Architectural Revolution with Augmented and Virtual Reality
This post originally appeared on blog.up.co
There is a lot of buzz around virtual reality (VR) gaming lately. Recently Sony entered VR market with lunching a PlayStation VR headset and competing against HTC Vive and Oculus.
At Startup Weekend Copenhagen team, we really want to explore possibilities of building a VR case withing business to business (B2B) set up. Since new software and hardware platforms are coming up, that will allow immersive environment integration along with motion training and capture through computers. Therefore VR can be much more than just gaming.
Since architecture plays a big part in Copenhagen, a thought of combining architectural design and VR world is very interesting This allows designers to envision and virtually immerse into 3-D dimensional conditions where they can design with intuitive hand corresponding with body motions.
First, new interfaces and custom workflows are to be created. The traditional keyboard and mouse needs take a back seat in the design process and second, these platforms for augmented reality (AR) or virtual reality (VR) to take their place, in the designer’s hands.
VR is advancing and it can be easy to imagine architectural practice following the trend as well. However, when combining architectural drawings process with 3D immersive environments with using our hands to design, is a bit harder to envision. Developing advanced software solution (or could be something else) for “visualized drawing” through creating links between visualization and simulation is essential.
I believe this will trigger a paradigm shift in comprehending scale drawing, and within an immersive spatial condition. Move over 3D modeling can be utilized well. This means that designers will look at their 3D models not merely as visuals, but rather as fully aware, visualized construction.
How does this have a big benefit? In this way, it allows drawing (modeling) to become closely correlated with making itself. To give you an example what I mean by that - a drawn line no longer is merely a depiction of a surface; rather, it is the surface itself. So if you think about it, this new “feature” moves drawing way beyond the definition and representation of space that has been in use by architects from the Renaissance periods through the 20th century.
New Workflows for the Design Process
For space immersive simulation to be executed practically, architects can combine existing hardware and software stages. Utilizing head-mounted displays (HMDs) it is possible to construct a situation for the designer to exploit “full scale” designs. At the desktop application level, it is necessary to the adoption of designers to easily integrate into the virtual context without bringing forward new software workflows. Both of which offer the designer the power to translate 3D geometry with surface textural maps (see the video above)
Photorealistic VR and Augmented Conditions
I imagined while using VR solution of this kind, there would be a blank canvas you would be using for drawings and modeling. I could assume that the preference is to work on a project in a photorealistic rendering made possible by VR.
Using various augmented reality applications as well, together with smartphone sensors, geolocation of projects in the design software and “publishing” to the correct place is made possible, which is viewed with the augmented reality software.
What Does the Future Hold?
Several VR and AR technologies have been leveraged for application in design and construction processes on selective office projects. Using the building information model of the project, one not only coordinates building design and the systems but also share with the client a deeper comprehension of the project through the use of VR and AR technologies available.
If you have a further interest of VR in architecture, I would recommend watching a TEDx talk by Gunita Kulikovska. She is also pointing out an issue between clients and architects.
While this technology successfully immerses the user in the virtual environment, the user, however, is left incapable of interacting with the physical environment itself.
Another challenge involving incorporation of these technologies into a consistent workflow, is a lack of native support for the hardware across the multiple design platforms that are currently in use. So far these products are platform specific, protocols have to be custom designed for each use of the device. What I mean by that and if you have tried VR before is, for example, each hand gestures are not set as default (or being recognized), they have to be created.
These challenges (and probably many more) sure stick out, however, they will be addressed with constant improvements within VR an AR. I’m starting to questioning if we even want to solve the feeling of being incapable of interacting with physical space?
This blog post originally appeared on LinkedIn.