It takes a village to bring a building design from concept to reality. And with designers, architects, engineers, and clients bringing their own perspectives to a project, even the most sophisticated digital models can lack depth that lets stakeholders actually feel how a design will play out in real life.
Hyperrealistic virtual worlds have long been the realm of the video-game industry, as anyone who’s lost too many nights to Grand Theft Auto or Call of Duty can attest. In construction—an industry still bogged down by slow, paper-based processes—immersive 3D modeling tools have been difficult and time-consuming to use, so they’re often deployed only to add marketing sheen to a final product.
But that’s changing, as tools adapted from video-game technology let designers create lifelike, interactive experiences that help project stakeholders get to the right design decisions sooner, no matter their technical skills.
Extended reality (or XR, which includes VR, AR, and MR) technology touches all aspects of architecture, engineering, and construction (AEC). An architectural firm bidding on a project can create a realistic VR environment to walk a client through an unbuilt space, and the client can take an active role in refining the design, with changes instantly implemented in the virtual model. Developers can sell a future building by navigating a stakeholder through a site. And when a project is finished, technicians in the field can use AR to maintain buildings and replace equipment.
From Video-Game Worlds to Virtual Buildings
Julien Faure is product marketing director at Unity, a software firm with roots in the video-game industry; developing tools for creating real-time, interactive digital models has been central to the company’s mission.
Faure highlights some ways immersive technology lets building designs be experienced from different points of view. For a sports stadium, for example, the model could simulate how fans view the game from various locations. “It helps optimize the positioning of the seats and even helps sell the private suites before they get built,” he explains. Other uses include simulating crowd movements to test security requirements and training facility staff before the ribbon cutting.
When building a complex project such as a hospital, gathering input from end users is a crucial part of the design phase. “How do you capture that feedback before you design things that actually don’t work in real life?” Faure asks. “The only way is to create an environment that looks and reacts exactly like the building and have people get immersed in this environment and give feedback.”
Engineering firms are using virtual environments to make these design changes long before construction begins. “They have surgeons and medical staff and nurses in the room with VR headsets, and immediately they see problems,” he says. The layout of an operating room might need to change to accommodate two surgeries at the same time, or a window that would bring in too much light could be eliminated. “The amount of feedback you get by allowing nonengineering professionals to experience the space is huge.”
In another example, to design Unity’s London offices, agency Oneiros and builders M Moser Associates developed an Autodesk 3ds Max software-to-Unity workflow to collaborate on room visualizations in real time.
Building Smarter, Faster, Safer
Rapid tinkering with designs before they’re set in pixels—or steel—saves time and money. Builders and contractors can leverage 3D environments to better sequence construction processes. An interactive model can pinpoint how long each granular step will take, including tasks such as excavation, pouring concrete, assembling prefab HVAC units, brickwork, and roof laying. According to Faure, some companies have reduced project timelines by as much as 35% by better sequencing their work.
And, when construction starts, teams on the ground can use AR to overlay BIM models on jobsites, which is a lot easier than shuffling through thousands of paper documents or PDFs.
Moving building design into a fully immersive 3D space creates opportunities to use virtual environments as machine-learning test labs—running simulation experiments again and again, refining designs as challenges arise.
For example, edge-case scenarios such as flooding, fires, or explosions are nearly impossible to simulate in the real world. Re-creating these hazardous situations at scale in virtual environments enables the collection of data necessary to train teams and autonomous systems.
“This is already how self-driving vehicles are learning, saving automotive companies from driving large fleets of sensor-equipped vehicles for billions of miles to collect the right amount of data,” Faure says. “For the AEC industry, where accidents and injuries are still too common, it will be a game changer to develop better safety equipment, construction robots, and building sensors.”
VR models also can tackle acoustic engineering in AEC by simulating sonic sensory input. “The majority of the world’s population lives in cities where millions suffer from high noise exposure,” Faure says. “Creating spaces that are beautiful and environmentally friendly but also quiet and soundproof is critical.”
By taking a BIM model into a platform such as Unity, a designer can simulate the acoustics of sound waves passing through a facility and bouncing off specific materials; users can hear the difference between sound reflecting on a tree versus a piece of stone, or an open versus closed window.
This fall, Unity will launch the Unity Reflect 3D visualization plug-in for Autodesk Revit. Unity Reflect translates BIM models into an immersive 3D model that retains BIM metadata and requires little technical expertise to explore and alter. Changes to the Revit model show up immediately in the Unity Reflect model.
“The idea of Unity Reflect is to take the data-optimization process from weeks to seconds,” Faure says. Unity is designed as an open platform, and the software automatically integrates data sources from different disciplines. “If you have a mechanical engineer working on one aspect of the model and an interior designer working on another aspect, we can merge all models into one.” (SHoP Architects integrated Unity Reflect into its design process for 9 DeKalb, a residential tower set to become the tallest structure in Brooklyn, NY.)
When Virtual Becomes Reality
Looking to the future of XR in AEC, Faure expects fewer barriers to entry and more intuitive ease of use, as well as deeper integration of simulation-based machine learning into everyday life. Reactive and dynamic environments require machine-learning trial and error to interpret human behavior; AEC digital models could be the petri dish. “Maybe your furniture will detect who is in the room and adjust to your configuration preferences,” he says. “Your chair will know that you are about to sit, and morph into the right shape for your body.”
Faure anticipates more convergence between manufacturing and AEC industries and more interoperability between building-scale AEC simulations and simulations of surrounding urbanism. “Auto manufacturers need AEC content in their virtual environments to simulate autonomous vehicles, and AEC companies need to integrate autonomous systems in their designs.”
For example, a set of digital models could test the impact hot cars pulling into a parking garage on a summer afternoon might have on the structure’s ability to mitigate the urban heat island effect. An AEC modeling application for this use might sound like the world’s most boring video game, but the cumulative effect is nearly limitless. The XR simulations of tomorrow will be subject to refinement by each other as much as by people. And, as digital models talk to each other, their conclusions could be just as transformative and astonishing as any video game, yielding smart buildings that could only arise from smart models.