As a child in Mexico, one of Ernesto Pacheco’s favorite pastimes was playing video games on an Atari video-game console with his father. The trained architect—now CannonDesign’s director of visualization—had little idea that his early passion for gaming, and later interactive design, would evolve into a career at the cusp of technological transformation in architecture: multi-user virtual reality (VR).
Even more unbelievable to Pacheco is that the visualization technology that helped create the chomping yellow face of Pac-Man would mature into systems capable of generating lifelike, emotionally responsive 3D avatars based on real human beings. Now, in his St. Louis, MO, office, he watches designers and clients strap on HTC Vive Pro VR headsets and step inside prospective buildings and rooms together—making cocreative decisions that effectively shape the future before it happens.
Pacheco’s 18-year career has been a wild ride. Although at times the exponential acceleration of digital product development can give him fits—say, when a $13,000 piece of hardware goes obsolete in months—he relishes his job. Here, he offers a lens into how multi-user VR helps him read bodies and moods while transforming early-stage design and prototyping by inviting clients into immersive worlds.
What attracted you to multi-user VR as a design tool?
VR is a very isolating experience, and it has been really hard to break that barrier. First of all, you have to wear something on your head. So that’s uncomfortable. And then, depending on your level of comfort, you might experience dizziness and so forth. So it doesn’t translate well to people who are not used to the technology.
But then NVIDIA Holodeck came out in late 2017. Through our partnership with NVIDIA, we helped inform Holodeck’s beta testing for architectural design. It was the first VR platform that had the tools we needed—and not only that, it was multi-user. That changed everything for us because we can have up to 16 people in a room interacting in the space, which makes it more comfortable for everybody. Once you’re inside VR, you can see other users, talk to them, collaborate, pick up things. It’s amazing. When I first tried it, I was blown away.
CannonDesign developed a preliminary design for a teaching hospital at the University of Houston College of Nursing. How did VR change the design-review process?
In health-care projects, it’s common to build low-fidelity mockups of key spaces on-site. Nurses walk around and see where equipment is positioned, how high the countertops are, things like that. Now we can give nurses the opportunity to explore patient rooms in VR. If we need to make changes, we can do this quickly, export the rendered files, and try it again. You don’t have to rebuild the physical mockup. It takes around two or three weeks to build a physical mockup, and it can cost up to $35,000. And they’re wasteful because, in the end, you have to trash them.
Virtual reality is opening the door for us to create more mockups for clients, and it speeds up the design process. Clients can get back to us with change requests the same day, and we can make changes in a day or two. Before Holodeck, we’d have to rebuild entire mockups, which could take a week or two.
Does that kind of rapid feedback also help in the construction phase?
Absolutely. You can see heights, for one thing. So when you put on a headset and enter the virtual space, your height is true to life. If you’re tall like me—6 foot, 5 inches—you look like a 6-foot, 5-inch guy. Also, your avatar behaves like you and has your mannerisms. So when you’re talking to someone, and you say, “Oh, that’s Andrew.” You know the avatar is Andrew by appearance and body language, without even talking to him.
How did your clients at the University of Houston react after seeing the patient rooms—and their animated selves—in VR?
It went surprisingly well. We had two clients jumping into Holodeck, trying it for the first time. And they picked it up with only brief instructions. We looked at the school from a dollhouse perspective. Then we went one-to-one so they could get a feel for the space.
In terms of the design, the CEO was concerned about several things: where the mechanical room was, how much space we’d allowed for the beds, and the materials used for the nursing station. One of the comments was, “What if we put glass, instead of wood, in front of the nursing station so practitioners can see equipment behind the wall without having to walk around the station?” And we did it on the fly.
What types of emotions can you see in clients using VR?
Well, you see a lot of emotions—frustration, for example: You can see peoples’ shoulders coming down and their heads slumping, and it’s just crazy. Virtual-reality platforms like NVIDIA Holodeck, in conjunction with the HTC Vive Pro’s sensors and AI [artificial intelligence], help our virtual avatars behave when users are in certain positions. If you’re bending down, the avatar’s knees bend down. You also have animated facial expressions. You can smile; you can be sad, uninterested, or happy, and your avatar will reproduce this body language to suggest your mood.
When you see someone looking frustrated in a design review, how do you react?
What’s the matter? What is making you feel this way? Do you have any questions? We have an opportunity to address peoples’ concerns, to reflect their emotions right there. That empathy keeps people engaged with the experience.
What do you envision as the future for design visualization?
It’s going to be tied to AI and generative design. The idea is that, for instance, CannonDesign will have its own flavor of buildings, its own signature, and each type will have restrictions on site, height, or other conditions. And then you can plug this visual logic into Autodesk Dynamo Studio to explore concepts and automate certain tasks. The computer will run through thousands of design iterations and then let you handpick whatever you think is best for a project.
It’s going to improve lives. We don’t have the time to go through 10,000 options for a project. We can do 10 or 20 within a project’s life span, and that’s a lot. But now we have the opportunity to let the computer do the heavy lifting and focus on something else, something more specific and personalized to each building.