Connecting digital and physical worlds to unlock new design possibilities

How the AEC and immersive industries can work together to shape better human-centred environments.

Facebook

With ever-rising interest in the concept of the “metaverse” and real-time/immersive technologies becoming more sophisticated and affordable every day, everybody is trying to envision what the future of digital and physical environments will look like and how their mix will change people’s lives.

I was so fascinated by the subject that a few years ago I even chose to have a (pretty smooth) career change, transitioning from a MEng in Architectural Engineering, passing through computational design and research and eventually landing in real-time development and programming.

In this article, I would like to share some thoughts on how architecture may help shape digital spaces and, vice versa, what the immersive industry can do for the AEC sector (Architecture, Engineering and Construction) to help design and manage physical ones, now that the boundaries between the two worlds are inevitably starting to fade.

Digital to physical

How real-time technologies can help architects to shape better physical spaces

It is not a big surprise to state that real-time technologies are already being leveraged by the AEC industry for a multitude of purposes, such as producing photorealistic ArchViz, enhancing workflows, speeding up the design process, and easing communication.

In the last few years, real-time giants like Epic and Unity have been expanding their businesses to target the AEC industry and the market is growing exponentially. Many events are taking place in the field, for example Epic recently held “Build: Architecture 2021” which highlighted a series of case studies, new partnerships and advancements in products specifically developed to be used by architectural firms.

It is worth mentioning one of the most popular tools that Epic provides specifically tailored for architects: Twinmotion (powered by Unreal Engine). Its goal is to make ArchViz extremely easy and quick, allowing architectural firms to produce realistic real-time visualizations themselves in a very short period of time and without any technical knowledge about real-time technologies. Twinmotion can be easily integrated in the AEC pipeline as it offers a seamless way to import CAD and BIM files and keep them in sync with software like Archicad, Revit, Rhino, RIKCAD, SketchUp Pro, and Vectorworks. Once a model is imported, the user can easily apply materials, access an extensive built-in asset library and drag and drop static and animated assets (including Quixel Megascans) such as objects, vegetation, and people plus experiment with different lighting and weather conditions. The result can be then visualized in real-time on desktop, web, and VR platforms.

Real-time 3D allows architects to easily and quickly test design alternatives which can even be adjusted while discussing them with clients or colleagues, at any stage of the project and not just at the end. The AEC world is typically full of complex and slow processes, so facilitating communication between different firms and departments can dramatically speed up such processes and help avoid future mistakes and consequent money loss.

If we think about the possibility to create multiplayer, shared environments where all designers, engineers and various stakeholders can discuss, review and modify their designs in real-time, even remotely, we understand how we are at the edge of a radical shift in how communication and interaction in the work environment will look like in the near future.

But this is just the beginning. What else can real-time technologies do for the AEC industry? Possibilities are endless and waiting to be explored.

Datasmith makes it easy to import BIM and CAD data directly into Unreal Engine and Twinmotion, using a live link connection and keeping all data in sync. Datasmith recently released an SDK to allow third party software to develop their own integrations with Unreal Engine, so that an ever growing number of different file formats will be importable into the engine.

It’s also possible to import 3D scans and photogrammetry thanks to Quixel, which offers a wide variety of assets in their Megascan library, and Capturing Reality, which allows users to capture their own scans. Also, GIS data can be streamed and parsed in Unreal Engine to recreate large 3D environments.

Centre Block Project by HOK: reconstruction of the environment into point clouds and meshes using RealityCapture photogrammetry.

It is important to notice that when we talk about data, we don’t just refer to 3D geometry and 3D scans. Any sort of data related to a building or a city can be imported and processed inside Unreal Engine, such as construction sequencing data; costs; material physical properties; services data; real-time temperatures and humidity; streamed data from IoT sensors installed in buildings equipment; data about occupation; people movement; transportation at the scale of a city and so on.

Recreating exact digital copies of real buildings and cities and combining them with real-time data that shows how they work brings them very close to their “real” equivalent: that’s why they are often referred to as “Digital Twins”.

This scenario opens up a series of infinite possibilities, including the development of apps to monitor and display data visually and intuitively, facilitate decision making, engage communities in a city and more.

Moreover, data don’t have to necessarily be streamed in, they can also be streamed out. Real-time simulations, user movements and any sort of data coming from the user can be used to control physical features in a real environment. Examples could go from being able to control domotics in a house, modulate real-time light and audio effects during an event, operate a robot for digital fabrication offsite and so on.

But this doesn’t have to be the final part of the puzzle. What if we come back to the early stages of design with all these new technologies in mind?

During the last few years, architects have been using procedural mesh generation techniques and software (like Grasshopper for Rhino and Dynamo for Revit) to push architectural design possibilities, optimize and automate complex modelling and try to integrate different parameters such as structural and energy data to inform the procedural design process since the very early stages. It is easy to see how that same process can be enabled and pushed to an entirely new level when powered by the aforementioned real-time data and technologies.

Insomniac’ s SpiderMan used procedural techniques to generate NY city.

The real-time industry commonly uses procedural mesh generation and other techniques that closely resemble the “non-real-time” ones used by architects. What if architects were able to actually start the design process inside a real-time environment and procedurally create their conceptual models at runtime by linking its generation to real data and real-time simulations? UX and UI would have to be re-envisioned to accommodate those new interactions and VR/AR could be used to facilitate the understanding of an increasingly complex environment.

There is already some ongoing cutting-edge research on the subject in academic and research & development environments. ETH Zurich, earlier this year, launched a new Center for Augmented Computational Design in Architecture, Engineering and Construction (“Design ++”) and opened their new Immersive Design Lab (“IDL”). The IDL enables researchers and students to explore interactions of users in real-time with virtual design and engineering models and human-computer design and fabrication interfaces.

Another way by which real-time environments could add value to the design of physical spaces is by offering the possibility to bring real humans inside them. When a person enters inside a photorealistic digital twin using AR/VR/XR technologies, his/her perception is very close to the real one. The way he/she interacts with the environment is real. This brings an incredibly new and valuable type of data into the equation: the one connected to people’s behaviour. Heat maps can be generated from eye-tracking data, movements and interactions can be recorded and stored in a database and used for biometric analysis, which may help inform the design of the real-world environment.

Physical to digital

How architecture can help shape better immersive digital spaces

Digital spaces are not always going to be used to design and simulate real ones.

On the contrary, some digital spaces are destined to become the final product, the actual environment where people will want to spend part of their time. These immersive digital spaces are shaping the idea of the “metaverse” (which the recent conference Facebook connect 2021 brought even more attention to).

Facebook Connect 2021 Keynote.

There are various definitions of the term, but the concept includes a series of interconnected cross-platform immersive spaces that will allow people to meet from remote locations using virtual avatars, perform any sort of activity together or alone, buy and share objects across different apps and environments and so on.

According to Meta’s vision, the metaverse will be a persistent digital reality that will give a sense of presence and will be accessible from a range of different platforms, using AR, VR, desktop or mobile. It will permeate all aspects of our lives and change the way we work, learn, do sports, interact with others and spend our free time.

Persistence in augmented and mixed reality means that the digital world (be it private or public) will be tightly coupled with the physical one. Extending this concept a bit further and thinking about the possibility of people being massively involved in the metaverse in the future, it’s pretty easy to envision a rapidly growing amount of persistent digital layers that will start populating and augmenting our homes, public spaces, venues and so on.

Of course, such augmented content will not be subject to the laws of physics, but it will become a part of the human environment anyway. As such, who better than architects will have the culture, knowledge and expertise to give their contribution to shape a respectful and coherent relationship with the pre-existing physical space?

Human biology and psychology won’t change very soon: people will still perceive space in the same way they have always done, the feeling of comfort and ergonomics won’t change, people will still have a series of needs, the same basic needs that architects have been trying to satisfy for centuries.

It is a matter of time before we will start questioning how to regulate the explosion of augmented and digital content that will permeate our lives. Alongside other issues, we will also need to start thinking of ways to ensure a good quality of life in the virtual world.

Moreover, the design of buildings, infrastructures and urban planning always starts by making some assumptions on the use and purpose that those spaces will have. Constructions are built and engineered against those assumptions and when old buildings are re-used for new activities, they have to be verified to be still suitable for them.

We should consider that the metaverse will likely bring a massive re-use and re-functionalization of existing physical spaces for virtual activities, which may appear overnight and not always allow some time to be regulated. If people will start using existing spaces in completely different ways, possible risks for health and safety may arise.

Last but not least, let’s not forget that architects may also start to help the immersive industry by designing new physical spaces that will be particularly suited to host and complement virtual content. VR/AR/MR headsets will likely evolve and become lighter and closer to wearable accessories, but other pieces of hardware may still be required for certain types of immersive experiences (today we would say projectors, screens, sensors, cameras, lights, speakers, markers, custom furniture and so on, but who knows what pieces of technology will be developed in the future!). These technologies may need to be integrated within the real spaces in a flexible way, to create an elegant and functional physical space, which will be able to be seamlessly “augmented” with good quality “virtual layers” on top.

It is clear that the AEC and immersive industries are already working together and will continue to do so in order to progress the possibilities of physical and virtual spaces.

There are still many challenges to overcome, but architects, engineers, designers, real-time developers and artists will all share the responsibility to build a better future environment for human beings to live in, be it physical, digital or somewhere in between.

Previous
Previous

PlayStation VR2 is the next generation of VR gaming on PS5

Next
Next

How to make glowing buttons in VR