Building Cross Platform, Cross Reality, Social Experiences Using XR
Earlier this year, our Global Director of Innovation Sol Rogers and I presented “Building Cross Platform, Cross Reality Social Experiences Using XR” at the MetroXRAINE 2024 conference. If you missed the event, this article captures the highlights of our presentation and provides details on our Connected Spaces Platform (CSP) and how it can help developers create interoperable social experiences using XR.
What is Cross-Reality (XR)?
XR is the convergence of Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), creating environments where virtual and physical spaces blend seamlessly. It allows users to step into shared digital worlds, interact with persistent virtual objects anchored to real-world locations, and collaborate in real-time.
What is Cross-Platform?
Cross-platform XR ensures these immersive experiences are accessible on a variety of devices, from mobile phones and desktops to VR headsets and AR glasses. Whether you’re wearing a high-fidelity Meta Quest 3 headset or joining on a smartphone, the content remains consistent and synchronized.
Why Multiplayer is Essential to XR
In a world that’s information-rich but experience-poor, the true magic of XR lies in bringing people together. Multiplayer XR experiences create shared environments where people can interact with each other in real-time – whether it’s a virtual concert where attendees in VR and AR see the same digital performers or a global team collaborating on a shared 3D prototype anchored in physical space. This kind of interaction opens up new possibilities for entertainment, collaboration, and community building.
A Multiplayer Example
Prototypes like Cryptic Cabinet, a mixed-reality escape room we developed with Meta, showcase what’s possible. Using Meta’s Presence Platform and CSP, this experience enabled multiple users to interact with digital objects within a physical room while maintaining low-latency synchronization. Such prototypes highlight how XR can blend the physical and digital to create engaging, real-time multiplayer experiences.
Technologies Making Cross-Reality Social Experiences Possible
We’ve built a powerful toolset to tackle the unique challenges of XR development. Central to this is our Connected Spaces Platform (CSP) — an open-source middleware that simplifies the creation of cross-reality, cross-platform experiences.
Key Features of CSP:
Real-Time Multiplayer: Low-latency, synchronized interactions across platforms.
Cross-Platform Compatibility: Operability with major engines like Unity, Unreal, and WebXR.
Spatial Anchoring: Ensures objects and interactions remain persistent across sessions and devices.
Scalable Infrastructure: CSP supports everything from intimate gatherings to large-scale events, adapting in real-time to user demand.
Using an alpha version of this tech, we created the world’s largest connected space for Expo 2020 Dubai. Over 39 months, we designed and developed a 4 km² digital twin of the Expo site, allowing millions of physical and virtual visitors to interact with a unified digital layer.
How It Worked:
Real-Time Synchronization: On-site and remote visitors experienced events in perfect sync, thanks to our real-time multiplayer capabilities.
Spatial Persistence: Digital objects and interactions remained anchored in the physical Expo site, allowing for a consistent and engaging experience across sessions.
Cross-Platform Accessibility: Participants could join via mobile, web, or VR, ensuring inclusivity for a global audience.
The project demonstrated how XR can bridge the gap between physical and digital spaces, creating a shared experience for people "there" and "not there."
Overcoming Technical Challenges
Building cross-reality, cross-platform social experiences isn’t without its hurdles. Developers face several key challenges:
Coordinate Systems: Every device uses its own coordinate system, requiring seamless transformations to ensure consistency.
Interoperability: Ensuring a seamless experience across hardware ecosystems like Meta Quest, ARCore, and WebXR requires extensive testing and integration.
Rendering Optimization: Different devices have varying performance capabilities, making it crucial to optimize rendering for smooth, immersive experiences.
We’ve streamlined this process by developing several plugins for Web, Unity, and Unreal on top of the Connected Spaces Platform to handle these aspects for you. These are part of a set of cross-platform multiplayer applications that we call OKO.
Our APIs convert the coordinate system into the appropriate ones depending on the platform you are using them for and offer multiple functionalities to replicate transforms and related metadata to ensure that your player is correctly interoperable between different viewing modes (AR / VR / default), and different platforms (Unreal / Unity / Web).
Furthermore, our plugins adapt to the rendering performance of the targeted device (desktop or mobile) using performance optimization techniques such as LODs, culling, and simplified rendering pipelines for lower-end devices.
To learn more about OKO, check out these resources:
What’s Next for XR?
As technology advances, several trends are shaping the future of XR:
Lightweight AR Glasses: Innovations like Meta’s Project Orion and Snap’s Spectacles are making AR more accessible and user-friendly.
5G and Cloud Rendering: Faster data speeds and cloud-based processing allow devices to offload computationally intensive tasks, enabling lighter, more mobile hardware.
Gaussian Splats and Digital Twins: These technologies enhance realism in XR environments, improving spatial interactions and visual fidelity.
These trends will empower developers to build richer, more inclusive XR experiences, driving adoption across industries like healthcare, education, and entertainment. It’s an exciting time to explore what’s possible and to push the limits of how we connect in the digital age!