We’re now an Unreal Engine Service Partner!

Magnopus is now an Unreal Engine Service Partner

Since Magnopus was founded a decade ago, the team has been creating next-generation experiences using Unreal Engine. Over the years, we’ve worked hand-in-hand with the Epic team, using their tools and even helping them build more to empower creators and developers. 

When we heard they were looking for teams who were “Helpful, honest, and creative problem solvers with a willingness to share their knowledge” we felt seen. The vetting process was a higher bar than we expected, but if you’re going to help people sort complex problems, it damn well should be a high bar. 

Now, we’re proud to have been named an Unreal Engine Service Partner by Epic Games. This recognition places us within a small network of Epic-approved partners, offering top-tier execution, co-dev, and technical support for people exploring the future.

Our Unreal Engine expertise spans various specializations encompassing experience design, world-building, systems engineering, AR, VR, Geospatial, IoT, and Virtual Production. But our mission has always been single-minded: to unite physical and digital to fill the world with extraordinary experiences. We believe in interoperability and open standards, and we’ve got the Github commits to back it up.

If you’re interested, here’s a little more about what we can do for you with Unreal Engine:

Cross-reality, spatial experiences

Creating compelling immersive experiences requires insane complexity to be rendered invisible to the user at exceptional quality with eye-watering performance. Unreal Engine delivers unparalleled high-quality visuals and capabilities, allowing us to make content that enhances the physical and digital worlds.

Our team of 180+ artists, designers, and engineers, working from studios in Los Angeles and London, use UE to achieve the highest levels of immersion, interactivity, and visual fidelity. These experiences have resulted from developing the expertise to utilize UE, as well as a software and hardware stack to make next-gen content faster, better, and (if it’s your priority) cheaper. This applies to AR or VR, for consumers or enterprises, in-home, or location-based.

The team here is attracted to every kind of cross-reality project. Though our chops might seem like they hail from media and entertainment, our team’s backgrounds are gaming, formal and informal education, research, cloud services, hardware, architecture, retail, and more. Naturally, we’ve shipped experiences in all those areas. We’ve done a lot and we’re proud of the scars we’ve earned along the way.

We’ve worked with big music labels and upcoming artists to make virtual concerts that go cross platform, across traditional and emerging platforms. We’ve whipped together digital twins of major metropolitan areas for filmmakers to shoot in LED Volumes. We’ve hijacked worlds from films and made them immersive and interactive so that families can discover new characters and stories in the shadows. We’ve done it with robots and lasers. We’ve done it at night in the rain. Our favorite way to do it is when you never know we were even there.

Production. Virtual or otherwise.

For years, we’ve been working in both physical and Virtual Production, bridging film and TV content into next-generation experiences. We often find ourselves in collaborative “applied R&D” partnerships with filmmakers to solve hard production problems or expand the storytelling canvas, inviting audiences to become participants in the narrative.

Our creative journey starts with interactive exploration and visualization in pre-production, using real-time graphics and virtual cinematography tools. This approach allows filmmakers to plan their production and shoot sequences that get everyone on the same page, reducing costs. During production, we assist with identifying the best technologies, including being critical about whether or not LED volumes are a good idea for your project. We handle systems integration related to hardware and software. Our Virtual Art Department captures and creates real-time sets used in pre-visualization and principal photography. The worlds we build for the production of film and TV content can become the foundation for experiences in the audience’s home or beyond. 

These experiences engage audiences in the worlds we create while maximizing the use of the assets generated during production. We leverage the latest distribution platforms to ensure that these interactive immersive experiences are engaging, extensive, and, most importantly, aligned with the filmmaker's vision.

We believe the success of these techniques comes from putting hardware in the hands of seasoned professionals and getting out of the way. We focus on integrating and supporting the hardware already familiar to a film crew, so we can bridge physical and digital into a single experience. We hate technology for technology’s sake. We’ve delivered Virtual Production solutions to happy people with and without LED Stages. And if we’re working with an LED stage, we can custom-design them to your needs and build with the right vendors, or work hand-in-hand with teams at the network of existing stages. 

We’ve also been collaborating with Epic Games on the development of Virtual Production for years, including designing and building a suite of filmmaker-focused tools for use during VR Scouts, Virtual Blocking, Techviz Development, or final pixel Virtual Cinematography. 

Now, as an Unreal Engine Service Partner, we’re in an even better position to push the boundaries as we continue our shared mission of creating better experiences for people across physical and digital. Wherever they are, with whatever they’ve got. 

Previous
Previous

Meet the Magnopians: George Nerkowski

Next
Next

We’ve open sourced our Connected Spaces Platform: anyone can contribute, everyone can benefit.