Extended Reality

New Worlds, New Possibilities

Facebook / Meta Video Productions (MVP) reached out to Xitelabs for the company’s expertise in Extended Reality, specifically to bring the latest technology and virtual production capabilities to Meta’s virtual production facility. Xite was selected as the winning vendor based on its extensive experience developing and operating numerous small volume XR LED stages – and led the creative and tech for the launch of MVP’s first xR stage

Xitelabs built the “Multiplex” world in Unreal Engine, integrating DMX and live show-controls for optimal scenic flexibility during shoots. The Meta XR stage employs Disguise technologies as the core media server, implementing Render Stream/RX machines to drive high resolution AOTO LED. Xite is a certified Disguise workflow partner, and smoothly liaised between MVP and Disguise technical support. Xite’s onsite labor efforts identified and catalyzed new development features into the Disguise software platform, collaboratively advancing the Disguise xR platform.

The final implemented workflow featured the ability to have multiple presenters stand and deliver to camera, along with virtual live feeds, AR or ‘Front Plate’ elements to create additional depth and interaction with the virtual environment. Virtual camera switching was successfully implemented in Disguise. For final delivery Xite helped to create and direct an internal explainer-video, to introduce the capabilities of the Meta Extended Reality stage to key stakeholders in the Meta video/event production world. Key stakeholders from across the company included Mark Zuckerberg, who experienced in-person Xite’s new virtual technology and photorealistic world designs. 

The project was over 2 years in the making from initial concept to the final permanent installation on campus. The stage was the first xR integration into the Meta video economy.

 

Credits

Xitelabs:
META Virtual XR set
 
Vello Virkhaus, Technical & Creative Director
Greg Russell, Creative Director & EP
Kevin Aguirre, 3D Designer (Emoji Machine)
Rodel Aragon, Unreal Engine Artist & Programmer
Valerie Aragon, Unreal Engine Artist & Programmer
Myokung Shon, Lead 3D Designer
Jeremy Vannix, Technical Director
Pierce Cook, Camera Director, Camera Color
Nathan “Smokey” Williams, LED Color
Amber Robertson, Virtual and Physical Lighting/ Unreal Engine Artist
Charles Dabezies, d3 Lead & TD
 
LMG:
Physical Installation and Technical Design Implementation
 
Derrick Beale, Producer
Sean Borowski, Technical Director
Erien Reinerth, Producer
 
 

Case Study Reel

A look into Meta XR Stage and Virtual World

Meta XR Virtual World

The Meta XR project consists of a single virtual world with multiple shooting locations in one virtual set. The MVP production team can jump locations dynamically in the middle of live filming, providing amazing variability. Using a lighting console and stream deck MVP can adjust camera positions, environmental attributes, and integrate live video calls. – all turnkey, and all in realtime using Unreal Engine and Disguise. Each environment location has full dynamic control of numerous atmospheric attributes. Utilizing engine features such as time of day, emissive lighting and complex geometries our team delivered a hyper-realistic, extremely flexible dynamic virtual set to prove the Extended Reality concept company-wide.

Emoji Machine

Consisting of conveyors, slides, gears, paths, and numerous other elements, the emoji machine is a strikingly gargantuan piece of innovation. Each ball is processed by the machine resembles a single emoji from the ever-popular Facebook platform. By utilizing both Unreal Engine’s PhysX and Blueprint Visual Scripting systems, we were able to portray realistic movement for the emoji balls as they travel through the machine, and for the moving parts of the machine itself.

Emoji Machine Overview

Since UE4’s PhyX system is frame-based, the team needed to ensure that the target frame rate was maintained, otherwise the physics of the rolling balls would exhibit erroneous behavior. For this reason, the logic had to be lightweight and the collision meshes simple. Creating overcomplicated collision meshes or logic would result in unnecessary loss of frames.

The Invisible Sorter

A “sorting” element was implemented to ensure that both halves of the machine were used equivalently. Leaving the sorting of the machine entirely to physics wasn’t an option, as it could cause one side to be used rarely or not at all.

Movers

The conveyors in the scene do not actually move the rolling balls. The rolling balls are moved by blueprint logic. When a ball lands on a conveyor, the blueprint tells them to move in a specific direction at a steady speed. These “movers” are signified by a red arrow and are implemented all along the emoji machine

Meta's XR Stage Design

Xitelabs designed the physical XR Stage and the virtual world for Meta’s first XR shoot. Xitelabs partnered with LMG for the installation of the physical stage and implementation of the technical design.