Live Augmented Reality and LED-Based Extended Reality Demo
IBC Show 2022
Watch Zero Density’s live augmented reality and LED-based extended reality demonstration at IBC Show 2022 to witness how its next-generation software and hardware products can transform a physical environment into a photorealistic XR stage with virtual and augmented reality graphics. You will also explore some practical use cases of an LED video wall for immersive storytelling and displaying data-driven graphics.
Photorealistic Augmented Reality
The first part of the demo showcases a photorealistic augmented reality production that conventional fill and key blending methods cannot produce. To reach this level of image fidelity, you need a ray-tracing renderer and a real-time 3D compositor like Reality Engine.
Augmented graphics become a part of the physical environment where the reflections of the talent and objects around the graphics can be seen on their surfaces. In the demo, the reflections of the presenter and the video wall are seen on the data-driven soccer game scoreboard graphics.
Beyond basic transparency, Reality Engine enables realistic ray-tracing of frosted glass. Furthermore, based on the camera angle, lens flares in the set as well as an eye-catching bloom effect created by the highlights on the metal surface can be seen. Check out the glass panel graphics for the team logos in the demo.
As the camera moves towards the AR graphics, the ray-traced reflections and refractions can be seen on them. The reflections of the AR graphics are also seen on reflective floor surfaces and not on non-reflective surfaces like carpet. All these take the photorealism of the scene to a whole new level. This is what photorealistic augmented reality means.
In a set with AR graphics, knowing the exact position of the talent makes it possible to create more interactive scenes. By sending the talent’s real-time 3D position data to Reality Engine, the AI-driven talent tracking system TRAXIS talentS enables AR graphics that can be triggered in and out by the talent’s position. Check out the glass panels in the demo. They appear or disappear automatically based on the location of the presenter in the set and this allows the presenter to move with more freedom in the space. talentS has won the Product of the Year Award at NAB Show 2022 and the Best of Show Award at IBC Show 2022.
Video Wall Graphics
Numerous real-time broadcast graphics workflows can evolve around video walls. By utilizing the Reality ecosystem, video wall graphics can become a powerful tool for storytelling and displaying data-driven infographics for weather, sports and business among many others.
RealityHub becomes an integral part of any kind of virtual production and provides a unified HTML5-based dynamic user interface to control the entire Reality ecosystem. With RealityHub, broadcast graphics can be filled by a simple spreadsheet file or can be linked with any external data source to display real-time data – without using any additional plugins or writing a single line of code.
In the demo, two important use cases of video walls are demonstrated: a five days weather forecast table and a real-time currency exchange chart. The weather graphics get text and image data from an external data repository and are updated in real time for any selected region around the globe. The currency exchange graphics also use real-time data feed. That’s how RealityHub organizes data sources and customizes them tightly to match your design.
While empowering operators with easy-to-use powerful tools, RealityHub is also designed to fulfill all the requirements of a news workflow. And it supports most of the Newsroom Computer Systems (NRCS) available in the market through the MOS protocol. RealityHub 1.3 is recently released with a new Form Builder, advanced User Management tools, support for Unreal Engine 5 and more.
Reality Engine supports all standard broadcast resolutions such as HD and UHD 4K. But its adaptability is beyond that. In the demo, the LED video wall is used in its native resolution without compromising on quality or performance.
Reality Engine can extend and composite virtual sets beyond the LED volume. Combining physical and virtual extended worlds seamlessly, Reality Engine can transform small, ordinary physical spaces into enormous, dynamic 3D virtual worlds. In the demo, the physical LED screen behind the presenter shows an extension of the virtual studio for the Mars environment with the presenter on the Mars surface. This is built entirely through XR and facilitated through Reality Engine.
As the camera moves around, the perspective angles in the video wall change accordingly, making scenes more lifelike. Also, check out NASA’s Mars Rover parked near the presenter. All these create an immersive 3D environment that viewers get immersed into. Furthermore, broadcasters gain more space and flexibility for storytelling. This is what photorealistic extended reality means.
– The production system had an end-to-end IP workflow ST2110.
– The production setup used Grass Valley LDX 86N 4K system camera with Canon CJ12EX4.3BIASE 4K Lens.
– The LDX86N is tracked by Egripment T10 crane with mechanical tracking.
– TRAXIS talentS is used for talent tracking.
– RealityHub 1.3 is used to control the real-time graphics.
– The virtual studio and AR / XR stage use Reality 4.27 and are raytraced.
– Five RealityEngine AMPERE workstations with Nvidia RTX A6000 GPUs and AJA Corvid44 12G video I/O boards are used for all demos.
– One Reality Engine is used for virtual studio, one for AR, one for LED-based XR graphics, one for feeding the video wall and one for broadcast on-air graphics.
– Three demo pods utilized three RealityEngine AMPERE workstations with Nvidia RTX A6000 GPUs and AJA Corvid44 12G video I/O boards.
Reality Engine: Real-Time Virtual Studio, AR and XR Platform
Hyper-realistic rendering and compositing in Unreal Engine
Create, control, produce in one system
Real-time Native 4K/UHD/SDI production
TRAXIS talentS : AI-Powered Markerless
Talent Tracking System
No wearables and no markers
Power of AI and machine learning
Easy setup and calibration
One Hub to Control Them All
Control, customize, automate broadcast graphics
from one single Hub
Real-time data integration from external sources
Unreal Engine integration into broadcast workflows
Ultimate Workstation For Virtual Sets & Broadcast Graphics
Tailored for virtual studio and broadcast graphics
Fastest available rendering, compositing & I/O performance
Multi camera rendering and compositing with a single engine
Image-based Keyer Works on GPU
Sophisticated and easy-to-use keyer
3D Masks for “force-fill” and “spill bypass”
for hybrid studio operations
Highest performance in HDR production
About Zero Density
Zero Density is a world leader in virtual studio, augmented reality and real-time graphics technologies for the broadcast, live events and esports industries. From the Olympics to World Cup coverages, Zero Density’s Unreal Engine-native platform, Reality Engine — which includes a real-time broadcast compositing system and its proprietary keying technology, Reality Keyer — has been used by some of the biggest companies in the world. Clients include: The Weather Channel, RTL, Fox Sports and Warner Media.