Zero Density Virtual Studio Demo at NAB Show 2022

It was great to be back at NAB Show. Featuring three separate spaces, an AR demo stage, state-of-the-art virtual studio and demo pods, Zero Density booth provided a front seat ticket to the real-time graphics technologies powering daily shows, live events and more from the world’s biggest broadcasters.

If you missed the virtual studio demo, you can check it below. AR stage demo video is going to be shared soon.

On this year’s Zero Density demonstration, The Weather Channel provided their brand-new virtual set that the channel utilizes for daily live forecasts powered by Reality suite. The virtual set design was undertaken by creative trailblazer Myreze. It is an enormous virtual environment with a captivating design and the production was done from a relatively small green screen. Change of scenery is possible with a click of a button, in real-time.

ZD Ecosystem

Reality Engine 4.27 was at work during the demonstrations. The real-time node-based compositor which can key, render and composite photoreal graphics in real-time ⁠— has been natively built to take advantage of Unreal Engine features that will make virtual studio graphics more photorealistic. The virtual studio was ray-traced, showcasing that cinematic-quality visuals are possible for ultra-realistic, live on-air graphics and virtual sets. Through ray tracing, reflections, soft shadows, area lights, refractions and other signs of light travelling around us create a result that is superior in realism than rasterization.

RealityEngine AMPERE is the hardware product that enables the highest performance with utmost stability for virtual studio and broadcast graphics productions. Each RealityEngine AMPERE runs with full performance and built to deliver the maximum stability under the toughest real-time production environment conditions. Zero Density’s most talented testing and quality-control teams run tailored test procedures that simulate every single point of real-time production environment.

Reality Keyer, the imaged-based keyer that works on the GPU provides high-quality results for keying of contact shadows, transparent objects and sub-pixel details like hair – in any shot. Focusing back and forth between the presenter’s hair and the background becomes a walk in the park without any visual defects with Reality Keyer.

Using a traditional chroma keyer requires an evenly lit cyclorama because it can key with the value of a single tone of green. This means the incoming image should have exactly the same RGB values on every pixel. In real-world, this is very unlikely to happen.

Image based keying solves this problem using a clean plate. After finalizing the camera and lighting setup, a clean plate is captured which will be the source key color. Reality Keyer individually keys for all tones of green in the clean plate, resulting in the best keying quality in the industry even the cyclorama is not perfectly uniform in terms of color.

Coupled with TRAXIS talentS, the demo showcases how visual accuracy is vital for virtual studio productions to achieve the highest photorealism. talentS, AI-powered talent tracking system that automatically recognizes the presenter’s 3D location, sends the positional data of the presenter to Reality Engine and then the Engine places the talent inside the 3D world with the correct depth. As a result, the presenter’s reflections and shadows fall automatically to where they should be.

This accurate positional data enables virtual objects to reflect and cast shadows into the real world and the real world to reflect on the virtual objects. The presenter can also interact with the AR objects and compose advanced scenarios such as virtual light following her/himself, triggering AR objects automatically by walking towards a defined location and so on.

You can watch it all in the demo video above.

Slide 1
RealityEngine

Reality Engine: Real-Time Virtual Studio, AR and XR Platform

Hyper-realistic rendering and compositing in Unreal Engine
Create, control, produce in one system
Real-time Native 4K/UHD/SDI production

virtual line studios powered by reality engine
Slide 1
talentS
talentS

TRAXIS talentS : AI-Powered Markerless
Talent Tracking System

No wearables and no markers
Power of AI and machine learning
Easy setup and calibration

TRAXIS talentS
Slide 1
RealityEngine

One Hub to Control Them All

Control, customize, automate broadcast graphics
from one single Hub
Real-time data integration from external sources
Unreal Engine integration into broadcast workflows

RealityHub-Unreal-Engine-Vanilla-Support
Slide 1
RealityEngine

Ultimate Workstation For Virtual Sets & Broadcast Graphics

Tailored for virtual studio and broadcast graphics
Fastest available rendering, compositing & I/O performance
Multi camera rendering and compositing with a single engine

RealityEngine-AMPERE
Slide 1
RealityEngine

Image-based Keyer Works on GPU

Sophisticated and easy-to-use keyer
3D Masks for “force-fill” and “spill bypass”
for hybrid studio operations
Highest performance in HDR production

RealityHub-Unreal-Engine-Vanilla-Support
previous arrow
next arrow
Slide 1
Slide 1
Slide 1
Slide 1
Slide 1
previous arrow
next arrow

About Zero Density
Zero Density is a world leader in virtual studio, augmented reality and real-time graphics technologies for the broadcast, live events and esports industries. From the Olympics to Louis Vuitton virtual fashion shows, Zero Density’s Unreal Engine-native platform, Reality Engine — which includes a real-time broadcast compositing system and its proprietary keying technology, Reality Keyer — has been used by some of the biggest companies in the world. Clients include: The Weather Channel, RTL, Fox Sports and Warner Media.

Get in touch with our real-time graphics experts.

    Name*
    Surname*
    Job Title*
    Company Name*
    Phone*
    E-mail*
    Country*
    What are you looking for?
    What's on your mind?*
    Please wait, submitting...
    By subscribing you agree to receive Zero Density communications via email. You can unsubscribe at any time.