It was great to be back at NAB Show. Featuring three separate spaces, an AR demo stage, state-of-the-art virtual studio and demo pods, Zero Density booth provided a front seat ticket to the real-time graphics technologies powering daily shows, live events and more from the world’s biggest broadcasters.
If you missed the virtual studio demo, you can check it below. AR stage demo video is going to be shared soon.
On this year’s Zero Density demonstration, The Weather Channel provided their brand-new virtual set that the channel utilizes for daily live forecasts powered by Reality suite. The virtual set design was undertaken by creative trailblazer Myreze. It is an enormous virtual environment with a captivating design and the production was done from a relatively small green screen. Change of scenery is possible with a click of a button, in real-time.
Reality Engine 4.27 was at work during the demonstrations. The real-time node-based compositor which can key, render and composite photoreal graphics in real-time — has been natively built to take advantage of Unreal Engine features that will make virtual studio graphics more photorealistic. The virtual studio was ray-traced, showcasing that cinematic-quality visuals are possible for ultra-realistic, live on-air graphics and virtual sets. Through ray tracing, reflections, soft shadows, area lights, refractions and other signs of light travelling around us create a result that is superior in realism than rasterization.
RealityEngine AMPERE is the hardware product that enables the highest performance with utmost stability for virtual studio and broadcast graphics productions. Each RealityEngine AMPERE runs with full performance and built to deliver the maximum stability under the toughest real-time production environment conditions. Zero Density’s most talented testing and quality-control teams run tailored test procedures that simulate every single point of real-time production environment.
Reality Keyer, the imaged-based keyer that works on the GPU provides high-quality results for keying of contact shadows, transparent objects and sub-pixel details like hair – in any shot. Focusing back and forth between the presenter’s hair and the background becomes a walk in the park without any visual defects with Reality Keyer.
Using a traditional chroma keyer requires an evenly lit cyclorama because it can key with the value of a single tone of green. This means the incoming image should have exactly the same RGB values on every pixel. In real-world, this is very unlikely to happen.
Image based keying solves this problem using a clean plate. After finalizing the camera and lighting setup, a clean plate is captured which will be the source key color. Reality Keyer individually keys for all tones of green in the clean plate, resulting in the best keying quality in the industry even the cyclorama is not perfectly uniform in terms of color.
Coupled with TRAXIS talentS, the demo showcases how visual accuracy is vital for virtual studio productions to achieve the highest photorealism. talentS, AI-powered talent tracking system that automatically recognizes the presenter’s 3D location, sends the positional data of the presenter to Reality Engine and then the Engine places the talent inside the 3D world with the correct depth. As a result, the presenter’s reflections and shadows fall automatically to where they should be.
This accurate positional data enables virtual objects to reflect and cast shadows into the real world and the real world to reflect on the virtual objects. The presenter can also interact with the AR objects and compose advanced scenarios such as virtual light following her/himself, triggering AR objects automatically by walking towards a defined location and so on.
You can watch it all in the demo video above.