Virtual Studio and Augmented Reality Solution

Reality Engine® from Zero Density is the ultimate real-time node-based compositor enabling real-time visual effects pipelines featuring video I/O, keying, compositing and rendering in one single software in real-time.  As the most photo-realistic virtual studio production solution, Reality Engine® provides its client the tools to create the most immersive content possible and revolutionize story-telling in broadcasting, media or cinema industry.

Reality Engine® uses NVidia Quadro GPUDirect technology for high performance video I/O in order to streamline real-time 4K UltraHD workflows.

 TF1 2018 FIFA World Cup Russia® broadcast utilizing green screen keying, portal window, ceiling extension and augmented reality. Set designed by Dreamwall. Images courtesy of TF1.


Reality Engine® uses Unreal Engine by Epic Games, the most photo-realistic real-time game engine, as the 3D renderer. With the advanced real-time visual effects capabilities, Reality ensures the most photo-realistic composite output possible.

Real-time Ray Tracing

Reality Engine® takes advantage of latest powerful technologies from world’s most leading companies like Nvidia® RTX® GPU technology to produce high quality and realistic rendering possible in the industry today. Real-time ray tracing technology brings cinematic quality rendering capability to live produc-tion and provides realistic lighting, shadows and reflections.

Compositing in 3D space

As a unique approach in the market, in Reality, green screen image is composited with graphics in 3D scene. This technique results in real-time realistic reflections and refractions of the physical objects and the people inside the green screen on top of the graphics. Also blooming effects and lens flares are composited over the real elements. All of the compositing is made in 16 bit floating point, for HDR precision.

Reality Keyer®

Reality Keyer® is the world’s first and only real-time image based keyer that works on GPU. Reality Keyer® provides spectacular results for keying of contact shadows, transparent objects and sub-pixel details like hair. Reality Keyer® can work also with 4:4:4 RGB camera sources. The keyer also supports 3D masks for “force-fill” and “spill bypass” buffers besides garbage mask. Those masks enable hybrid virtual studio operation which lets you combine virtual and real environments.Besides the quality of its results, Reality Keyer® is still very easy to setup and operate.

As a unique approach in the industry, Zero Density’s Reality Engine® composites images by using 16 bit floating point arithmetics. Having the control of full compositing pipeline, all gamma and color space conversions are handled correctly. By using this technology, the real world and the virtual elements are blended perfectly. Although Reality Engine® is able to provide fill and key outputs for external devices (such as vision mixers) like conventional solutions, compositing in Reality Engine® is recommended.

Advanced Augmented Reality

Having the tracked camera feed as an input image, Reality Engine® is able to render reflections and refractions of the real environment onto the virtual objects. Real-time 3D reflections and the shadows of the virtual objects are also composited over the incoming camera feed. Blooms and lens flares caused by the bright virtual pixels are also blended to the final composite.

Videowall and Portal Window

Reality Engine® can feed your videowalls with high-resolution 3D graphics. By using camera tracking data, you can even convert your videowall into a portal to a virtual environment. Combining portal window with ceiling extension, you will be able to have a virtual studio without green screen.

Tracking Technology and Lens Calibration

Reality Engine® supports industry standard mechanical and optical camera tracking systems. Reality can use the lens calibration data sent by the tracking devices.

Talent Tracking
When the talent tracking data acquired by Reality Engine, the engine positions the talent inside the virtual world in the correct position and depth. This technology automatically enables the creation of accurate reflections and refractions of the talent. Interaction between the real and physical is achieved with utmost realism thanks to Reality Engine’s industry first and only feature of compositing in 3D space.

Reality Broadcast Workflow Tools

Reality Engine® provides necessary control tools for multicam studio operations. Reality Control Suite has plugins to drive Reality Engine® with external data for data-driven workflows.


IBC 2019 Reality Virtual Studio Demo

Sample Projects

FOX Sports NASCAR2019

Real-time Ray Traced Virtual Studio with NEP & Zero Density

Webinar | Reality Starter Contents