Reality is the most photorealistic virtual studio solution in today’s broadcast industry, powered by Unreal Engine 4. Reality has powerful integrated toolset, that provides solutions for all types of virtual studio production workflows, including augmented and data driven graphics. Reality raises the bar in virtual studio production in terms of image quality, operation speed, keying technology and lots of other aspects.
Reality Processing Engine
Reality Processing Engine is working as a “node based compositing” system, which is designed for real-time production. Any functionality of Reality, such as the Keyer, lens distortion, logarithmic conversions are nodes of the processing engine.
The node-based system allows advanced users to design custom workflows for more complex operations.
Reality uses the Unreal Engine 4, which ensures most photorealistic image output available for today’s GPUs. This level of image quality is achieved by deferred rendering methods, unique anti-aliasing technology and advanced features like depth of field, motion blur, light maps, screen space reflections and refractions.
Reality Keyer™ HDR
Reality Keyer provides spectacular results of keying of contact shadows, transparent objects and sub-pixel details like hair. Reality Keyer can work also with 4:4:4 RGB camera sources and HDR cameras sending logarithmic encoded video images. The keyer also supports 3D masks for “force-fill” and “spill bypass” buffers. Besides the quality of its results, Reality Keyer is still very easy to setup and operate.
HDR Compositing in 3D Space
As a unique approach in the market, in Reality, talent is composited with graphics in 3D scene. This technique results in realistic reflections and refractions of talent. Also blooming effects and lens flares are composited over talent. All the compositing is done in 16 bit floating point, for HDR precision.
Low Latency Multi-format Video I/O
Reality video subsystem supports HD (including interlaced modes) and UHD/4K standarts, while maintaining very low I/O latency via Nvidia GPUDirect DMA* and AJA Corvid video boards.
The video subsystem is designed to work with circular buffers, so it is possible to fine tune latency vs performance.
Starting with version 2.0, Reality supports alsı Rec. 2020 color space and Smpte ST 2084 transfer function for HDR video I/O.
*NVIDIA GPUDirect video I/O is available only on Quadro series. GeForce series are limited to standart GPU read-back operation.
Reality supports market leading camera tracking systems such as Mo-Sys, Stype, TrackMen, Xync, Vinten and many other devices supporting the FreeD protocol.
Reality provides lens matching tools for radial lens distortions. Reality also has and advanced feature named focal distance calibration, which allows cameraman to intuitively pull focus of the graphics using the physical lens.
Reality .net API
Reality API, gives the possibility to users to develop 3rd party control applications for specific needs such as data driven graphics.
Action Based Control Interface
Reality allows creating action groups from multiple custom actions with timeline. So the studio operators can trigger multiple actions with custom timings in one click. By using custom actions, you can change any property of any item in the scene, trigger any animation and connect any data sources and ingest external data to the system.
Reality can be configured for multi-channel rendering, so it is possible to setup a whole virtual studio system using only one Reality Engine.
Multi-viewer & Vision Mixer
With the built-in multi-viewer and video mixer, the system can preview all sources (cameras, live video inputs, still images, video clips etc.) and make various transitions in between. Reality Vision Mixer supports cut, dissolve and wipe transitions with selectable duration.
Zero Density's virtual studio technology Reality keeps on growing. Thanks to our partner "DreamWall", Belgian TV channel Télévesdre will take a new turn in [...]
For details you can check: Reality Hardware Specs
If you wish to get maximum performance from your engine, you have to assign one engine hardware per camera. Use “Reality” in single channel rendering mode and use an external video switcher / multi-viewer.
As with all the keyers, low noise levels improve keying quality. Try to have the lowest signal noise from the camera for best keying results.
Reality uses the Unreal® Engine. Unreal® is a trademark or registered trademark of Epic Games, Inc. in the United States of America and elsewhere” “Unreal® Engine, Copyright 1998 – 2016, Epic Games, Inc. All rights reserved.”
Even if you can import your complete set design as one huge FBX file to Unreal Editor, we never suggest doing that. The best way would be exporting each of your models as separate meshes, and bringing your assets one by one. Then you can place your meshes in Unreal Editor, while instancing the same geometry multiple times, with little performance penalty.
You also have to design lighting and materials within Unreal Editor. The materials imported from FBX files will be just very basic materials, and they won’t let you to unleash Unreal Engine’s rendering quality.
So, unless you create an interactive gaming content for your audience, you are ok to use Reality in your broadcast and movie productions.
Royalties for the product Reality are handled by Zero Density.
○ Lens calibration curve editors
○ Ability to export Unreal actors, functions and properties
For example: you can control the vision mixer node, or change a keyer parameter with the Action Builder. Or you can call an exported blueprint function or change an exported property of an actor with the same method.