Reality is the most photorealistic virtual studio solution in today’s broadcast industry. Reality has powerful integrated toolset that provides solutions for all types of virtual studio production workflows. Reality raises the bar in virtual studio production in terms of image quality, operation speed, keying quality and lots of other aspects.

 

Reality Processing Engine

Reality has a real-time node based processing engine. This is the core of the Reality Virtual Studio.

 

Photo-realism

Reality uses the best real-time 3D rendering engine ensures most photo-realistic image output available for today’s GPUs. This image quality is achieved by using deferred rendering methods, unique antialiasing technology, and advanced features like depth of field, light maps, reflections and refractions.

 

Reality Keyer™

Reality provides a GPU accelerated keyer, which has a patent-pending “tracking-aware” keying technology, which is unique in terms of matte generation. With the technology inside, “the world’s first tracking-aware real-time keyer”, Reality Keyer™, competes with the most expensive keyers in the market. Besides the spectacular results of Reality Keyer™ in terms of keeping details and softness, it is very easy to setup and to operate.

3D Compositing

As a unique approach in the market, talent is keyed and composited with graphics in 3D space. This technique results in realistic reflections and refractions of talent.

 

Lens Calibration

Reality uses its own lens matching tools. Reality provides vital features such as chromatic aberration correction which is vital for keying and focal distance tracking which matches the graphics depth of field with the real world.

 

Action Based Control Interface

Reality allows creating action groups from multiple custom actions with the timeline. So the studio operators can trigger multiple actions with custom timings in one click. By using custom actions, you can change any property of any item in the scene, trigger any animation and connect any data sources (database, RSS, excel etc.) and ingest external data to the system.

 

Integrated Multi-viewer & Vision Mixer

With the built-in multi-viewer and video mixer, the system can preview all sources (cameras, live video inputs, still images, video clips etc.) and make various transitions in between. Reality Vision Mixer has a touch-based interface which supports cut, dissolve, wipe and fly transitions with duration selection.

 

Reality .net API

Reality API, gives the possibility to users to develop 3rd party control applications for specific needs.

Product Gallery

Product Video

Related News

Reality Storms into Educational Content Production with Groupe Média TFO

Groupe Média TFO introduces LUV: the virtual universe production laboratory of a new digital generation for French-language educational content, powered by Reality Virtual Studio. The [...]

Thank you all for visiting Zero Density at EXPO SYNC LISBOA 2016

We would like to thank all our stand visitors for taking their time to visit our stand during EXPO SYNC LISBOA 2016. We enjoyed meeting [...]

Partners

stipe100 newtek100
ms-sys100 trackmen
blackmegicdesign100 aja100

FAQ

Reality is designed to be compatible with off-the-shelf standart PC hardware. We strongly suggest that you go for an Nvidia GTX1080 GPU, and make sure that you have a decent motherboard with PCIex16gen3 bus connection for the graphics board. Regarding the CPU, you can either go for a Xeon or i7. Depends on what requirements you have.

But the rule  for the CPU is, going for the higher clock rates instead of core count. We would suggest to have 32GB RAM, and a 1TB SSD connected over SATA (please don’t use PCI SSD, those cards occupy extra PCIe slots for future expansion)

Reality supports AJA Corvid and Decklink video boards currently. Our top picks are: Aja Corvid88, and Decklink 4K pro.
At least Windows 7 today. But we strongly suggest that you go for Windows 10. Since the DirectX12 support is there, and it will become the graphics API we will suggest in the future. It will provide much more performance compared to DirectX11.
That depends on your expectations and workflow. The “Reality” architecture supports both internal and external production models. In internal production model, you need to use Reality Multiviewer and Reality Switcher applications in conjunction with Reality Engine in one box. You can connect all camera inputs to the same box. “Reality” is capable of dual channel rendering, which means it can provide you a preview and a program channel in full 1080p resolution, while delivering a multi-viewer output at the same time. But since the same hardware is used for dual channels and all the I/O, this may limit the complexity of your sets with real-time performance.

If you wish to get maximum performance from your engine, you have to assign one engine hardware per camera. Use “Reality” in single channel rendering mode and use an external video switcher / multi-viewer.

Definitely no, you wouldn’t want that. We developed our own patent pending keyer technology, where you can simply create post production quality keys in real-time. Furthermore, since we do internal compositing, we can achieve realistic reflections and correct depth blending. If we key the video externally, there is no way to achieve this level of photorealism in the final composite image.
We currently support: Mo-Sys, Stype, TrackMen, Vinten tripods, freeD support is on its way. However if you have another tracking device with another protocol, we would integrate that device to “Reality”.
Basically we work with ⅔ inch HD broadcast cameras during our development and trade shows. However you are not limited only to those cameras, we can custom calibrate any sensor size or lens setup you have.
We prefer to have 1080p images from the cameras whenever possible. Even if you broadcast at 1080i, this extra resolution helps our keyer to produce much better results.

As with all the keyers, low noise levels improve keying quality. Try to have the lowest signal noise from the camera for best keying results.

Reality supports radial lens distortion models. To match optical lens distortion, we have to render bigger pixel size images (for more distortion, even bigger image sizes are needed), which need more render time. So in order to maximize render performance, it is best if your lenses have little optical distortion.
Basically the lens should have minimal chromatic aberration, this optical defect directly effects keying quality, since it results in misalignment of color channels. We can digitally correct the chromatic aberration in “Reality”, but it is best to avoid it optically.
Since “Reality” is using Unreal Engine https://www.unrealengine.com/, you can use the standard Unreal Editor for designing virtual sets. Later you can directly import your set design to “Reality” and make your production.

 

Reality uses the Unreal® Engine.  Unreal® is a trademark or registered trademark of Epic Games, Inc. in the United States of America and elsewhere” “Unreal® Engine, Copyright 1998 – 2016, Epic Games, Inc.  All rights reserved.”
Reality currently supports Unreal Engine 4.12.
Unreal Engine has a growing marketplace: https://www.unrealengine.com/marketplace, where you can buy many environments, archviz designs, PBR materials. The marketplace content is growing everyday. You can buy your stock assets from there and easily use them in your virtual sets.
You can use any 3D application to design your models, as long as that application can export FBX files in 2014 format.

Even if you can import your complete set design as one huge FBX file to Unreal Editor, we never suggest doing that. The best way would be exporting each of your models as separate meshes, and bringing your assets one by one. Then you can place your meshes in Unreal Editor, while instancing the same geometry multiple times, with little performance penalty.

You also have to design lighting and materials within Unreal Editor. The materials imported from FBX files will be just very basic materials, and they won’t let you to unleash Unreal Engine’s rendering quality.

You can use Reality for broadcasting and owe no royalty as clearly stated in Unreal Engine EULA: https://www.unrealengine.com/eula (in no-royalty part, #3)

So, unless you create an interactive gaming content for your audience, you are ok to use Reality in your broadcast and movie productions.

Royalties for the product Reality are handled by Zero Density.

In order to use the modified Unreal Engine 4 editor (Reality DevKit), Yes you need to be an Unreal Engine licensee.
As Reality uses Unreal Engine 4, our DevKit includes a version of Unreal Engine 4 Editor modified for virtual studio requirements.
The “Reality DevKit”is provided for free, which basically is a fork of Unreal Editor. As stated in Unreal Engine Eula https://www.unrealengine.com/eula, an Unreal Engine licensee can download our customized editor from Github. This may change in the future, if Epic develops another distribution method for modding kits like “Reality”.
The additional features in “Reality” editor:

○ Lens calibration curve editors

○ Ability to export Unreal actors, functions and properties

If you want to control Unreal actors or “Reality” nodes, we provide an Action Builder application for this job. You don’t have to write single line of code for simple control applications, which don’t need complex integrations.

For example: you can control the vision mixer node, or change a keyer parameter with the Action Builder. Or you can call an exported blueprint function or change an exported property of an actor with the same method.

Product Updates

Reality 1.1.5

Reality 1.1.5 is released. New Features:  AJA 10bit video input, Reality Keyer control panel, FFmpeg [...]

Reality 1.1.0

Reality Virtual Studio 1.1.0 is released. New Features:  Unreal Engine 4.12.5 support, New universal virtual [...]