Alhurra TV Elects for Zero Density: Broadcasting the U.S. Midterms with a Virtual Studio
What if you could discover who won a U.S. election from within the Capitol Rotunda? If you were watching the Midterms on Alhurra TV, you got to experience just that.
Thanks to Zero Density technology, the Arabic-language TV channel broadcasts 30 hours of live election coverage filled with realistic augmented reality and virtual studio graphics – including a virtual twin of the Rotunda itself. The result was an engaging election day, unlike anything Alhurra’s vast audience had seen before. From conception to delivery, everything was done in-house.
Operated by Middle East Broadcasting Networks (MBN), Alhurra and its digital platforms were scheduled to start U.S. Midterm Election Day coverage on November 8th and continue as results came in that evening and throughout the next day. For each hour of coverage, Alhurra’s team planned to focus on a prominent topic in the elections, such as foreign affairs and key topics that influenced the race. The Alhurra team also wanted to integrate historical information into the coverage to provide better context for the incoming election results.
The team decided to integrate augmented reality into their workflow as the network wanted to present vast amounts of data alongside real-time election results. This would help audiences to visualize the complicated data in an easy-to-understand and immersive manner.
Described as the Capitol’s symbolic and physical heart, the United States Capitol Rotunda in Washington, D.C. is a perfect space for presenting election results, also quite suited to insert storytelling elements into the vast venue. Thus, the Alhurra team aspired to build a digital twin of the Rotunda – both the interior and the exterior – as the virtual set for the coverage. As the Rotunda is a large, domed and circular room hosting a national and historical showcase of architecture, sculpture and painting, it was no easy task to depict the landmark accurately in the virtual environment. The team was also faced with the challenge of integrating live voting data from a third-party source into the graphics in real-time, as well as seamlessly placing presenters in this immersive virtual world within a live multi-camera studio setup.
The Alhurra team, needing a robust and reliable virtual studio and AR workflow for the entirely live broadcast, opted to leverage real-time graphics products from Zero Density. Zero Density assisted the network during the installation process and provided training to ensure that Alhurra was ready to go live. Despite the tight schedule and the challenges, after completing the lens calibration in two weeks, the team successfully managed to go on-air as planned on November 8th and continued over the course of three days as America selected its House and Senate. Zero Density’s real-time graphics engines worked non-stop, without the need to shut down or reboot.
Building a Hyper-realistic 3D Twin of the Capitol
Using Reality Editor, a fork version of Unreal Engine with a built-in broadcasting toolset, the in-house team began the Midterm coverage work by designing a hyper-realistic virtual twin of the Capitol in just three weeks. This involved building a 360-degree virtual recreation of both the outside and inside of the Rotunda; with two different lighting setups to represent daytime and nighttime. The paintings and artefacts were accurately replicated with features like texture kept in place. As all the Rotunda graphics were created virtually, there was no need to take a camera team to the capital to shoot, which saved Alhurra resources such as time and finances. They took photos of the Rotunda halls to use as textures in the virtual set. Aiming to go beyond the physical limits of the studio (Height: 300cm, Width: 500cm, Depth: 400cm), the team also utilized Reality Engine’s flycam feature to plane in and out of the Rotunda, which added grander transitions to the coverage.
Transforming the Studio into a Rich Theatre
Graphically speaking, the Middle East Broadcasting Networks pulled out all the stops for the coverage. The network achieved a fresh, highly immersive election graphics look, all while telling the story as it unfolded in real time. On the live set, a series of data-driven and dynamic augmented reality objects including 3D charts, numbers, maps and infographics were deployed, to break down states and key facts, and to visually convey the live national vote in real time. These were displayed at the center of the set; on the floor or around the presenters. Correspondents from 10 states interviewed voters about significant issues and were shown using AR video walls.
AR provided the audience with an enhanced viewing experience while helping to make sense of election polls, predictions, history data and finally – election results. As the results rolled in, augmented reality also allowed for interactive data updates, and lively art maps that made the complex voting system understandable.
As with all virtual studio graphics, all AR graphics were designed and built in Reality Editor and powered by three RealityEngine AMPEREs with three Reality Engines, creating a powerful real-time broadcast compositing system for the most photorealistic production possible. There was no use of real set elements: Everything was virtual.
Accurate Real-Time Graphics
In order to ensure the election data was automatically updated for all 30 hours, Alhurra’s in-house developers integrated all graphics sources with Zero Density’s control platform, RealityHub. Using REST API and OpenSource SDK, the developer team integrated the election data from a third-party source into the graphics only by means of documentation – without needing to place a single ticket or get remote support. For the 30-hour election coverage, RealityHub enabled the Alhurra team to convert the data-driven information into viewer-friendly graphics. These helped visually convey the incoming election results in a fast, error-free, and highly accurately.