ASK Co. Conducts Zero Density “Reality” Workshop, Reveals the Background of Real-time Compositing Drama

*This article is originally published on PRONEWS JAPAN.

A hot tool that can combine graphics, VFX, and CG with live action

Zero Density’s 3D Virtual composite software “Reality“ is much-talked-about in the VFX industry. Although this is a virtual production, CG is overwhelmingly real. That is proven by prestigious awards, including the Broadcast Tech Awards 2019, the IBC 2017 The Innovation Awards, and the 39th Sports Emmy Award. Zero Density, is established in 2014 yet safe to say a game changer in the VFX industry.

A workshop to introduce Zero Density’s Reality, was held at Arkbell, Ota-ku, Tokyo for two days on February 26 to 27.

Utilizing Unreal Engine, that dominates the gaming industry, for broadcast virtual production

First of all, let’s introduce the studio, Arkbell. Arkbell has a virtual studio through the end of this year, and three walls in the studio are not the simple style with hanging curtains, but painted in digital green. It’s the studio dedicated to chromakey, and this size of virtual one is rare in Tokyo.

At the workshop, ALEXA mounting Angenieux lens is used for the camera A, and RedSpy is attached on its handle. This is stYpe’s ultra-precision optical camera tracking system. RedSpy enables to collect and transmit the infomation of camera position and lens data in real time, it allows to work together with a CG rendering engine.

The camera B is Panasonic’s 4K integrated PTZ camera “AW-UE150”, supports to the signal of the protocol “FreeD,” widely used for transmitting the tracking information of the camera.

Reality Engine is used for the CG renderer. The workstation is HP Z4 G4, equipped with Quadro RTX 6000 for GPU, and AJA Corvid 44 12G for HD-SDI input/output. Two workstations has been on set according to that one camera requires one rendering engine. There is the Network Attached Storage (NAS) in the middle of workflow system to share contents. The shared content is rendered each viewpoint of the positions from the camera A and B.

Five companies are involved in to handle Zero Density in Japan as an alliance. Import by ASK Co., sales by NAC Image Technology, technical support by reinphase, system introduction by Ahum-Labs, and the rental and production by a Tokushu Eizai.

Reality goes far beyond live events and broadcast. Introducing the live compositing method in Drama production.

The highlight of this workshop is a dialogue interview between Jun Katatae from reinphase and Shunsuke Ito from Tokushu. They said, Reality can be used not only for live broadcasting but also for commercials and movies.

In fact, the studio Arkbell, which is the venue for this workshop, has been used for shooting drama with Reality. Let’s see their conversation.

Jun Katatae, reinphase (Mr. Katatae): You said that in the shooting of the drama for a Video On Demand (VOD), the VFX compositing was performed in real time using the green studio. How did you decide to try using the real time compositing system like Reality in this drama shoot?

Shunsuke Ito, Tokushu (Mr. Ito):

The technology of real time compositing has just begun around the world. When I had demonstrations of Zero Density at Inter BEE, lots of visitors often asked me “How can we use it for?”. We just showed them the technology and product overview of Reality there, and we could not forward to its introduction with actual cases like “we used it for this way”.

My main job is the director of photography (DP), but through a year of learning Reality, I thought it should be perfect for drama shooting. In the first place, drama has a post-production workflow, so there is no need to composite in real time, but there is a situation that Japanese productions have a small budget recently. When I gave a suggestion to the producer to composite as much as possible on site, we got the positive reply “It seems our budget is likely to be reduced.”, and then moved to introduce Reality.

Mr. Katatae: Were you actually able to reduce the production costs?

Mr. Ito: 

I think so. We had the usual drama shooting for 20 days and shot about 100 cuts daily. After that, we shot the virtual parts about for 4 days. In the beginning of that, I thought it would be no problem with about 20 cuts a day…but technically, we shot 100 cuts per day as same number as drama shooting. That’s a really considerable number.

Mr. Katatae: That sounds so hard. Was it able to take 100 virtual cuts in a day because the results could be reviewed in real time, after all?

Mr. Ito: 

Yes, but it was a kind of challenge this time, so I thought it would be difficult to finalize all 400 cuts with just shooting and transmitting in real time.
That’s because there were some parts didn’t work well when we actually started using the system.

Even if we did not finalize 400 cuts, I think that should be meaningful if we could do 100 cuts on site. Because up until now, the post production had to adjust all of 400 for the drama project.

Mr. Katatae: I decided not to wear white clothes this morning, because I knew I would be shot in the virtual studio for today’s workshop. Did you have any difficulties for compositing because of costumes during the shooting?

Mr. Ito: 

We struggled to shoot due to the main character of the drama had gray hair and appear in shiny white kimono.

Not only the main one, we had lots of gamer cosplayers as characters, and there were so many glittering items such as thin swords, glossy swords, and silver armors in this drama.

After all, the key point is the performance of lights. If we use high-quality lights, we can make it properly, and using poor ones will not work well with keying.

Mr. Katatae: In standard green screen shooting, it’ll be difficult for actors to understand what kind of scene they are shooting now, because the background is just in green. However, Reality has a monitor that allows actors to check the results  in real time. Did you get any positive effect for the motivation or performance of the actors with that feature?

Mr. Ito: 

I noticed that being able to see in real time which situation the actors were performing would have a major impact for motivating them. It also had made it possible for them to gain a deeper understanding of what directors wanted to do.

Mr. Katatae: How was the response from the directors using Reality on site?

Mr. Ito: 

In the beginning of the meeting with the directors, they said “Just do the best you can”, but in the latter half of the shooting, they found out Reality could do anything, and got started saying everything they wanted to. I had my hands full dealing with that, and it was so hard. (laughs)

Mr. Katatae: Is there anything we should be aware of during the operation?

Mr. Ito:

You’ll need considerable light quantities. The amount of light from below must exceed the amount of light shining on green parts from above at least. Otherwise, you won’t be able to blow out the green reflected on persons. You also need to prepare clear ones, my recommendation is ARRI’s SkyPanel.

Also, I think the fact that we didn’t use the simple style with hanging curtains, but painted in digital green had a great effect on the result. Reality captures the green studio as a texture in Cyclorama, so if it’s a simple style with just hanging paper walls, several noises break out at the moment the curtain shakes. Considering this, using the painted background in green was really a important point for utilizing Reality. It made the whole workflow much easier, and we could get more spaces available in the shooting studio.

Mr. Katatae: Arkbell will keep this virtual studio for a long term, right?

Mr. Ito:

Yes, I’m very grateful. We need to take quite a lot of time for bringing in equipment and wiring them, also need time to calibrate. If Reality can settle down in this studio, the rental period will be reduced by at least 2-3 days, and it will lead to cost savings of hundreds of thousands or millions of yen.

If there is anybody wants to try using Reality in Japan, I think this Arkbell’s studio is the best place to get a trial at first.

About Zero Density
Zero Density is an international technology company dedicated to develop creative products for the industries such as broadcasting, augmented reality, live events and e-sports. Zero Density offers the next level of virtual production with real time visual effects. It provides Unreal Engine native platform, “Reality Engine®”, with advanced real-time compositing tools and its proprietary keying technology. Reality Engine® is the most photo-realistic real-time 3D Virtual Studio and Augmented Reality platform in the industry.