Production

How Sony and Unreal Engine Are Leading the Way for Virtual Production

Assemble

By Assemble

July 14, 2022

The rise of virtual production technology has been on a steady climb since the term spread outside Hollywood and reached viewers with productions like The Mandalorian. The making-of series for the show on Disney+ showed off Industrial Light and Magic’s innovative use of emerging technology, capturing the attention of everyone, whether they knew about movie production or not. Lifting the proverbial curtain on such new technology could have ruined the magic behind the show, but it didn’t. Now, virtual production, as done in The Mandalorian, is everywhere. 

The technology has only gotten better since then, with more software and camera makers joining the fray. HTC VIVE just announced the Vive Mars CamTrack, a tracking module that can be used to track a camera on a virtual production set. Additionally, virtual production display makers such as Absen and INFiLED have also been busy. But regarding hardware and software, there are two products with a more established residency in this space, Sony’s Venice camera and Epic’s Unreal Engine. 

Sony Venice 2

The Sony Venice 2 is a cinema camera that’s a follow-up to the very popular Sony Venice. The new 8.6K sensor block on the camera is the flagship upgrade. It’s a huge jump from the 6K sensor on the old Venice, which is still compatible with the new Sony Venice 2. For filmmakers that may want to switch between sensors, the Sony Venice 2 can switch between the 8.6K and 6K sensors. Switching doesn’t require a clean room because the sensor blocks are sealed, meaning it’s possible to switch out at any time. And because the camera instantly recognizes the sensor, there’s not a whole lot of setup involved.

Venice 2Sony Venice 2 specs. Image via Sony.

In addition to the hulking new sensor and the ability to switch it out at a moment’s notice, the camera has improvements that make it well-suited for virtual production shooting. Those features would be the ability to capture images in Sony’s proprietary S-Gamut3 color space, and the 16 stops of exposure latitude. The S-Gamut3 color space is much wider than the industry standard Rec.2020 color space for UHD and HDR displays, an advantage that allows filmmakers to capture richer color. In regards to virtual production, this is useful as the camera is better able to sell the virtual backdrop as a real setting. According to Naoki Tokunaga from Sony Imaging Products & Solutions Business Group, the S-Gamut3 color space “ensures no color is missed when filming LED displays.” He also mentioned that virtual productions that needed a more realistic look and higher resolution would not be disappointed with Venice 2. 

venice 2

S-Gamut3 color space comparison. Image via Sony.

Further, the 16 stops of exposure latitude give filmmakers ample space for lighting scenes on a virtual production volume. This wide latitude effectively allows the camera to capture the brightest highlights and darkest shadows; it’s better able to extract the detailed lighting in high-contrast ratio LED panels. With the 16 stops of exposure latitude, filmmakers won’t have to compromise the ideal lighting to compensate for uneven exposure between the lightest and darkest parts of a scene. 

sonyThe Sony Venice 2 being used on set. Image via Sony. 

While the Venice 2 offers features that work well for in-camera VFX — the type of virtual production involving LED displays — it’s not the only camera available for virtual production. In fact, essentially any camera can work for virtual production shooting. When speaking to Nate Strayer and Ace Patel from Stray Vista Studios, a virtual production volume soon to open later this fall in Austin, Strayer, who’s the founder, explained virtual production doesn’t need a “specific camera.” Patel, who is the virtual production supervisor, added, “you could be shooting with a DSLR because it [Unreal Engine] doesn’t really need to know what camera you’re using.” All Unreal Engine needs to know is the lens you’re using and where the camera is located in real-time. However, Patel did mention that it’s “advisable to use the industry-standard cinema camera” In Stray Vista’s case, they’re planning to offer clients the Alexa Mini LF. 

The Alexa Mini LF is a widely-used, industry favorite camera with a 4K sensor, but Sony’s other virtual production tech highlights the qualities of the Venice 2. 

Crystal LED and Atom View

Sony’s virtual production tech doesn’t stop at making cameras. It’s also heavily invested in LED panels of its own. The Crystal LED Virtual Production System Solution is a combination of several Sony technologies, including Sony’s Venice camera and the Crystal LED B-series display. 

In virtual production, panels need to be very bright to prevent the image from being degraded by the studio lights. For instance, think of sunlight hitting a phone or laptop screen. The sun is so bright it makes it hard to see the image until the display’s brightness is increased enough that it negates the brightness of the glare of the sun’s rays. So, as a result of brighter panels, a cinematographer has more room to light the scene. 

The Crystal LED B-series display panel is incredibly bright, with a brightness measurement of 1,800 cd/m² (candelas per square meter), or nits,  and a 1,000,000:1 contrast ratio. To put the brightness into perspective, a measurement of 1,800 nits is more than four times brighter than an M1 Macbook Air set at 100% brightness, which is already very bright. The panels also feature an anti-reflective coating to help with this issue. A secondary result of bright panels is that the panels themselves act as a light source, which light actors and the real objects placed on the set. And because the panels are curved, subjects can be lit from the back and sides. Finally, the high contrast ratio allows for detailed highlights and shadows that the Venice 2 can capture due to its wide exposure latitude. 

veniceThat’s not a real house in the background, it’s the image of a house recreated on Sony’s LED panels. Image via Sony.

Because Sony makes both the LED panels and the cameras, Tokunaga says, “multiple pieces of equipment can work together to reproduce colors and create seamless images.” To create those “seamless images” that Tokunaga is referring to, Sony uses Atom View, a proprietary technology that turns real settings into virtual ones. Atom view works by capturing an environment in 3D to create a three-dimensional computer graphic (3DCG) virtual environment. The 3DCG environments can then be used as backdrops for virtual production volumes. While it can’t help you if you’re planning to shoot a scene set on Jupiter, you can turn any environment on this planet into a highly-detailed 3DCG virtual environment. Atom View is a middle man of sorts, connecting the high-resolution Crystal LED B-series panels to the equally high-resolution Venice 2. 

Still, Atom View is simply part of the suite of hardware and software technologies that enable real-time in-camera VFX. What brings everything together is Unreal Engine and 3D camera tracking.

P.S. if you love technology, learn more about the future of AI, NFTs, community building and more in our Creatives Offscript podcast episode with Dirk Van Ginkel the Executive Creative Director at Jam3.

Unreal Engine

Epic Games’ Unreal Engine is a big deal in the video game industry, but the engine’s relatively recent pivot to virtual production has been swift and is steadily becoming more powerful. The word “engine,” as it’s being used in this case, refers to a piece of centralized software that contains the tools needed for virtual production. That means Unreal Engine can be used to create 3D environments, render 3D environments in real-time like a video game, light scenes, place virtual cameras, and implement other in-camera VFX. 

At the moment, Unreal Engine 5 is officially out of Beta as of early April, but with it still being so new, many, like Patel, are sticking with UE 4.27. He says he’s waiting for the bugs on UE 5 to be worked out, but he’s expecting it to be “pretty smooth for virtual production.” In the meantime, UE 4.27 is no slouch for in-camera VFX shooting. But to get it properly working, there’s one piece of hardware that hasn’t been mentioned yet, and that’s a 3D camera-tracking system. At Stray Vista, the 3D camera-tracking system of choice is made by Mo-Sys.

sonyThe Mo-Sys StarTracker Mini attached to Venice 2 camera. Image via Sony. 

The Mo-Sys Startracker is an optical camera tracking solution. When attached to a camera, it communicates the position of the camera to Unreal Engine by referencing the reflective stickers on the wall. Without this, the LED panel stays static, unable to adapt relative to the camera’s position. Strayer says that clients with a lower budget might opt for this non-tracking type of setup. Though, if you hope to achieve the most realistic view, a 3D tracking system is as necessary as the camera or the panels. 

In addition to the camera’s position, Unreal Engine also needs to know the type of lens being used. It requires this information to render the camera view — also known as the inner frustum render. The frustum, according to The Virtual Production Glossary, is “The region of a virtual world which appears as a viewport to the camera. On an LED volume, the inner frustum moves in sync with the camera, while the outer frustum is unseen by the camera and maintains the remainder of the environment static to provide consistent, realistic lighting.” The inner frustum render replicates what the camera would see in a real setting, changing the perspective of the virtual set in relation to the camera’s position. This is a technique also common in video games, rendering only  what the camera sees in great detail so as to not use unnecessary computer resources. Without the 3D tracking or the information on the lens, the inner frustum camera view would not be possible. 

unreal engineAn example of inner frustum culling. Image via Epic Games Unreal Engine. 

veniceInner frustum rendering with two cameras for a multi-cam shoot. Image via Epic Games Unreal Engine. 

All these pieces — Unreal Engine, 3D tracking, and inner frustum rendering — are still only a fraction of everything that goes into shooting with in-camera VFX. They stand out, however, because they’re a key part of what connects the Sony technology mentioned above. For example, Atom View’s 3DCG environments are static on their own, but everything comes together when input into Unreal Engine via a plugin. Unreal Engine puts the environments to use, using 3D camera tracking and inner frustum rendering to breathe life into the 3D images.

Back at CES 2020, Sony, Unreal Engine, and Mo-Sys partnered together for a presentation on this technology, using the ECT-1 car from Ghostbusters to show off the power of Atom View. In the two years since, in-camera VFX virtual production has gone from largely experimental to mainstream. 

Here to Stay 

The Venice 2 camera is not a “virtual production camera” per se. Still, it’s ideal for such use because of its ability to capture more color and lighting detail. That, and the synergy between the Venice 2, Sony’s LED panels, and Atom View technology. So, while it’s true that almost any camera can be used for virtual production, the Venice 2 ostensibly stands on firmer ground due to the supporting virtual production technologies developed by Sony. And central to any virtual production, not just those using Sony tech, is Epic Games’ Unreal Engine. 

The two companies have been there from the start and continue to innovate on a production technique that’s, for better or worse, impacting Hollywood in a big way.  Of course, just like there are filmmakers who will not budge on shooting on film, some will insist on shooting on-site. But virtual production isn’t going anywhere. 

“If you can shoot it practically, it’s always going to look best if you can go to these locations,” said Strayer, acknowledging the irony in making such a statement when he owns a virtual production volume. “But it’s an enormously helpful tool, and I do believe it’s the future. I don’t think it’s going away anytime soon. It shouldn’t be used for everything, but the stuff it should be used for, it’s the future for sure.” 

 

Are you ready?

Try our project management platform for video production

Try Free