Want to know how to use Unity for virtual production and real-time iteration? Read on to discover how this innovative studio helped its clients stay on time and on budget while fine-tuning textures and lighting in real-time.
As the pandemic swept the world into chaos, virtual reality (VR) and augmented reality (AR) experiences were in high demand. But traditional filmmaking processes were also disrupted: production needed to pivot quickly to accommodate safety concerns and the challenges of remote work. One indie studio recognized the opportunity to expand its scope.
Founded in 2017 as a group of freelance creatives, programmers and technical artists, VRFX Realtime Studio has been operating under a simple but ambitious mission: to bring stories to life.
Based in Lucerne, Switzerland, the five-person team began their journey with Unity several years ago when they recognized the potential of VR/AR and its wide applicability across a variety of industries – and saw a gap in the market for agencies that could develop these tailored solutions. Upon realizing how open and accessible Unity’s VR/AR developer framework was and how it would allow the team to create a huge variety of software and experiences, their choice of Unity as a platform was an obvious one.
VRFX has brought the power of real-time 3D to clients in industries that run the gamut from theater production companies to geological research centers to manufacturers. Pictured here is a VR experience using Unity created for a film festival.
As self-described “technovisual problem solvers,” the studio works with companies in a variety of industries to tailor solutions that leverage the power of real-time 3D. By gathering in-depth knowledge about their client’s customers, products, and positioning, they can collaborate closely with all parties to create the perfect visual content, be it a classic TV commercial or an AR experience for an art museum. The studio produces content that spans several formats, so the team constantly needs to stay up to date on the latest trends and find new and interesting ways to work with hardware and software.
A VR theater show developed in 2018, also made with Unity
“As we’ve expanded our team, we’ve looked for not only technical expertise with Unity and other animation software, but people with a positive attitude, a desire to own full workflows, and a thirst for new challenges,” explains Pascal Achermann, VRFX cofounder and technical director. “As a result, we’re a group of people that love what we do. I think this has been essential to our success thus far."
An indie studio’s foray into real-time filmmaking
While VR and AR experiences remained the company’s sweet spot in 2020, the year presented opportunities for VRFX to expand into new fields, such as virtual production. Indeed, the team’s professional history in visual effects, 3D compositing, film production, and audio engineering positioned them well to explore the realm of real-time filmmaking.
Configuring an animated scene inside of Unity HDRP
While the Unity Editor and MiddleVR are their core tools, VRFX also leverages packages developed by other Unity developers to accomplish specific tasks, be it a tool for creating plants and trees with perfect meshes and UVs, a DMX controller to control on-set lighting in real-time, or their in-house developed “Snap Shot” to make changes to a scene while in play mode.
“When we don’t know how to do something, we research to find out who’s solved the problem before,” describes Nick Schneider, VRFX engineer. “We have yet to encounter something that we can’t create. We’ve always found a solution by looking to the Unity community. It’s a fantastic resource.”
View of “Snap Shot,” a tool that VRFX developed in-house
Project Silvester: Virtual production with Unity HDRP and ArtEngine
The studio’s debut with virtual production using LED walls was a project for the Lucerne government for a series of TV commercials to encourage residents to practice social distancing during the Christmas holidays to slow the spread of coronavirus. VRFX pitched a virtual production setup, arguing it was the only way to stay on budget and meet the imminent deadline. With about a week to write and deliver the content, there was little time to spend on activities such as location scouting and postproduction. VRFX knew that with a little bit of pre-production planning, they could not only deliver the project with nearly zero post-production costs but also offer the client the ability to reuse the same environment for future commercial spots. The client was sold, and so planning began.
VRFX was responsible for the virtual production setup and real-time content, while partner agency Orisono wrote the story and did the on-set lighting, filming and post-production. The teams worked closely with each other in their resident location at Soundville Media Studios to deliver the entire project in about a week, with the Unity work done in about three days.
VRFX art director Claudio Antonelli testing the camera tracking system for a virtual production
One of VRFX’s key tasks was to create the virtual backdrop content. As such, during preproduction, VRFX created a replica living room in Unity that represented what the real camera would see as a background. The team leveraged Unity’s High Definition Render Pipeline (HDRP) and MiddleVR. With this setup, the team was able to use real-world lighting values, simulate lens distortion on their Blackmagic URSA camera and project the 3D content onto the LED wall. Using Unity ArtEngine, they very quickly replicated the materials onsite. These materials were then used for texturing to create the virtual clone in Unity.
Setting up the virtual environment during preproduction
The first step in setting up a virtual production environment involved replicating the real set – a festive living room – in 3D. To do so, the team measured the set components and modeled a virtual clone in Modo.
A digital replica of the real set, made in 3D modeling program Modo
With the digital model, VRFX created a quick previz mockup using Marmoset Toolbag to help the client and film crew visualize the scene. With previz approved, the team then created a higher fidelity lookdev render to illustrate the vision for lighting, mood, and materials.
With the lookdev scene, VRFX worked with the director of photography Alex Stratigenas to align on shot framing. After another round of client feedback, the scene was imported into Unity HDRP to apply the lighting and materials.
Diagram showcasing the shot framing with the LED wall
Iterating on set with ArtEngine
A key component of any virtual production setup is flexibility. While a real-time engine is at the core of this setup, other tools that support quick iteration and enable creativity, such as Unity ArtEngine and MiddleVR, play a critical role in cultivating an agile workflow.
For example, while on set, the crew decided to swap out several materials to better match the live set and artistic vision. Previously, the materials were placeholders downloaded from a public library. Using their phone, a team member snapped a couple of photos of the set carpet and studio walls, imported into ArtEngine, and within minutes had created a tileable physically based rendering (PBR) material.
“The speed at which we’re able to create textures from photos using ArtEngine is amazing,” explains Achermann. “For this production, we created the carpet material minutes before shooting. In a few clicks, we color-corrected the photo, removed seams and unwanted artifacts, and generated all the PBR maps. The ability to quickly iterate on set like this has been critical to our success in real-time filmmaking.”
Original photo of the set’s carpet, taken on a phone
Using ArtEngine’s Content-Aware Fill node, an artist was able to quickly remove unwanted noise and dust
Final node graph inside ArtEngine
The 2K textures from ArtEngine were then painted inside an 8K atlas using the Quixel Suite, which was then imported to Unity.
Close-up of the carpet material in Unity, with lighting applied
The background scene in Unity, which was projected onto the LED screen
The scene was then projected onto the LED wall. “It was an amazing feeling seeing Unity projected onto such a big screen, and even cooler when we saw the scene captured from a real film camera,” explained Achermann.
The final virtual production setup, as seen through the camera
Matching lighting in the real and virtual worlds
Matching the lighting in Unity to that captured by the camera in real life is a critical part of ensuring realism with virtual production. In this case, the lighting and post-processing setup was relatively simple.
In Unity, the team opted for Mixed Lights (as opposed to Baked or Realtime). Since lighting in HDRP is physically based, VRFX was able to set light intensities in lux, candela, or Kelvin – standard units of measurements for physical lights in the real world.
For the environment lighting, the team used a standard image-based lighting workflow, which involved lighting the scene with high-dynamic-range imaging (HDRI) photographs of the real-world set, similar to the workflow used in other digital content creation (DCC) tools. For an added sense of global illumination, VRFX also leveraged a lightmap baking technique using the GPU, and combined it with OptiX denoising for a speedy process.
Once the director of photography was satisfied with the lighting, shooting began
Post-processing was minimal. The team applied simple Screen Space effects (Reflections, Ambient Occlusion, etc.), which were sufficient to make the environment look great.
Actors play out a fondue dinner scene in front of the LED wall
The final results were a series of short, tongue-in-cheek TV commercials that aired during the holidays. To be sure, the client was pleased with the results and expressed hope for further collaboration with VRFX in the future.
Project Eichenfresser: Animation lookdev in HDRP and ArtEngine
VRFX’s expansion into real-time filmmaking has not stopped at live-action. Indeed, the company has been recently experimenting with animation in Unity. Part of this exploratory process has involved creating “playground projects,” one of which was nicknamed Project Eichenfresser and focused on using the Unity HDRP for lookdev to experiment with storytelling in a spooky world. Such ongoing projects have allowed VRFX to play around with workflows using animation-ready assets and bolster their expertise for future client projects.
Working with the camera inside Unity
The character art for this project, originally hand-drawn, was transformed into a 2.5D look using Cinema 4D, and then imported to Unity as baked animations. The vision was to keep the character designs hand-drawn and use 3D for the lookdev cinematics and final animations.
The aesthetic of the Eichenfresser world entails a mix of cartoonish, paper-like assets as well as real-world, scan-based textures and 3D models. While creating the initial lookdev, VRFX leveraged content from Quixel Megascans. However, as the concept developed, they wanted to swap out all of the Megascans content and incorporate factually accurate biomes (indeed, the Eichenfresser story takes place in Switzerland) and began photographing their own textures and models, using ArtEngine to clean up the images.
ArtEngine graph of the material used in the cave (see screenshot below), created by combining two materials into a single material
View of the project being configured inside of Unity
Photographs taken of a forest floor in Switzerland, transformed into digital materials with ArtEngine for use in Project Eichenfresser
VRFX continues to experiment with real-time animation in Unity, specifically as it pertains to creating character movements and controlling the camera and shots. For example, one current challenge is adapting the character rigs so they can be used directly in Unity to create movements. For this, they have been experimenting with motion capture techniques, with assembly done with Unity’s state machine; however, due to the dedicated 2.5D look and the way the character’s topology is built, this remains an ongoing process.
You can see a teaser of Project Eichenfresser here:
Project Leolina: The road to real-time animation in Unity
In the early days of the pandemic, VRFX’s art director, Claudio Antonelli, also began working on another playground project that has since grown into a concept for an animated kids TV series. The stories, developed in partnership with teachers and parents, portray a family of three going about the ups and downs of daily life and are intended to teach a young audience meaningful lessons about the real world. The artwork was designed to represent life inside a dollhouse, with the characters and most objects appearing to be made out of wood.
A still from Project Leolina, rendered in Unity HDRP-DXR with RTX ray tracing
Beginning as a classic 3D animation project, Project Leolina is now in the process of being adapted for a real-time pipeline using Unity. Though the characters are wooden puppets, they were created to work well with a motion capture rig and can be controlled with a character controller for more repeatable animation parts.
Inspector view in the Editor for the scene above
Again, VRFX used ArtEngine to author materials for the scene. For example, to create the white wall paneling, VRFX downloaded a flat image from Textures.com and used ArtEngine to make several adjustments before generating the full PBR material.
Graph inside ArtEngine
Stay tuned in the coming weeks for a teaser of Project Leolina.
The next chapter has yet to be written
VRFX is of the mindset that real-time filmmaking is a journey – workflows evolve, tools advance, and the community continues to uncover new problems and develop solutions for fixing them. With recent successes with client projects in virtual production, and seeds planted to flourish in the animation space, VRFX remains hopeful about their future in the world of filmmaking with Unity.
To learn more about the studio’s work or share feedback, you can email them, visit their website, or connect on LinkedIn
- Claudio Antonelli, Art Director
- Pascal Achermann, Technical Director
- Nick Schneider, Engineer
- Rolf Stalder, Business Strategy Manager
- Daniel Wipf, Engineer
If you want to experience how AI material creation can augment your real-time filmmaking pipeline, we invite you to give ArtEngine a try. Until May 17, ArtEngine is available for $19/mo (vs regular price of $95/mo).
Virtual production and real-time film tools
- Unity Universal Render Pipeline
- Unity High Definition Render Pipeline
- Unity ArtEngine | AI-based material authoring tool
- MiddleVR | Multiscreen rendering and camera-tracking solution
Texture libraries and packages
Renderers and 3D DCC tools
- Marmoset Toolbag 4 | Realtime lookdev renderer
- Cinema 4D, Nuke, Modo, DaVinci Resolve | VFX and 3D
- Quixel Suite, Quixel Mixer | Painting, authoring and packing textures
Unity Asset Store tools and developers
- Lennart’s Vegetation Studio Pro | Tree and vegetation rendering
- Boxophobic’s Vegetation Studio | Vegetation shading and animation
- Jason Booth’s Microsplat | Fastest and most accessible terrain tools
- Dawies DE Shaders | A huge variety of useful unity shaders, and more
- Time of Day | Easy-to-use, tight sky shader system for day/night lighting
- R.A.M River Automatic Material | High-quality river and lake creation tool for Unity
- C.T.A.A | CTAA V3 Cinematic Temporal Anti-Aliasing
- Horizon based ambient occlusion | Amazing AO especially for URP and VR
- Crest Ocean System URP | Great water simulation system
- GPU Instancer | Instancer system