Volumes of Creativity with Virtual Production

by Paula Parisi

The green wall won’t be disappearing anytime soon. “If you have a one-page scene in Paris, you should do a green screen and traditional visual effects. But if you have a show where one of the characters lives in Paris, and it goes on for 22 episodes, you can have a virtual backlot in Paris,” build the world in 3D and let the characters inhabit it on an LED set, Stargate Studios founder and CEO Sam Nicholson explains in the ETC @USC whitepaper “Virtual Production and Beyond,” a case study of making the 20-minute short Fathead. The proof-of-concept film pushed the boundaries in many ways. The first project shot on Amazon Studio’s Stage 15 Virtual Production facility in Culver City, the dystopian fantasy relied exclusively on in-camera VFX (ICVFX) to depict an adolescent turf war in a sculpturesque junkyard. Written and directed by c. craig Patterson (MFA, School of Cinematic Arts), the cost of constructing such an elaborate physical set combined with the dangers of filming child actors in a potentially hazardous environment made the story an ideal candidate for the volume stage.

In the summer the School of Cinematic Arts got a new LED screen of its own, a generous gift from Sony Electronics Inc., following a visit to the School by a delegation led by Kimio Maki, President and CEO of Sony Corporation. At the LED volume stage, a camera tracking system constantly gathers the position of the physical camera, and that data is used to precisely align the virtual camera in the game engine with the same position in 3D space. The virtual camera drives the inner frustum — the region of a virtual world that is visible to the physical camera, ensuring that only those portions it sees are rendered in real time on the wall. While the interactivity of the game engine is liberating, it has triggered a change in the workflow, realigning labor formerly reserved for fixing things after-the-fact to the front-end, previsualizing and planning ahead. “We’ve heard this to the point where it’s almost a cliche, but preproduction is the new post,” notes ETC Executive Director Ken Williams. Understanding that paradigm shift was perhaps the biggest learning on Fathead. Says Williams: “It requires a complete rethinking of the workflow by every content contributor if they’re going to meaningfully use these tools.” This fall, the Film & Television Production Division has configured a trio of classes designed to illuminate each step in the virtual production process, culminating in a volume shoot.

Caption: SCA students use new LED screen generously gifted by Sony to craft a captivating winter scene on the Akira Kurosawa Stage in the Robert Zemeckis Center.

The three classes are “conceived as a series, each building on the next,” explains Gail Katz, Chair of the Production Division and a Professor of Cinematic Arts. First up, “Realtime CG Filmmaking” (CTPR 445) provides an introduction to game engines and the software and processes used to create narrative-based projects, the goal being to complete a previsualization. Next, “Virtual Production” (CTPR  446), collaboratively advances a previously created previsualization, with instruction on motion-tracking and creating the backgrounds that will display on the LED walls. The third class, “Virtual Production in LED Volumes” (CTPR 499) will take place on the Akira Kurosawa stage of the Sony Virtual Production Studio in the Robert Zemeckis Center. in the Sony Virtual Production Studio. Six instructors — a director, cinematographer, visual effects producer, art director and editor — will address all aspects of a volume shoot. “This is where we tie it all together,” says Katz. The studio, operational as of fall 2022, boasts a 20 x 90-foot LED wall that is one of the finest in operation, featuring an 8K Sony Crystal LED B Series display. For spring, Katz says SCA is working on a class on direction and performance in the virtual volume as a collaboration with the USC School of Dramatic Arts.

 “If you’ve done preproduction correctly, the day of the shoot is just execution. You’ll work quickly and efficiently and can focus on the most important live element, your connection to the actors and getting those performances,” explains Fathead VP producer Tom Thudiyanplackal. The goal is not to be rigid, but to figure out where you would like flexibility and plan for that. Interviewing VFX supervisor Kevin Baillie for the Fathead whitepaper, Thudiyanplackal wrote that the longtime Robert Zemeckis collaborator and his team “did a virtual camera layout for every set and filmed and edited a nearly complete version” of 2022’s Pinocchio “before they shot a single frame.” Portions of their previsualized world made it onto the LED wall for the live action shoot.

Caption: The set of “Fathead” at Amazon Studio’s Stage 15 Virtual Production facility in Culver City.

While software previsualization can be done using any combination of the software mentioned in this article, “Realtime CG Filmmaking” instructor Emre Okten, also the VFX producer for “LED Volumes,” says he likes to keep things simple for the introductory class and has them do previz to final pixel inside the game engine using Unity or Unreal, in the spirit of “independent filmmaking, where they’re doing everything themselves.” Game engines, he says, “bring a lot of different procedures together in one software. So, while you can use Maya to animate and import that, the game engine can do animation” and has a host of other VFX tools that are sufficient. “The beauty of game engines is that they deal with all the lighting and shadow and color. A lot of that is ready made,” Brinson adds.

While user friendly engine interfaces eliminate coding as a barrier to entry, a basic understanding, and even some visual scripting, can help, according to Okten, who majored in computer science as an undergrad. About half the USC Games students are enrolled in the Viterbi School and half at SCA. Megan Friedberg, who graduated in August with her masters from Viterbi has worked on VP projects at Disney and says “knowing how to visual script is extremely useful, even as a non-engineer or TD on set.” For those interested in cross-pollination, Friedberg helped design two introductory Viterbi classes — ITP115 and ITP116 — “intended for non-engineering majors who want to learn how to program.” The emergence of generative text-to-code AI suggests a little knowledge will soon go a long way, though human oversight will remain as important to the technological processes as they are to the creative endeavors. “AI, procedural learning, machine learning — it’s work, not magic,” says Patterson, who shares that “it took 610 prompts to generate 15 reference images and short videos” using Midjourney, Firefly and Blockade to create previsualization for his new projects. “That takes time, but the hours you put in might save days for your first AD, costume designer or production designer.” Practicalities aside, Patterson says “it allows you to dream, freeing the imagination to create characters and develop stories for any medium.”