Dark Asset, the latest film project from director Michael Winnick (Guns, Girls and Gambling), is a tale of spies, microchips, and military scientists – one of whom is played by T-1000 himself, Robert Patrick. Talk about the perfect combo for an action thriller.

We won’t ruin the plot. But, we can reveal some of the virtual production methods that Setareh Samandari, the lead visual effects (VFX) artist on the film, used to bring the project to life and accelerate the overall production workflow via previs in Unity. In this article, Setareh shares her experience as a VFX artist over the past four years, having studied at the Gnomon School of Visual Effects in Hollywood.

We’d love to hear a bit about the early production process for Dark Asset. What led you to choose a virtual set for this production?

Michael Winnick and I discussed whether we should build a virtual location or find an actual one. We talked through the benefits, and ultimately, it was both cost and time savings that led us to build a virtual set. We created two environments for use on LED walls, and two other scenes that were shot entirely on green screen.

A virtual set provided a host of other benefits due to the nature of working in real-time with Unity. With previs, we could discuss which shots would be CG, which would be on green screen, and how we were going to storyboard and decide on shot angles instantly – all done over Zoom.

How did using real-time assets affect the dynamics on set? 

With Dark Asset – and indeed, with other projects that I’ve worked on, such as the music video “Breathe Free” for the artist Shani – using Unity meant that key creatives like the director, DP, production designer, and VFX supervisor got to see the virtual set and interact with it directly in real-time, which made things more efficient. Both communication and direction were clearer.

We could also do the keying on set and composite the virtual background to make sure the green-screened characters fit. All of this helped give Michael and the rest of the team a lot of options, and meant that we could remain close to our creative intent.

It helped the actors as well. In cases where a green screen was used instead of LED volumes, we could use the same tools to provide a real-time preview composite of the actors on the virtual stage, and also give them a preview of camera moves.

All the lighting and virtual set dressing could be adjusted interactively, which was so much easier than the old way of going back into NUKE and figuring it out. It saved us a lot of time in post-production and was a bit of a lifesaver; especially after having done all the previs work. It meant that everyone was much more aware of what was going to happen on set.

That sounds like a big change for the better. Can you take us through a scene that you created using these real-time assets?

There’s one scene where our female protagonist is sitting in an office, and I created a 3D background that I built in Unity. I could match the perspective of the camera and integrate real lighting with the green screen to make it look real.

I had so much fun working with Michael to design the office style he wanted for the scene. I could just grab some assets and materials that I already had in Unity and lay out the office, while Michael directed me to get the look he wanted.

Source: Unity Technologies Blog

0 0 votes
Article Rating