Since Unreal Engine would be used throughout the creation process, Magnopus made use of its built-in virtual scouting tools to get their cameras set up so they could test the lighting before diving into the special effects. But first, they needed the performance.

The benefits of virtual production for music 

Unlike most motion capture shoots, where everyone could be together, The Madison Beer Immersive Concert Experience was a remote affair driven by teams across the US. In LA, Madison Beer was in a mocap suit and head-mounted camera. In Philadelphia, Hyperreal CEO Remington Scott was directing her in real-time, using a VR headset that not only allowed him to view Madison’s avatar face-to-face live within the virtual Sony Hall, but adhere to the COVID-19 restrictions that were keeping them apart.Because Unreal Engine operates in real time, virtual productions can use its remote collaboration tools to stream 3D environments anywhere in the world, completely synced across locations. This allowed Madison’s performance to be recorded in one continuous take, with no cuts and no edits, which was important for a team who wanted the performance to feel as authentic as possible. 

After the motion capture shoot was completed and the experience was polished, cameraman Tom Glynn was able to build out the shot selections for the final 9.5 minute performance.

“There are moments where you can’t believe this was done in a game engine,” says Tom Glynn, Managing Director at Gauge Theory Creative. “There’s a 3D world with a performance happening, and it’s happening at the same time that I’m moving a camera around. It’s hard to believe what I was seeing in the viewfinder while I was shooting it. I’m looking at an avatar of Madison Beer and it feels like a real person I’m standing in the room with. It kind of blows my mind.”

Source: Unreal Engine Blog