Using Real Life Environments in a Virtual Production with Brett Danton

Born in New Zealand and raised in Britain, Brett Danton spent the first two decades of his career as a photographer. His interest in photography and moving image has led him to become a director to provide creative, still images and TVCs for commercial campaigns and projects.

Due to the technical similarity of shooting still and moving images and his background as a cinematographer, the director eagerly applies the latest creative technologies – giving lectures on behalf of Canon and helping the camera company develop its workflow, equipment and technology. His work, represented by NM Productions, includes campaigns for automakers Jaguar and Land Rover and music videos for pop-rock band Bastille.

Brett, fascinated by virtual production after recent experiences with virtual production, sits down with LBB’s Nisna Mahtani to share his thoughts on the industry’s two leading production softwares, Omniverse and Unreal Engine, and how real Environments to support the virtual process.

LBB> When did you first get into virtual production?

Brett> I was thrown in at the deep end on a first job at Hogarth and WPP where they wanted to see where they could take virtual production and what they would see in the future. That was interesting because we decided to go to Scotland and scan the forest with LiDAR and we were trying to see what kind of resolution we could bring to a virtual world.

We looked at two different cases. The first was whether it could function as a purely digital environment and be used as a background within volume levels. The second and final thing we realized – and it doesn’t matter how good your digital environments are – is that you still need traditional in-camera image backplates. So we had to figure out how to work backwards to make the digital assets work with traditional assets.

I had never done 3D work before, but we used Omniverse, a path-tracing renderer where the reflections are lifelike, and that’s what I’m working on today. We scanned an entire forest and nobody had done it before! It took four weeks to open the files, but we could see where the potential lay.

If you look at most worlds that are made purely out of CG, they’re overkill. So this technique also works if you want to use it as a background reference because all of a sudden you have a background that’s real, it’s organic and you can put real, real assets over it.

LBB> So the original project you worked on was a car driving through the woods?

Brett> Yes! This was used in videos and a keynote speech at GTC [a tech conference] too, and we were very honored by that. However, if we went back with the Volvo campaign rendering and did it now, it would be 20x better. It was made a year and a half ago as a test for broken software. The software has evolved a lot now, new pieces have been released and the rendering is much better. It’s much more stable.

Also Read :  Using life cycle assessment to reduce climate impact

For me this project was my first time seeing anything and as far as I was concerned the light looked right, the reflections were correct and that was my excitement about it. As a cinematographer, I can then take that scene where the forest exists in Scotland and I can light it like we normally would on set. Better actually, because if I suddenly want to put a 100-meter softbox over it, I can do that in a virtual world.

LBB> What are the most exciting areas of virtual production as you look to the future?

Brett: Suddenly you can do things you’ve always dreamed of doing. Yes, scanning is quite expensive and time-consuming – there’s a lot of stuff to do – but once you’ve got it, you’ve got everything else that goes with it. When you need to turn a car, you need lighting and generators, and the budget goes up and up too. If you need to mount a 30ft softbox over the roof of a car in the middle of a Scottish forest, you can do it on your desktop. Suddenly my desktop has become my whole studio. Originally it felt like sacrificing looks and lighting, but now I don’t feel like that’s happening – the light is reflecting off the surfaces of the cars, for example. It needs to be pushed further, but it seems like every three months it’s getting better and better.

Once the AI ​​can step in and analyze the scene, I’m sure things will improve as well. Currently the software struggles with things like weed, so someone needs to work on telling the software what it is. I’ve also spoken to lens manufacturers who I think should be selling a virtual equivalent of physical lenses so you can use them on a virtual world shoot. This allows you to assign these lenses to a DoP. We said it wouldn’t happen with stills, but it did, so I guess that’s what opens up. It’s also that Omniverse includes an aspect of physics, so you can see things like the suspension moving and the car taking off and the camera tracking it.

LBB> Talk to us about the software, you mentioned Omniverse but do you use others such as B. Unreal Engine?

Brett: We jumped back and forth between different pieces of software. Omniverse is the one we’re using at the moment, but we just finished a project for the Metaverse, which funnily enough was in the Unreal Engine. For this project, Unreal Engine was better suited – it’s faster real-time rendering and the technology was able to use frost fringes on an LED wall. So they each have their pros and cons and I personally believe people need to learn both.

Also Read :  POLY headsets: Crystal clear in every call

LBB> Many say you can teach yourself how to use virtual production software. What would you say to that?

Brett> So the work we did with Bastille, I had never used Unreal Engine before. I kind of taught myself how to do it, although I’m sure anyone who actually knew how to use it would look at it and have an absolute heart attack. But it’s out there. The video clip we rendered was finished as a 360 VR piece but we used the same to reproduce another single we made in the city that we built in 360 but released as a normal clip. The music video has been viewed 650,000 times in three weeks – and it’s by someone who’s never touched Unreal Engine before.

I would say both are a steep learning curve. I could see the industry coming here about five years ago and I tried the technology, eventually I gave up. But when the job came, I focused and taught myself. Depending on the job, you only have to create a mock-up, after which specialists in every area will be at your side. There are different ways of doing things depending on your budget.

LBB> When you talk about virtual production, which parts are you talking about? And why does it naturally become a fusion of different aspects?

Brett: That’s what I call a big conglomerate. What we’re trying to do is have a pipeline that goes from real world to virtual production. So the assets we created and what we did for the Bastille play on Unreal Engine came from having the band in a volume stage with a big LED wall and a 1.5 pixel pitch screen to turn against the environment. We’ve had fans live streaming as avatars from around the world, so we chose this screen because we were able to keep that sharpness practical – while it doesn’t come through as sharp at some of the larger spots on the camera.

Because of this, the band could literally reach out and touch fans while performing as avatars. Then, for the VR experience, we incorporated that back into billboards that were built in virtual space and rendered in 360. In short, this has been achieved using all available technologies. If you [the band] wanted, they could give another virtual concert in this city that is on the billboards. The whole idea was to become a music experience and at that point people were really looking for other ways to release music.

Also Read :  Life without internet unthinkable to Koreans: survey

LBB> Things like virtual location scouts and intellectual property talks are now eligible. Can you tell us more?

board> It’s an absolute minefield. The thing is, we’re not using the original material. We could use some nice wide antennas, but you don’t know where in the world those angles are. We can also scan the environment and change things. I can take a mountain and move it to the left, or I can say I don’t like this square and drop it in another square. If you do all that, suddenly your location doesn’t look like your location anymore. I think there are two ways to do things. You can either scan assets and get them yourself, or you can buy a virtual asset — like a virtual Leaning Tower of Pisa — to drop by. In this case, however, I assume that you will have to pay license fees.

However, there is really easy-to-use software out there that lets you go out with your iPhone and do LiDAR scanning, drop it into a 3D world, and hide everything. This was developed for DoPs who want to create a physically accurate model. Yes, it’s pixelated, but it’s good enough for most things.

LBB> What hurdles or aspects are you currently trying to overcome?

board> The biggest thing is the amount of processing power you need. This is a massive stumbling block for most people. The other is getting the different softwares to talk to each other and detect devices, because what we find is that no one has an out-of-the-box solution, so you end up with five different scanners and types. Each software has its own proprietary software and data collection, so what is meant to be a universal format ultimately results in none of them talking to each other. Because of this we spend a lot of time putting everything together and this workflow is one of the hardest things we’ve ever done.

In addition, the equipment was not designed for all types of work. Suddenly you have industrial mining equipment that you want to use in the movie world. We’re working on bringing everything together. Things like Covid-19 slowed down the process because we had to wait for hardware or new gear to show up. Sometimes we have a conversation and then agree to revisit in six months when new hardware comes out.

Source link