Talking Virtual Production with Shutterstock Studios’ Drew Weigel

Jesse Radonski Virtual Production

Has the thought of skyrocketing location fees or unpredictable weather ever dampened your filmmaking spirit? Virtual production offers a valuable solution. Whether you’re the director of photography or a 3D artist creating real-time 3D assets for film and TV, this filmmaking technique has transformed the way visual content is created. And companies are already reaping the benefits.

But how much time and effort does it take to adjust to the workflow? Probably not as much time as you think.

Shutterstock Senior Director of 3D and Immersive Drew Weigel sat down with us to shed some light on Shutterstock Studios’ virtual production capabilities and how TurboSquid makes it easy to create custom projects without having to burn time and money on travel to faraway destinations.

If you’re unfamiliar with the name, Shutterstock Studios is a global collective of production and marketing specialists from every sector—commercial, TV, and traditional production to 3D and beyond.

Can you tell us how TurboSquid factors into the creative and logistical decisions for your virtual production?

DW: So often on commercial production schedules, we’re in the race against the clock and need the highest-quality assets to build out something that fits within our timeline and budget. You see, having a quality library to tap into for pre-render and real-time assets is pretty essential. TurboSquid is a necessary piece of the virtual production puzzle, whether we’re combining things or using its library to augment things we’ve already built.

How much time does Shutterstock Studios have to complete a project?

DW: I’d love to say that we work on a project for months, but the reality is we generally get about four weeks of preparation. It depends on the project of course, since building a set extension of a house for shooting XR is a much smaller task than building an underground cave.

Larger environments and fantasy worlds take a lot more time to create, so I’d never promise an alien world on that type of timeline. Right now, we’re working to make our clients feel comfortable with virtual production technology, leveraging anything that makes us more efficient—efficiencies like stock assets for set dressing, existing environments, and things that can convert easily. If we can easily change the color or texture of an object, that’s a good thing to keep in our asset library.

Does the TurboSquid library factor into any of the testing phases?

DW: We use a lot of assets for pre-visualization. When you need to quickly composite and build out a scene, I believe it’s essential to prove an idea before you even get into production. During this phase, we’re gauging if we have the right type of real-time assets or if we need to create a custom model for a plane, train, automobile, or whatever.

Regardless if we’re working natively in Unreal Engine or if we want to set up mock environments in Cinema 4D or Blender, building a scene using TurboSquid assets is extremely useful to have a full picture of where we’re going once we move into production.

Can you explain what Shutterstock Studios’ asset selection process is like for virtual production?

DW: We do so many different types of projects with commercial executions on XR, it really depends on where the virtual setting is located. Geography can be a major component of a virtual production project, so some assets already exist on TurboSquid; however, there’s still a lot of customization to be done to ensure the scene matches the look and feel of the brand.

We’re currently working on a project situated virtually inside a vast limestone cave. Some of the topologies are easier to generate using in-engine tools, but we always need to populate a scene with props and there’s no better place to start for that stuff than TurboSquid.

How often do you modify models?

DW: We frequently modify models, but how often depends on the project. Some assets can more or less be used right away, such as foliage and rocks. But we may need to change the textures or materials on other objects. Sometimes we use TurboSquid assets to inspire a custom model we’ll build.

One thing we enjoy doing in virtual production is building our sets specific to the brand we’re working with. For example, if we’re creating an extension of a house, we can take existing models and update the materials and textures to match the brand’s visual identity. So if we’re working on an ad for Delsym, we can easily create an all-orange interior; likewise, if we’re creating a living room set for Lysol, we can accent it blue like the brand colors. Having that level of flexibility and control enables us to do art direction in ways that previously would have been much more difficult to execute.

TurboSquid essentially became our virtual prop house for the Shutterstock Studios art department. Plants, tables, basically anything available for commercial use is something we’re able to use and build out these scenes. It saves us so much time.

How long does it take to swap in and out models when you’re on set?

DW: If they’re real-time assets or even if they’re quality FBX files, usually we can simply download, drag, and drop it into a scene on the fly. It depends on what we’re attempting to alter, but we try to have alternative assets loaded up already on set so it doesn’t take long.

Having a library of different options preloaded so we can quickly swap them in and out is essential. However, we’ve had some last-minute changes where we jumped on TurboSquid to get some curtains or art to add to a scene.

Virtual production offers some flexibility but just like traditional live-action filming, any decisions made in advance are extremely important when you’re on set. This preparation enables you to focus on the good stuff like direction and lighting.

It sounds like lighting is a big part of virtual production.

DW: One hundred percent. We often go on set the day before and pre-light the scene to calibrate the foreground and background lighting. We needed to make sure everything within the Unreal Engine scene was projected on the wall uniformly because it’s the fine details that take this from feeling like a green-screen comp to something more integrated. It also helps the on-set talent feel immersed in a scene, no matter how fantastic the setting.

To me, it’s best when you treat 3D lighting like physical, real-world lighting. So when we’re working on set with a director of photography (DP), there’s a shared language and it doesn’t have to be people butting heads between what’s real and what’s not real. I don’t think it was as easy to bring in a DP into a fully pre-rendered environment when working on fully utilized green screen shoots.

Do you have any other tips for working with 3D models on a virtual production set?

DW: It’s always going to save time to test your assets beforehand and come prepared with them already loaded onto a device. I don’t recommend going to an online library to search for replacement assets when you’re under a tight deadline for an on-set shoot, but it’s doable if absolutely necessary. It’s just a matter of knowing to search for assets in universal formats that translate to real-time 3D models with clean textures. Usually, we have either a digital asset manager (DAM) setup or some sort of quick-access storage that we can pull things from depending on what we need to do.

Aside from having equipment to download and use the assets we’re using, other quick-fix things we’ve used are Megascans to change textures or interiors on the fly.

Because we do a lot of heavy previz, we don’t get a ton of requests to change a car into an airplane, or anything that drastic. We try to get all of those major things approved ahead of time so that fixes on set are more creative, such as if the blocking and lighting are right. Should we remove those curtains? That’s where we try to focus when we’re on set.

Are there any technologies or techniques you’re excited about trying in virtual production?

DW: I’m very interested in trying out some live mocap options. I’ve been talking to Zero Space recently about their ability to volumetrically capture talent and then put them into environments on the fly, so you can get different shots offering flexibility with how you shoot your featured and background extras.

One of my favorite things about this type of wall setup is that you can rotate an entire scene 180 degrees and turn your actors 90 degrees to get a whole different set of shots. That sort of spatial Rubik’s Cube thinking is a lot of fun and practical without needing to set up a bunch of different shots within a single location.

What do you think Shutterstock Studios’ clients want from virtual production?

DW: Virtual production is still a new technology, so I believe trust is a big part of what they need. The way we earn their trust is by being upfront about what the technology can and can’t do, and helping them visualize that early and often. Then we make the previs process feel familiar, similar to how our fully live-action production processes are.

We get the director and storyboard artists involved so we can introduce everyone to the real-time technology, moving them through the world and giving them a sense of what will be projected onto the wall. It’s technology and things can go sideways, so this relationship building makes everything easier in the long run.

When you’re beginning work with a new client and building that trust, do you usually take key stakeholders through a virtual production set?

DW: It’s totally helpful because I believe many people think they understand virtual production in the abstract or have seen videos of it, but showing people the way the perspective can be distorted gives everyone an understanding of how you can only trust the camera lens.

If you only use a monitor to guide people through an environment inside a real-time game engine, it can look like you’re walking through a video game. It’s not an incorrect assumption by the way … it does look like that. What changes is the incorporation of lensing, lighting, and cinematography—things that we know from 125 years of making movies—that bring it all to life. Showing people new to the process of how virtual production works on a stage is very eye opening.

Is there anything else you’d like people to know about using 3D models in virtual production?

DW: One of the cool things we’ve been getting into is spatial capture like NeRFs and Gaussian Splatting, creating virtual environments from existing photography and video.

But really, the ease of using the existing assets on TurboSquid is something we don’t take for granted. It was always true in 3D production but I think based on the time of trying to match commercial production schedules to XR, which is more time and labor intensive in the previz part, having TurboSquid as a sort of toolbox makes our virtual production efforts so much more possible.

Want to learn more about Shutterstock Studios and its virtual production capabilities? Head to studios.shutterstock.com.


Want access to the 3D models the pros use?

Shutterstock Studios uses many of the TurboSquid models you’ll find today.

SIGN UP NOW