Job Description: VFX Supervisor
Emmy-nominated VFX supervisor Justin Mitchell breaks down his work on The Nevers, HBO’s fantasy drama set in Victorian London, and explains how new technology will shape television production for years to come.
From Iron Man 3 and Captain America: The Winter Soldier to Justice League and Tomb Raider, VFX supervisor Justin Mitchell has worked on some of the biggest blockbusters to have graced cinemas in recent years. But with credits that also include Lost and Game of Thrones, he is well placed to see how television has grown to rival the big screen over the last decade.
“I started my career in television working on shows like Charmed and The Invisible Man, as well as doing some graphics on television for sports events,” Mitchell tells DQ from his home in LA. “But back in the day, television was a notch down in terms of the calibre of work that was done. It was certainly a lot more prestigious to work on feature film projects.”
What’s changed, he says, is the rise of streamers such as Netflix, Disney+ and the continued evolution of premium cable networks like HBO, which in recent years have been producing work that matches the quality of the most action-packed feature films.
“I’m not sure how much we’re going to go back to the times when we were viewing things in cinema [after the pandemic],” he says. “I’m sure a lot of people will go back to the cinema. But now people have 65-inch TVs at home with good sound systems, we’re really seeing a change in the tide. As far as the calibre of work goes, television is right up there with feature films.”
Working for ScanlineVFX, Mitchell’s most recent television job was on HBO’s The Nevers, a fantastical drama set in a steampunk vision of Victorian London where a supernatural event gives certain people – mostly women – abnormal abilities. Whatever their “turns,” the members of this new underclass find themselves in grave danger, leaving mysterious widow Amalia True (Laura Donnelly) and brilliant young inventor Penance Adair (Ann Skelly) to protect and shelter these gifted “orphans” from the brutal forces that want to annihilate their kind.
From the outset, the visual style of The Nevers was “certainly ambitious,” says Mitchell, with physical sets being digitally extended and backdrops of Victorian London being added into certain scenes. The impact of the pandemic, however, placed unforeseen demands on the show’s VFX team. Crowd sequences due to be filmed on location with a large number of extras suddenly had to be scaled back, with the scenes moved to an interior studio and the smaller number of extras used then digitally cloned to make it seem like there were more people there.
But arguably the most challenging sequence in the series was one in which villain Odium (played by Martyn Ford) walks on water during a fight scene with Amalia. The sequence features in episode three, Ignition, which earned Mitchell and the show’s VFX team an Emmy nomination for Special Visual Effects in a Single Episode. In competition with Star Trek: Discovery, The Crown, The Umbrella Academy and Vikings, the winner will be announced next month.
“It was very challenging to devise a unique way of showing somebody walking on water,” Mitchell says. “We’ve all seen the gag where they put Plexiglass under the water and someone just walks across the surface without affecting the water in any sort of supernatural way. But we really wanted to invent something unique that hadn’t been seen before, so that was a real challenge.”
To find their unique effect, the team studied small insects that can walk on water to see how they affect the water around them. “Surface tension is a property that exists at that scale that doesn’t really apply normally to humans. We obviously sink right through the water normally, but we took those ideas and then scaled them up,” Mitchell says.
“Odium, the villain who can walk on water, then bent the surface of the water the way an insect might when it walks on the surface of water. It was really like mixing different techniques and different disciplines to achieve that effect. There was a lot of cloth simulation involved that drove the fluid effects and added the details to the shots.”
Upon joining the project, which debuted on HBO in April, Mitchell teamed up with fellow VFX supervisor Johnny Han and VFX producer Jack Geist to develop their ideas for the show. For the water sequence, they had already done some testing with walking on trampolines and wet surfaces to figure out how best to shoot the scene, with Mitchell supporting their endeavours. He then travelled to the UK during filming, part of which took place in a water tank at Pinewood Studios – the home of James Bond and many Star Wars films.
“We had the actors and some stunt people in harnesses fighting in the water and then in a green-screen tank environment, which was then replaced with the park setting also shot in the UK,” he says. “I was there at the tank stage helping to supervise that with Johnny and then also on location when they were shooting the actual environment.
“A lot of what happened in the actual environment was shooting tiles or plates of the park that we would then stitch together to create a 360-degree panorama of the environment that was used as the backdrop to replace the green-screen footage that was shot on the tank stage. I was really involved early on, from the design stage through shooting, and then obviously in post-production as well.”
The team were also responsible for the concept and design of the Galanthi alien spaceship, which features in episode one.
“I always enjoy that, conceptualising and working with the filmmakers to work on the design of certain elements,” Mitchell continues. “That was a key element. It’s a unique and unusual spaceship design, something we haven’t really seen before, which melds together creature-like design elements with glass and light energy.
“Then, of course, there are the spots of light that radiate from the Galanthi during that episode that ultimately fall down upon the people of London. That was also an interesting challenge to create simulations of scale, as some are seen very close up and others are very broad.”
Another standout effects sequence from episode five, Hanged, features Su Ping Lim (Pui Fan Lee) jumping off a building and using shields created by Nimble Jack (Vinnie Heaven) as stepping stones to the ground. That was also shot on a green-screen stage, with Lee wearing a harness as she jumped down some real steps that were also covered in green fabric.
“They had built the bottom part of the stage, but we essentially created a digital replica of the stage and then extended the top half of the environment,” Mitchell says. “Everything you see in the background was either a fully digital creation or built out of various photographic elements, like the crowd that was replicated in that shot.”
With a single VFX shot sometimes taking several months to complete, depending on its complexity, multiple departments are involved in the planning stages and then across pre-production, production and in post-production.
“There are a lot of people who have to touch a shot at the end of the day. It’s a very time-consuming process,” notes Mitchell, who started out studying acting at the University of Southern California before switching career paths into VFX, a field that has advanced at an astonishing rate over his 20-year career.
“We’re certainly doing things today that were not possible when I first started working. Every year, the evolution of the technology is amazing. We feel like we’ve seen everything, but I don’t think that’s the case. We’re destined to see really amazing things in the future, particularly with the advent of artificial intelligence, which is going to be a real game changer for content creation. We’re probably a decade or so away from really seeing that take hold.”
While Mitchell agrees that anything that can be imagined can almost certainly be created already, he believes AI will help to create a larger number of effects more quickly and at greater cost efficiency, which is always a sticking point no matter how big a production’s budget may be.
What’s more, “we’ll see the creation of truly digital people and characters where we might even see the creation of digital celebrities, which is hard to imagine,” Mitchell says, “but I really think that that will be the case. You think about the cost of paying a star actor – there’s a lot of incentive to create digital characters that might be more cost effective.”
When creating visual effects, Mitchell likes to imagine each one as a short story. Rather than a building simply collapsing, the movement will be broken down into beats, rather like the story in a script, beginning with a hairline crack in one part of the structure, for example.
“Then that opens up and we see there’s a supporting structure that has become compromised and a metal girder is starting to bend. Then that leads into the whole top part of the building starting to topple over,” he says. “There are certain moments where you lead the audience in the same way that an actor might with a certain part of the dialogue or certain actions.”
But while the use of AI Mitchell described might still be several years away, what is going to become more prominent in the near future is virtual production. Already in use on shows like Disney+’s Star Wars series The Mandalorian, virtual production is when a digital environment is created before shooting – rather than afterwards – and scenes are then filmed on a stage surrounded by LED screens as a computer changes the background environment in real time, as if the actors were really in a particular location.
“In that case, the environment can be changed in the computer at a moment’s notice and the actors are standing inside this environment. Rather than trying to imagine they’re on a sci-fi set or on the surface of a planet where a spaceship’s landing behind them, they can actually see it and then react to it,” Mitchell says.
“That’s something we’re really going to see a lot more in the near term because it allows a lot of control [that you don’t get] with traditional filming. Being on location, weather and crowd control and all of those sorts of things are real prohibitive issues when it comes to filmmaking.
“That’s a direction we definitely think has a lot of merit and is going to be important. You can basically change from one location to another in a split second, and that’s pretty remarkable. You get all of the interactive lighting from the screen as well, so rather than the actors being cast in green light because the lights are bouncing off a green screen, they actually have the light emitting from the appropriate environment, whether that’s a city street or surface of a planet.
“It really adds to the integration in the shot. And you’re capturing everything in camera, so all the nuances you get from shooting something in camera, whether it’s lens flares or light wrap, it all comes for free. Those are the things we painstakingly work hard on in visual effects to try to recreate when somebody is filmed in front of the green screen.”
When Mitchell started in visual effects, an element such as ray tracing – creating realistic reflections or refractions on objects – wasn’t possible, and then when it did emerge, the technology was prohibitively expensive. Today, you have to stay ahead of the technology or risk being left behind.
“Now, those sorts of effects are considered run of the mill and ordinary, and we’re doing things that are far more complex with thousands of characters on screen at once, complex water simulations, global illumination and other more complicated lighting techniques,” says Mitchell.
“Being on top of things is critical. Otherwise you fall behind the times. That just doesn’t do.”