‘Avatar: The Way of Water’ VFX Supervisors Talk Pandora, How Kiri Was Created & More – Interview
Avatar: The Way of Water is arguably the biggest visual effect-driven movie of the year. While upping the scale and epicness of the first Avatar movie, the sequel strives to push the boundaries even further. One of the many ways they achieve that is through the wonderful team at Weta FX. I had the opportunity to speak with a few VFX supervisors for The Way of Water and discuss their process.
Firstly, I spoke with Wayne Stables and Dan Barrett. Stables is the FX Supervisor who oversaw the jungle scenes, and heavily credited Production Designers Ben Proctor and Dylan Cole for establishing the world of Pandora through their gorgeous concept art and designs. One of the many cool things he pointed out during our conversation was the fact that all of the green plant life in the jungle reflects off Na’vi skin.

Stables said, “When we got sunlight reflecting off the greener, the blue Na’vi are bouncing and absorbing that light. So those greens and blues are everywhere…and a lot of the purple plants that were personal splashes from James Cameron himself.”
Stables further detailed how many of those colors seen in the forest can also be found in the ocean, truly connecting the ecology of Pandora’s world. Of course, that parallel is clearest when Kiri plugs into the underwater “Cove of the Ancestors” which mirrors Jake’s first interaction with the Tree of Souls from the first film.

Amongst other interesting Cameron touches, were the motions of certain plants. While most were relatively static, some plants interacted with the characters. Stables described a shot with the youngest in the Sully family, Tuk, putting her hand inside a flower.
“James liked when Tuk (Trinity Jo-Li Bliss) reaches for a flower on set,” Stables explained. “So we animated it a little bit when she touches it. It’s a beautiful moment.”
That blend of live-action and animation is where Barrett steps in. Barrett is the Senior Animation Supervisor of The Way of Water, which makes him responsible for translating what was captured on set into the digital world. That includes movements, water simulations, and facial performances for the actors. Through Weta’s specialized performance capture technology, the VFX team of the first movie was able to pull surfaces and textures from the actors’ faces and translate them to the Na’vi. Improving upon the first, this sequel utilized a new “deep-learning” system that registers muscle movements underneath the surface.
“We hadn’t decomposed the face to sort of truly understanding what all of the muscles did. So this new system, that’s essentially what we did. You know, 170-odd muscle strains within the face,” Barrett said. “We had detailed actor puppets, so we could, you know, see what the actor looked like in a 3D space once we had the performance and then transferred those across.”
Breaking new ground on visual effects technology is nothing new for the team at Weta FX. But pulling off a massive project like The Way of Water requires all hands on deck. Since they shot on both virtual cameras and inside a large “Volume” LED studio, a lot of the animation needed to be prepared ahead of time. Making this movie required a unique merger of pre and post-production.

Barrett said, “I think effects artists tend to hate animators and vice versa. So, very early on, we sat down and knew that we’d have to work very much hand in hand, that our workflows would need to be parallel.”
Collaboration was key to making this movie work. That was a sentiment echoed by The Way of Water Visual Effects Supervisor Pavani Boddapati who said, “Visual effects is not a solo effort. It is a group effort. There is no visual effects that are done by one person sitting at a computer. It is a whole bunch of people collaborating.”
Boddapati started working with Weta in 2009 during the first Avatar movie and was largely responsible for the sequel’s Metkayina/Reef Village location effects. Before jumping into the computer graphics world, she attended architecture school at the Academy of Art in San Francisco. During that time, she took a few film classes that taught her the general principles of lighting and cameras. After her CGI-filled thesis that she shot in her home country of India, visual effects companies began banging at her door. Thus, when she got the call for the first Avatar movie nearly 14 years ago, she moved to New Zealand and is a proud female visual effects supervisor at Weta FX to this day.
“I remember my first day at Weta, I was in the kitchen, just standing there making coffee. And I can hear at least 12 – 15 languages, from all over the world,” remarked Boddapati.
Just a few years later, Boddapati would reconnect with her former Rhythm & Hues colleague Johnathan Nixon at Weta FX. Nixon grew up in Detriot, Michigan, and graduated from Savannah College of Art and Design. His buddy Aaron Ramos encouraged him to leave the United States for the first time and come to New Zealand in 2014.
Speaking to two people of color heading the Avatar visual effects team felt very inspiring. As an inspiring filmmaker, I was curious about their experiences in the industry. To my surprise, Boddapati and Nixon both described the Weta workplace enthusiastically.
“It’s a pretty diverse workplace, which is awesome,” Nixon said. “You get to meet people from not only different colors and nationalities, but just different walks of life… different economic status. That barrier isn’t there at Weta.”
While Weta’s worldwide reach connects effect artists from all different parts of the world, everyone on the team shares a love and passion for visual effects. They open their doors to new recruits every day, thanks to their “Assistant Technical Director” program. This is a Weta FX on-the-job training program for young people interested in any field of visual effects. And because you don’t need any former experience to participate in this program.
Nixon said, “[Pavani] has an architectural background, I’ve got a fine art degree. There’s people with computer science degrees. There’s people who were dentists before. It’s all people from different backgrounds who just gravitated to the filmmaking process. And the community around visual effects has grown so globally that if you’re really passionate, you can pick it up and learn it. You don’t have to have a physics background or be a wiz at math. You don’t have to have some high-powered degree or spend a bunch of money.”
While many young and talented artists found their footing at Weta during The Way of Water, a solid group of Hollywood veterans was still steering the ship. Senior VFX Supervisor Joe Letteri has a career that dates all the way back to Cameron’s first underwater adventure The Abyss. He has a total of four Academy Awards with Weta FX and has worked with a number of A-list directors. When I asked how James Cameron’s approach is different, he said, “It’s the engineering approach. You know, Peter (Jackson) loves visual effects. Steven (Spielberg) loves visual effects. But Jim also likes the engineering aspect of it. He wants to know how it works under the hood. And if you have a problem, he wants to get in and help solve it.”

Amongst the many new inventions created to make Avatar 2 was their new robotic “eye-line” system – a pre-programmed monitor propped on the head of a 9-foot-tall Na’vi stand-in on set. This allowed the human actors to play physically opposite CGI characters in the same scene.
Letteri said, “His capture could be played live on the scene on a camera that was basically on, like, a Spidercam rig that we moved around the set so that the actors, like Jack Champion who’s playing Spider, could actually react to Steven’s performance. It was a one-to-one performance live on set.”
This system helps integrate the pre-shot performance capture with the live-action production by adding context to each character’s relationship to the space. They even came up with the idea of “three-dimensional depth”, where every pixel in the image has its own depth. And because they had a live, real-time video feed of what the visual effects will roughly look like finalized. For example, when Edie Falco’s character interacts with Stephen Lang, she acts next to an actual rendered-out look at a blue, recombinant Colonel Quaritch.
“So we can see in the live (rendered visual effects) and combine it with the performance capture, combined with the eye line monitor, now the actor knows where to look. So in real-time, you see the actor looking at the correct place in space, with everything combined graphically correctly,” Letteri said.
This creative mixture of practical and computer effects adds texture and depth to each scene. When I asked Richard Baneham (Lightstorm’s Visual Effects Supervisor/Virtual Second Unit Director) what practical effect stood out the most, he talked about bringing the underwater creatures to life.
“We had a couple of rigs. One of them is essentially an underwater raft that has a very flexible neck that was steered by one performer and ridden by another one. It was able to give us movements and some gesture behaviors for the creatures. It was used for Skimwing and Ilu, but mostly for Ilu. This is a practical effect that allows us to bring a digital effect to life,” Baneham said.

A film with as many technically complicated as The Way of Water required a lot of problem-solving. When I asked Baneham which major effect stood out, he mentioned the challenge of translating Sigourney Weaver’s performance into her teenage character Kiri.
Baneham said, “When [Sigourney] was younger, she used to habitually sit with her jaw slightly forward. It was one of those things that we couldn’t tell without looking at references when she was younger. Bringing that to the character really made it feel life-like.”
The end result is a convincing, believable performance from Weaver and from all the members of the cast. In my Geeks of Color review, I complimented how natural the young child performers felt in their roles. Now we know why.
Every technique developed to make this movie served the filmmaking and the story. Most of these details are indistinguishable to the average viewer but matter towards making something that feels real, true, and authentic. Everything was in service to the audience, from the “deep learning” facial software to eye-line monitors to 3D pixels.