VOL. 1 · ISSUE 17 · APR 25 2026THE DISPATCHInstagramTikTokYouTubeX
Geeks of ColorGeeks of Color
NEWS

How Wētā FX Turned up the Heat for ‘Avatar: Fire and Ash’

As it turns out, working in animation can literally be like rocket science. If there is one film franchise in recent, if not all, of film history that is defined by it’s special effects, it’s James Cameron’s Avatar. Though Cameron had the story in his head for years, he had to wait for filmmaking and […]

Matt Fernandez
Matt Fernandez
10 min

As it turns out, working in animation can literally be like rocket science.

If there is one film franchise in recent, if not all, of film history that is defined by it’s special effects, it’s James Cameron’s Avatar. Though Cameron had the story in his head for years, he had to wait for filmmaking and special effects technology to catch up to his vision before he even made the first Avatar film in 2009. As more film in the series have been made, like this year’s Avatar: Fire and Ash, the technology behind the films has also continued to advance.

“James Cameron and Joe Letteri, who is our most senior effects advisor, have a very good eye for what’s right or not, which has helped push this technology further,” said John Edholm, Staff Research Engineer and Senior Team Lead at Wētā FX.

To meet the needs for the demanding number of shots and technical complexity of the special effects involved in the Avatar sequels, The Way of Water and Fire and Ash (which were originally planned as a single film), the special effects teams at Wētā FX developed a program called Pahi, named after a type of Tahitian canoe. Previously, animators were limited in their ability to simulate character interactions with water, according to Alexey Stomakhin, Principal Research Engineer at Wētā FX.

For example, an animator would have to model how water interacted with a character’s hair separately from how it interacted with debris in the water, and they would have to jump back and forth between simulations for every adjustment. Pahi enables artists to run all those simulations simultaneously.

Image credit: Wētā FX - Avatar: Fire and Ash
(Image credit: Wētā FX)

Avatar: The Way of Water won the Oscar for “Best Achievement in Visual Effects” in 2023.

Though they work in animation, Stomakhin and Edholm aren’t animators or artists; they’re scientists and researchers who just so happen to apply their expertise towards entertainment. Stomakhin compares his work to that of a structural engineer, who uses equations and builds models to determine a building’s structural integrity.

“We do the exact same things, we solve the same equations and use basically the same methods, but what we’re interested in is when [a building is] going to fall apart,” Stomakhin said. “We want to see them happen the most spectacular way, we want them to crumble and see dust and parts that would break down the most natural way. But under the hood, it’s the same exact equations that that computational physicist or engineers would use.

“But the focus is different. For engineers, the focus is on accuracy because lives depend on it…For us, it’s less about realism or accuracy. We call it believability. We can simplify models somewhat so that simulations don’t take forever to complete, because obviously we need to [finish] the movie on time and but it needs to look believable to the viewers.”

As the Wētā team transitioned to the third film in the series, they found they needed to develop an easier way for animators to create impressive yet realistic-looking fires so the film could live up to its title, Fire and Ash. Prior to their work, Stomakhin said that animators would try to use easy hacks to imitate the physics of fire.

“A simple way to make a fire is make a couple of heat fields, make a velocity field, and then trying if you can use some very, very simplistic physics to have heat, or temperature evolve in some kind of velocity field,” Stomakhin said. “You’ll get something that looks kind of wispy and looks like fire, but it’s not exactly correct.

(Image credit: Wētā FX)

“[This method is] simple, it’s artist friendly,” he continued. “It’s very art-directable, as we say. So there’s a lot of knobs that you can tweak and make it look exactly how you want, but it doesn’t give you physics. A lot of artists end up sitting there and tweaking the look and trying to achieve the physics where, you may think, ‘Well, that’s the job of some programmed to do that kind of thing, they should be focusing more artistic tasks.'”

Enter Kora.

Like Pahi, Kora is a Polynesian-named physics simulator designed by Wētā to help artists create visual effects simulations more accurately and efficiently. Where Pahi is used for water effects, Kora deals with fire.

Stomakhin, Edholm and the rest of their team applied their knowledge of fire physics, thermodynamics, chemistry and literal rocket science (one of their bosses at the time was a consultant for SpaceX) to build a program capable of simulating the most realistic possible combustion models.

“It produced fires that looked like nothing we’ve seen before,” Stomakhin said proudly. ” That was kind of like a big breakthrough around around.” However, that success presented an unforeseen issue.

“What we quickly discovered was that we can run it because we understand chemistry,…artist can’t,” he said. “It’s really hard to explain to them what it is happening underneath, and most importantly how you control it.

“The simple request: ‘how do you make the fire taller?’ It’s not obvious. What does it exactly have changed? Is it more fuel, more oxygen? Do you have to crank up the ignition temperature?”

Before being used in Avatar, the combustion-solving software was tested out on Kingdom of the Planet of the Apes. While working on that film, Edholm found that although the program could simulate physically accurate flames with multiple variables running simultaneously, artists weren’t using it because it was difficult to operate.

(Image credit: Wētā FX)

The solution was to create an artist-focused interface as an additional layer that provided animators with more intuitive controls and presets, enabling them to use the combustion and thermodynamics model much more easily and on a scale that allowed them to complete the thousands of shots required for Fire and Ash.

“These [simulations] that you see on screen are typically run for eight to 10 to 20 hours,” Edholm said. “Until [users] build up the intuition of what parameters to tweak, it can be hard to do a sim, get some feedback or notes on it and then need to know what you need to change in order to achieve that [result]. And then you need to wait overnight until you can actually see the results. One of the biggest challenges we were trying to solve was making turn-arounds quicker.”

Stomakhin and Edholm said that developing Kora was a very unique project for them because of how involved James Cameron was.

“James Cameron really likes details, and he’s a bit of a physics geek himself, so you could actually talk to him about about physics,” Stomakhin said. “I think a lot of the respect comes from the fact that the more physically accurate you make everything about a shot, the more immersive the experience of a person watching the movie would be. And you may not realize it, right?…We want people watch the movie without realizing that this is all computer generated.”

One of the effects that Kora was used for in Avatar: Fire and Ash was a flamethrower. According to Edholm, because digitally generated characters interact with the weapon, Cameron couldn’t just use a real one in the film. A flamethrower works by shooting liquid fuel out of the weapon, which then mixes with air, vaporizes, and ignites. To achieve a believable effect in a movie, artists have to animate the entire process from beginning to end, which used to be long and frustrating. But not with Kora.

“We did our first flame thrower prototype for Avatar 2, and though there was just a couple, and those looked super real, they took forever to simulate. There was a lot of frustration on for artists to use that system,” Stomakhin said. “So we actually broke it down into smaller components. We’ll let them simulate the stream separately and tweak that to perfection. And then we would set it on fire as a separate layer. So that was a way to simplify the process.”

Another prominent effect that Kora was used for was a fire tornado. Thankfully, for everyone not working on this project, fire tornadoes are not a naturally occurring phenomenon.

“First of all, obviously nobody watches fire tornadoes on a regular basis in real life, so there’s a little bit of leeway in terms of what we can achieve,” Stomakhin said. “It is something a little bit out of a fantastical or otherworldly domain. It doesn’t necessarily need to look super real because nobody would be able to tell. [However] it should look like a little fire, and so we couldn’t completely throw the physics out of the window.”

The team ran tests in Kora like dumping a bunch of fuel onto a fire to create an intense updraft of air. The team experimented with multiple variables, such as planetary spin and air buoyancy, to determine how a flaming, spinning column of air could theoretically be created. When those tests didn’t work, they “cheated” the physics.

“We just, you know, cheated the gravity a little bit…but we didn’t cheat completely, it was still physically inspired,” Stomakin said. “It looked realistic enough that the director respond really well to it and think the viewers like it as well.

“It’s kind of like this delicate game you play all the time with visual effects, because on one hand, you want physics, but if you just do straight physics, it’s usually boring and oftentimes it also hurts storytelling,” he added. “Well, the story wins, but also you want it to be believable. And so it’s just kind of like a push and pull process.”

Recently, Gore Verbinski, best known for directing the first three Pirates of the Caribbean films, said that modern CGI and visual effects do not look as good as those in films from the early 2000s because of the widespread use of the video game software Unreal Engine rather than dedicated filmmaking tools.

Though Edholm and Stomakhin say they are impressed by how far gaming technology has come in recent years and how successfully it has been used on shows like The Mandalorian, gaming software like Unreal Engine is used for generating images for real-time performance, so there might be some corners cut or compromises in quality made with that goal in mind.

“The business we’re in is we want the highest realism possible, so we don’t mind something running eight hours to see the first frame render,” Stomakhin said.

(Image credit: Wētā FX)

And what about artificial intelligence? Generative AI is under fire from much of the entertainment world for eliminating jobs, for churning out soulless, generic, and flawed content that some refer to as “slop,” and for the devastating environmental effects of data centers and server farms. Cameron himself has been very vocal about his distaste for the technology and banned its use in the Avatar films.

“It’s something we think about a lot as well. I mean, it could be that it will just become another tool for the VFX artists to use,” Edholm said. “I think that’s sort of how we’re trying to approach it…combining it with our more traditional kind of tools.”

Stomakhin thinks that the future of generative AI in entertainment lies in finding where it will be useful. He argued that AI exists on a spectrum of controllability and usefulness, and that even simple tasks like linear regression could be considered AI. Currently, he thinks the biggest issue with using generative AI for artistic purposes is that it’s difficult to control the results, and the programs will only give you about 80 percent of the prompt.

“We use [AI] to some degree, but…more like as an assistant at the moment, and not like something that will fully do the work for you,” Stomakhin said. “It’s sort of the right way to approach it, because you don’t want to take artistry out of it. You don’t want the product to be responsible for that. You still want artistic talent, people to be responsible for the for the vision, for the artistry.”

The Wētā team received eight nominations for the Visual Effects Society Awards and won seven of them, including “Outstanding Visual Effects in a Photoreal Feature” and an “Emerging Technology Award” for the Kora Fire Toolset. Avatar: Fire and Ash has also been nominated for the Oscar for “Best Achievement in Visual Effects.”

“It’s been many years put into this and to get some of that kind of recognition, it’s very honoring,” Edholm said. “It’s quite a bit disconnected from being in Hollywood since most of Wētā is in Wellington, and they have quite a few of us being remote. So it’s more of a technology job than a film job…but also working on these massive film projects, that’s really cool.”

“I think for us as scientists, it’s quite exciting,” Stomakhin said. “Just seeing the movie for ourselves for the first time is super awesome. Our typical day is just looking at the screen, doing a lot of coding, looking at some plots and whatnot. A lot of it is very mundane, not very visual research at times, and it can get quite daunting.

“But then at the end, where it all kind of comes together and turns into this beautiful image, that’s where most of my excitement comes from. That’s why I love this job. Because, at the end of the day, it’s using scientific methods to produce something visually appealing and beautiful.”

CONTRIBUTOR

MORE FROM NEWS.

Pieces from the same beat.

END.
Next piece →

Get on the list.

New drops, reviews, and interviews. Once a week. No filler.