Missed the GamesBeat Summit excitement? Don’t worry! Tune in now to catch all of the live and virtual sessions here.
The movie hit U.S. theaters on June 16 and it was called out for taking 3D visuals to new heights. The movie has taken in about $395 million in box office revenues worldwide. It was Pixar’s 27th film since 1995, and once again it involved adopting a new kind of technology.
Pixar, a division of Disney, began working with Vast in 2018, using the volumetric rendering technique in their acclaimed 2020 film Soul. For the next film, Pixar asked if Vast could handle three times more data than it used in Soul.
However, unlike the geometric surfaces and materials used in earlier Pixar projects, the broadly-applied volumetric animation methods used in Elemental — Pixar’s 27th film — turned out to produce six times the data footprint and computational demands than that of Soul.
VB Transform 2023 On-Demand
Did you miss a session from VB Transform 2023? Register to access the on-demand library for all of our featured sessions.
In Soul, when the animated character dies and goes up to the soul world, you notice that the apparition has a kind of glow to him. All the light particles are actually simulated in 3D, at that level of granularity, by Pixar, as opposed to just painting a bunch of dots. They weren’t traditional models; rather they were simulations, said Eric Bermender, head of data center & IT infrastructure at Pixar Animation Studios.
“One of the important things to realize is that it’s the story and the animation that dictates our pace of innovation here at Pixar. We have some of the most talented storytellers in the world, and they come up with concepts that nobody’s ever thought of,” Bermender said in a video. “And so our technology needs will be dependent on what the story is calling for.”
The challenge had to do with file caching. Bermender said that with older characters like Woody and Buzz Lightyear, there was a model with textures on it and it’s the same model in every single shot. However, with volumetric characters, it’s a simulator. Each frame of each character is independent and different from the one before it. Traditional models weren’t going to work.
“The animators were just starting to talk about a new style of film production called volumetric rendering,” said Jeff Denworth, cofounder of Vast Data, in an interview with VentureBeat. “What volumetric rendering requires you to do is — as opposed to just creating a scene — actually go and simulate the characters in 3D. So it’s a full simulation down to almost the atomic or the molecular level.”
Making the film
You can see this just by looking at the film, which features the characters Ember and Wade, who are made on the one side of fire and on the other side by water.
“It was really important for Pete that Ember be made of fire, and that when she moved, she moved in a fire-like way and not adhere to a strict, skeletal structure,” said visual effects supervisor Sanjay Bakshi, in a statement. “For example, when she reaches for something, her arm can stretch and get really narrow, like fire can. Ember needed to be able to really change shape and be amorphous. While our animators have a lot of tools at their disposal to make a character like Ember angry—from changing her posture, her eyebrows, and her facial expression—we also wanted to change the characteristics of the fire when she got angry.”
Pixar said that to create Ember and Wade, in addition to a complex backdrop, another phase of production was added to run simulations on the characters in every frame of the film. The filmmakers adjusted their pipeline to allow more time after animation to tackle the massive effects and complex lighting needs.
Pixar said it tapped resources at Disney Research Studios in Zurich, Switzerland, to help shape ideas into technological innovations. According to Bakshi, this allowed them to organize the flames into more stylized shapes using a machine learning technique called Volumetric Neural Style Transfer. Pixar wanted the flames around Ember to look real without being hyperrealistic. The company said it was about getting a bunch of experts—a fire expert, a shading expert, an animation expert, a rigging expert, and a lighting expert—in the same room and iterating until it was right.
Ultimately, Pixar turned to a company that wasn’t nearly as well known — Vast Data. They built a bigger data cluster.
Denworth said the company was founded at the start of 2016 on the idea of building a system that over time would be used for artificial intelligence workloads, where it was designed to make infrastructure much simpler as enterprises walk the path towards the AI future.
In the early days, the company positioned itself as a maker of software for a flash-based storage product that can be deployed in the data center or in the future in the cloud.
“We found ourselves talking to customers on workloads that essentially were at the intersection of very high levels of performance but also large data capacities,” Denworth said. “What we built is a next generation distributed systems architecture. We think is the first new style of a distributed database system since Google introduced a concept called the Google File System about 20 years ago. And it’s a system that is just designed for massive levels of scale, massive levels of data processing, massive amounts of efficiencies, such that you can afford flash for all of your data.”
The company is now valued around $4 billion and it has about 550 people. Vast Data is a remote-first company, based in New York. It’s also cash-flow positive.
“It’s a rare mix of efficiency and growth that you typically don’t find in any startup,” Denworth said. “The way that we’ve achieved that is by dealing with organizations that only have very large appetites for infrastructure and, consequently, can cut large checks.”
Working with Vast Data
To meet the technical requirements of the Pixar creative team, Pixar turned to the Vast Data Platform to power the demanding render pipeline required for Elemental and future productions.
“When I met with the founders of Vast that first time, I was blown away by how they had not only the answers to how they were going to fix all our problems, but they actually understood the engineering required to fix those problems,” Bermender said. “It’s been fantastic working with vast over these last four years. The Vast platform has allowed us to build on our current level of animation into future levels of animation that nobody has even thought of.”
By moving up to 7.3 petabytes of data to a single datastore cluster managed across one high-performance namespace organized from low-cost flash, Vast can provide real-time access. That keeps Pixar’s render farm constantly busy while enabling improved observability and analytics.
“We had to come up with a new solution that allowed us to have high-speed flash access to our data and Vast had this great proposition that allowed us to get the performance we needed and the capacities we needed in order to support this new volumetric pipeline,” Bermender said.
Moreover, it simplified collaboration and development across multiple productions running at the same time, while laying the foundation to leverage AI and deep learning for future films.
“Pixar is the world’s leading innovator in animation, and in order to continue to deliver new stories, breath-taking visual environments and memorable characters – we require the industry’s most innovative technologies to help us bring the animators’ vision to life,” said Pixar’s Bermender, in a statement. “‘Elemental’ is the most technically complex film that Pixar has ever made, but with VAST we were able to build beyond our current level of animation and consider new techniques that no one had ever considered before because we didn’t have the technology in place to support it. VAST’s technology has allowed us to change the way we store and access data while also opening the door to new potential visual pipelines.”
The compute requirements for Elemental demanded fast and concurrent data access from hundreds of thousands of processors used in the render pipeline. Vast delivered fast, uninterrupted performance even during the film’s peak rendering usage, requiring nearly two petabytes of capacity at one time (compared to previous films over the last five years using only about 300 to 500 terabytes of capacity).
With different productions in different stages of rendering at any given time, Pixar depended on VAST’s Data Platform to handle the volumetric demands of several films being developed at the same time. With VAST, these pipelines enjoy the uptime needed to meet Pixar’s intense film production schedules and release dates.
Thanks to the parallel data access, high-performance, and scale of the Vast Data Platform, Pixar was able to render nearly 150,000 volumetric frames in Elemental alone, each of which contributes to the film’s unique look. The partnership with Vast has also helped Pixar to build the foundation for future generations of AI-powered cinematography, using machine learning and deep learning training models for automated and improved production processes.
“At Vast, we’re so proud to work with Pixar as they create timeless cinematic magic by simultaneously pushing the boundaries of story-telling and technology,” said Denworth. “For ‘Elemental’ and future films, we’re delivering a data platform that powers the animation and rendering workflows for their most data-intensive and computationally heavy projects, while enabling its AI and ML pipeline for the future in order to further the ambitions of Pixar artists and the stories they’re able to tell.”
Using Vast’s Data Platform, Pixar projects are now available for the types of high-speed data processing required to employ new animation techniques using machine learning and deep learning training models for automated and improved production processes.
“What you need is a significant amount of more computational capability. And as the compute needs rise, so does the need to access and process data,” said Denworth.
Previously, Pixar used system from Dell to do the job when it came to unstructured data. But those systems weren’t scalable or affordable enough to meet Pixar’s needs.
To infinity and beyond
Today, Vast Data’s biggest customers spend hundreds of millions of dollars on single machines, and Vast Data is working with multiple partners on that level.
“Scale is important, but that also requires customers then to be able to get a global view of data,” Denworth said.
As AI becomes more and and more compute intensive and gets applied everywhere, Vast Data sees its challenges getting bigger and bigger.
“That’s what we’re building to, is this idea that, over time, the machines will be able to understand and make their own discoveries, and you need to enable them by interconnecting them all around the world, giving them tools to catalog and categorize data, and then opening up access to data such that they can process whatever they need to do,” Denworth said.
Whether that day comes or not, Denworth said that world will require a lot of data.
You can bet that Pixar films are going to get even more complex in the future.
“Fire and water is where it gets really hard to render,” Denworth said. “When it comes to the visuals, all the reviews have been very positive. It’s one of the most stunning movies Pixar has ever made.”
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.