The Evolution of Rendering

  • Blog

The Evolution of Rendering for Media & Entertainment – and the Correlating Need for More Compute Power

CGI animated features and visual effects have come a long way in the last 25 years. Iconic movies like Jurassic Park and Toy Story broke new frontiers and demonstrated the possibilities of new technologies, visionary thinking, and skilled artistry. Since then, we've seen continued advances in detail and fluidity, as filmmakers continue to push boundaries and keep audiences thrilled.

Rendering is a critical part of bringing CGI films and VFX to life through the conversion of 3D models into 2D images that can then be displayed on-screen. However, a single shot like a T-Rex bursting from the jungle can take days, weeks, or even years on a single computer – it's a resource-intensive process. Even as workstations and render farms have become significantly more advanced, so too have the rendering demands, as artists pack more detail into shots that must live up to higher visual standards such as 4K and HDR and exceed audience's expectations. 

Rendering techniques and technologies have evolved too, with cloud rendering allowing faster and easily-scalable options for studios of all sizes. Here's a look at what it took to render some of history’s most iconic films, demonstrating the options creators have to ease the brunt of rendering today.

The '90s boom

Jurassic Park certainly wasn't the first film to incorporate a high degree of CGI; Industrial Light & Magic wowed with its work on The Abyss and Terminator 2: Judgment Day. However, Steven Spielberg's 1993 movie made a monumental impact on Hollywood, with realistic computer-generated dinosaurs that brought the long-extinct creatures back to life much like the film's fictional scientists.

According to a SIGGRAPH talk recap from 1994, Industrial Light & Magic artists used as much off-the-shelf software as possible, including Pixar's RenderMan, and relied on SGI's (Silicon Graphics Inc.) workstations to handle the intense computing and rendering load. ILM had to triple its computer animation team to handle just 52 total shots in the entire film, with rendering typically taking between two and four hours per frame—but six hours per frame during the rainy T-Rex scene.

The Stanford bunny is a computer graphics 3D test model developed by Greg Turk and Marc Levoy in 1994 at Stanford University.

Two years later, Pixar's Toy Story raised the bar for all CGI animated films that followed and it was an enormous undertaking for the team. Pixar thought that it could produce a feature film with eight animators and 53 render nodes, but ultimately ended up with 33 animators and 117 SPARCstations comprised of 87 dual-processor and 30 quad-processors. Of course, those were primitive machines compared to the computers studios are using today. As former Pixar employee Chris Good suggests in this post, the spec of the Pixar SPARCstation 20 render farm cluster that rendered Toy Story had only half the power of the 2014 Apple iPhone 6!

According to a WIRED article from 1995, "After the processor was fed massive amounts of digital information to determine the animation, shading, and lighting, RenderMan software stirred the mix slowly (taking anywhere from two to 15 hours per frame) in a huge, simmering computational soup."

A pair of influential films from just before the turn of the century—1999's The Matrix and Fight Club—both pushed VFX boundaries further, requiring a significant rendering load. Manex Visual Effects handled many of The Matrix's most memorable shots, including the "bullet time" sequence, dreamed up by VFX supervisor John Gaeta. The team rendered those shots on 32 Dell Precision 410 Dual Pentium II/450 Processors running the open-source FreeBSD.

Meanwhile, Fight Club had a number of truly striking VFX-aided scenes, but one, in particular, required some serious rendering time: the reverse tracking shot out of a garbage can. It was reportedly the final shot added to the film due to the sheer amount of time needed to execute on David Fincher's vision. Each frame took about eight hours to render due to all of the reflective surfaces in view, and the entire shot took three weeks to fully render.

Into the 2000s

Final Fantasy: The Spirits Within was an ambitious undertaking, as video game maker Square Enix attempted to transfer its cinematic skill into big-screen storytelling. The 2001 film’s ultra-realistic CGI human characters impressed—and took a huge amount of processing power. Square Enix had some 960 Pentium workstations in its render farm, with the 106-minute film's 141,964 frames requiring about 90 minutes each to render. 

Big Buck Bunny (2008), was created by The Blender Foundation.

The following year saw the release of The Lord of the Rings: The Two Towers, the middle chapter in Peter Jackson's epic retelling of the J.R.R. Tolkien trilogy, which is famed for the full-CGI character of Gollum. Featuring motion capture performed by Andy Serkis, each frame of Gollum required between two and four hours of rendering time.

However, that's nothing compared to the character Treebeard, who had so much epic detail that it took up to 48 hours per frame to render the character. It's unclear just how many processors Weta Digital tapped into for that rendering haul, but a Computerworld article from 2004 suggests that the studio added 500 more during the production of the film to deal with the added demand. We have to imagine that extra firepower came in handy for Return of the King

In 2009, James Cameron's sumptuous visual effects feast Avatar hit cinemas, and the VFX unsurprisingly took an incredible amount of time and processing power to bring to life. Lead vendor Weta Digital had a 10,000 square foot facility used for the render farm, with 4,000 Hewlett-Packard BL2x220c servers packed with 35,000 processor cores, 104 terabytes of RAM, and three petabytes of local storage. Frames took several hours apiece to render, according to Geek.com, and the film's scope demanded elaborate hardware. 

Disney's 2013 CGI animated film Frozen required 26,000 processor cores to handle its rendering needs, but the following year's Big Hero 6 blew past that figure. According to a Fast Company report, Disney Animation connected “four rendering farms—three in Los Angeles and one in San Francisco—into a giant supercomputer” and collectively, the 4,600 computers provided 55,000 cores for rendering. Their render management system was able to queue up 400,000 rendering jobs overnight and have them ready for the next day. 

To the cloud

As those examples showed, the rendering demand has only skyrocketed over time, even with much more powerful hardware on-hand. With many live-action films incorporating significantly more VFX shots, and CGI animated flicks packing in significantly more detail and nuance, the need for immense processing power has grown exponentially over the last 25 years. 

That need is unlikely to slow anytime soon. These days, 4K resolution has become standard, and that requires a lot more detail and data than a 1080p/Full HD shot. But that's just going to change again before long; 8K televisions are starting to roll out, and then it could be onward and upward from there with higher fidelity color, audio and picture sure to come in the future. On top of that, high-dynamic range (HDR) support requires significantly more data, too, which means longer rendering times.

milk vfx deadline customer adrift boat water

Milk VFX scaled to nearly 10x its on-premises capacity with Cloud connectivity. Read the full story here

The advent of cloud computing has enabled a transformation in how studios large and small handle their rendering needs. The ability to spin up virtual machines at any time to handle rendering off-site allows companies to scale with absolute ease. It has enabled teams to expand beyond their local render farms when larger projects pop up, save on the power expenses of running (and cooling) a farm, and even ditch local farms entirely and rely on the flexibility and speed of cloud rendering.

Tangent Animation used AWS Thinkbox to tackle “Next Gen,” their largest project to date both in size and scope. Requiring four 2K resolution deliverables—including mono and stereo versions in English and Mandarin—and with only 25 percent of the project’s rendering completed three months before delivery, the team at Tangent scaled their compute resources with Amazon Web Services (AWS) Elastic Compute Cloud (EC2), ultimately completing more than 65 percent of the film’s rendering with AWS.

 “AWS Cloud was a godsend for us on ‘Next Gen;’ it allowed us to render about two and a half versions of the movie in just 36 days and far outstripped our on-premises capabilities. Without it, lighting and rendering would have to have started nearly eight months ahead of time and that would have required an entirely different creative strategy,” said Jeff Bell, Tangent Studios COO and Co-founder.

For the live-action film Adrift, Milk VFX was challenged to deliver 170 fluid-simulation shots to recreate an ocean—a job 10 times larger than anything the studio had attempted before. With AWS, Milk quickly scaled up from an average need of 80,000 CPU rendering cores to a peak of 132,000. “To complete a project like Adrift using on-premises resources, we'd need 10-15 times more rendering nodes, but only some of the time," said Benoit Leveau, head of pipeline for Milk. The savings in cost and hassle enables studios to do much more with less.

As filmmakers strive to push the boundaries further and deliver the most stunningly impactful images possible, rendering demands aren't likely to subside. But with an intelligent render manager like AWS Thinkbox Deadline and the option to use both local and cloud farms, studios now have more power than ever to decide how to tackle their rendering challenges in speedy, reliable, and cost-effective ways. 

Sources: