A render farm is a computer network utilizing high-performance GPU and CPU cores running in tandem to aid in the production and rendering of CGI. Since the ’90s, this innovative process has been assisting design and production studios with quickly creating beautiful renders of complex algorithms used for animation and visual effects in film, television, and gaming. Keep reading to find out more about the origins of render farms, whether or not it’s worth building one, and if cloud-based rendering is a better option for your project.
Building the First Render Farm
The first known render farm dates back to July 1990. Autodesk’s Multimedia Division was tasked with creating the promotional short The Bored Room, an entirely computer-generated animated film, to demonstrate the power and production value of Autodesk’s new software 3D Studio DOS Release 1 set to launch that year. (That software would come to be known as 3DS Max.)
The final animation is only about 8 minutes long, but at the time, it was a massive undertaking and rendering something so ambitious would take an exceptionally long period of time.
The animation was rendered in a room filled with Compaq 386 workstations running on the Intel 80386 32-bit microprocessor, the fastest CPU available at the time. However, these workstations weren’t connected by a network so each machine was set to individually render a single frame independent of one another that was collected, or “harvested,” to a large format optical storage drive in sequential order. Once completed, the animation was then recorded frame by frame onto a Sony LaserDisc and then transferred onto VHS.
The story goes that during production, Jamie Clay, the technical engineer tasked with overseeing the rendering process, would regularly wear farmer overalls to work and coworkers would joke about how Clay was “farming the frames.” Clay soon began calling his collection of workstations a render farm.
What Are the Benefits of a Render Farm?
Since The Bored Room, there have been great advances in rendering technology, from microprocessing power and speed to task performance and quality. Today, optimizations in networking and communication between workstations allow for faster render times and higher quality imagery while the birth of technologies like the GPU may have had the single greatest impact on expanding the role of render farms as an effective tool for 3D modeling. Moreover, with high-speed Internet access and modern technologies like online file storage and FTPs, it has become possible to grow a new industry of cloud-based render farms that let you utilize rendering services from anywhere in the world.
A single but powerful workstation using a powerful GPU may be acceptable for personal 3D modeling projects, but when it comes to large-scale animation or architectural designs, it’s simply not practical to render models on a single computer. Most projects will eat up your production time and all of your CPU or GPU’s resources. Render farms are built to tackle these large projects with queue management that lets you render elaborate animations, test sequences, or highly detailed images in a fraction of the time it would take a single workstation.
On-site vs. Cloud-based Render Farms
On-site render farms usually require technical engineers to operate, control, monitor, and maintain the network during the rendering process. This option is practical for profitable studios that have the human resources available to dedicate to managing the server racks.
Money is also an important factor when considering building a render farm. Though high-powered workstations, advanced core microarchitecture, and top-of-the-line products are democratizing content creation and the entertainment landscape, these technologies don’t come cheap. Rendering is no exception. Building on-site render servers at home or in the studio is a fairly simple process and can potentially increase your production output, but you may find this to become more expensive in the long run.
Rendering projects can take hours, or days to finish and these very sophisticated machines must be on at all times in order to complete the render. Having a PC and server network operating on-site 24 hours a day for extensive periods of time will most likely:
- significantly increase your electricity bill,
- require debugging for system errors,
- cause the motherboard to fry due to overheating.
Replacing and repairing these machines is also costly. This is why going with a cloud-based render farm is often a much more feasible alternative.
Cloud-based render farms are a way to harness the productivity of the render farm at a much lower cost than maintaining an on-site setup. They rely on high-speed Internet access and FTPs to send your source files to a queue that processes and renders them using machines equipped with the most powerful processors available on the market. Each of these machines is referred to as a “node.”
The best online render farms utilize hundreds, if not thousands of nodes with thousands of GPU cores working in tandem on any one project. This speeds up the rendering process to a literal blink of an eye. A project that would take weeks on-site to complete can be rendered within minutes.
The most attractive feature is not just time, but also the cost of utilizing these services. Inexpensive compared to on-site farms at just pennies a minute, most cloud-based farms use a queue manager that helps you track your file’s progress with a pay-as-you-go model, letting you easily calculate the total rendering cost so you never have to be concerned about going over budget.
Are Render Farms Worth It?
As the line between professional and consumer-grade technology further continues to blur, smaller independent studios and freelance designers now have the opportunity to take advantage of cloud-based render farm services. If you still have questions about whether or not this type of service is worth the cost, we suggest reading through our Guide to Cloud-based Rendering, where you can learn more about how easily your production process can be streamlined.