By TREVOR HOGG
The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.
Winner of three prestigious Folio Awards for excellence in publishing.
By TREVOR HOGG
Establishing a network of mainframe computers that could be accessed by multiple users dates back to the 1950s when schools and corporations were provided access by technological companies such as IBM. This concept of “server rooms”’ and “dumb terminals” are the central components for what has become known as cloud services. With the usership costs becoming more affordable, it has allowed for widespread adoption and integration, and in the process is paving the way for the virtual highway, which will enable life-like digital interaction.
Dominating the marketplace that is still far from achieving its full potential are Amazon Web Services, Microsoft Azure and Google Cloud Platform, where there are three models of service being offered: software, infrastructure and platform.
“All three models are important and part of the same ecosystem,” notes Gretchen Libby, Director of Visual Computing for Amazon Web Services. “Platform providers are enabling virtual workstations, while software providers facilitate pixel streaming, and we provide infrastructure at scale. They’re all key components to cloud-based workflows. We’re excited to continue partnering with independent software vendors and solutions integrators to evolve and adapt to our customers’ needs.”
Two significant areas of concern are cost and security. “We have always countered the cost spiral by operating our own data center,” explains Ralph Huchtemann, CEO of RebusFarm. “This means that we do not have to pass on the high fees of the large billion-dollar companies to our customers. In addition, having our own data center prevents third-party access. This makes a big difference compared to rendering providers who only rent the necessary hardware from the big players like AWS, GCP or Azure. We have full control over the data and can guarantee that there is no third-party access. Secondly, we have ensured complete redundancy of hardware resources. All components in the data center are designed to be fail-safe. The power supply, Internet connection of the data center, the cooling, the central servers for monitoring the processes, all HDD- and NVME HDD storage and all other components are redundant and guarantee a smooth operation 24/7.”
Typically, data is stored in a particular region for the selected cloud provider. “If a studio needs to move that data, either to a different region or another cloud provider, it will likely incur egress fees, and they can be significant,” states Mac Moore, Head of Media &Entertainment at CoreWeave. “CoreWeave is the exception here and allows users to move data out to other cloud providers at no cost. Hefty egress charges break down the business model for using the cloud in VFX and animation. Part of MovieLabs’ 2030 Vision [white paper] for new technologies in content production is an emphasis on interoperability and open standards. I think when there’s improved interoperability and limited egress between cloud providers and regions so that data can flow indeterminately, it will greatly benefit cloud users from a cost and flexibility perspective.”
Scalability and elasticity of resources for computation and rendering are a major benefit. “It’s important to note that the small margins in visual effects are a challenge, and this has been the case for as long as I can remember due to the unpredictable nature of those workloads,” Libby states. “Delayed decision-making makes it tough for visual effects companies to predict render capacity needs. The cloud has helped mitigate some of those historical bottlenecks by enabling studios to work in a more elastic way. It also provides studios with the option to expand and contract when needed and removes barriers to entry. A big capex investment isn’t required to get started; more work can be done and at a faster pace when needed with the cloud.”
The core technology behind cloud computing in visual effects and animation has been in place for a decade. “The true evolution is how studios are accessing the technology,” Moore says. “Early on, people looked to the cloud as they needed more machines, and they treated them like on-premises machines in that they’d spin up cloud resources and leave them going. This approach was incredibly inefficient and led to cost overruns. Visual effects companies already have such limited margins that burning machines like that was highly problematic. What’s helped mitigate some of these early inefficiencies is the rise of services to help automate the scaling process, both up and down; in parallel, more studios are building out staff experienced with using the cloud. There are also solutions providers dedicated to helping studios set up cloud-enabled workflows.”
Globalization of the workforce is driving the adoption of cloud computing. “Maybe the term digital transformation has been used too much, but people are gradually moving towards cloud computing and services because there are more distributed teams working with offices around the world and clients around the world; everything is connected,” observes Vladimir Dragoev, Product Manager at Chaos. “It’s not just the pandemic that contributed to this digital transformation. When working with a local render farm, teams have to consider a lot of additional work around it such as maintenance, provisioning, updates and installation. We have clients who comprehend that and want to offload this load and maintenance cost, and cloud computing is good at that.” Cloud services are democratizing technology. “The visual effects and animation industries have long been dominated by giant studios with access to expensive hardware and software,” states Le Quang Hieu, CEO of iRender. “But that is no longer the case anymore. The introduction of online render farms/ cloud rendering services has made those resources accessible to everyone.”
“Moving to the cloud will help unify workflows and data across the production pipeline, enabling teams to access and manage their assets wherever and whenever they’re needed,” notes Paolo Tamburrino, Senior Industry Strategy Manager for Autodesk. “At Autodesk, we envision a future where data flows seamlessly across teams, and film/animation studios as well as visual effects facilities can collaborate better, so artists stay in the creative zone, focused on creating high-quality content rather than rebuilding assets from project to project. Autodesk is working on Autodesk Flow, our industry cloud for the media & entertainment industry. It will further connect workflows, data and teams across the entire production pipeline to a single secure source of truth for all assets, versions and feedback. With a foundation built on open standards, studios will also be able to customize flow for their unique production needs and workflows.” Client input has been essential. “To ensure that we are always aware of and able to support our clients through these rapid market changes, Autodesk M&E business is investing in cloud. Flow will connect existing content creation and production management capabilities with new platform-native capabilities to unify and extend customers’ workflows on the desktop and in the cloud. Maya, ShotGrid and Moxion will be the first to be integrated to create new extended workflows across the entire production life cycle. Enabled by open standards, Autodesk Flow will allow our existing tools, as well as third-party tools and APIs, to plug into an open ecosystem to speed innovation,” Tamburrino says.
“Every year, the cloud computing and rendering market grows by 25%, thus doubling the market in four years. According to our assessment, cloud computing and rendering is a must-have choice in the next 10 years when client standards and graphics standards don’t stop at HD, 2K or 4K. The higher resolution is increasingly popular and becoming mandatory for the end-product. And to meet such a high standard, studios are required to use cloud computing and rendering.”
—Le Quang Hieu, CEO, iRender
“We manage the workflow from submitting the scene to the cloud computers,” Dragoev explains. “We start a virtual machine, provisionit, load the scene, do other pre-steps, start V-Ray, which after rendering the scene generates image outputs. After that, we store the images, and users can download them in several different ways [including integration with cloud storage providers like Google Drive, OneDrive and Dropbox]. Chaos Cloud also offers traditional services like sharing VR previews with end clients. There are all sorts of things that we do. It’s not strictly for rendering.”
Ease of use is a priority. “The most important thing is to provide a simple and smooth but powerful process,” Huchtemann observes. “We already achieve this through our advanced interface. There is beauty in simplicity. The customers basically just want to press one button and get what they need, which is fast rendering. But we will soon release our new interface, which will simplify and empower the workflow even more. We are already good at that, but the new version will have some additional features that people will like, and many performance optimizations. There will be a new feature that allows collaboration with teams of any size and at the same time gives full cost control to the budget manager.” It is important to make the UI not intrusive. “In terms of workflow, users can render their images without too much hassle or having to go through a lot of cumbersome steps,” Dragoev remarks. “We’re trying to have fewer steps and settings that users have to customize. That’s why, for example, we don’t have job prioritization. We treat everyone with the same highest priority and start each job as soon as possible on its own dedicated machine. Every user benefits and there’s no wait time for them. Because we strive to have a UX that helps, we have only one button in the Chaos Cloud Rendering integrations in each supported DCC or CAD application. We also have a live preview of the rendering so you can pause or stop it before completion. Chaos Cloud Rendering also offers no-UI Submit interface, if the users are confident to directly start the rendering without customizing any settings. But the majority of the workflows still involve going through the submission page and adjusting the available settings.”
“Maybe the term digital transformation has been used too much, but people are gradually moving towards cloud computing and services because there are more distributed teams working with offices around the world and clients around the world; everything is connected. It’s global, not so domestic.”
—Vladimir Dragoev, Product Manager, Chaos
Real-time, AI and machine learning have a role to play. “Real- time is being adopted more than ever, driven by tools like Epic Games’ Unreal Engine,” Libby remarks. “We’re seeing a lot of virtual production technology coming together and real-time rendering improving. Still, we’re just scratching the surface of how real-time tech can accelerate the ways digital artists create. AWS has been developing AI and ML technologies for over 20 years, but we’re still evolving how we apply these technologies to M&E applications. What’s happening across the industry with generative AI is super interesting. Generative AI will change the entertainment business in a way that we haven’t seen in a long time.” Machine learning is pivotal in creating an automated system. “This will include concepts like up-resing footage, denoising images and black frame or partial render detection,” Moore notes. “Historical data can train ML models to help identify patterns and inform problem detection and correction. Just as AI/ML helps artists focus on the end-product, the emerging technology will help studios and wranglers render scenes in a more cost-effective way and reduce the need to manually analyze and/or intervene at each phase of the project. In summary, the mundane tasks of cloud computing have been automated, and now intelligence is being built to optimize the workflow.”
Partnerships are being forged. “We are a diamond member of the Blender Development Fund to support community develop- ment of this open-source 3D software,” Hieu states. “We are also a long-term user of AWS service in terms of cloud storage. Currently, we are working closely with Maxon to provide licenses for Cinema 4D and Redshift as part of our plan to provide PaaS-based cloud rendering services. In the future, we continue looking for further cooperation with hardware and software vendors to further enhance our services.” A lot can be learned from clients. “Because of the characteristics of the IaaS render farm model, we are a bit more technology and infrastructure-oriented. By interacting with clients when supporting them using the service, we have gained more insight into 3D software and tools for content creation. iRender provides a rating and commenting system after each session, so we can get more direct feedback from clients. What we are really grateful for is our clients constantly providing feedback and useful suggestions. Thanks to them, we know where to focus on developing so that we can deliver a better user experience,” Hieu adds.
Workflows are becoming smoother. “You could go back to the beginning of the film industry to see we’ve always had hard stops,” notes Tim Moore, CEO of Vu Technologies. “When you were shooting on film before, you had to stop production, develop the film and then go into editing. Once we made it in digital form where you capture it digitally and edit it digitally, there is a uniformity across that exchange because it’s in the same format. Now when you look across all of the creative exchanges in the life cycle of a video production, we’re hitting that point where almost all is going to be uniformly digital.”
There is an ongoing evolution occurring in cloud services. “Once people stop trying to replicate on-premises environments in the cloud and embrace the cloud for its true potential, we’ll see innovation in different ways,” states Libby. “Studios won’t have to worry about IT management and can instead put all their effort into creating amazing visuals. We’re working to help our customers achieve that and helping to spur the growth of the industry at large.” The market is growing. “Every year, the cloud computing and rendering market grows by 25%, thus doubling the market in four years,” Hieu notes. “According to our assessment, cloud computing and rendering is a must-have choice in the next 10 years when client standards and graphics standards don’t stop at HD, 2K or 4K. The higher resolution is increasingly popular and becoming mandatory for the end-product. And to meet such a high standard, studios are required to use cloud computing and rendering.”
Despite being established on brick-and-mortar studios, Vu Technologies acknowledges that the future is headed towards virtual studios. “AWS has been able to help connect all of our studios for co-creation and remote control, and a lot of the platform is built on their cloud computing network,” Tim Moore remarks. “We’ve created some core applications that it runs on. One of them is an AI orchestration layer that manages all of the other apps in the platform. This is the biggest paradigm shift since the advent of the digital camera. For the past 100 years, studios have not been connected to the process or the workflow of a cloud-based production. The tools and workflows that we’re building are to help that digital transformation, and our vision is that in the future there are no physical studios. It’s an interesting time in the creative marketplace. Fortnite has opened up its creator tool and are saying that they will pay creators to build worlds for them. You can build an island now on Fortnite, charge people to come onto that island and then play a multiplayer game. Now imagine in the future if you and I want to go shoot a movie on an island, which they will get royalties for the movie that is being made or there’s a permit to actually shoot in there.” Photorealism is being achieved with generative AI such as Midjourney 5.1. “Is it going to take the fun away of actually blowing up the car?” Tim Moore reflects. “Yes. Anytime you make technology this accessible, it takes away the creative constraint that people are put under to figure these things out. But I will say this: Whenever you democratize or make a technology more accessible, the participation rate increases so dramatically that it pushes the creative work further. Practical will still be an option, but in five to 10 years it’s all going virtual.” This digital transformation would not be possible without cloud computing and rendering. Tim Moore believes. “It has to be, because what other way can we co-create and connect with other creators? It’s all going to be a cloud-based environment.”