By TREVOR HOGG
By TREVOR HOGG
The dictionary definition of visual effects as the sole domain of film and entertainment does not remotely offer a complete picture of how widely visual effects are used today. Hollywood has popularized and refined the digital technology that began life in laboratories situated in universities, military organizations and corporations such as IBM and Xerox. Therefore, it is not surprising to learn that actors and filmmakers, like Tom Cruise and Christopher Nolan, are not the only beneficiaries of making the impossible possible through virtual means as there are innovative practitioners in a vast range of areas such as education, policing, healthcare, automotive design and space exploration.
“The accuracy obtained with the latest generation of simulators emulates in an amazing way real flying when it comes to a precision not seen before giving the pilot great confidence regarding the capabilities of the equipment he is using,” retired pilot Ed Acosta observes. “Also, all kinds of emergency situations can be introduced that prepare a crew to confront it with a lot of confidence should it happen in the real world. This is the essence of flight training in a simulator. And, with no risk, these situations can be taken to the very limit.”
Computer animation has not changed how cars are developed. “But it brought an emotional side that engineering software couldn’t deliver,” notes Albert Kirzinger, Head of Design for Volkswagen Commercial Vehicles. “The designer takes the ‘key sketch’ that defines the design idea best and transforms it into 3D using CAD, like Autodesk Alias or Blender 3D. The latter has become the go-to product after the program´s complete redesign back in 2018. We’re all about 3D after that, both digitally and physically [scale and full-size models].”
“Clay models are irreplaceable because you can only add or remove material with clay. But we build the data digitally for milling clay models. Once milled, they’re manually altered. After the design is locked, it’s reverse engineered with 3D printing for foam or materials like metal.”
—Albert Kirzinger, Head of Design, Volkswagen Commercial Vehicles
The reliance on clay models remains. Kirzinger says, “Clay models are irreplaceable because you can only add or remove material with clay. But we build the data digitally for milling clay models. Once milled, they’re manually altered. After the design is locked, it’s reverse engineered with 3D printing for foam or materials like metal.” Virtual reality has streamlined the design process. “VR simplifies prototyping for both interior and exterior designs, slashing costs and speeding up the review of concepts before committing to a physical prototype,” Kirzinger adds. Predicting the evolution of digital tools in automotive design is hard to determine. “One trend has been on the block for the past two years: generative AI tools speeding up design processes and making tedious tasks disappear,” Kirzinger notes.
Digital tools have become more sophisticated in allowing complex workflows to be completed in shorter periods of time. “Before the introduction of web mapping tools, web map development required significant time and effort from programmers,” explains Gary Wheeler, Ministry Spokesperson, Ministry of the Environment, Conservation and Parks, Ontario, Canada. “As more more staff. Geomatics uses graphics and visualizations to provide data analytics services to manage environmental issues across the ministry. The ministry uses tools including Canva for creating graphics [using predeveloped elements] to add visual interest to digital products and/or summarize complex concepts/data in an easy-to-understand way. We also use a variety of tools for data visualization, including maps. Examples include Power BI and Esri ArcGIS. Graphics and visualization allow complex data and information to be communicated in a way that is easily understood by the end user.”
For students wanting to visit different places around the world or to teach life skills to those with learning disorders, RobotLAB has created a series of VR experiences. “There is a VR software being used to help students on the autistic spectrum to understand things like, if you’re in a high school hallway how do you look at someone in the eye and have small talk with them or how do you engage with the police in a way that they can understand that you need a different sort of interaction than another person,” remarks Amy George, K-12 Account Manager at RobotLAB.
“It is a cool and interesting software that comes with an autism VR pack.” A classroom endeavor is VR Expeditions in partnership with Encyclopaedia Britannica. “When Google Expeditions was canceled, we decided to create our own VR platform,” remarks Maria Galvis, Marketing Communications Manager for Education at RobotLAB. “We have three types of questions with our expeditions. What do you see? What did you learn? What if? Teachers also have pointers so they can tell students, ‘Check here. Learn this. Let’s listen to what they are trying to say.’ That’s another thing that helps education change in the sense that we’re not just learning from a book but we’re actually living it. Students are seeing it through their eyes.”
“In support of scientists seeking to explain submarine phenomena, we have created a model of Canadian oceans and seabeds. This modeling features a number of whale pods and studies the potential impact of military probes deployed in our waters on whale behavior. This digital picture becomes indispensable when it is either impossible or too costly to capture and represent these phenomena. We made use of the unparalleled real-time rendering capabilities of Unreal Engine, lit the environment with Lumen and rendered millions of polygons using Nanite, a new technology.”
—Defence Research and Development Canada
CG Pro School is the first premier Unreal Engine authorized training center in North America and specializes in real-time and virtual production. “We do public training which tends to be related more to media and entertainment, and private training, such as teaching virtual production to NASA. “There were simulation engineers working on the Artemis mission, which is getting ready to go back to the South Pole of the Moon, who were simulating the landing in Unreal Engine so they can understand what it looks like before going there,” explains Edward Dawson-Taylor, Co-Founder and Head of School, CG Pro.
“Where advances in technology should be focused is where they’re useful and make something better,” Dawson-Taylor advises. Technology is agnostic as to where it can be applied. “Films have taken all of the industrial use cases and improved the visual quality and user experience. At this point, what we’re unlocking is real-time computer graphics, and that’s not only about the speed of producing the image; it’s the workflow itself. It’s changing the way that people work together and collaborate. But it is predominately visualization and simulation, being able to see something that otherwise would be hard to see.”
Catering technology to suit the needs of clients, whether it be in the entertainment industry, advertising or corporate events, is the mandate for global virtual production studio Final Pixel. “We had to figure out how to meet fairly large client needs that had pressing deadlines, so one of the models that we built through the production approach was a popup stage,” remarks Michael McKenna, CEO and Co-Founder of Final Pixel. “We would put up a stage, film on it and bring it back down again. We’ve done that multiple times in Los Angeles, New York and London, and we’re doing a lot more internationally now as well. We are also building a trusted stage network and have developed a way of interfacing, because how you get the best out of virtual production is to have a seamless way through pre-production, the stage and post.” Democratizing the knowledge and expertise associated with the latest virtual production techniques and skills through online and in-person training is the Final Pixel Academy. “We deliver it for four major groups: corporate, professional crew, general public and clients. What we saw from the beginning was the skill shortage, and I was happy to start an academy to share everything that we’ve been learning in order to build a knowledge base for the industry. People can go to an academy and work for other companies as well.”
Commencing in 2009, simulation technology for driving, marksmanship and judgment has been incorporated into the Cadet Training Program at the RCMP Academy. “When the driving simulators were first purchased, it was because we needed to address a gap in our training program,” remarks Dr. Gregory P. Krätzig, Ph.D. (Psyc), Director Research and Strategic Partnerships, Depot Division for the RCMP. “Teaching emergency vehicle operations [driving with emergency equipment activated] was not possible in a civilian environment. What the simulators allow is to expose cadets to civilian traffic and pedestrians in a realistic setting. If mistakes are made in the simulated environment, they are made in a safe setting where those errors are addressed and teaching can be reinforced. This can also be applied to firearms simulation training where multiple cadets can be trained in situations previously limited to one or two cadets at a time.” The marksmanship simulation has shown the most progress. “New developments have resulted in the simulated pistols functioning close to a real pistol while maintaining a safe, risk-free environment. It is expected that the technology will mature in the coming months and will provide opportunities to be adopted in the field for experienced officers. The driving simulation is also advancing and has been used to teach decision-making and situational awareness,” Dr. Krätzig says.
As for the expected proliferation of simulators, part-task trainers and extended reality devices in Air Force schools, contracted training centers and operation training units, the Royal Canadian Air Force states, “With advancements in artificial intelligence, enhanced digital animation, graphics and effects incorporated in these modern training aids, the training realism afforded to students continues to improve their learning while at the same time reducing costs and operational impacts of using real-life aircraft or facilities for training. For example, students at the Canadian Forces School of Aerospace Control in Cornwall, Ontario, make use of control tower simulators with sophisticated graphics allowing them increased depth perception and detail to spot and control aircraft and ground vehicles, thereby improving the fidelity of the training experience in a way not achievable to the same degree with earlier generation graphics/simulators.”
When it comes to simulating experiments, evaluations, staff training and rehearsals before operational deployment, the Canadian Joint Warfare Center explains the reasons for doing so. “There are many different types of digital tools from flight and driving simulators to learning to operate equipment in a combat zone. We can teach CAF members basic familiarization of their environment and an aircraft maintenance task without leaving a classroom. These are immersive or virtual environments that you can interact with. This technology is probably the hardest to develop but will be the most productive and rewarding in the future. Especially as we start to include machine learning as part of the system.”
Aiming to make the invisible visible and visualize the impossible is the Science Visual Documentation team at Defence Research and Development Canada, which released the following response about the use of videography, photography, 3D animation, graphic design and illustration. “In support of scientists seeking to explain submarine phenomena, we have created a model of Canadian oceans and seabeds. This modeling features a number of whale pods and studies the potential impact of military probes deployed in our waters on whale behavior. This digital picture becomes indispensable when it is either impossible or too costly to capture and represent these phenomena. We made use of the unparalleled real-time rendering capabilities of Unreal Engine, lit the environment with Lumen and rendered millions of polygons using Nanite, a new technology.”
After a 50-year absence, NASA plans to set foot on the Moon in preparation for human missions to Mars via the Artemis Program, which is responsible for the engineering, planning and training. “Digital tools [virtual reality, ray tracing, virtual production] have enabled us to explore the various challenges that a crew may encounter while on a mission in a more immersive environment than was possible before,” states Lee Bingham, Aerospace Engineer, Simulation and Graphics Branch of the Software, Robotics, and Simulation Division, NASA. “This allows us to create realistic simulations and tools that are used in assessments and studies that can influence requirements in engineering and mission planning. For example, the areas around the Lunar South Pole that the Artemis missions plan to explore have unique lighting challenges that were not experienced in the Apollo missions. By creating a realistic and immersive simulation with visualization products, we can get a first-order approximation of how those lighting conditions impact design and mission operations. Additionally, we can perform end-to-end human-in-the-loop simulated missions with crew to determine how all the elements and systems interact with each other. Game engines are becoming more suitable for engineering and simulation development by supporting real-world physics models, integrating features that support the engineering design process and interoperability standards with other analytical and simulation tools.”
Not everything is environmentally oriented; Talespin makes use of AI-enabled CG characters in virtual workplace scenarios that are designed to develop the skills of employees in 15 different industries, including healthcare, insurance, hospitality and financial services. “We have made it drag-and-drop simple for an enterprise user who has no experience with game engines, has never had a 3D file on their computer, doesn’t know the first thing about optimization or IK rigs or animation,” states Kyle Jackson, CEO and Co-Founder of Talespin. “We built a toolkit for them where they can come in and focus on the simulation that they’re trying to achieve because of a business outcome. In the healthcare space, there can be difficult conversations between doctors and patients or doctors and staff.” In most user cases, it is beneficial not to have photorealistic CG characters. Comments Jackson, “It’s the same way with filmmaking where you’ve done your job well if, by the second act. people are not thinking about all the false rules that are making your world unbelievable. In learning, we have the same kind of barriers, but they’re more about getting people to unblock their biases and distractions or focus. If you cross the uncanny valley, people still know it’s software, and at this point in time, we’re more fascinated with the idea that we’re talking to a virtual human rather than focusing on the goals of what the conversation is about. The focus is in the wrong place, so it actually helps to not be photoreal because then people focus on the engagement rather than the science of it, or the fascination with the technology,” Jackson comments.
Online shopping has taken consumers beyond brick-and-mortar stores. Emperia, aiming to bridge the gap between e-commerce and the instore experience, provides a virtual store platform for retail and fashion enterprises. “The construction of the environment is important and how you position the product in it to make sure the product drives the focus,” remarks Olga Dogadkina, Co-Founder & CEO at Emperia. “Even though you’re [virtually] on the Moon, you can still see what you’re shopping for clearly. The layout of virtual stores versus physical ones is quite different. The first thing we found out early on is that no one wants to see the replica of a physical store in virtual space because you can simply go to the store and explore it yourself, whereas having a digital world brings a whole other aspect of interaction into how you can shop. Rules of merchandising are a bit different in the virtual world. Placing best-selling products closer to the start of the experience helps users to ground themselves and explore further, while in a physical store, the best-sellers always go in the back; otherwise, you create a crowd at the entrance and no one goes through. The other part of this is we have launched a product for 3D creators that is focused on how to enable others to create these environments that are targeted towards e-commerce.”
In an effort to standardize medical procedures and communicate the latest information about technological advances to healthcare professionals, Osso VR has created a VR training and assessment platform. “There is a lot that we’re pulling from gaming and visual effects, but also from the simulation work done by aviation and the military,” remarks Dr. Justin Barad, Chief Strategy Officer of Osso VR. “We’re not at the final period of the so-called augmented surgeon. There are a lot of different components that will come into play, but certainly, the pieces are starting to come into focus. A lot of things happen outside of the operating room that are critical. The problems I was seeing in my surgical residency firsthand are four things. First, there are many procedures that are expanding all of the time, and it’s gotten beyond the ability of any human to know how to do all of these different procedures. Second, most of these procedures are more complicated than what came before, so to learn them takes a lot longer and many more repetitions, and those repetitions are on people. Third, we don’t have a way to assess the abilities to perform these procedures. Fourth, surgery is a highly coordinated team activity whereas in the past it was just the surgeon who did it. Not everybody has the ability to train together. Training is such an important part of this whole ecosystem because your ability to perform a procedure has a major impact on patient outcomes. Even more important than training is deciding to do surgery. Generative AI has an incredible potential to enhance our ability to figure out when we do and don’t want to do surgery.”
Inhabiting entertainment and non-entertainment industries as a graphics design studio is the Territory Group, which has clients ranging from Marvel Studios to Audi. The brief is similar to Hollywood studios and automotive manufacturers. “In a movie, I need that visual to read in three seconds, and our graphic is probably not the most important thing in the shot,” remarks David Sheldon-Hicks, Founder of the Territory Group. “It has to be intentional and direct, but also needs a visual sophistication that still feels cinematic to an entertainment-level production value. The way I translate that in the real world is that people still have short attention spans; it still needs to translate, be immediate, transmit to a mass audience and have brand sophistication. When we’re doing those digital interfaces for Audi cars or General Motors or Porsche, there has to be a point of difference in the interface; otherwise, how do you know why you choose one brand over another one? You’re always communicating functionality and purpose, both in terms of storytelling and utility. And you have a second layer of that emotional resonance whether that is at a brand or storytelling level.” There is a co-dependency that exists. Sheldon-Hicks concludes, “Life imitates art and art imitates life. It’s this beautiful cycle.”