By JOE HENDERSON
Edited for this publication by Jeffrey A. Okun, VES
Abstracted from The VES Handbook of Visual Effects – 3rd Edition
Edited by Jeffrey A. Okun, VES and Susan Zwerman, VES
By JOE HENDERSON
Edited for this publication by Jeffrey A. Okun, VES
Abstracted from The VES Handbook of Visual Effects – 3rd Edition
Edited by Jeffrey A. Okun, VES and Susan Zwerman, VES
VR as a Tool
The impact of virtual reality (VR) on the industry cannot be underestimated. VR is now used for internal asset review, digital tech scouting, and also to create a suite of tools to provide content creators with the ability to become immersed in, and interact with, their environments. Production designers, costume designers, set decorators, DPs, directors and even writers can now access worlds that previously only existed in their minds, and they are able to manipulate and dress their environment to their liking. A creator can now share a (almost) fully-realized moving vision with their team. This translates into sharing not just the look – as drawings, still images and text – but also the emotional texture that one gets when seeing the lighting, camera field of view and moves, and basic overall tone.
Visualization in Engine
With more and more filmmakers pushing the look of their previs to unprecedented levels of quality, it is important to answer these two key questions: Is the goal to use the engine to render content? Or is running/creating the shots in real-time the goal? It is important to distinguish between these two uses of the game engine as it has a massive impact on workflow and budget for the production.
Render in Engine
If content is rendering entirely in the game engine, then standard game engine issues such as polygon count, clean UVs or even texture size will not be of much concern as long as the asset looks reasonable in engine and does not need to be running in real time. As visualization is meant to be a rapid prototyping of film, one would not want the render to be bogged down by the ingest of too “heavy” assets, lighting and/or FX (including screen space effects).
Visualization in Real-Time
Real-time visualization is where virtual production and visualization shake hands. In order to accomplish this task, we need to begin shifting our specifications when it comes to the building of visualization assets. All of our assets need to have “clean” topologies as well as tidy UVs, and restrained texture sizes in order to run in real time.
A number of issues arise in-engine when it comes to engine-only components – components that live in-engine and will more than likely never be able to be exported into another file format nor be passed down the VFX pipeline. These are items such as effects and depth of field.
Lighting poses a few special challenges worth considering. For example, baking the lighting into the textures of the assets for a higher-fidelity look. Be aware that by doing so it will reduce or eliminate the ability to rapidly adjust the lighting on the fly. Clear establishment of priorities will help prevent issues later in production.
AR as a Tool
Augmented reality (AR) has been used in some capacity for many years on film sets. This is traditionally referred to as Simulcam. With the advent of technologies such as the Hololens and ARK, many smaller budget projects are now able to utilize AR visualization on set. AR’s core value is its ability to create visualizations quickly on set and/ or during location scouting. If any previs or techvis has been created prior to the need for the AR. then a lot of useful information has been gathered about the film. As all of that information begins to collate into an AR device on set, the benefits become obvious. Production designers can see how their sets are going to be utilized, directors can see all of their blocking, and DPs can see if their lighting placements are going to work. As with all aspects of pre-production, the more information and resources that can be gathered early on, the better the shoot and the stronger the AR tool kit becomes.