By HELENE SEIFER
By HELENE SEIFER
A Babylonian princess created the world’s first museum in 530 BC in what is now Iraq. This repository of Mesopotamian artifacts was neatly arranged and each ancient treasure accompanied by a chiseled clay cylinder with a description in three languages. Although more than 2,500 years have passed since Princess Ennigaldi preserved the past for all to see, for much of history the basic organization of museums remained the same: display some cool stuff and label it. In this age of exploding technology, however, the creative approach has kicked into hyper-drive. Welcome to the new museum experience, where systems developed for film and gaming now reign.
“How many times will you dive with a whale?” asks Jake Rowell, Director of “theBlu,” a VR adventure from Wevr that brings us to the depths of the ocean for underwater encounters without getting wet. “Most people are not scuba certified!” “TheBlu,” which started as a home application, made its museum debut in early 2017 at the Dubai Aquarium & Underwater Zoo, followed by an extended stint at the Natural History Museum of Los Angeles, and more locations are planned. Museum patrons virtually encounter a big Blue, a jellyfish migration, and a whale fall where a deceased whale’s bones have become an undersea ecosystem. “It’s a celebration of the ocean and ocean life,” says Neville Spiteri, CEO and Co-founder of Wevr. “We worked with oceanographers and scientists to ensure a degree of accuracy.”
The Natural History Museum version of “theBlu” consisted of six 9’ x 9’ pods that held one person at a time for a personal immersive experience. “We go to great lengths to consider the audience,” continues Spiteri. “VR is a powerful medium and you can instill a high degree of empathy and can also scare people easily. We aim for the right level of awe. No cheap scares. No shark jumps out at you.” Rowell adds, “Exploration is the heart of what ‘theBlu’ is.” Meeting a full-sized whale, even virtually, is awesome. “The creature acknowledges you’re there and they’re there. The connection is very powerful.”
According to Spiteri, reactive sea anemones and bioluminescent creatures are possible because, “It’s a combination of software, custom development of code and algorithms, and creating a simulation engine that determines how the fish move and swim.” Explains Rowell, “‘TheBlu’ code base lives in Unity – the rendering engine. It’s flexible and customizable and can achieve many different looks. We set a series of ‘go-to’ points in the story. Some parts are scripted, but the ocean still needs to feel alive and random.” Art is also critical, and the work of Academy Award- winning visual effects artist Andy Jones was key.
The Museum of Science, Boston has taken a deep dive into tech in order to excite visitors and inspire life-long learning. “Particle Mirror” opened this year, and inculcates an interest in physics. Kids and adults alike gambol in a snowstorm, bounce giant colorful orbs and frolic through pixie dust – and nothing is real.
Created by Karl Sims, a digital media artist, computer scientist, and recipient of a MacArthur Fellowship, the wall-sized virtual mirror uses a Microsoft XBox One Kinect Depth Sensor camera to capture visitors’ motions and depth and project them into the AR environment. Sims, who previously founded software company GenArts, developed the systems for “Particle Mirror” using C, OpenGL and Open CL, which runs on a Linux computer with an NVIDIA GTX 1080 graphics card. Participants are led through a series of nine revolving scenarios, interacting with a changing variety of dots and spots, all following the laws of physics and the properties programmed into each. Gravity causes “snow” to fall from the top of the screens – and, just like real snow, it collects on heads and shoulders, and can be scooped into snowballs. Music enhances the magic – when particles collide different sounds are generated. “The bubbly effect makes bubbly sounds. They gurgle when pushed,” Sims explains. “The goal is to inspire kids to learn more.”
The museum also houses Sims’s “Reaction-Diffusion Media Wall,” installed in 2016. He details that the exhibit simulates “two chemicals that make emergent dynamic patterns.” Consumers at a kiosk manipulate patterns projected on 24 hi-def screens. To create the effects, Sims used a consumer-gaming piece of hardware: a graphics processor on Linux machine with 2000 processing cores.
Museum of Science, Boston develops most things in-house with a dedicated nine or 10-person team, including a 3D designer, a physical manipulative engineer, software developers and builders. According to VP of Exhibition Development Christine Reich, special effects excite and engage people in learning. “Neuroscience is now teaching us that emotions are the starting point for behavior. When people are in a heightened state, we can push them beyond their expectations. Triggering that emotional reaction leads to deeper learning. One way is through digital immersion.”
Equally important is referencing cultural touchstones. As Reich notes, the Museum is always trying to determine “how we can leverage pop culture to teach STEM (Science, Technology, Engineering and Mathematics).” Last year, their “Star Wars®: Where Science Meets Imagination” experience concluded its eight-year, 20-venue tour. Their “The Science Behind Pixar” exhibition elucidates the engineering and computer science behind animation technology. Seen by over 800,000 people thus far, there are now two traveling versions, each with over 40 interactive components, including a simulation allowing visitors to program the grass in a movie scene to determine if blade density affects movement.
Understanding how actions affect our environment is the goal of the 2,500-square- foot interactive “Connected Worlds” at the New York Hall of Science. Visitors are surrounded by six interconnected ecosystems: Desert, Mountain Valley, Wetlands, Reservoir, Jungle and Grasslands. A 40-foot-high waterfall, rivers, indigenous plants and native creatures are projected onto the walls and floor, and environments thrive or die depending on what’s done to them. Infra-red cameras react when practical logs, wrapped in reflective material, are used to divert the path of a virtual stream to feed or starve flora and fauna. Gesture-reading Kinect cameras react to actions that the group of museum-goers take: whether interacting with animals or planting virtual seeds, every human action counts in how well the entire ecosystem works.
According to Geralyn Abinader, Creative Producer at NYSCI, “When you talk about system-thinking, it’s really hard for even adults to grasp the relationships in a living system. When an individual does an action or there are aggregate behaviors that cause changes; sometimes the reactions are immediate, but sometimes they are over the long term. Sometimes they are nearby, some- times on the other side of the world.” Visitors can see that if they plant enough seeds and supply enough water, animals will appear. If food sources dry up, animals will migrate elsewhere.
Although based on real scientific models about the ways water systems work, the projected worlds are fanciful, filled with imaginary critters. “We wanted children to understand one or two important concepts and not get bogged down with expectations about what they were seeing,” offers “Connected Worlds” developer/designer Theo Watson, who, with his wife Emily, founded Design I/O, the company that designed and programmed “Connected Worlds.” “The more straight-forward the characters were, kids said they liked them more, but had very little to say about them. The weirder the characters, the more they had to say. That helped us because it captured their interest and attention.” The exhibit-planning included scientific collaboration, as Watson explains. “Some of the people who advised the UN on climate change were involved.”
“We go to great lengths to consider the audience. VR is a powerful medium and you can instill a high degree of empathy and can also scare people easily. We aim for the right level of awe. No cheap scares. No shark jumps out at you. Exploration is the heart of what ‘theBlu’ is. The creature [a full-sized whale] acknowledges you’re there and they’re there. The connection is very powerful.”
—Neville Spiteri, CEO and Co-founder, Wevr
The complex responsiveness of the exhibit was a technological challenge, as Watson attests. “Almost every aspect of this was on the verge of being impossible to do.” He developed the software on OpenFrameworks, built it on C++. “Each of the six environments is a single projector connected to a single computer. The floor is another computer, but the content comes from seven projectors, so it feels like one continuous digital surface. All the environments talk to each other over a network. When a creature flies from the dessert to the forest, the forest knows that and makes sure the bird shows up on the forest screen.” Design I/O used the XBox Kinect camera, but the interaction and tracking aspects are customized. “Most of our installations are closed source. Sometimes we’ll solve really hard problems that would be helpful to millions of people, so we’ll open source that part of it.”
An added attraction in “Connected Worlds” is the “Living Library,” a 2-foot by 4-foot physical book. Look through the book, find an interesting section and a hi-res camera mounted above will recognize the page and prompt an interactive digital display.
“[The VR experience] is a combination of software, custom development of code and algorithms, and creating a simulation engine that determines how the fish move and swim. ‘TheBlu’ code base lives in Unity – the rendering engine. It’s flexible and customizable and can achieve many different looks. We set a series of ‘go-to’ points in the story. Some parts are scripted, but the ocean still needs to feel alive and random.”
—Jake Rowell, Director of ‘theBlu’
The Franklin Institute in Philadelphia has embraced immersive technology as a means of engagement and learning, with VR and AR strategies. “Westay up on what the latest available technologies are,” explains Susan Poulton, the Institute’s Chief Digital Officer. “If the technology exists in the public, we want it here.” Under her guidance the museum has VR stations with HTC Vive and Oculus Rift headsets which transport visitors to the depths of the sea, into outer space, or inside the human body. The Institute also has four movable, 6-foot-tall, 10-person VR Carts outfitted with 10 iPod touches, Samsung headsets, and access to the museums full array of 360-degree photos and videos, including all 26 of NASA’s VR properties. Visitors can download the app on their phones and access them at home, as well. To make sure that’s possible, the museum is giving away over 10,000 Google card- boards. Additionally, in a further commitment to immersive tech, two flight simulators have been upgraded to recreate the Apollo 11 landing in a 4D ride using HTC Vive.
The real goal,” according to Poulton, “is providing technology accessibility. To experience VR and all the things it can do. But the big issue is through-put. How many visitors can we put through in a day? The VR stations are limited to 24 people an hour, so that’s approximately 100 a day. It’s a challenge finding great content that works for the rhythm and flow of the museum.”
The other prong in their strategy is augmented reality experiences. For example, the “Terracotta Warriors” exhibit opened in September. Visitors can download an app to access AR on their phones. “Activate the AR and hold the phone over a warrior,” Poulton explains, “and it might show the original weaponry, created with CGI. Some will show the chemical decay process or the original coloring of the statues.
“We have to meet expectations,” she says. “This younger generation – the phone is a third eye, not a device. They need and will expect to use it, and if we’re not figuring that out now, 10 years from now, even 5 years from now, we’re in trouble.” She continues, “We are learning how to react to audience needs and do things quicker and not get so hung up on the details. We need hyper-relevant contexts – not just a hurricane exhibit, but a hurricane that’s happening right now in Japan. Museums are going to radically change.”
Even art museums are jumping aboard the technology train. Design I/O was commissioned to create two pieces for an inter- active video wall at The Cleveland Museum of Art. In the reveal mode, visitors’ movements effectively “paint” an item from their collection. Zoom turns each museum-goer into a living magni- fying glass. As Watson details, “In reveal, you’re basically throwing paint strokes off your body. Zoom is the simplest idea. As you walk up to the video wall, you see an object from the collection. Your body position can zoom into it until it’s 10-20 times magnified.”
In an effort to entice young adults to appreciate older art, The Montreal Museum of Fine Arts has projected an enchanted garden in their Romanticism Gallery. Leaves rustle and winds blow, creating an alluring room. They also developed a fascinating duo of exhibitions combining old-school craftsmanship with a complex projection system to convey the magic of a fashion icon. “The Fashion World of Jean Paul Gaultier: From the Sidewalk to the Catwalk,” whose five-year, 12-venue world tour ended last year, and the currently displayed “Love is Love: Wedding Bliss à la Jean Paul Gaultier” both feature manikins that talk with realistic articulation, but they’re not animatronic.
“Neuroscience is now teaching us that emotions are the starting point for behavior. When people are in a heightened state, we can push them beyond their expectations. Triggering that emotional reaction leads to deeper learning. One way is through digital immersion.”
—Christine Reich, VP of Exhibition Development, Museum of Science, Boston
Nathalie Bondil, Director General and Chief Curator of the museum, worked with UBU Theatre, an innovative company known for incorporating video masks into their productions. A selection of real people, including Gaultier’s top model and Gaultier himself, were tapped for the show. Bondil explains that first UBU artisans “made precise molds of their heads. Then they created a new head in plaster. After that, they recorded film of the same characters acting.” This part of the process was not easy. “Each person must combine the perfect features for the perfect 3D head. We cannot have someone with big gestures, who can’t move their head left to right while talking, who must look in front of him.”
Then UBU devised a means to realistically project the talking head onto that person’s sculpted head mold. “Magic comes from the combination of the sculpture and the recording of the same person projected on the same head. A completely unique specific system had to be created to avoid bizarre effects. In Korea, they thought we hired people to be on stage.”
Stéphanie Jasmin, Artistic Co-director of UBU, adds, “There were tricks we developed. When we project an image, it’s a square. The image is flat. A face is not square – it’s 3D. And we needed to make invisible cuts [in the monologues] or there’d be a jump cut in the face!” The systems created were so precise that a team of technicians had to travel with the exhibition to set it up properly for each venue.
With its ability to engage and excite, special effects techniques in museum exhibitions are sure to expand. Wevr’s Rowell observes, “VR is a powerful form of storytelling. I’m a huge fan of film, television, video games, animation. What I get excited about with VR is that it’s a new way to communicate, to show people the story.”