clock menu more-arrow no yes mobile

Filed under:

A special effects legend and Wevr release their first nightmarish VR short together

Mad God VR is a nightmarish, technical achievement

Nightmarish creatures stand in a brown hellscape Tippett Studio/Wevr

Phil Tippett Studio had never done VR before. But the team, led by industry legend Phil Tippett (Star Wars, Indiana Jones, Robocop, Willow, Jurassic Park) isn’t new to the idea of trying new things, and working with new technology and techniques.

Mad God VR, out today via the Wevr Transport app on Gear VR and the HTC Vive, is a dark, virtual reality take on Tippett’s 2013 short film Mad God that can be viewed through the Gear VR. The artists at Tippett Studio and Wevr wanted to try to see how stop-motion animation would look and feel when the viewer could look in any direction at any time while feeling like they were the same size as the hand-animated characters.

“This is a man whose unique vision has affected some of the most iconic films of our lifetime,” Anthony Batt, the co-founder and executive vice-president of Wevr told Polygon. “You couple that with the fact that he is introducing this almost forgotten practice of stop-motion animation into the medium. This installment may be short in duration, but there a lot of craft and skill that goes into creating stereoscopic stop-motion VR and people who are students of the field will recognize what an incredible achievement it is.”

Tippett Studio/Wevr

But how the hell do you animate these physical characters by hand when the audience had complete control over where to look?

“Originally we’d intended to create the entire arena practically, building trapdoors by which the animators could pop into the scene to adjust the physical puppets for each frame,” Tippett told Polygon. “That idea did not go over well with the animators — considering we had to collect 360-degree photography, compounded by the laboriously animated frames themselves. That was going to wreck them, squatting and standing for each angle of each frame.”

Creating every frame physically would have also dramatically slowed down production while dramatically increasingly the likelihood of small mistakes — you have nowhere to hide any imperfections in the 360 degree view of virtual reality, so that idea was scrapped early in production.

“We ended up building the arena separately and animating the characters on green screen using Dragon software, then putting each one on a card that was added to the scene in Nuke to achieve the level of parallax we needed while preserving the stop-motion feel,” Tippett explained.

And preserving the feeling of stop motion, of embracing the artifice of the process instead of trying to hide it, was a large part of the project. It was also a way to stand out from other VR experiments.

“We had twenty-something of these characters we call the shit men,” Tippett said in the official blog post. “They’re small six-inch stop motion characters that are made out of foam rubber with articulated skeletons and they are covered, I took cat hair from my vacuum cleaner at home and put that on their surface so every time an animator touched them it would disturb the cat hair. So the contour of the characters crawling all the time creates the kind of otherworldly distance.”

A character is animated by hand before being placed in the scene
Tippett Studio/Wevr

Technology issues in production are one thing, but telling the story itself proved challenging, as the presentation of VR isn’t like traditional film. The two-minute short is more emotionally evocative than narrative. Drawing the eye from scene to scene proved tricky.

“One of the things I was warned about was that in certain instances you need a bit of lag time between the sound cue that’s intended to change the viewer’s field of view, and the actual event you’re pointing them toward,” Tippett explained to Polygon. “With a cut you can do that immediately, but with VR you need to make it feel like the person is not being forcibly directed.”

You can get a sense of this challenge yourself by watching the trailer for the original Mad God short, which was meant to be shown on traditional screens. How would you direct these scenes if you couldn’t explicitly force, or at least tell, the viewer where to look? If you couldn’t cut from scene to scene?

You can achieve that with sound, and the fact you can hear something coming — and have time to look that way and anticipate the arrival of a new character — even added drama to the scene. You don’t feel like you’re watching something, you feel as if you’re one of the voiceless characters in the scene.

And that ability to look anywhere and even to hear the little details will even help you anticipate the violent end to the short. There are flying bricks above you, and from time to time they nearly collide and angrily “honk” at each other.

“That honking sound effect was an important cue for the end of the experience, which is where the VR vs flat screen disconnect became very interesting for me,” Tippett said. There was the opportunity to use a bit of misdirection against the viewer and, although this story provides a bit of a hint about how the short ends, we won’t go into details.

“What’s interesting about the VR experience is that it is a private, non-shared experience, therefore everyone walks away to some extent with a different experience.,” Tippett told Polygon.

You can show the short to someone in the same room with the next viewer, and they won’t be able to see what happens ... even if they can somewhat intuit the rhythm of the experience from the body language and reactions of the viewer inside virtual reality.

“What happened by accident underlined a sort of existential truth that can occur at the end of one’s life,” Tippett said. “There are generally only two ways to go out: One, you see the bus that’s going to hit you. The other, you don’t.”