What’s the best way to show off what it feels like to be in virtual reality? It’s an open question with no single right answer, at least not yet, and the best solution used to be something called mixed reality.
This is the process in which someone goes into virtual reality and is shot with a camera attached to a motion controller. Software then places the video image of the player into the game world. A recent sketch on Conan O’Brien showed used this technique perfectly.
The problem is that mixed reality videos are complex, and not just in terms of software. You need a large area for the green screen, and you’re compositing live video into a virtual world. Owlchemy Labs — the developers of Job Simulator — are already tackling some of these problems.
But videographer Kert Gartner wanted to try a different approach for some of the trailers being released alongside the Oculus Touch, including Space Pirate Trainer and Fantastic Contraption.
His idea? Shooting the entire thing within virtual reality, using a motion-tracked virtual camera. The player would still be playing, and would be “recorded” live on set, but the camera would be shooting a near photorealistic character model from within virtual reality. There is nothing “real” in the trailer at all.
This is how it was put together.
You start with a character model
Shooting a trailer in this manner is easier in some ways, but it requires a lot of technical work on the part of the developer upfront.
“Initially I suggested just doing a head model along with the hands to save on the amount of work it would take, but Dirk [Van Welden, Space Pirate Trainer’s developer] wanted to take it to a full on avatar, which I thought would be awesome if they could pull it off,” Gartner said.
“The idea of shooting in-game avatars is something we did in a smaller scope back on the Mixed Reality Fantastic Contraption trailer,” he continued. “There’s a few insert shots there that are all in-game that use the exact same technique, but I wanted to expand on it and see what we could do shooting a whole trailer this way with different focal lengths and camera moves like you would on a film set.”
Van Walden did manage to create a full avatar, but that wasn’t enough. The avatar had to move the same way the player moved, even though the game was only tracking three points of the human player: the head and both hands. The rest of the animations were extrapolated using a technique called inverse kinematics.
Experimenting with VR Body IK and Animations today. #VR #madewithunity pic.twitter.com/UmmCGiMysN— Dirk Van Welden (@quarkcannon) September 26, 2016
“I'm guessing I've spent a week of work in total — without the modeling itself — so that's doable,” Van Welden told Polygon. “It might take another week to make it more user friendly.”
Once the character model and inverse kinematic system were done, they still had to tweak the camera system so they could “shoot” the trailer like a movie.
“Basically, there’s a slider to change the field of view so we could shoot with wide or telephoto lenses, and smoothing controls for both position and rotation,” Gartner said. “We also had options to turn off the score/lives indicators, the floor bounds, and the in-game ad stations so there wasn’t any distracting elements on screen.”
This is the final version of the in-game camera controls they used to shoot the trailer:
Once the camera system was done and they had a character model that moved on a 1:1 basis with a live player, they could begin to shoot the trailer itself. This was done by mounting a standard Vive controller to an inexpensive gimbal system; no actual physical camera was ever used to shoot the trailer.
They acted as if the controller was a camera, however, and were able to imagine shots by moving it around an actual human player while “shooting” the same thing inside of virtual reality.
“I’m really excited by the prospect of being able to do this kind of virtual cinematography in VR,” Gartner told Polygon. “This technique isn’t new - James Cameron did a lot of virtual cinematography on Avatar, but the setup they used was in the hundreds of thousands of dollars and infinitely more complex to operate. Now we can do basically the same thing for the cost of a Vive, another controller and a bit of development work. It’s incredible that we can shoot a live virtual action sequence in my basement.”
This is a time-lapse of the shooting process:
“There are two other techniques we used too. We attached a camera to the drones in-game to get some cool sweeping moves for ‘free’ without having to manually move the camera, and moving the camera manually with an Xbox controller,” Gartner said. “There’s a few of those wide shots peppered in the trailer and I think they really help sell the scope and scale of the environment you’re in and give a different perspective on the action.”
Those amazing shots that would require some tricky programming or impossible physical movements on the part of the camera person you see in the trailer? All they did was strap a virtual camera to an in-game enemy and shoot things from its point of view. This is on example of what that looks like in action:
The team also had to get creative with some of the camera angles, and would sometimes have the performer themselves hold the “camera” to simulate an over-the-shoulder effect.
They shot for three days to get everything they needed and, since they weren’t using a large crew or even studio space, they had plenty of time to take breaks, plan out shots and try different ideas.
Without any further building of anticipation, here is the finished trailer:
Compare that to the mixed reality demo O’Brien shot at YouTube’s offices, which required a much larger team and a full room dedicated to the green screen:
Van Welden thinks it would take about another week of work to make the model and inverse kinematics system user-friendly enough that he could release it to streamers letting anyone film themselves playing the game in this manner. “A random dude with a headset on doesn't look as cool as a real 3D-modeled space pirate,” he told Polygon.
It’s amazing to think that shots like this are basically live motion capture using only three points of data, with the camera person and editor having enough freedom to cut the scene together in a way that feels cinematic and exciting. This isn’t how the game looks when you’re watching someone play it, this is how the game feels when you’re playing it.
While the in-game models aren’t quite as detailed, Gartner also used this method to shoot the latest trailer for Fantastic Contraption, which you can watch below:
Gartner has a few more details on his official blog post about the trailers, if you feel like taking a deeper dive into how they were put together and why shooting VR in strictly first-person views is so limiting. These are the early days of shooting what amount to short films in VR, and it’s only going to get better from here.
Space Pirate Trainer is available now on Steam and tomorrow through Oculus Home.