Gaming sensors and peripherals have come a long way since the original Wii’s waggle controls.
Handheld peripherals like the PlayStation Move are now more precise than they’ve ever been. The Xbox One’s Kinect can now detect color in addition to depth and heat. The Leap Motion sensor is able to track intricate finger movements. And mobile phone gyroscopes can even measure physical activity. Developers in the sensory gaming industry believe that the technology can be taken much further, though. Instead of improving on what we currently have, they believe sensory gaming platforms can open doors to new kinds of game design and experiences.
"There are already a number of things that are possible," said Tan Le, CEO of neuroengineering company Emotiv. "We’ve always dreamed of being able to control things with our mind, and it is possible to have some limited mental commands that have very distinct and very precise results."
According to Le, EEG headsets like the NeuroSky and Emotiv’s own headset can already detect if players are relaxed, frustrated or engaged. They can also reliably detect facial expressions and can differentiate between smiles, grimaces and eyebrow raises. Developers like the team behind Throw Trucks With Your Mind are taking advantage of this technology by giving their players Jedi-like powers, allowing them to use their minds to pick-up and throw objects in the game. This is an exciting development for the sensory gaming community, but Le has bigger ideas for ways the technology can change games. For example, what if the player’s facial expressions translated into their avatar’s face? And what if games recognized the player’s facial expressions and responded accordingly? Sony Online Entertainment has already introduced such a feature into EverQuest 2, but Le sees potential for this to be widely implemented in more games.
"We’ve always dreamed of being able to control things with our mind..."
"Say you meet a character in the game and you smile at them — it seems more natural than typing something to them," she said. "Being able to convert facial gestures in the game itself is something that doesn’t exist right now … [but it would] create a level of immersion that doesn’t really exist today."
Le also sees potential for neuro-gaming technology to play a role in dynamic difficulty adjustment. So instead of making players choose a level of difficulty before the game begins, the game can detect if a player is frustrated, observe when they’re engaged, when they’re bored, and adjust the difficulty.
On the motion-control front, Leap Motion co-founder and chief technical officer David Holz said motion tracking could potentially become even more precise than what we have today, which could allow players to achieve things in games that are currently not possible.
"It’s worth thinking about it in terms of dimensionality," he said. Instead of only tracking hand movements based on if the hands are moving up, down, left, right, forward and backward, imagine technology that can track the various angles and joints, and capture 22 dimensions of the hand. "It’s such a departure from what we’ve had in the past," he said.
Holz used Minecraft as an example of a game that could benefit from accurate hand gestures. What if players could grab two blocks with their hands and merge them together? Or what if, when controlling a swarm of fighter jets, players could create complex formations and express different shapes like three dimensional helixes using hand gestures?
Developing neuro-gaming technology can be risky, though, because it’s difficult to tell if it will catch on with players. Le said there is the risk of "missing the boat" by "being too early." But it’s a risk that Emotiv and other developers are willing to explore.
"The pragmatic side says make what the consumer wants," she said. "But there’s a passion inside me that says let’s create something I love and see where it goes."