Even hardcore gamers may be interested in some of the ways the Xbox One's Kinect can augment gameplay.
In a demo today, Microsoft's Jeff Henshaw showed how the Kinect can be used to carefully watch a gamer's movements, tracking them to do things like raise a shield in a firefight, activate an X-ray mode or call in artillery strikes.
The stripped-down first-person shooter Henshaw used to show off some of these new features was played like most games in the genre, using the thumbsticks to move and aim, and triggers to fire. But when Henshaw raised his controller up in front of him to shoulder level, a shield popped up to protect him from enemy fire.
Tapping his temple with his right index finger toggled an X-ray view and leaning left or right caused his character to side-step incoming attacks. Henshaw said that the Kinect isn't just looking at body movement, it is "literally looking at your spine as if it's a third thumbstick."
In a later round, Henshaw pointed at flying enemies on the screen to paint them with targeting reticules and then said "fire missile" to automatically fire off a barrage of missiles at the enemies.
"When I play a traditional first-person shooter most of the time my hands stay on the controller," he said. "By leveraging subtle gestures and combining them with hardcore controls the Kinect becomes meaningful even for hardcore gamers."
Earlier, Henshaw walked the room full of journalists through a demo that he said was built to show off how cloud processing can be used to augment the power of the Xbox One.
He said Microsoft engineers used data collected by NASA to create a simulation that showed the position of asteroids in space.
Using just the console's processing power, the app was able to calculate and render the movements of about 40,000 asteroids. But once the Xbox One started using Microsoft servers to augment the processing, he said, that number jumped to more than 300,000.
He said they were using data centers from around the world to help with those calculations "piping in about 500,000 data calculations per second."
That cloud processing, Henshaw said, will be used to help develop bigger, more detailed persistent worlds and experiences.