clock menu more-arrow no yes mobile

Filed under:

Apple's iOS controller API isn't solving a problem, it's opening a door to the future

Ben Kuchera makes a solid case that the iPhone doesn't need buttons for gaming.

He's right — it's solving a problem that doesn't exist. But that's not the point. It's not about the buttons. It's not about making the iPhone better at gaming. It's about making the iPhone better at everything.

Ben talks about how this first wave of button-based iPhone controllers feels like something you'd find at Kentia Hall at E3 — a place where people are hawking all kinds of insane, and mostly useless, hardware fever dreams. One of the beautiful things about walking through Kentia Hall is that sometimes you can see past the cheaply made hardware and the guy in the ill-fitting suit, and instead you can see the dream.

So rather than talking about a button-based interface for a machine that doesn't need buttons, let's instead talk about a dream. A dream that you can take the computer in your pocket and turn it into anything you want.

Let's jump a little bit into the future. Time's up at work, and you head home. Pack up your laptop and head out. On your way home, you get a text on your phone. You play a game. You get home, plop down on your couch, and fire up the TiVo on your big screen. Switch to Netflix. Play a game on your PS4. Maybe find a new recipe for dinner on your iPad or order food online. You can do that from the five or six separate devices, or you could do all of that from your iPhone.

The catch is, you don't want to do all of that from your phone, because the screen is tiny and the interface sucks.

You love your big screen. You love your physical keyboard. You love your game pad. Heck, more than any of those things, you love the spaces you've made to experience those things, and the particular ways in which you interact with the different pieces of hardware that facilitate the different elements of your life.

Change the interface, not the device

It might be a surprise to think of doing all that in exactly the same way from your phone, but why? It's a monstrously powerful CPU that you carry around with you all the time. The limiting factor is how you can interact with it.

A world where all your interactions with your phone are via voice commands is a ways off, if desirable at all. A world where your displays are all piped into your eyes via optical implants isn't happening any time soon. For all the innovation in the interface space such as Glas or Myo, the simple fact is that it's still very difficult to interact with the CPU in your pocket in the way that you want to interact with it.

Let's imagine a slightly different world than before.

You sit at your desk, working. In front of you is a screen, and a keyboard. There is no computer. Your screen is a mirror of your iOS device in "work" mode. You can do anything you could do in OSX, it's just running on your phone. When you're done with work, you leave with your phone in your pocket. Your screen, keyboard, and mouse are left behind.

On the train, you play your iPhone games. Touchscreen-specific experiences, maybe running leaner graphics to save battery power - experiences that are designed from the ground up to be mobile experiences, played wherever you are in small chunks.

When you get home, you plop down on the couch and turn on the TV. Not broadcast or cable - you tell Siri to watch NBC, and NBC is streamed through your phone to the screen. But now you want to play a game. Pick up the controller on your table, which is functionally the same as an XB1 or PS4 controller, and immediately, you're in game mode. No need to tell it anything, you already told it you want to play by picking up the controller.

Now you have a selection of controller-enabled games. Your Halos, and Killzones, GTs and Forzas, all things that don't work well on a touchscreen. But you don't have a library of physical games. You don't even have a console. You're running the games off the phone in your pocket, but you're using a familiar controller on an amazing screen with awesome surround sound on your couch.

What I'm asking you to imagine is a future where you only use one computer. Where a screen is just a display, and a controller is just an interface. Instead of a specialized device for everything you want to do, you have one CPU with myriad interfaces. A future where your computer displays to the whatever screen is appropriate for what you're doing, and you interact with it using whatever interface is best for the job.

Pick up the controller on your table, which is functionally the same as an XB1 or PS4 controller, and immediately, you're in game mode.

The craziest thing is that this isn't even that ambitious a pipe dream. The only thing that's preventing this from happening right now is widespread adoption of Airplay mirroring, some tweaks to iOS, and new controller interfaces. This could happen tomorrow with a concerted effort from the right developers, with nothing more than a slight adjustment of how Apple lets people interact with their devices.

I believe that having a standardized controller API isn't a misstep, not that it's solving a problem that doesn't exist. It's about taking the first step in a complete re-imagining of what computers we use and how we use them. It's a dream about sowing the seeds of revolution.

One I'd happily press A to start.

Seppo Helava co-founded Self Aware Games, whose games include Taxiball, Word Ace, Fleck, Big Fish Bingo and Big Fish Casino. His team's hard at work on something new and different. To keep an eye on it, join them here.

Sign up for the newsletter Sign up for Patch Notes

A weekly roundup of the best things from Polygon