Games displayed at 1080p, while running at a smooth 60 frames-per-second, has become part of the marketing for many new games.
It's a magical bullet point on a list of features that proves that your game is next-generation, and there are groups of players online who sneer at anything that can't quite hit that goal.
Diablo 3 on the Xbox One, for instance, couldn't quite get there. The original response was to drop the resolution down to 900p and keep the frame rate. A smooth game, many would argue, is worth sacrificing a few lines of resolution. The problem is that this solution created a PR problem for Microsoft, a company that is scared of consumers seeing the Xbox One as a less-powerful console compared to the PlayStation 4.
"That's what we demoed and were showing around E3 time. And Microsoft was just like, 'This is unacceptable. You need to figure out a way to get a better resolution.' So we worked with them directly, they gave us a code update to let us get to full 1080p," Blizzard's John Hight told Eurogamer.
The results were positive, and shows the power of a platform holder invested in helping a developer deliver the best possible game.
The framerate isn't locked at 60fps
"The good news is that Blizzard has indeed achieved a full 1080p resolution on Xbox One, and has done so with no impact to visual quality. This is aside from what we suspect may be some light tweakery to the shadow maps, an occasional, subtle change that takes effect on the updated Eurogamer reported. "From an image quality perspective, though, first impressions suggest the two console versions are now absolutely identical — a point we hope to elaborate on once we dissect a greater breadth of areas for the full comparison."build too,"
Here's the problem: The frame rate isn't locked at 60fps, and in moments with a lot of monsters or effects onscreen, frames are dropped. The report goes to great lengths to point out that both issues, the lowered resolution from the first pass and the dropped frames that you see after installing the day one patch, are subtle and won't necessarily impact the game's play in a major way, but we don't have a choice in which one we prefer.
"[The dropped frames are] not game-breaking, and many players may not even notice — but it's clear that a 44 per cent boost to resolution doesn't come for free: in the same scenarios, the 900p version proved smoother," Eurogamer stated.
The marketing demands 1080p resolution, so that's what the market itself gets. I would prefer to have a smoother frame rate over the increase in resolution, but I don't get to make that call. The talking point of that resolution and that target frame rate are going to decide how we play our games, for better or worse.
I think it's worse. Or better yet, I think the player should be given a choice.
This isn't a new argument
I've written about the many options given to PC players when it comes to visuals, and some of them are beginning to sneak over to the console side of things.
You can lock the frame rate of has led to a lawsuit. Because to some people, it's that important, apparently. The PS4 release of The Last of Us also allows you to lock the frame rate; Sony seems to be comfortable giving at least a small bit of control to the player.: Shadow Fall at 30 fps if you'd like, although the controversy over whether or not that game runs at 60 fps while displaying a 1080p resolution
But let's swing back to the Diablo example, because it's recent and shows what we're dealing with here. There are two types of gamers out there: Those who care about these numbers, and those that don't. The ones that don't care just want a game that looks good and play smoothly. The actual pixel count or frame rate targets don't matter, or they matter less.
The other type of gamer cares about these numbers dearly, and they want the absolute best experience in their games. Fair enough. This is who Microsoft is worried about when it tries to improve games that perform better on the PS4, but dropping frames can be a pain in the butt in games like Diablo, and there's no way this hyper-plugged in gamer isn't going to read about it.
So the solution, to up the resolution and deal with the hit to frame rate from time to time, is imperfect. It's marketing, and the gamer that cares will read reports that discuss exactly what happened. The benefit of hitting these arbitrary targets is lost, and we still don't have a say in which way we'd like to play the game.
The other type of gamer cares about these numbers dearly
So why not give us a choice? I'd much rather be able to choose to increase the resolution or lose frames if I can't have it both ways, and Microsoft pressuring companies to provide these bullshit numbers takes that choice away from us. The fact these stats are used in marketing isn't a bad thing, what's annoying is when hitting those targets is justification for decisions that could impact the play itself.
Sony successfully set the terms for this fight when it stressed how many games are running at the desired resolution and frame rate, and it's a battle the more-powerful PS4 is well-equipped to fight. Microsoft is playing a dangerous game fighting back in this area, and the differences in graphical fidelity between future games may not be as subtle as what we see in the latest version of Diablo.
We're going to see more of this as the consoles fight each other, and the only way to really escape it is to play on the PC where you can pour money into new components and not have to choose between resolution and performance. But then many gamers prefer the direct control of the characters you get from using a console-style controller. There's no option to play this version of Diablo on the PC, so players have a choice to make.
In the case of Diablo 3? I'm playing it on the PS4.