clock menu more-arrow no yes mobile

Filed under:

Why frame rate and resolution matter: A graphics primer

Samit Sarkar (he/him) is Polygon’s deputy managing editor. He has more than 15 years of experience covering video games, movies, television, and technology.

Graphics has always been the foremost battleground in the console wars. The participants in those never-ending debates try to bring in objectivity by quoting numbers — and in the case of visual prowess, resolution and frame rate are the two figures most commonly cited.

But what do they even mean? What's the difference between 720p and 1080p, or between 30 frames per second and 60 frames per second — and is it an academic distinction, or a meaningful one? In other words, why should you care?

What are resolution and frame rate, anyway?

Let's begin with some basic definitions, for the uninitiated.

Frame rate

Standard video, such as film or broadcast television, consists of still images that are captured consecutively and played back in quick succession. A "frame" is a single one of those images, and the "frame rate" is a measure of frequency: how often the video is updated with a new frame. For video games, that translates to the frequency at which the game redraws the image on the screen. Frame rate is measured in frames per second (fps).

Refreshing the frame must happen very quickly in order for the player to perceive motion. How quickly, you ask? Movies are traditionally filmed and exhibited at 24 fps. Television in North America is broadcast at 30 fps or 60 fps. Developers we spoke with for this article indicated that 30 fps is a general baseline that game makers don't want to fall short of — below that threshold, things start to look choppy, like the audio from a skipping record.

30 fps is a general baseline that game makers don't want to fall short ofA game's frame rate is separate from the screen it's being displayed on. Displays have their own frequency: the "refresh rate," or how often the device (such as a TV or monitor) refreshes its screen; this is measured in hertz (Hz), where 1 Hz is one cycle per second. Most modern TVs and monitors have a refresh rate of 60 Hz, so the optimal situation is for an image source (like a game console or Blu-ray player) to come in with a frame rate that evenly divides into 60. Think about it this way: A standard TV refreshing at 60 Hz would go through all 60 frames of a 60 fps feed in a single second — one frame every one-sixtieth of a second. The same TV would show each of the 30 frames in a 30 fps feed twice every one-thirtieth of a second.

Thus, games run into problems on most displays when they're not running at 30 or 60 frames per second. If a game ignores a screen's refresh rate and runs at whatever frame rate it can manage — a state known to PC gamers as "v-sync off" — it causes screen tearing (parts of multiple frames are visible at once). If you turn v-sync (vertical synchronization) on, which caps the frame rate at the display's refresh rate, it can cause stuttering and input lag whenever the frame rate falls below that refresh rate.

Resolution

The size of an image is known as its "resolution." Modern widescreen displays feature an aspect ratio of 16:9 (the width is 1.78 times the height), and resolution is literally a measurement of the width and height, in pixels, of an image. The minimum resolution that qualifies as "high definition" is 1280x720 ("720p" for short), while the higher-fidelity standard of most new HDTVs is 1920x1080 ("1080p"). Just as with megapixels in a digital camera, a higher pixel count — higher resolution — provides more detail in an image.

A 1080p image contains 2.25 times as many pixels as a 720p imageThe "p" stands for "progressive," shorthand for "progressive scan," a technique in which a display draws every line of a single frame sequentially, from top to bottom, within the space of a single refresh cycle (one-sixtieth of a second). This is different from the previous method, "interlaced," where TVs would alternate drawing odd- and even-numbered lines of the image every one-thirtieth of a second. Progressive scan makes for a smoother image that's less susceptible to the flickering that interlaced video suffers from.

A 1080p image contains 2.25 times as many pixels as a 720p image. So it's notably tougher for a game to generate a 1080p image than a 720p image. The PlayStation 4, Wii U and Xbox One are all capable of outputting games in 1080p. Sony and Microsoft have actually been touting the availability of 1080p gaming since the start of the previous console generation; the PlayStation 3 and Xbox 360 could also do 1080p. But we're very early in the life cycle of the PS4 and Xbox One, and at this point, games that run in native 1080p are relatively rare — especially on the Xbox One, although that's an argument we'll skirt here.

Instead, some games are rendered at a sub-1080p resolution in order to maintain the visual fidelity of the game; the console then upscales the image to 1080p before it sends the picture to the TV. For example, the PS4 version of Watch Dogs runs at 900p (1600x900), while the Xbox One version runs at 792p (1408x792). Developers make those decisions depending on the game and console in question.

How do developers prioritize frame rate and resolution?

Naughty Dog's Cort Stratton is a senior programmer on Sony's ICE team, which develops graphics technology that is shared across Sony Computer Entertainment's first-party studios and with third-party developers. According to Stratton, frame rate and resolution are related, but it isn't as simple as them being inversely proportional.

Generally, resolution is entirely under the purview of the GPU. Stratton provided this simplified explanation: "The CPU sends the GPU a list of objects to draw and a resolution at which to draw them; the GPU hunkers down and runs the dizzyingly complex calculations to figure out the appropriate color at each pixel." For example, doubling the resolution wouldn't affect CPU performance, but it would require the GPU to pump out four times as many pixels.

"While it's true that rendering at higher resolutions is more work for the GPU, this only affects the overall frame rate if rendering is the main performance bottleneck. It's often possible to increase resolution (to a point!) without affecting the frame rate," Stratton explained. "So, it's certainly not as simple as a dial that developers can turn, with 'silky-smooth [first-person shooter]' at one end and 'jaggy-less 4K resolution' at the other."

Peter Thoman, who is known for making a Dark Souls mod that vastly improved the graphics in the game's Windows PC port, agrees.

"While increasing the resolution only increases GPU load, increasing the frame rate also increases the CPU load significantly," Thoman told Polygon. "So in cases which are CPU-limited (or limited on the GPU by some very specific factors), you might be able to increase resolution while not affecting (or only slightly affecting) frame rates."

While the impact of resolution on performance depends on a variety of factors, there's a concrete measure of the difference between frame rates. The "frame time" is the time it takes to execute a single frame, and is generally expressed in milliseconds. At 30 fps, developers have one-thirtieth of a second, or 33.33 milliseconds, to render each frame. Doubling the frame rate to 60 fps cuts the frame time in half to one-sixtieth of a second, or 16.67 milliseconds. It takes time to render everything on the screen — objects, particle effects like explosions, visual effects like antialiasing and more — so whatever the target frame rate is, the total rendering time can't exceed the corresponding frame time.

In Stratton's experience, developers decide on a frame rate target, not a resolution, and go from there. Simply attempting to make the game you wanted to make and seeing where the frame rate and resolution ended up, well, that wouldn't be a good use of development resources. Although Stratton said he hasn't been personally involved in that decision-making process, he listed a few potential factors: hardware and engine technical limitations, a desire to keep up with competing games, a studio's past history and the art department's creative vision.

"For all the games I've worked on, the frame rate target is the fixed point against which all other performance decisions are made," said Stratton. "If your target is 30 fps, then you cut whatever corners are necessary to hit 30 fps as consistently as possible — reduce the scene complexity, use cheaper/lower-quality algorithms or yes, render at a lower resolution." Stratton noted that he has heard of some developers choosing to drop from 60 fps to 30 fps rather than make cuts in visual quality. But usually, the frame rate target is "the line in the sand that doesn't get crossed."

Do developers care, and do they think others should?

When it comes to frame rate and resolution, Stratton told Polygon that his personal bare minimum is 720p at 30 fps, since "anything below those levels is actively unpleasant." However, he continued, "anything higher isn't necessarily a clear improvement." Asked to choose one or the other, he said he'd probably prefer a higher resolution than a higher frame rate, since he "[doesn't] personally place much value on frame rates above 30 fps."

This is a your-mileage-may-vary area of graphics because your personal preference is likely to depend on the kinds of games you play. The higher the frame rate, the lower the theoretical input lag (the delay, in milliseconds, between entering a command on a controller and seeing an action on the screen). According to an analysis of responsiveness originally published in Game Developer magazine, the real-world minimum input lag for a 60 fps console game is three-sixtieths of a second, or 50 milliseconds — a figure that is doubled to one-tenth of a second, or 100 milliseconds, for a 30 fps console game. That gap can make a difference in games that depend on very responsive controls.

"I acknowledge that ultra-low input latency has tangible benefits for certain genres of games (fighters, multiplayer shooters, etc.), but those aren't the games I tend to be playing," said Stratton.

"In most cases, I'd rather a team uses the extra 16 milliseconds per frame," he continued, referring to the difference in frame time between games at 60 fps and 30 fps, "to make the game world significantly more beautiful/fun/rich/complex/awesome, instead of wasting it all to make the game marginally more responsive to player input."

Thoman has a higher threshold for performance than Stratton's 30 fps, but his opinions are otherwise similar.

"What's more important to me depends on the genre. In anything but very reaction-based action games I can live with 45+ fps, and then try to increase [image quality] as much as possible while staying above that boundary," he said. "I'm generally not that hung up on frame rates above ~45 [fps]."

Even in the land of facts and figures, personal preferences play a major role in discussions about visual quality. Both Thoman and Stratton agreed that people argue over the importance of frame rate and resolution partly because they're trying to justify their investment in a particular console. But according to Thoman, these figures do matter beyond serving as ammunition in the console wars, and they should matter.

"I strongly believe that gamers should care about resolution and frame rate, because the former makes games look much better and the latter does that, and in addition makes them more playable," he said. "I'm always surprised when publishers claim that resolution differences don't matter — if that is the case, then why are most of them sending out screenshots rendered at 8K?"

"I don't believe the average consumer can tell the difference between 720p and 1080p"While a higher-resolution image is ostensibly better than a lower-resolution one, it's more difficult to distinguish between two images when they're both in HD — especially considering how far most people sit from their TVs.

"In a side-by-side comparison, I don’t believe the average consumer can tell the difference between 720p and 1080p," said Barry Sandrew, PhD, a digital imaging expert who is the founder, chief technology officer and chief creative officer of Legend3D, a well-known stereoscopic 3D and visual effects firm. "If you get into the higher resolutions like 4K or 8K, there is an obvious difference, but the difference is best appreciated when sitting close to the screen and the TV is larger than 55 inches."

What does the future hold?

According to Stratton and Thoman, developers are able to wring more and more performance out of a particular hardware platform as they become increasingly familiar with it. In fact, Stratton explained, this is what keeps consoles relevant long after they've been outclassed by newer, more powerful computers.

"Uncharted: Drake's Fortune and The Last of Us were made by many of the same people, and run on the exact same hardware, yet [The Last of Us] looks significantly better than [Uncharted: Drake's Fortune]," said Stratton. "It's worth pointing out that the PlayStation 3's hardware is nearly 10 years old. Go look at PC games from 10 years ago and compare them to the PS3 games being released today. That's what years of experience with a fixed hardware platform gets you."

Stratton also noted that he didn't fully believe this was true before he was working with console hardware for a living. As a student, he figured that developers under-utilized a console's power early on in its life cycle in order to "give themselves some headroom to improve in later titles." But an experienced developer who visited his school disabused Stratton of that notion.

The individual explained that every game uses 100 percent of the system resources that the developers can access. "We can't add anything to the sequel without first improving our existing tech to make room for it," Stratton recalled the developer saying.

"rest assured that we're nowhere near hitting the full potential of next-gen consoles yet"

"Game developers are constantly learning, constantly optimizing, constantly swapping tricks with each other to make better use of each generation of hardware," Stratton continued. "The same is true at the platform level; first-party core tech teams like mine are tirelessly improving the performance of the OS and core rendering libraries, and passing the improvements on to developers."

Six months into the life of the PS4 and Xbox One, there aren't many games on either console that reach the holy grail of 1080p60. But it's possible, perhaps even likely, that developers will get there over time.

This also applies to any differences in power that may exist between the consoles. Compare the current situation to the previous console generation. Because the Xbox 360 was easier for developers to work with, games on that system often ran better than their PS3 counterparts during the early years of the generation. But game makers eventually figured out the idiosyncrasies of the PS3 hardware, and were able eliminate significant disparities in performance.

In mid-November, just before the launch of the new consoles, we spoke with a developer of one major multiplatform title who told us that although the PS4 may be ahead of the Xbox One at this point, the systems are similar enough that — given enough time with both — the performance gap is "completely temporary."

"While I can't go into specifics," said Stratton, whose work includes writing the official rendering API for the PS4, "please rest assured that we're nowhere near hitting the full potential of next-gen consoles yet."

The next level of puzzles.

Take a break from your day by playing a puzzle or two! We’ve got SpellTower, Typeshift, crosswords, and more.