One of the most intense episodes of Game of Thrones’ eighth and final season, “The Long Night,” was dark and full of terrors. In fact, it was so dark that viewers had a tough time making out the terrors in question. Based on behind-the-scenes footage, that was intentional on the part of the filmmakers.
It’s unclear where to lay the blame. Was it the fault of a creative decision to prioritize low-light photography? Is modern broadcast television and streaming video technology poorly equipped to handle dark, fast-moving, action-heavy footage like scenes from the Battle of Winterfell? Do viewers just need to properly calibrate their TVs?
After extensive research, the answer I’ve come to is: all of the above, and because of that, most people may never see Game of Thrones the way it was meant to be seen.
[Ed. note: Any Game of Thrones screencaps in this article — the images with captions discussing video quality — are 1080p crops of unretouched PNG files captured via Windows’ native screenshot tool.]
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/8826131/ee36b2abde8bb063d5bc0efa9c2a75fac17b886b26a0c1c0ba62680e42c5f1d061639716961bde81becc41f0b93d65ef.jpg)
The intended reason Game of Thrones looks dark
Longtime Game of Thrones watchers may have noticed that the show darkened over the years — and I’m not just talking about the subject matter. In a 2017 interview with Insider, cinematographer Robert McLachlan noted that he and the show’s other directors of photography had changed their lighting philosophy following the first season, which he said had “a lot of unmotivated backlight.” (The phrase refers to artificial illumination, lighting that doesn’t have an actual in-world source in the scene.)
“We’re all very much on the same page where we’re trying to be as naturalistic as possible,” McLachlan explained. “[We want] to make these sets and locations feel as if they’re absolutely not lit by us, but only by mother nature or some candles or what have you, so that it feels more naturalistic albeit enhanced in some cases.”
The interview was published during Game of Thrones’ seventh season, by which point winter had come to the world of the show. With the arrival of colder weather, the people of Westeros kept their shutters closed to battle the chill, even during the daytime — which made it much more challenging for the show’s cinematographers to illuminate interior scenes, according to McLachlan.
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/16162459/a00bd36451f75e37d70874f6e2e16209f34f00f122879ea4322b59232c5ae88427822c70d1910bc90de6e0fb1449bd49.jpg)
That was exponentially more true for “The Long Night.” Most of the previous fight sequences on the show — Hardhome, the Battle of the Bastards, the “loot train attack” — took place during the day. But the Night King stayed true to his name this time, laying siege to Winterfell under cover of darkness (and enveloping the castle in a chilling fog that made it difficult for Daenerys’ two dragons to navigate the skies). Anyone unsuited for combat hid underground in the castle’s crypts ... which were lit only by candles, lanterns, and torches.
“Everything we wanted people to see is there,” said Fabian Wagner, the episode’s director of photography, in a Wired U.K. interview. He added that Game of Thrones’ showrunners “decided that this had to be a dark episode” because they wanted to “find a unique way of portraying the story” compared to the previous daytime battles. And as Joanna Robinson noted at Vanity Fair, the filmmakers made clever use of light — particularly in the form of fire — to symbolize hope for humanity, whether it was evaporating at the start of the battle in the failed Dothraki charge or (briefly) returning with Melisandre lighting the trench.
For the most part, director Miguel Sapochnik — who also directed the Game of Thrones episodes featuring the Battle of the Bastards and the massacre at Hardhome — was able to convey the story beats and character moments clearly enough, using brief shots of people like Jaime, Brienne, Sam, and Tormund to let us know they were still alive amid the carnage. The quick-cut editing of the battle sequences occasionally produced an undesirable level of confusion, though that may have been the point.
Wagner told Wired U.K. that he doesn’t believe it’s necessary to be able to see everything, because “it’s more about the emotional impact.” Instead, he chalked up the complaints about the episode’s darkness to the distribution methods and viewing situations out there. Wagner said that the video compression used by broadcasters can reduce image quality, especially for dark scenes, and also pointed the finger at viewers, saying that many people “don’t know how to tune their TVs properly” and don’t watch the show in an appropriately darkened room.
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/16192167/cba722c2c30e39db21e4ffb9758e7f3f47b34b5c42cdba1e8f202ae2b548c24f0f71a3487390608e43b915f855cd1d8f.jpg)
How TV technology can make Game of Thrones look worse
In addition to being broadcast as a linear television channel by various cable and satellite providers, HBO is available via two streaming services (HBO Go and HBO Now) on a wide variety of devices, including smartphones, tablets, computers, and video game consoles. Streaming and broadcast television rely on different technologies when it comes to video compression, delivery, and display, but any content provider’s goal is to provide a viewing experience that’s generally comparable across all platforms (at least, under ideal network conditions).
Providers like HBO take the raw video file of an episode — which would likely be tens if not hundreds of gigabytes in size — and compress it for transmission to broadcasters (which may compress it further) or for streaming to consumers. This is necessary because bandwidth isn’t unlimited or cheap. There’s a balancing act involved: The goal is to reduce file size as much as possible, without going so far that the compressed version’s video/audio quality fails to maintain the essence of the original product. Compression is used even for transfers to physical media, because storage space is limited.
Video makes up the vast majority of the data. Game of Thrones viewed on Blu-ray might see a video bit rate ranging from 15-35 megabits per second, while 7.1-channel Dolby TrueHD surround sound — an audio format that uses lossless compression — might play at 3-7 Mbps, one-fifth as much. HBO Go/Now support the lossy Dolby Digital Plus audio format, which Netflix also now offers in a range from 192-640 kilobits per second, depending on the available bandwidth. (According to Netflix’s engineers, Dolby Digital Plus at 640 kbps is a “perceptually transparent” level of compression — it’s indistinguishable from the uncompressed studio master — so it doesn’t make sense to waste additional bandwidth on higher-bit-rate audio.)
I should note that bit rate isn’t everything, and just because it’s a number doesn’t mean it’s format-agnostic. Compression technology has gotten better over time, so newer standards are significantly more efficient than previous ones. One of the most widely used video formats today, H.264/MPEG-4 AVC, can deliver the same quality as the old MPEG-2 standard at half the bit rate or less. Dolby Digital Plus will provide higher-quality audio than basic Dolby Digital at the same bit rate.
Comparing bit rates across streaming video and physical media is unreasonable. Local playback of a high-definition physical format is one thing, and the use cases are limited: a Blu-ray player or game console that’s hooked up to a TV, or an optical drive in a computer that’s connected to a monitor. But if a video is going to be streamed over the internet, the file is encoded in a way that is designed specifically for that purpose. And the format often varies by the recipient device: The video feed that a streaming service would send to an Xbox One for playback on a 1080p TV is unlikely to be the same one that it would beam to a smartphone with a 1080p screen.
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/16212642/game_of_thrones_803_fire_macroblocking_1920.png)
There are other reasons why bit rate isn’t the be-all and end-all of video quality. Despite the strong performance of today’s video codecs — the software that converts data from one format into another for transmission or storage, with an “encoder” on one end and a “decoder” on the other — they do struggle with certain kinds of material, with an episode like “The Long Night” primed to expose those weaknesses.
Modern video codecs rely on a technique known as motion compensation. The encoding software uses an algorithm to predict a frame in a video based on how much the pixels in that frame (which is split into groups of pixels called macroblocks) move compared to previous and/or future frames. This accounts for the movement of the camera and of anything in the scene, and it greatly reduces file size because it discards data for the parts of the image that don’t change between reference frames.
However, motion compensation can introduce a video compression artifact known as “macroblocking,” which manifests on the screen as large, jagged blocks of color where there should be smooth edges. Macroblocking is exacerbated by fast motion — the more movement there is, the harder it becomes for the algorithm to predict frames — and there was plenty of that in the Battle of Winterfell. (It can be caused by hitches in the network connection, too.)
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/16211727/game_of_thrones_803_dragon_banding_1920.png)
Because of the way they’re designed, video codecs also have a tough time reproducing dimly lit scenes. The human eye isn’t very good at discerning detail in darkness, and video compression takes advantage of that fact by using less data to encode darker footage. (Turn on the bit rate display the next time you’re watching something — you’ll see that the number usually drops during dark scenes.)
This issue dovetails with broadcast television’s limitations in reproducing colors. The current standard for live TV — whether you’re getting it over the air, from a cable/satellite provider, or from a streaming service like PlayStation Vue — is 8-bit color depth. Long story short, that translates to a color gamut of nearly 16.8 million possible colors, which sounds like a lot. But 8-bit color is unable to accurately represent subtle gradations; it literally doesn’t have enough information to display all the colors necessary for a smooth transition.
When the footage in question features a hazy mess of dragons flying through thick clouds of ice at night, the shades of black, very dark gray, and slightly lighter gray are likely to show up in the picture as distinct bands of color — instead of a gradual shift from darkness to lighter tones. The phenomenon is known, appropriately enough, as “banding.”
This problem would be vastly improved by the introduction of HDR video in broadcast television. Multiple HDR formats exist today; by far the most common ones are HDR10 and Dolby Vision, which are suited only for recorded content like 4K Blu-rays and streaming video. But there’s one that has been designed explicitly for live broadcast TV: hybrid log-gamma (HLG). Like HDR10, HLG uses 10-bit color, which is capable of reproducing 1.07 billion different hues — 64 times as many as 8-bit depth supports. The more colors you can represent, the less you’ll run into banding. (HDR also offers a wider range of brightness levels, so the picture can deliver inky blacks and blinding whites.)
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/16200028/0b7ef878f7a889788a0994907b5d7e67ab7a6851c85696f09c7d95f475347609817a905827ef64f6c15f40cab5351193.jpg)
HLG and 4K resolution are among the cutting-edge features built into the next generation of broadcast television standards, which are collectively known as ATSC 3.0. But that applies only to over-the-air transmissions, not subscription TV from a cable or satellite provider. The rollout of ATSC 3.0 in the U.S. likely won’t begin until 2020 at the earliest.
Now, HBO could start mastering its content in 4K HDR, like Netflix does with its live-action originals. That would allow the company to broadcast the existing 1080i standard-dynamic-range version of Game of Thrones via the linear HBO television channel, and deliver a 4K HDR feed to streaming viewers via HBO Go/Now. (Granted, such a move might risk pissing off cable/satellite subscribers and incentivize them to cut the cord, and HBO may still be wary of jeopardizing its relationships with companies like Comcast — a problem that Netflix doesn’t have.) As far as 4K and HDR, HBO currently says on the HBO Now support site, “We’re exploring what it will take to support these formats in the future.”
The consumers and the screens
As much as I might personally be horrified by people who watch movies and TV with motion smoothing enabled, it’s not reasonable to expect most folks to be tech-savvy enough to know what that is, let alone why or how to turn it off. Filmmakers like Fabian Wagner, the Game of Thrones cinematographer, may hope that people decide to watch the show on a big-screen TV with the lights turned down. But the fact of the matter is that as an artist, once you make something and release it into the world, you essentially relinquish any control over how people experience that art.
Videophiles may watch Game of Thrones live via HBO Now on a home theater setup, with thousands of dollars’ worth of finely tuned audio and video equipment to make the most of the 1080p/Dolby Digital Plus 5.1 stream. Others might watch the show on a small LCD TV with a decade-old cable box hooked up via composite video cables, not realizing that they’re getting a standard-definition experience. Maybe people will stream an episode over 4G on their smartphone’s 5-inch screen while riding the bus home from work. Or perhaps they’ll wait until the entire run of the series is available on 4K Blu-ray, because they want the highest-quality viewing and listening experience possible.
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/16019089/68d089c10ecf1d0753ee6bce94d1acabf72cafe872272b77ef32006413c56328c7d936482b9ff5a98c5191da03357f3b52a797f6db16ac2b54240e4f7422fd0a.jpg)
Nobody likes to be told that they’re doing it wrong, and aside from the person with the obsolete cables, none of the people above are. The average TV viewer won’t and shouldn’t be expected to spend time calibrating the color temperature and gamma on their display. (In a situation like “The Long Night,” they might simply try bumping up the brightness, which would just end up washing out the picture.)
Aside from the physical media purists, everyone interested in a TV series like Game of Thrones is watching week to week. That means they have a limited number of options for accessing the show: either broadcast cable/satellite TV, or a streaming service. And as I’ve explained, those viewing methods all come with their own limitations in terms of image quality and faithfulness to the filmmakers’ original vision.
Many media companies — and considering its famous “it’s not TV” slogan, HBO more so than most — would say they endeavor to support an “uncompromising” vision from their creative partners. But for most of the 20th century, television producers were forced to compromise on lighting. For a number of technical reasons dating back to the days before color TV, as explained in this terrific Slate article, everything was lit for a low-contrast look and minimal shadows — which is why nothing on TV looked remotely like the movies.
At the time, television makers couldn’t count on parity between their finished product and the actual broadcast experience. “When we saw it on a standard definition TV, it was just horrible, and you wondered why you bothered,” cinematographer Arthur Albert, a veteran of The Wonder Years and Better Call Saul, told Slate. “The only place it looked good would be in the color correction suite, and then after that you didn’t know what you were going to get.”
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/16234083/the_wonder_years_arnold_family_1900.jpeg)
In the 21st century, a number of technological advancements — the widespread availability and adoption of digital cameras, digital color correction, and digital post-production — have facilitated the proliferation of low-light cinematography in TV shows. When Game of Thrones’ Wagner told TMZ, “I know [the episode] wasn’t too dark because I shot it,” he wasn’t being arrogant; he and his team of professionals monitored luminance levels as they filmed the the shadowy Battle of Winterfell.
But paradoxically, new technologies may have brought us back to a visual standard akin to the pre-cinematic era of television. At present, the delivery mechanisms and viewing methods for TV are not up to the task of fully realizing the creative vision behind the shows — and therefore, there’s less of a guarantee that what the filmmakers see is what the viewers will get.
I’m sure that the raw video file of “The Long Night,” the final cut as assembled by the filmmakers in the editing room, looks immaculate on a professional-grade studio monitor. The closest thing to it that consumers will get to experience is the 4K Blu-ray release of Game of Thrones’ final season in HDR.
It’s easy to put the blame on the content provider, but the calculations for HBO are different from those of streaming-only companies like Amazon, Netflix, and Hulu. The minimum recommended bandwidth for HBO Go/Now is just 5 Mbps; let’s double that to 10 Mbps to guarantee a consistent high-quality streaming experience. It would take a good deal more speed to reliably stream 4K — Netflix suggests at least 25 Mbps — and HDR would require a bit of extra bandwidth.
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/16234099/stranger_things_season_2_eleven_darkness_3840.jpg)
Considering how popular Game of Thrones is, it would likely cost HBO a lot of money to deliver a higher-quality (i.e., larger) video file to broadcasters and customers. And it might cost even more for the company to rework its entire production pipeline to generate 4K HDR video at the start. HBO has to weigh those costs against the benefits. Would it produce an appreciable jump in subscribers if HBO Go/Now began streaming in 4K and HDR? It’s hard to believe that that’s a sticking point for anyone who’s considering signing up to watch Game of Thrones. Right now, 1080p is good enough for most people.
More than 4K or anything else, HDR has the potential to be the most impactful advancement in TV technology since the introduction of HD resolution and digital television in the previous decade. But it hasn’t yet reached widespread adoption on the content side or the consumer side. While 4K TV sets have been on the market for years, we’ve only just gotten to the point where HDR is a standard feature on the majority of new TV models. And the landscape is confusing not only because of the multiple HDR formats vying for supremacy, but because the quality and availability of the experience can vary wildly. (On many TVs, you won’t get HDR if your source is plugged into the wrong HDMI port.)
Game of Thrones ended at a moment of technological fluctuation. No one and everyone is to blame for “darkness” or any other visual degradation. There is promise for the streaming era; the debut of forward-looking TV standards may lead the industry to rethink its production pipelines, and push TV manufacturers to focus on HDR and other foundational display technologies rather than gimmicks like motion smoothing. Major improvements in broadband speeds and accessibility — tangible ones, not marketing hype like 5G — should make it easier for content providers to deliver higher-fidelity video and audio to customers everywhere, and make it more feasible for people to receive and experience that content.
If this sounds like the future is still too far off, don’t despair: A lot can change in a relatively short period. A decade ago, you couldn’t stream HBO programming on demand in most of the U.S. — HBO Go didn’t launch nationwide until 2010. Five years ago, you couldn’t get HBO without a cable/satellite TV subscription — HBO Now didn’t exist until Game of Thrones’ fifth season in April 2015. The long night may be coming to an end.