Bridging the generation gap: Porting games to new platforms

With remasters in vogue and ports in the spotlight, we look at the art of transferring games to new hardware.

Naughty Dog built its post-apocalyptic adventure The Last of Us for one of the most idiosyncratic pieces of gaming hardware ever designed. Released towards the end of the PlayStation 3’s life cycle, the game ran on an engine designed to extract every ounce of performance from the platform’s unique — and ageing — Cell architecture. So when a team at the studio set out to port the title to the PC-like hardware of Sony’s PlayStation 4, the studio didn't expect the process to be easy.

"I wish we’d had a button [labeled] ‘Turn On PS4 Mode’," said Neil Druckmann, creative director at Naughty Dog, to Edge Magazine just prior to the game’s launch. "But no ... we expected it to be Hell, and it was Hell."

You can also listen to this feature below, thanks to Polygon Longform. Links to subscribe in iTunes, in your podcast player of choice or to download the episode as an MP3 are also available below.

The process of reworking a game for a new platform — one which may not have existed during development of the original title — can be a colossal undertaking. Success calls for a phenomenal degree of engineering expertise, but also careful consideration of logistical and artistic challenges. Developers must contend with misplaced assets, unfamiliar pipelines and hundreds of thousands of lines of code that, in most cases, they did not write themselves.

HD remasters have enjoyed a renaissance in recent years, with over 30 titles already arriving on the latest generation of consoles. The challenges faced by video game archivists are starting to gain legal recognition, while troubled ports continue to expose the difficulties inherent to the reproduction of interactive experiences.

Polygon recently spoke to staff at Naughty Dog, Bluepoint Games, Nixxes Software, Aspyr Media and Digital Eclipse to see what goes into today’s ported and remastered games.

Ico HD Team Ico/Sony Interactive Entertainment
Ico on PlayStation 3

"Just don’t screw it up ..."

From a commercial standpoint, remastering games can be a lucrative strategy for publishers. It offers a way to capitalize on a back catalog for less than the cost of developing a new game, while also representing an opportunity to bring an existing title to a new audience. A well-timed release can rekindle interest in a franchise before a full sequel or reboot, or help recoup the costs of previous investments.

Gamers often benefit as well. The multi-generational maturity of major franchises, combined with the patchy backwards compatibility offered by the last two generations of consoles, has made remastered ports a hugely popular way to enjoy the classics.

As a result, dozens of studios have sprung up over the years specializing in remastered games. For Bluepoint Games — the team behind acclaimed titles including the Metal Gear Solid HD Collection and Uncharted: The Nathan Drake Collection — securing its first commissioned remaster was simply a case of being in the right place at the right time.

"[Sony was] looking for a studio ... We were standing right in front of them."

"In 2009, we had been in operation for just a couple of years," says Marco Thrush, Bluepoint’s chief technical officer and co-founder. The fledgling studio — formed by key members of the Metroid Prime team — had made a name for itself by developing the shooter Blast Factor, one of two games that were ready in time for the launch of the PlayStation Network, on an engine built in-house. "[Sony was] looking for a studio to bring God of War and God of War 2 to the PS3 in advance of the next release in the franchise," says Thrush. "We were standing right in front of them."

Bluepoint’s mantra with remasters is simple: "just don’t screw it up." The end result must be the same game people remember playing, says Thrush, and all enhancements must stay true to the original experience. Subtle as these changes may appear, though, he says the remastering process involves considerably more than routine coding and spruced up art assets.

Teams tasked with the restoration of older titles will often use emulators: ‘wrappers’ of code that mimic the software’s original environment. The computational power of modern consoles can make this approach impractical for recent titles, though, and consequently developers must re-engineer these games to run natively on new hardware. For Bluepoint and certain other developers, this process begins not with source code, but with a retail disc.

"We always try to get the data from the retail disc [and] reverse engineer the formats," says Thrush. "That way, we have a 100% known data set that we can use to replicate the original game." Source code archives, he explains, often don’t match the final shipped assets. One need only look at the problems encountered by Konami during production of the Silent Hill HD Collection for a demonstration of these risks; the archived code with which the team worked came from an incomplete, buggy beta.

"More work needs to be done to do justice to a title on a later-gen platform."

Last minute changes made on local drives or build machines often escape the backup process, making the retail data an attractive starting point. If the original studio can supply higher-resolution assets, the remastering team can match these up and swap them in later.

From there, the process becomes impossible to predict. "Every project has its own unique challenges," says Thrush. "You can’t anticipate what some of those problems will be until you get your hands dirty." In the case of the Ico & Shadow of the Colossus Collection, this meant firing up a hard drive to discover that every file required a Japanese build of Linux in order to open. This, combined with an unfamiliar build pipeline, entailed huge amounts of work for the team simply to learn its way around the dataset.

To make matters even more complicated, gamers’ expectations are getting higher. The last generation of remastered titles — games like Metal Gear Solid 2 & 3 — benefited from the shift over to digital HD. Freed from the confines of standard definition, titles often looked stunning to players simply as a result of being rendered at the higher resolutions available on the PlayStation 3 and Xbox 360.

Since the launch of the PlayStation 4 and Xbox One, however, standards have changed. "More work needs to be done to do justice to a title on a later-gen platform," says Thrush. "[There are] increased expectations, from players as well as ourselves." With 1080p and 60 frames per second becoming the common baseline standard for modern remasters, it takes significantly more work to raise the bar on a re-release.

The Last of Us
The Last of Us

Endure and survive

Although Naughty Dog shipped The Last of Us (TLOU) less than six months prior to the debut of the PlayStation 4, the studio did not initially develop the game with a port in mind.

"[We were] always building on ‘what do we need to do to get an incredible game on the PS3’", says Ricky Cambier, lead designer at Naughty Dog. "No choices were made within that regard for ‘well, in future we might wanna do this.’" However, the game’s production pipeline afforded the team luxuries often outside the reach of external porting specialists such as Bluepoint: an intimate understanding of the game’s code, and well-archived high-resolution source assets.

"Uncharted 4 gets to sit on top of all that hard work that went into [TLOU] Remastered."

Development of The Last of Us Remastered for PS4 began with a small group of programmers following the release of the original game in June 2013. As the project progressed, the team expanded. "It was predominantly programmers, with that initial bar of ‘just get it up, get it working,’" says Cambier. "Then once we hit a certain level [of performance], we’d get a couple of artists helping with the textures and the models, and making sure that pipeline worked."

Porting the game was a test of the studio’s tools. Naughty Dog's previous migration from the PS2 to the PS3 involved scrapping most of the engine which supported the Jak & Daxter titles, and as a result, the team designed the engine powering the first Uncharted on the PS3 for long-term viability. Migrating The Last of Us — a ready-made game experience — served as a way for the studio to test its technology on Sony’s new console.

"This was an evolutionary process," says Cambier. "We were able to take all the hard work on the PS3, move it over, and then start to build on top of that." The decision to leave the game’s content untouched gave the team what Cambier refers to as a "controlled environment" for technical exploration. The studio could then take this new expertise forward to a new title; as Cambier puts it, "Uncharted 4 gets to sit on top of all that hard work that went into [TLOU] Remastered ... and now, we can push even further."

The PS3 has one CPU and six satellite processors, whereas the PS4 features eight CPU cores. During his presentation at SINFO, Naughty Dog Lead Programmer Jason Gregory described these cores as "higher quality more powerful processors" than those in the PS3, making the PS4 "a highly parallel machine." The process of reworking TLOU’s engine to take advantage of this design entailed a reorganization of all the tasks which the console must execute in order for the game to run.

"There were a lot of people that were scared at this point."

At the 2015 Game Developers Conference, Lead Programmer Christian Gyrling recalled that the initial port of the game — prior to any significant engine optimization — ran at less than 10 frames per second. Even after the game’s new rendering code had been reworked, TLOU Remastered was still not hitting 30 frames per second, let alone the team’s goal of 60.

By this stage, the team were due to ship the game in under three months. "There were a lot of people that were scared at this point," says Gyrling.

Naughty Dog’s solution was to split the CPU’s workload into two. Rather than processing one frame at a time, first computing gameplay and then rendering it, the CPU could work on gameplay and rendering simultaneously, but for separate frames. While preparing to render a particular frame, the CPU could simultaneously compute gameplay for the frame after that. This allowed the team to hit its target of 60 frames per second at 1080p resolution.

The higher frame rate even posed some surprising challenges for the team. "We had to resample a lot of our animations," says Cambier, referring to the process of generating the extra frames from the raw animation data. "[Animations] were sampled at a certain rate for memory, so now we had to go back and increase all of these ... because we’re now refreshing so much more." The team placed emphasis on increasing the fidelity of the game’s original design, leaving the core experience untouched.

"We think about what benefits could be had if the games were developed today."

"So much effort and time went into creating a certain aesthetic," says Cambier. "This isn’t just ‘OK, well this GPU is different, so whatever the lighting engine turns out, that’s fine’; … it’s still a stylized world, so going back and making sure we can match that aesthetic [is key]." The developers purposefully limited the extent of their changes. Even points of criticism — such as several difficulty spikes throughout the game — remain present.

The team did however add several small touches to encourage the feeling of a ‘native’ PS4 experience. These range from very subtle — the ‘click’ of the flashlight emanating from controller’s speaker, and the light-bar indicating health — to entirely new features.

The much-lauded Photo Mode added the ability to pause the game, then pan and zoom around the scene, adjusting depth of field and vignetting. Combined with DualShock 4’s share button, it extended a social dimension to the game’s single-player campaign. "[It] cultivated even more of a community," says Cambier. "We always talked about how important exploration was in TLOU, and all these little nooks and crannies that we put so much detail into, that people were exploring again but in this new way."

Tailoring to the new platform is something that Bluepoint’s Thrush agrees is vital to a successful remaster. "We try to stay as close as possible to the original experience," he says, "but each game or franchise is in a unique position to benefit from new console capabilities." The key, he believes, is staying true to original intentions: "We think about what benefits could be had if the games were developed today."

Black Ops 3
Call of Duty: Black Ops 3

Any port in a storm

Naughty Dog's challenges are not all unique to remastered games. By examining not only remastering, but other unique platform shifts, it becomes apparent that many similar problems are inherent to porting as a whole.

Particularly complex from a technical standpoint is the practice of downporting. As the PS4 and Xbox One consoles continue to build momentum, publishers keen to reach a broad audience sometimes hire external teams to port new games back to last-gen hardware. Call of Duty: Black Ops 3 is a recent example; while Treyarch developed the core game, Activision outsourced pared-back versions for the PS3 and Xbox 360 to Beenox and Mercenary Technology.

Production of a downport is effectively a de-mastering. The developers strip back the game’s engine, compress its assets and sometimes remove features while preserving as much of the experience as possible. The circumstances of their production, however, often demand a simultaneous release on all platforms; this eliminates the possibility of working on retail code and entails a whole new set of challenges.

"We’re pretty much on high alert all the time in a co-dev downport situation."

"Potentially the most challenging thing about a downport like Titanfall was that we were in co-development with the lead team," says Thrush of Bluepoint, developer of the Xbox 360 version of Respawn’s flagship shooter. "If we were working on final code, it would be ‘easier’ — for lack of a better word — to devise the strategies for getting as much quality and performance out of the last-gen console."

As Bluepoint worked on the game’s Source engine, compressing 5GB of assets into just 512MB of memory, the lead team at Respawn continued to make changes to the game. Bluepoint’s developers had to bring these fixes and tweaks over to their Xbox 360 branch in order to maintain parity with the lead version. Given that the frequency of changes to a game’s codebase tends to increase as the ship date draws nearer, the team faced an intense crunch towards the end of the project. "We’re pretty much on high alert all the time in a co-dev downport situation," says Thrush, "whereas we have a better idea of where we are on the roadmap in a remaster."

Remastering and downporting generally involve moving games between platforms that are generations apart from one another. What, then, can be said of ports involving systems with broadly similar architectures and capabilities? Given the relative homogeneity of this generation’s gaming hardware — AMD famously got its processors inside every new console — are ports to and from the PC any easier today than they once were?

"I feel that the importance of these common architectures is somewhat overstated," says Jurjen Katsman, founder of Nixxes Software. As the studio behind acclaimed ports including the PC version of Deus Ex: Human Revolution and that of its upcoming sequel Mankind Divided, Nixxes is intimately familiar with how the PC’s unique requirements as a gaming platform have changed compared to those of today’s consoles. "The platforms still have very different APIs with their own strengths and weaknesses," says Katsman, referring to the toolsets which define how software components can interact. "There are [also] meaningful differences in the number of cores that is available for GPU or CPU."

"Over 90% of the major content we license and develop for Mac is likely going to be developed first on the PC."

Modern consoles may bear an architectural resemblance to today’s PCs, but this does little to address the complications which have always faced PC ports. "The main challenges when going to PC are still in scalability to both lower-end and higher-end hardware, and in good control schemes on mouse and keyboard," says Katsman. Consoles provide developers with low-level access to the system’s hardware with no overhead from the OS, and a known environment in which to test. The fragmented PC platform introduces thousands of other variables, and thus affords creators with no such luxury.

Porting games to the Mac entails issues similar to those encountered during production of PC ports, but the platform carries with it a raft of additional complications. Elizabeth Howard, Vice President of Publishing at Mac specialists Aspyr Media, says that it’s often necessary to develop two separate builds of each game: one which takes advantage of Steam and its Steamworks features, and one which meets Apple’s strict requirements for sale on the Mac App Store.

These guidelines include a ‘sandboxing’ policy which prohibits reliance on third-party software such as Steam. "Over 90% of the major content we license and develop for Mac is likely going to be developed first on the PC utilizing Steam and SteamWorks," says Howard. "Because the Mac App Store is such a significant part of the Mac market, we often cut features and create a new separate version of the game in order to accommodate Apple requirements."

"When hiring, we don’t underestimate the value of someone who has contributed to open source projects."

In addition, Windows games commonly rely on Microsoft’s proprietary DirectX APIs, as opposed to the cross-platform OpenGL. "Converting from Direct3D (D3D) to OpenGL can be challenging," says Jez Sherlock, Aspyr’s Director of Development. "We have developed technology that now handles many of the fine details for us and allows us to very quickly get a D3D engine running in OpenGL, but it’s not always straightforward."

The process, says Sherlock, is never static; each project is unique and the challenges will continue to evolve. "Our technology was for a time focused around D3D9, and then 11 came along, now there is 12. Keeping pace with Windows technologies is a challenge."

The development of outsourced ports calls for a unique set of skills. "The most important thing someone can bring to the table is the ability of not being daunted by hundreds of thousands of lines of code that they didn’t author themselves," says Howard. "When hiring, we don’t underestimate the value of someone who has contributed to open source projects."

The varied and ever-changing nature of the work involved in porting can make it a great learning experience for developers. Much as Naughty Dog’s remaster of The Last of Us provided a test of the studio’s tools and an exercise in PS4 development, the opportunity to remaster an existing game exposes creators to new technologies and workflows. "The nature of these projects is also that you relatively quickly see a lot of different engines and different systems," says Katsman of Nixxes. "It can be a great learning experience for more junior staff as well, as long as they get the right guidance."

Mega Man
Mega Man Legacy Collection

The definitive experience?

Video games, given their dependence on short-lived specialist hardware and the nature of their interactivity, are more troublesome than film or literature to preserve authentically. "Video games are built for specific platforms and are not easily portable to anything else," says Frank Cifaldi, head of restoration at Digital Eclipse. "With all the moving parts involved in engineering for specific platforms it’s I would say impossible to exactly replicate."

Porting a game to run on new hardware is challenging in itself. Attempts to recreate multiplayer experiences, or to provide what Cifaldi refers to as a "contextualization" of the original game entail a raft of additional complications.

Some in the gaming community have expressed disillusionment with the large number of remastered titles seen in recent years. They are a profitable alternative to true backwards compatibility, and — in a sense — they exist to exploit players’ nostalgia by creating a supposedly improved or ‘definitive’ version of a past experience. As a potentially lucrative publication strategy, might the resulting influx of remastered titles be occurring at the expense of more original games?

"Preservation isn’t just a case of retaining the original."

"I think that in general people have a hard time understanding that any big video game has a budget," says Cifaldi. "It’s easy to be like, ‘why did they just remaster this game instead of making a new one?’ It’s probably because a new game would cost about five to 10 times as much as porting an old one, and would be much, much riskier to try to sell."

Commentators have also levelled criticism at the arguably destructive nature of certain re-releases. These titles exist on a spectrum; to use film as an example, there is an obvious difference between Criterion’s restoration work and LucasFilm’s treatment of the Star Wars films.

Cifaldi argues that true remasters — distinct from remakes or reinterpretations — respect the original artistic intent. "If we’re talking about The Last of Us Remastered, we’re talking about 3D assets," says Cifaldi. "You’re actually going to the original source elements and presenting them in an even cleaner way than before. And I would argue that that is a totally valid approach for that kind of game; it is the equivalent of putting [Star Trek:] The Next Generation on Blu-ray." [Edit: This paragraph originally omitted the Star Trek reference from Cifaldi’s quote.]

"Preservation isn’t just a case of retaining the original," says Cifaldi. "I think there’s an argument that awareness is a form of preservation. And I would say that keeping these games alive via remastering is preserving that game’s heritage, and intent, and place in this world, absolutely." Babykayak