Sony’s brand-new artificial intelligence laboratory revealed its first breakthrough technology on Wednesday — GT Sophy, an AI capable of superhuman reactions and airtight racing strategy to beat the most skilled human drivers. It’s the cover star of this week’s edition of the science journal Nature, but don’t go looking for it at launch in Gran Turismo 7 next month.
GT Sophy, Sony says, is an achievement not so much because the AI can evaluate and execute complicated decisions with lightning speed, nor because it can master the racing lines of three Gran Turismo Sport tracks down to the millimeter. Motorsport, noted Michael Spranger, Sony AI’s chief operating officer, also relies upon etiquette — hard and aggressive driving that still plays fair and observes the spirit of rules as much, if not more than, the letter of what is legally allowed.
Coding an AI in a conventional racing video game to drive hard but fair is a tremendously difficult process because of the vagaries of racing etiquette. Kazunori Yamauchi, the creator of Gran Turismo and chief executive of its studio, Polyphony Digital, said that GT Sophy was developed not only to show respect to competitors, but to behave in a way that human drivers would respect its performance.
“The agent should be a friend, a comrade, a buddy to human beings, an agent that people can feel sympathy with,” Yamauchi said, through a translator. “Also, the agent can stimulate the emotion of people, so that the agent and human beings can mutually respect each other.”
In a racing demonstration following a half-hour presentation, four GT Sophy bots raced against four Gran Turismo esports competitors — Tomoaki Yamanaka (2021 TGR GT Cup champion), Takuma Miyazono (2020 Nations Cup world champion and 2021 runner-up), Ryota Kokubun (2018 Nations Cup Asia/Oceania Champion), and Shotaro Ryu (runner-up, 2019 Japan National Esports champion, youth division). GT Sophy had cars in 1st, 3rd, 5th, and 7th position on the grid, followed by Yamanaka, Miyazono, Kokubun, and Ryu. The eight were racing at Autodrome Lago Maggiore, a fictional course in Gran Turismo Sport, in the Porsche 911 RSR Type 991.
GT Sophy Rouge, the pole-sitter, led wire-to-wire and won by 5.8 seconds over Yamanaka; Rouge’s fastest lap was 1:54.373, more than two seconds faster than Yamanaka’s best at 1:56.422. For the uninitiated, both are absolutely dominating margins, especially given the level of competition. Yamanaka drew within a second of Rouge partway through the second lap (of three); but the nonstop pressure from GT Sophy Lavande, starting and finishing third, required too much defense for Yamanaka to mount any serious overtaking attempt.
Rouge ran wide on several apexes, to the point commentators guessed that human racing stewards would flag the AI for violating track limits. Evidently Rouge took every last allowable millimeter of kerb, but no more, as no penalties were awarded. Lavande almost overtook Yamanaka in turn 7 of the third lap, but he slammed the door shut and concentrated on defending, conceding the race to Rouge. The first four on the grid all finished in order, with Kokubun and Ryu taking fifth and sixth after GT Sophy Emeraude smacked the wall on lap 2 — a racing risk that seemed to affirm GT Sophy’s sophistication more than disprove it.
Yamauchi, in a Q&A following the demonstration, acknowledged that developing an unbeatable AI might be a technical achievement, but not much fun for everyday players. “In any sense, GT Sophy will always understand the surrounding environment and the conditions, and that includes the level of the players, too,” Yamauchi said. “So I’m sure that, ultimately, we will be able to provide joy and fun as GT Sophy races with people.”
GT Sophy will join Gran Turismo 7, which launches on PlayStation 4 and PlayStation 5 on March 4, but the AI will be added later through an update. Yamauchi and Sony didn’t give a window for when that update would come.
But to give an example of the kind of heat Sophy can bring, if necessary, Yamauchi explained that the AI has mastered a type of cornering that most humans wouldn’t conventionally attempt, much less ever be taught.
Typically, racing drivers are taught to brake in a straight line under a “slow in [to the curve] fast out [exit]” philosophy. “Gran Turismo Sophy doesn’t do that, necessarily,” Yamauchi said. “When Gran Turismo Sophy goes into a curve, it actually brakes as it turns into the curve. Usually when you go into a curve, the load is only on the two front tires; but Gran Turismo Sophy’s case is that you have the load on three tires, two in the front and one in the rear as well. It allows the car to brake as it is turning.
“We notice that, actually, top drivers such as [seven-time Formula One world driver’s champion] Lewis Hamilton or [2021 world champion] Max Verstappen actually are doing that, using three tires, going fast in and fast out, all these things that we thought were unique to GT Sophy,” Yamauchi said.
It points both to how much GT Sophy has learned — over 45,000 hours of machine learning, in fact — and also how much more it can discover.