Please select your country / region

Close Window
GT
My Page
IE
General Information

Introducing Gran Turismo Sophy: a Champion-level Racing AI Agent Trained Through Deep Reinforcement Learning

Gran Turismo Sophy, a racing AI agent developed through a collaboration between Polyphony Digital Inc. (PDI), Sony AI and Sony Interactive Entertainment (SIE) was announced on 9 February.

Gran Turismo Sophy was created using state-of-the-art deep reinforcement learning technology developed by Sony AI and Gran Turismo, as well as through large-scale training via SIE’s cloud gaming infrastructure. This takes machine learning to the next level by introducing an AI agent to a hyper-realistic racing simulation that requires real-time decisions on a continual basis for the duration of the race.

Michael Spranger, Sony AI COO, describes Gran Turismo Sophy as “an AI agent that learned to drive by itself at a very competitive level and is able to compete with the best drivers in the world.”

Gran Turismo Sophy began as a blank slate and evolved from an AI that could barely maintain a straight line on a track to a racer that can compete against the best Gran Turismo Sport drivers in the world.

Gran Turismo Sophy opens up new possibilities for gaming and entertainment. Below we explain how this exciting project came to life.

The First True Test

Gran Turismo Sophy’s training began in April 2020 with the formation of Sony AI. From this point onward, Sony AI team worked closely with Polyphony Digital to develop and improve the agent’s capabilities. The first 'Race Together' event was hosted on 2 July 2021 which saw Gran Turismo Sophy competing for the first time against a team of four of the best human drivers, led by the triple-champion of the '2020 FIA Gran Turismo Championships,' Takuma Miyazono.

In solo Time Trial scenarios, Gran Turismo Sophy showed superhuman speed, recording faster lap times than the human drivers. However, racing against other humans proved to be a different challenge.

“I think we all underestimated how hard it would be to get the sportsmanship side of it right, and learn to do that without being overly aggressive or overly timid in the face of competitors,” said Peter Wurman, Sony AI Director and Project Lead.

Racing Done Right

An AI agent's performance is upper-bounded by the complexity of the challenges it faces and Gran Turismo can present a great challenge to the AI, since it captures the dynamics and physics of the sport where other similar games only get racing physics partially done correctly.


“I wanted to recreate cars and the entire culture around cars in a video game,” said Kazunori Yamauchi, President of Polyphony Digital. Charles Ferreira, a PDI engineer, added: “the realism of Gran Turismo comes from the detail that we put into the game, from the engine, tyres, and suspension to the tracks and car models.” It's such realism that makes this a unique AI challenge and pushed the Sony AI team and Gran Turismo Sophy to new heights.

Training with Mass-Scale Infrastructure

Using a technique called reinforcement learning, Gran Turismo Sophy learned to drive through positive and negative feedback using inputs such as how fast it is going, which way the wheels are pointed, the curvature of the track, etc. Mirroring how humans require an estimated 10,000+ hours to become proficient at a skill, Gran Turismo Sophy duplicated itself and took on multiple different scenarios simultaneously. This required a great deal of computing power, which was provided by Sony Interactive Entertainment.

“With a standard AI simulation, a model is created and then run. Analysis occurs and updates are then added to that simulation and run again. This process can sometimes become extremely time-consuming,” said Justin Beltran, Sr. Director, Future Technology Group, Sony Interactive Entertainment.

“However, leveraging SIE’s worldwide mass-scale cloud gaming infrastructure, Gran Turismo Sophy was powered to deploy state-of-the-art learning algorithms and training scenarios and successfully run tens of thousands of simultaneous simulations in this cutting-edge environment that supported this revolutionary technology,” continued Beltran.

Returning to the Track

On 21 October 2021, the second race day arrived along with the hopes that Gran Turismo Sophy would win every competition, including the team race.

Not only did it dominate across the board, but the team also witnessed it adapting to a tough situation when it had a wipeout early in the third race and still went on to come in first place. It won 1st and 2nd place in all three races and won the team score by double points against the humans.

While GT Sophy proved its capabilities by outracing the human drivers, the intention with this project is not to replace or diminish human interaction, instead, it is about expanding and enriching the gaming experience for all players. Sony AI is calling this ‘AI for Gamers.’ “We’re going to create artificial intelligence that will unleash the power of human creativity and imagination,” says Hiroaki Kitano, Sony AI CEO.

“I want to race with GT Sophy more in the future. I really learned a lot from the AI agent,” commented Takuma Miyazono after racing against GT Sophy. “The goal with Gran Turismo Sophy is ultimately to entertain people,” added Kazunori Yamauchi, President of Polyphony Digital.

“We envision a future where AI agents could introduce developers and creators to new levels of innovation and unlock doors to unimaged opportunities,” said Ueli Gallizzi, SVP of Future Technology Group, Sony Interactive Entertainment. “We could see new levels of user engagement, better gaming experiences, and the entry of a whole new generation into the world of gaming.”

We can’t wait to see what lies ahead as the worlds of artificial intelligence and interactive entertainment are bridged, and Gran Turismo Sophy is the next step in this exciting adventure.