AI Outraces Human Champs at the Video Game Gran Turismo
To hurtle around a corner along the fastest “racing line” without losing control, race car drivers must brake, steer and accelerate in precisely timed sequences. The process depends on the limits of friction, and they are governed by known physical laws—which means self-driving cars can learn to complete a lap at the fastest possible speed (as some have already done). But this becomes a much knottier problem when the automated driver has to share space with other cars. Now scientists have unraveled the challenge virtually by training an artificial intelligence program to outpace human competitors at the ultrarealistic racing game Gran Turismo Sport. The findings could point self-driving car researchers toward new ways to make this technology function in the real world.
Artificial intelligence has already conquered human players within certain video games, such as Starcraft II and Dota 2. But Gran Turismo differs from other games in significant ways, says Peter Wurman, director of Sony AI America and co-author of the new study, which was published in Nature. “In most games, the environment defines the rules and protects the users from each other,” he explains. “But in racing, the cars are very close to each other, and there’s a very refined sense of etiquette that has to be learned and deployed by the [AI] agents. In order to win, they have to be respectful of their opponents, but they also have to preserve their own driving lines and make sure that they don’t just give way.”
To teach their program the ropes, the Sony AI researchers used a technique called deep reinforcement learning. They rewarded the AI for certain behaviors, such as staying on the track, remaining in control of the vehicle and respecting racing etiquette. Then they set the program loose to try different ways of racing that would enable it to achieve those goals. The Sony AI team trained multiple different versions of its AI, dubbed Gran Turismo Sophy (GT Sophy), each specialized in driving one particular type of car on one particular track. Then the researchers pitted the program against human Gran Turismo champions. In the first test, conducted last July, humans achieved the highest overall team score. On the second run in October 2021, the AI broke through. It beat its human foes both individually and as a team, achieving the fastest lap times.
The human players seem to have taken their losses in stride, and some enjoyed pitting their wits against the AI. “Some of the things that we also heard from the drivers was that they learned new things from Sophy’s maneuvers as well,” says Erica Kato Marcus, director of strategies and partnerships at Sony AI. “The lines the AI was using were so tricky, I could probably do them once. But it was so, so difficult—I would never attempt it in a race,” says Emily Jones, who was a world finalist at the FIA-Certified Gran Turismo Championships 2020 and later raced against GT Sophy.
(Excerpt from ‘AI Outraces Human Champs at the Video Game Gran Turismo’, scientificamerican.com, 2022)
What is this text mainly about?