“OpenAI Text Generator GPT-2 Creates Video Game Walkthrough for ‘Most Tedious Game in History’”, Andrew Whalen2020-02-20 (; backlinks; similar)⁠:

When OpenAI announced the automatic text generator GPT-2 in February of 2019, its language model had a simple objective: predict the next word. Since its release—and despite high computational barriers—programmers, tinkerers and artificial intelligence researchers have explored creative ways to use the advanced language model, developing applications for GPT-2 far beyond simple text generation. In January, AI researcher Shawn Presser demonstrated how GPT-2 can empower video game design, beginning with “the most tedious game in history.” “You can prompt the model with whatever text you want, and it will try to guess how to complete it”, Presser told Newsweek.

…Using thousands of game walkthroughs and FAQs, scraped from sites around the web (a 50 megabyte data set provided by Twitter’s @me_irl), Presser prompted GPT-2 to generate its own walkthroughs. The result is walkthroughs of video games that never existed; guides to adventures no one has ever programmed. Presser described one of GPT-2’s creations as “a walkthrough for the most tedious game in history”: a dense set of instructions for something that sounds a lot like a first-person shooter. “When the room opens, go forward. You should find a rocket launcher”, the walkthrough begins. “Push the switch and a door opens. Take cover in the corner and shoot the guard. The door will close when he dies. Now jump over the gap and kill the guards. In the next area is a switch. Push it and the door will open. In the next area is a scientist. Kill him. Go back to the previous room and push the switch. Open the next door. In the next room is a scientist. Kill him.”

…But renting a “TPU pod” for cloud computing can cost millions, making them prohibitively expensive for all but large companies—organizations unlikely to try out playful experiments. So Presser developed a technique he dubbed “swarm training”, to employ 80 individual TPUs on a single data set. “In swarm training, we can run dozens or hundreds of TPUs in a loose network which swaps updates on the fly”, Presser told Newsweek. “It’s chaotic, but it winds up working pretty well: it’s much faster than using just a few TPUs, but much cheaper than renting entire TPU pods. We’re hopeful that swarm training will be very useful to other researchers.”

…GPT-2 has also proved adept at gaming functions beyond just generating games-related text. Presser previously collaborated with technology writer and researcher Gwern Branwen to train GPT-2 to play chess, by providing it hours of “training” in legal chess moves (using standard notation) and asking it to output its own responses. After hours of training GPT-2 on which responses are valid moves in an ongoing chess game and which responses are nonsensical, the text generation engine was eventually able to complete a full game.

While it may be years before game designers are employing text generating language models in their designs, Presser said he already sees potential practical applications. “If you prompt the model with descriptions of some spells from your tabletop campaign, the model can generate new spells”, Presser said. “It’s quite versatile.” For example, Dungeons & Dragons players could input spells like Fireball, including a description of its HP damage, and get back from GPT-2 new attack spells to use in tabletop roleplaying sessions. “I think there’s an opportunity to build new indie games using GPT-2”, Presser said. “Imagine making a mod for Skyrim that uses GPT-2 to generate new quests. You’d have infinite replayability. It’d be like AI Dungeon 2 in 3D.”