Google AI learns to play open-world video games by watching them
A Google DeepMind artificial intelligence model can play different open-world video games including No Man's Sky like a human, by watching video from a screen, which could be a step towards generally intelligent AIs that operate in the corporeal world. Playing video games has long been a way to test the progress of AI systems, such as Google DeepMind's AI mastery of chess or Go, but these games have obvious ways to win or lose, making it relatively straightforward to train an AI to succeed at them. Open-world games with extraneous information that can be ignored and more abstract objectives, such as Minecraft, are harder for AI systems to crack. Because the array of choices available in the games makes them a little more like normal life, they are thought to be an important stepping stone towards training AI agents that could do jobs in the real world, such as control robots, and artificial general intelligence. Now, researchers at Google DeepMind have developed an AI they call a Scalable Instructable Multiworld Agent, or SIMA, which can play nine different video games and virtual environments it hasn't seen before using just the video feed from the game.
Mar-13-2024, 14:15:48 GMT