ButSpeak.com
News which Matters.
Google’s DeepMind has developed a robotic arm that can compete with amateur table tennis players, showcasing impressive agility and skill, and marking a step forward in AI’s physical capabilities.
DeepMind, Google’s AI subsidiary, has made waves in the world of robotics by developing a robotic arm capable of playing table tennis at an amateur level. The robot, equipped with advanced AI and motion capture technology, has shown remarkable agility, successfully competing with amateur human players in recent tests.
In a newly published research paper, DeepMind revealed that their table tennis-playing robot has won 13 out of 29 full matches against amateur-level opponents. While it may not yet be able to challenge professional players, the fact that the AI system has reached this level of skill is an impressive achievement. The robot can handle various types of shots, including backhands, forehands, and even spin-heavy volleys. It’s also able to recover from shots that graze the net, demonstrating an advanced understanding of the game’s dynamics.
The human players who faced off against the robot reported that the matches were both engaging and challenging. According to MIT Technology Review, many players found the experience enjoyable and noted that the robot could serve as a valuable practice partner, potentially helping them improve their own skills. A video released by DeepMind shows the bot in action, deftly responding to a range of play styles and even appearing to ‘hop’ in place during particularly intense rallies, despite lacking legs.
Pannag Sanketi, the engineer who led the project at DeepMind, expressed his surprise at the robot’s performance. “Even a few months back, we projected that realistically the robot may not be able to win against people it had not played before,” he said. “The system certainly exceeded our expectations. The way the robot outmaneuvered even strong opponents was mind-blowing.”
DeepMind’s approach to training the robot involved two key steps. Initially, the AI was trained in a virtual environment, using simulations that accurately mimicked the physics of table tennis. This allowed the system to develop its hitting techniques in a controlled setting. Following this, the robot’s skills were further honed using real-world data, where it learned from actual gameplay.
During live matches, the robot relies on a pair of cameras to track the ball’s movement and uses motion capture technology to monitor its human opponent’s actions. The system identifies the player’s style using an LED-equipped paddle, and all of this data is fed back into the AI’s simulations, enabling continuous improvement with each game.
Despite its impressive capabilities, the robot still has some limitations. It struggles with extremely fast shots, balls that are hit far off the table, and those with heavy spin, as it currently lacks the ability to accurately measure ball rotation. However, DeepMind believes that future upgrades to predictive AI modeling and collision detection could address these issues.
While the project might seem like a fun experiment, it has broader implications for AI development. The research highlights the potential of AI systems to perform complex physical tasks safely in natural environments, such as homes or warehouses. This table tennis robot is not just a showcase of advanced technology but also a step forward in the journey towards integrating AI into everyday tasks.