It might not look that special, but the robot above is, according to a new measure, the most dexterous one ever created. Among other tricks, it could sort through your junk drawer with unrivaled speed and skill.
The key to its dexterity is not in its mechanical grippers but in its brain. The robot uses software called Dex-Net to determine how to pick up even odd-looking objects with incredible efficiency.
Dex-Net was developed by Ken Goldberg, a professor at UC Berkeley, and one of his graduate students, Jeff Mahler. The software runs on an off-the-shelf industrial machine made by ABB, a Swiss robotics company. Goldberg demonstrated the latest version of his system at EmTech Digital, an event in San Francisco organized by MIT Technology Review and dedicated to artificial intelligence.
Goldberg’s system is a lot closer to matching the adroitness of a human than anything developed previously. Industrial robots with better dexterity could find application in warehouses and factories as well as hospitals and homes.
What’s especially clever about Dex-Net is how it learns to grasp. The software tries picking up objects in a virtual environment, training a deep neural network through trial and error. Even in simulation, this is a laborious task. Crucially, though, Dex-Net can generalize from an object it has seen before to a new one. The robot will even nudge an item to get a better look at it if it isn’t sure how it should be grasped. The latest version of the system includes a high-resolution 3-D sensor and two arms, each controlled by a different neural network. One arm is equipped with a conventional robot gripper and another with a suction system. The robot’s software scans an object and then looks at both neural networks to decide, on the fly, whether it makes more sense to grab or suck that particular object.
UC Berkeley researchers also developed a better way to measure the performance of a picking robot: a metric called “mean picks per hour,” which is calculated by multiplying the average time per pick and the average probability of success for a consistent set of objects.
The new metric will help research labs working on picking robots share their results. “We’ve been talking about how to align our results so that we see progress,” Goldberg says. “It all depends what robot you’re using, what sensor you’re using, and—very importantly—what objects you’re using.”
Humans are capable of between 400 and 600 mean picks per hour. In a contest organized by Amazon recently, the best robots were capable of between 70 and 95. The new machine reaches 200 to 300 mean picks per hour, Goldberg says. The results will be presented at a conference in Australia later this year.
During his presentation, Goldberg added that within five years, he expects that robots will reach "human or even superhuman mean picks per hour."
Grasping and manipulating awkward and unfamiliar objects is a fundamental challenge in robotics, and one that has held the technology back. The robots found in car factories, for instance, are fast and precise but have no ability to adapt to a changing or unfamiliar setting. Besides factory or warehouse work, more sophisticated manipulation may lead to the first useful robots for helping people in places such as hospitals and elder-care facilities.
Recent progress in this aspect of robotics is the result of several simultaneous trends. Smaller, safer robots have proliferated, new kinds of end grippers have emerged, and—most significant—big strides have been made in machine learning.
In addition to Goldberg’s work and research at several other academic labs, researchers at places like DeepMind and OpenAI have begun exploring how machine learning could be used to make robots smarter and more adaptable. Advances in robotics may well feed back into other areas of AI, such as perception.
“Machine learning is having an unprecedented impact on robotics,” says Russ Tedrake, a professor at MIT who has seen the UC Berkeley robot demoed. “There is incredible value in getting robots to proliferate to the point that we finally have big data for robotics.”