dex-net
Learning suction graspability considering grasp quality and robot reachability for bin-picking
Jiang, Ping, Oaki, Junji, Ishihara, Yoshiyuki, Ooga, Junichiro, Han, Haifeng, Sugahara, Atsushi, Tokura, Seiji, Eto, Haruna, Komoda, Kazuma, Ogawa, Akihito
Deep learning has been widely used for inferring robust grasps. Although human-labeled RGB-D datasets were initially used to learn grasp configurations, preparation of this kind of large dataset is expensive. To address this problem, images were generated by a physical simulator, and a physically inspired model (e.g., a contact model between a suction vacuum cup and object) was used as a grasp quality evaluation metric to annotate the synthesized images. However, this kind of contact model is complicated and requires parameter identification by experiments to ensure real world performance. In addition, previous studies have not considered manipulator reachability such as when a grasp configuration with high grasp quality is unable to reach the target due to collisions or the physical limitations of the robot. In this study, we propose an intuitive geometric analytic-based grasp quality evaluation metric. We further incorporate a reachability evaluation metric. We annotate the pixel-wise grasp quality and reachability by the proposed evaluation metric on synthesized images in a simulator to train an auto-encoder--decoder called suction graspability U-Net++ (SG-U-Net++). Experiment results show that our intuitive grasp quality evaluation metric is competitive with a physically-inspired metric. Learning the reachability helps to reduce motion planning computation time by removing obviously unreachable candidates. The system achieves an overall picking speed of 560 PPH (pieces per hour).
- South America > Uruguay > Artigas > Artigas (0.04)
- Asia > Japan (0.04)
- Asia > China > Guangxi Province > Nanning (0.04)
A Choice of Grippers Helps Dual-Arm Robot Pick Up Objects Faster Than Ever
We've been following Dex-Net's progress towards universal grasping for several years now, and today in a paper in Science Robotics, UC Berkeley is presenting Dex-Net 4.0. The new and exciting bit about this latest version of Dex-Net is that it's able to successfully grasp 95 percent of unseen objects at a rate of 300 per hour, thanks to some added ambidexterity that lets the robot dynamically choose between two different kinds of grippers. For some context, humans are able to pick objects like these nearly twice as fast, between 400 and 600 picks per hour. And my guess would be that human success rates are as close to 100 percent as you can reasonably expect, perhaps achieving 100 percent if you allow for multiple tries to pick the same object. We set a very, very high bar for the machines.
This One-Armed Robot Is Super Manipulative (in a Good Way)
Give a man a fish, the old saying goes, and you feed him for a day--teach a man to fish, and you feed him for a lifetime. Same goes for robots, with the exception that robots feed exclusively on electricity. The problem is figuring out the best way to teach them. Typically, robots get fairly detailed coded instructions on how to manipulate a particular object. But give it a different kind of object and you'll blow its mind, because the machines aren't great yet at learning and applying their skills to things they've never seen before.
- Information Technology > Artificial Intelligence > Robots (1.00)
- Information Technology > Artificial Intelligence > Games > Go (0.42)
Robots Can't Hold Stuff Very Well. But You Can Help
Imagine, for a moment, the simple act of picking up a playing card from a table. You have a couple of options: Maybe you jam your fingernail under it for leverage, or drag it over the edge of the table. Now imagine a robot trying to do the same thing. Tricky: Most robots don't have fingernails, or friction-facilitating fingerpads that perfectly mimic ours. So many of these delicate manipulations continue to escape robotic control.
Robots get closer to human-like dexterity
It might not look that special, but the robot above is, according to a new measure, the most dexterous one ever created. Among other tricks, it could sort through your junk drawer with unrivaled speed and skill. The key to its dexterity is not in its mechanical grippers but in its brain. The robot uses software called Dex-Net to determine how to pick up even odd-looking objects with incredible efficiency. The new robot was built by Ken Goldberg, a professor at UC Berkeley, and one of his graduate students, Jeff Mahler. Goldberg will demonstrate the latest version of it at EmTech Digital, an event in San Francisco organized by MIT Technology Review and dedicated to artificial intelligence.
- North America > United States > California > San Francisco County > San Francisco (0.25)
- Oceania > Australia (0.05)
This robot arm's AI thinks like we do about how to grab something
Robots are great at doing things they've been shown how to do, but when presented with a novel problem, such as an unfamiliar shape that needs to be gripped, they tend to choke. AI is helping there in the form of systems like Dex-Net, which uses deep learning to let a robotic arm improvise an effective grip for objects it's never seen before. The basic idea behind the system is rather like how we figure out how to pick things up. You see an object, understand its shape and compare it to other objects you've picked up in the past, then use that information to choose the best way to grab it. Dex-Net doesn't have the advantage of being a living person with eyes and a memory, so its creators gave it more than six million artificial 3D representations of objects and had it work out the best way, theoretically, to pick up each.
Why It's So Hard For Robots To Get A Grip
Berkeley robotics professor Ken Goldberg is turning an empty coffee mug around and around in his hands. "Oh, it's so complicated for a robot to be able to make sense of that kind of data," he says, eyeing his fingers grasping the cup with a look of wonder. Artificial intelligence is taking on complex cognitive tasks, such as assisting in legal and medical research, but a manual job like picking up laundry off the floor is still science fiction. Universities like Berkeley and Cornell and companies like Amazon and Toyota are working to close the gap with mechanical hands that approach human dexterity. Success would unleash a new robotics revolution with positive effects like reducing household drudgework, and fraught effects such as eliminating jobs in places like warehouses. Machines have been taking over manual labor for centuries; but they've been limited to specific, predictable tasks, as in factories.
- Health & Medicine (0.90)
- Leisure & Entertainment (0.70)