play-doh
Robotic Dough Shaping
Ondras, Jan, Ni, Di, Deng, Xi, Gu, Zeqi, Zheng, Henry, Bhattacharjee, Tapomayukh
The ability to deform soft objects remains a great challenge for robots due to difficulties in defining the problem mathematically. In this paper, we address the problem of shaping a piece of dough-like deformable material into a 2D target shape presented upfront. We use a 6 degree-of-freedom WidowX-250 Robot Arm equipped with a rolling pin and information collected from an RGB-D camera and a tactile sensor. We present and compare several control policies, including a dough shrinking action, in extensive experiments across three kinds of deformable materials and across three target dough shape sizes, achieving the intersection over union (IoU) of 0.90. Our results show that: i) rolling dough from the highest dough point is more efficient than from the 2D/3D dough centroid; ii) it might be better to stop the roll movement at the current dough boundary as opposed to the target shape outline; iii) the shrink action might be beneficial only if properly tuned with respect to the expand action; and iv) the Play-Doh material is easier to shape to a target shape as compared to Plasticine or Kinetic sand. Video demonstrations of our work are available at https://youtu.be/ZzLMxuITdt4 Keywords: Robotics and Mechatronics, Machine Vision and Perception, Sensors and Actuators
- North America > United States (0.04)
- Asia (0.04)
AI-powered robot learned to make letters out of Play-Doh on its own
A robot has learned how to mould modelling clay into letters that it has never seen before. Creating complex shapes out of doughy materials is a skill that could be put to use in the future in the form of a dumpling-making robot chef. "Deformable objects are ubiquitous in our daily life," says Yunzhu Li at the Massachusetts Institute of Technology. Robots capable of gently handling such objects could one day cook, do housework or even help care for elderly people, he says.
Robots learn to shape letters using Play-Doh
MIT CSAIL researchers have created a system, RoboCraft, that teaches robots how to work with the kid-friendly goo. The platform first takes the image of a shape (in this case, a letter of the alphabet) and reinterprets it as a cluster of interlocking particles. The bot then uses a physics-oriented neural network to predict how its two "fingers" can manipulate those spheres to match the desired outcome. A predictive algorithm helps the machine plan its actions. The technology doesn't require much time to produce usable results.
How your poop can help train AI
San Francisco (CNN)The next time you go to the bathroom, a couple startups are hoping you'll snap a photo before you flush. Two companies -- Auggi, a gut-health startup that's building an app for people to track gastrointestinal issues, and Seed Health, which works on applying microbes to human health and sells probiotics -- are soliciting poop photos from anyone who wants to send them. The companies began collecting the photos online on Monday via a campaign cheekily called "Give a S--t" (you can imagine what the dashes stand for) with the goal of creating the first known data set of human poop images. These pictures -- the companies hope to collect 100,000 photos in total -- can then be used to build AI for research into gut-related diseases and to help people with such health conditions more easily track their own bowel movements. "We like to say it's basically a data dump that gets flushed away each day that could really inform science," Seed cofounder and co-CEO Ara Katz told CNN Business.
- North America > United States > California > San Francisco County > San Francisco (0.25)
- North America > United States > California > San Diego County > San Diego (0.05)