Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. Playground cofounder and CTO Peter Barrett discusses why the firm is betting big on the future of robots. Robots are finally learning how to open doors.
Apparently, one of the standards by which we should be measuring the progress of useful robotic manipulation is through the assembly of Ikea furniture. With its minimalistic and affordable Baltoscandian design coupled with questionably creditable promises of effortless assembly, Ikea has managed to convince generations of inexperienced and desperate young adults (myself included) that we can pretend to be grownups by buying and putting together our own furniture. It's never as easy as that infuritatingly calm little Ikea manual dude makes it look, though, and in terms of things we wish robots would solve, Ikea furniture assembly has ended up way higher on the priority list than maybe it should be. We've seen a variety of robotic systems tackle Ikea in the past, but today in Science Robotics is (perhaps for the first time) a mostly off-the-shelf system of a few arms and basic sensors that can put together the frame of a Stefan chair kit autonomously(ish) and from scratch. This research comes from the Control Robotics Intelligence (CRI) group at NTU in Singapore, and they've been working on the whole Ikea chair assembly thing for a while.
A dog's purpose can take on new meaning when humans strap a GoPro camera to her head. Such "dog cam" video clips have helped train computer vision software that could someday give rise to robotic canine companions. The idea behind DECADE, described as "a dataset of ego-centric videos from a dog's perspective," is to directly model the behavior of intelligent beings based on how they see and move around within the real world. Vision and movement data from a single dog--an Alaskan Malamute named Kelp M. Redmon--proved capable of training off-the-shelf deep learning algorithms to predict how dogs might react to different situations, such as seeing the owner holding a bag of treats or throwing a ball. "The near-term application would be to model the behavior of the dog and try to make an actual robot dog using this data," said Kiana Ehsani, a PhD student in computer science at the University of Washington in Seattle.
It takes a lot of practice to fly a drone with confidence. Whether it's a multirotor or a fixed-wing drone, there are a lot of complicated things going on all at once, and most of the control systems are not even a little bit intuitive. The first-person viewpoint afforded by drone-mounted cameras and VR headsets helps, but you're still stuck with trying to use a couple of movable sticks to manage a flying robot, which takes both experience and concentration. EPFL has developed a much better system for drone control, taking away the sticks and replacing them with intuitive and comfortable movements of your entire body. It's an upper-body soft exoskeleton called FlyJacket, and with it on, you can pilot a fixed-wing drone by embodying the drone--put your arms out like wings, and pitching or rolling your body will cause the drone to pitch or roll, all while you experience it directly in immersive virtual reality.
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. Oh yes, this is an excellent idea. Cassie Blue is controlling the motion of the Segway by body lean, just as a human rider would do.
Robots make fantastic remote-sensing systems, ideal for sending in to disaster areas or for search-and-rescue. Drones in particular can move rapidly over large areas or through structures, identifying damage or looking for survivors by sending a video feed from their on-board cameras to a remote operator. While the data that drones provide can be invaluable, managing them can be quite difficult, especially once they get beyond line-of-sight. Researchers from Graz University of Technology, in Styria, Austria, led by Okan Erat, want to change the way we interface with drones, using augmented reality to turn them from complicated flying robots into remote cameras that an untrained user can easily control. Through a HoloLens--Microsoft's mixed reality head-mounted display--a drone can enable a sort of X-ray vision, allowing you to see straight through walls and making controlling the drone as easy as grabbing a virtual drone and putting it exactly where you want it to be.
In 2010, the U.S. House of Representatives passed resolution H.Res. 1055 to make the second week of April officially National Robotics Week. Now celebrating its eighth year, National Robotics Week is more national and more robotics-y than ever, with hundreds of events taking place all over the country. We know that most of you live robotics every single day anyway (as you should), and so National Robotics Week might not seem like it's worth celebrating, but what about your friends and family who have no idea how cool robots are, and who maybe have no idea what it is that you actually, you know, do? Fundamentally, NRW is all about celebrating how cool robotics is, and getting as many people involved as possible--think about finding an event near you that looks like fun, and then dragging someone along who doesn't (yet) understand why robotics is the best thing ever. They'll either thank you for it, or think you're crazy, but it's a win either way, right?
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. Only try this at home on April 1. You might remember the Flowbee hair cutting vacuum device from commercials back in the late 1980s.
A diving trip to the Great Barrier Reef may have unlocked a new way to build a GPS-like sensor that works underwater. The device is based on recent scientific understanding of how marine animals sense their geolocation based on the signature polarization patterns of light entering the water. A few years ago, U.S. and Australian researchers developed a special camera inspired by the eyes of mantis shrimp that can see the polarization patterns of light waves, which resemble those in a rope being waved up and down. That means the bio-inspired camera can detect how light polarization patterns change once the light enters the water and gets deflected or scattered. Those researchers now realize that they can use those underwater polarization patterns to deduce the sun's position--and use that to figure out the location of the camera itself.
In the United States, there are over 5 million young adults between the ages of 18-35 living alone, and that number is growing. While many of them may be living alone by choice, it can also be socially isolating, if you're into that whole being social thing. The situation is similar in many other countries, especially in Asia. There are plenty of robots under development (and even available) for elderly people with social isolation issues, but younger people are expected to, uh, just go outside or something. At the ACM/IEEE International Conference on Human Robot Interaction last month, roboticists from Korea introduced a robot called Fribo, which is designed to provide a way for young adults who live alone to maintain daily connections with one another.