Last month, we wrote about autonomous quadrotors from the University of Pennsylvania that use just a VGA camera and an IMU to navigate together in swarms. Without relying on external localization or GPS, quadrotors like these have much more potential to be real-world useful, since they can operate without expensive and complex infrastructure, even indoors.
The vast majority of the fancy autonomous flying we've seen from quadrotors has relied on some kind of external localization for position information. Usually it's a motion capture system, sometimes it's GPS, but either way, there's a little bit of cheating involved. This is not to say that we mind cheating, but the problem with cheating is that sometimes you can't cheat, and if you want your quadrotors to do tricks where you don't have access to GPS or the necessary motion capture hardware and software, you're out of luck.
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. A new RoboBee from Harvard can swim underwater, and then launch itself into the air with a microrocket and fly away. At the millimeter scale, the water's surface might as well be a brick wall.
HAX, the hardware startup investor and accelerator, along with Airbus, is looking for start-ups to join a four-month accelerator program aimed to advance developments in urban air mobility, a.k.a. "Transportation in megacities needs fresh ideas to improve the way we live," said Mathias Thomsen, urban air mobility general manager at Airbus, in a press statement. "We believe that adding the vertical dimension to urban mobility will improve the current congested megacity transport systems." The selected startups will receive at least $100,000 in seed money, and spend four months in Shenzhen, China, turning their ideas into prototype with help from HAX and Airbus engineers. Applications can be submitted here.
During the Hands Free Hectare project, no human set foot on the field between planting and harvest--everything was done by robots. To make these decisions, robot scouts (including drones and ground robots) surveyed the field from time to time, sending back measurements and bringing back samples for humans to have a look at from the comfort of someplace warm and dry and clean. With fully autonomous farm vehicles, you can use a bunch of smaller ones much more effectively than a few larger ones, which is what the trend has been toward if you need a human sitting in the driver's seat. Robots are only going to get more affordable and efficient at this sort of thing, and our guess is that it won't be long before fully autonomous farming passes conventional farming methods in both overall output and sustainability.
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. Dean Kamen's DEKA R&D firm, with support from DARPA's Revolutionizing Prosthetics Program, designed the advanced prosthetic LUKE Arm to give amputees "dexterous arm and hand movement through a simple, intuitive control system." The LUKE Arm, which stands for Life Under Kinetic Evolution but is also a reference to Luke Skywalker's bionic hand, "allows users to control multiple joints simultaneously and provides a variety of grips and grip forces by means of wireless signals generated by sensors worn on the feet or via other easy-to-use controllers."
The week is almost over, and so is the 2017 IEEE International Conference on Robotics and Automation (ICRA) in Singapore. We hope you've been enjoying our coverage, which has featured aquatic drones, stone-stacking manipulators, and self-folding soft robots. We'll have lots more from the conference over the next few weeks, but for you impatient types, we're cramming Video Friday this week with a special selection of ICRA videos. We tried to include videos from many different subareas of robotics: control, vision, locomotion, machine learning, aerial vehicles, humanoids, actuators, manipulation, and human-robot interaction. We're posting the abstracts along with the videos, but if you have any questions about these projects, let us know and we'll get more details from the authors. Have a great weekend everyone! This letter presents a physical human–robot interaction scenario in which a robot guides and performs the role of a teacher within a defined dance training framework.
If you take a common brown rat and drop it into a lab maze or a subway tunnel, it will immediately begin to explore its surroundings, sniffing around the edges, brushing its whiskers against surfaces, peering around corners and obstacles. After a while, it will return to where it started, and from then on, it will treat the explored terrain as familiar. Roboticists have long dreamed of giving their creations similar navigation skills. To be useful in our environments, robots must be able to find their way around on their own. Some are already learning to do that in homes, offices, warehouses, hospitals, hotels, and, in the case of self-driving cars, entire cities. Despite the progress, though, these robotic platforms still struggle to operate reliably under even mildly challenging conditions.
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. Communication with a robot using brain activity from a human collaborator could provide a direct and fast feedback loop that is easy and natural for the human, thereby enabling a wide variety of intuitive interaction tasks. This paper explores the application of EEG-measured error-related potentials (ErrPs) to closed-loop robotic control.