Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. A new RoboBee from Harvard can swim underwater, and then launch itself into the air with a microrocket and fly away. At the millimeter scale, the water's surface might as well be a brick wall.
Their secret to peeking around corners is detecting slight differences in light patterns reflected from moving objects or people. MIT's "CornerCameras" system can reveal the number of moving people or objects as individual lines on a graph that tracks angular velocity over time. That active laser system can detect even stationary objects with fairly high precision, whereas the new MIT CornerCameras system can only detect moving objects. The MIT CornerCameras system is fairly simple and needs nothing more than a basic webcam or iPhone 5s smartphone camera, along with a laptop to run the software algorithm.
Dean Kamen's DEKA R&D firm, with support from DARPA's Revolutionizing Prosthetics Program, designed the advanced prosthetic LUKE Arm to give amputees "dexterous arm and hand movement through a simple, intuitive control system." A series of research flights at NASA's Dryden (now Armstrong) Flight Research Center in the summer of 2005 validated the premise that using thermal lift could significantly extend the range and endurance of small unmanned air vehicles (UAVs) without a corresponding increase in fuel requirements. This 1-minute, 53-second video taken on October 1, 2011 shows the NASA Dryden (now Armstrong) Flight Research Center's Dryden Remotely Operated Integrated Drone (DROID) sub-scale test bed aircraft is moving up to the flight test big leagues! The center's Automatic Collision Avoidance Technology team conducted test flights of new software architecture on the radio-controlled large model aircraft to demonstrate that even the simplest flight systems may benefit from Automatic Ground Collision Avoidance Software (GCAS).
In January, we wrote about a cybernetic micro air vehicle under development at Draper called DragonflEye. The backpack interfaces directly with the dragonfly's nervous system to control it, and uses tiny solar panels to harvest enough energy to power itself without the need for batteries. The unique thing about DragonflEye (relative to other cyborg insects) is that it doesn't rely on spoofing the insect's sensors or controlling its muscles, but instead uses optical electrodes to inject steering commands directly into the insect's nervous system, which has been genetically tweaked to accept them. This means that the dragonfly can be controlled to fly where you want, without sacrificing the built-in flight skills that make insects the envy of all other robotic micro air vehicles.
Even aircraft designed to hover, like helicopters and quadrotors, have preferential directions of orientation and travel where their particular arrangement of motors and control surfaces makes them most effective. ETH Zurich's Omnicopter goes about flying in a totally different way. We have developed a computationally efficient trajectory generator for six degrees-of-freedom multirotor vehicles, i.e. The fetching work comes from Dario Brescianini and Raffaello D'Andrea at the Institute for Dynamic Systems and Control (IDSC), ETH Zurich, Switzerland.
We decode ErrP signals from a human operator in real time to control a Rethink Robotics Baxter robot during a binary object selection task. It has been produced to support KONE's 24/7 Connected Services, which uses the IBM Watson IoT platform and other advanced technologies to bring intelligent services to elevators and escalators. This video presents the robotic system designed by Team NimbRo Picking for the Amazon Picking Challenge 2016. In this talk, Shuo will review some key technologies DJI has developed, then talk about RoboMasters, a robotics competition that uses these technologies to nurture next generation engineers.
These autonomous ground robots can be deployed to establish a mobile microgrid. Muscular activity contains information on motion intention. By decoding the muscular activity of an arm during reachig-to-grasp motions, Billard Lab was able to detect grasp type in the early stages of a reaching motion which enables fast activation of a robotic hand by teleoperation. Our goal is to develop AirSim as a platform for AI research to experiment with deep learning, computer vision and reinforcement learning algorithms for autonomous vehicles.
Making a fully autonomous delivery robot (whether it's flying or not) is a very hard problem. Rather than try to develop a fully autonomous delivery robot from scratch, PFF is instead starting with something simpler: A pleasingly roundish robot called Gita ("gee-tah") that will follow you around, carrying 19 kilograms of tools, groceries, or whatever you want. But PFF also wants Gita to eventually be able to navigate completely by itself, even if the user isn't nearby (a capability that would let the robot make deliveries, like the autonomous robot delivery service being developed by London startup Starship Technologies). PFF's choice to rely solely on stereo cameras for outdoor localization in mostly unstructured environments is interesting, since there are many outdoor situations in which cameras aren't great, like at night, or looking into low sun angles.
Nuno Vasconcelos, a visual computing expert at the University of California, San Diego, says bikes pose a complex detection problem because they are relatively small, fast and heterogenous. Consider the Deep3DBox algorithm presented recently by researchers at George Mason University and stealth-mode robotic taxi developer Zoox, based in Menlo Park, Calif. On an industry-recognized benchmark test, which challenges vision systems with 2D road images, Deep3DBox identifies 89 percent of cars. More automakers are expected to follow suit as European auto safety regulators begin scoring AEB systems for cyclist detection next year. In December, Wiedenmeier warned that self-driving taxis deployed by Uber Technologies were violating California driving rules designed to protect cyclists from cars and trucks crossing designated bike lanes.
On the stage next to Krafcik stood a self-driving hybrid minivan from Fiat Chrysler equipped with his company's technology. A short-range lidar covers areas beside the minivan that would otherwise have been in the shadow of the rooftop sensor. Exactly how much of Waymo's self-driving prowess comes from such hardware--rather than improved software and road mapping--isn't clear. What is clear is that Waymo wants to supply the entire auto industry with packages that can be fitted to just about any vehicle.