Last month, we wrote about autonomous quadrotors from the University of Pennsylvania that use just a VGA camera and an IMU to navigate together in swarms. Without relying on external localization or GPS, quadrotors like these have much more potential to be real-world useful, since they can operate without expensive and complex infrastructure, even indoors.
The vast majority of the fancy autonomous flying we've seen from quadrotors has relied on some kind of external localization for position information. Usually it's a motion capture system, sometimes it's GPS, but either way, there's a little bit of cheating involved. This is not to say that we mind cheating, but the problem with cheating is that sometimes you can't cheat, and if you want your quadrotors to do tricks where you don't have access to GPS or the necessary motion capture hardware and software, you're out of luck.
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. A new RoboBee from Harvard can swim underwater, and then launch itself into the air with a microrocket and fly away. At the millimeter scale, the water's surface might as well be a brick wall.
HAX, the hardware startup investor and accelerator, along with Airbus, is looking for start-ups to join a four-month accelerator program aimed to advance developments in urban air mobility, a.k.a. "Transportation in megacities needs fresh ideas to improve the way we live," said Mathias Thomsen, urban air mobility general manager at Airbus, in a press statement. "We believe that adding the vertical dimension to urban mobility will improve the current congested megacity transport systems." The selected startups will receive at least $100,000 in seed money, and spend four months in Shenzhen, China, turning their ideas into prototype with help from HAX and Airbus engineers. Applications can be submitted here.
During the Hands Free Hectare project, no human set foot on the field between planting and harvest--everything was done by robots. To make these decisions, robot scouts (including drones and ground robots) surveyed the field from time to time, sending back measurements and bringing back samples for humans to have a look at from the comfort of someplace warm and dry and clean. With fully autonomous farm vehicles, you can use a bunch of smaller ones much more effectively than a few larger ones, which is what the trend has been toward if you need a human sitting in the driver's seat. Robots are only going to get more affordable and efficient at this sort of thing, and our guess is that it won't be long before fully autonomous farming passes conventional farming methods in both overall output and sustainability.
Dean Kamen's DEKA R&D firm, with support from DARPA's Revolutionizing Prosthetics Program, designed the advanced prosthetic LUKE Arm to give amputees "dexterous arm and hand movement through a simple, intuitive control system." A series of research flights at NASA's Dryden (now Armstrong) Flight Research Center in the summer of 2005 validated the premise that using thermal lift could significantly extend the range and endurance of small unmanned air vehicles (UAVs) without a corresponding increase in fuel requirements. This 1-minute, 53-second video taken on October 1, 2011 shows the NASA Dryden (now Armstrong) Flight Research Center's Dryden Remotely Operated Integrated Drone (DROID) sub-scale test bed aircraft is moving up to the flight test big leagues! The center's Automatic Collision Avoidance Technology team conducted test flights of new software architecture on the radio-controlled large model aircraft to demonstrate that even the simplest flight systems may benefit from Automatic Ground Collision Avoidance Software (GCAS).
I suppose you could decide that this project from MIT's Tangible Media Group isn't really a robot, but I think it's arguably robotic enough (and definitely cool enough) that we can let it slide for this week: We present AnimaStage: a hands-on animated craft platform based on an actuated stage. At the end of every semester, UC Berkeley has a design showcase in Jacobs Hall. My modified Racing Roomba takes on the obstacle course at UC Berkeley's annual student vehicle challenge. If so, they didn't put it on this table: Two modules of EJBot propeller-type climbing robot which use a hybrid actuation system.
These results validate the performance of aerial grasping based on our proposed wholebody grasp planning and motion control method. However, for most vehicles, high performance over rough terrain reduces the travel speed and/or requires complex mechanisms. We extend GPS in the following ways: (1) we propose the use of a model-free local optimizer based on path integral stochastic optimal control (PI2), which enables us to learn local policies for tasks with highly discontinuous contact dynamics; and (2) we enable GPS to train on a new set of task instances in every iteration by using on-policy sampling: this increases the diversity of the instances that the policy is trained on, and is crucial for achieving good generalization. To increase the spike decision rates, iterative spiking training with actual blockers is required.
While running a SLAM algorithm, a robot can explore strange terrain, building a map of its surroundings while at the same time positioning, or localizing, itself within that map. Wyeth had long been interested in brain-inspired computing, starting with work on neural networks in the late 1980s. Their aim wasn't to create maps built with costly lidars and high-powered computers--they wanted their system to make sense of space the way animals do. To mimic this structure and behavior in software, Milford adopted a type of artificial neural network called an attractor network.