If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. We already posted about the unveiling of Sony's new Aibo, but here's a bit of extra video from the event showing the little robotic dog in live action: In this video we show a compilation of our research for the last 4 years on autonomous navigation of bipedal robots. It is part of the DFG-founded project "Versatile and Robust Walking in Uneven Terrain" (German Research Foundation) and includes development in environment perception and modeling, motion planning and stability control.
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. A new RoboBee from Harvard can swim underwater, and then launch itself into the air with a microrocket and fly away. At the millimeter scale, the water's surface might as well be a brick wall.
A revolutionary NASA Technology Demonstration Mission project called Dragonfly, designed to enable robotic self-assembly of satellites in Earth orbit, has successfully completed its first major ground demonstration. Over time, the system will integrate 3-D printing technology enabling the automated manufacture of new antennae and even replacement reflectors as needed. Vijay Kumar kicks things off with a talk about "research to enhance tactical situational awareness in urban and complex terrain by enabling the autonomous operation of a collaborative ensemble of microsystems." Next, Sean Humbert from UC Boulder talks about develping the fundamental science, tools, and algorithms to enable mobility of heterogeneous teams of autonomous micro-platforms for tactical situational awareness.
During the Hands Free Hectare project, no human set foot on the field between planting and harvest--everything was done by robots. To make these decisions, robot scouts (including drones and ground robots) surveyed the field from time to time, sending back measurements and bringing back samples for humans to have a look at from the comfort of someplace warm and dry and clean. With fully autonomous farm vehicles, you can use a bunch of smaller ones much more effectively than a few larger ones, which is what the trend has been toward if you need a human sitting in the driver's seat. Robots are only going to get more affordable and efficient at this sort of thing, and our guess is that it won't be long before fully autonomous farming passes conventional farming methods in both overall output and sustainability.
Dean Kamen's DEKA R&D firm, with support from DARPA's Revolutionizing Prosthetics Program, designed the advanced prosthetic LUKE Arm to give amputees "dexterous arm and hand movement through a simple, intuitive control system." A series of research flights at NASA's Dryden (now Armstrong) Flight Research Center in the summer of 2005 validated the premise that using thermal lift could significantly extend the range and endurance of small unmanned air vehicles (UAVs) without a corresponding increase in fuel requirements. This 1-minute, 53-second video taken on October 1, 2011 shows the NASA Dryden (now Armstrong) Flight Research Center's Dryden Remotely Operated Integrated Drone (DROID) sub-scale test bed aircraft is moving up to the flight test big leagues! The center's Automatic Collision Avoidance Technology team conducted test flights of new software architecture on the radio-controlled large model aircraft to demonstrate that even the simplest flight systems may benefit from Automatic Ground Collision Avoidance Software (GCAS).
I suppose you could decide that this project from MIT's Tangible Media Group isn't really a robot, but I think it's arguably robotic enough (and definitely cool enough) that we can let it slide for this week: We present AnimaStage: a hands-on animated craft platform based on an actuated stage. At the end of every semester, UC Berkeley has a design showcase in Jacobs Hall. My modified Racing Roomba takes on the obstacle course at UC Berkeley's annual student vehicle challenge. If so, they didn't put it on this table: Two modules of EJBot propeller-type climbing robot which use a hybrid actuation system.
We decode ErrP signals from a human operator in real time to control a Rethink Robotics Baxter robot during a binary object selection task. It has been produced to support KONE's 24/7 Connected Services, which uses the IBM Watson IoT platform and other advanced technologies to bring intelligent services to elevators and escalators. This video presents the robotic system designed by Team NimbRo Picking for the Amazon Picking Challenge 2016. In this talk, Shuo will review some key technologies DJI has developed, then talk about RoboMasters, a robotics competition that uses these technologies to nurture next generation engineers.
These autonomous ground robots can be deployed to establish a mobile microgrid. Muscular activity contains information on motion intention. By decoding the muscular activity of an arm during reachig-to-grasp motions, Billard Lab was able to detect grasp type in the early stages of a reaching motion which enables fast activation of a robotic hand by teleoperation. Our goal is to develop AirSim as a platform for AI research to experiment with deep learning, computer vision and reinforcement learning algorithms for autonomous vehicles.
Making a fully autonomous delivery robot (whether it's flying or not) is a very hard problem. Rather than try to develop a fully autonomous delivery robot from scratch, PFF is instead starting with something simpler: A pleasingly roundish robot called Gita ("gee-tah") that will follow you around, carrying 19 kilograms of tools, groceries, or whatever you want. But PFF also wants Gita to eventually be able to navigate completely by itself, even if the user isn't nearby (a capability that would let the robot make deliveries, like the autonomous robot delivery service being developed by London startup Starship Technologies). PFF's choice to rely solely on stereo cameras for outdoor localization in mostly unstructured environments is interesting, since there are many outdoor situations in which cameras aren't great, like at night, or looking into low sun angles.
Indeed, for the robot to localize with respect to the gap, a trajectory should be selected, which guarantees that the quadrotor always faces the gap and should be re-planned multiple times during its execution to cope with the varying uncertainty of the state estimate (the uncertainty increases quadratically with the distance from the gap) while respecting the vehicle dynamics. It is not trivial to combine all these constraints into a single path-planning problem, since the set of "feasible trajectories" reduces significantly as the quadrotor approaches the gap. To approach the gap, we use a trajectory generation method that allows us to evaluate a very large set of candidate trajectories; for each candidate trajectory, we compute the optimal vehicle orientation that allows the quadrotor to align its onboard camera as much as possible with the gap direction. Can this research be applied to other kinds of high speed maneuvering, like avoiding tree branches, or urban obstacles like lamp posts?