Goto

Collaborating Authors

Machine learning helps robot swarms coordinate

#artificialintelligence

Engineers at Caltech have designed a new data-driven method to control the movement of multiple robots through cluttered, unmapped spaces, so they do not run into one another. Multi-robot motion coordination is a fundamental robotics problem with wide-ranging applications that range from urban search and rescue to the control of fleets of self-driving cars to formation-flying in cluttered environments. Two key challenges make multi-robot coordination difficult: first, robots moving in new environments must make split-second decisions about their trajectories despite having incomplete data about their future path; second, the presence of larger numbers of robots in an environment makes their interactions increasingly complex (and more prone to collisions). To overcome these challenges, Soon-Jo Chung, Bren Professor of Aerospace, and Yisong Yue, professor of computing and mathematical sciences, along with Caltech graduate student Benjamin Rivière (MS '18), postdoctoral scholar Wolfgang Hönig, and graduate student Guanya Shi, developed a multi-robot motion-planning algorithm called "Global-to-Local Safe Autonomy Synthesis," or GLAS, which imitates a complete-information planner with only local information, and "Neural-Swarm," a swarm-tracking controller augmented to learn complex aerodynamic interactions in close-proximity flight. "Our work shows some promising results to overcome the safety, robustness, and scalability issues of conventional black-box artificial intelligence (AI) approaches for swarm motion planning with GLAS and close-proximity control for multiple drones using Neural-Swarm," says Chung.


Machine Learning Helps Robot Swarms Coordinate - ScienceBlog.com

#artificialintelligence

Engineers at Caltech have designed a new data-driven method to control the movement of multiple robots through cluttered, unmapped spaces, so they do not run into one another. Multi-robot motion coordination is a fundamental robotics problem with wide-ranging applications that range from urban search and rescue to the control of fleets of self-driving cars to formation-flying in cluttered environments. Two key challenges make multi-robot coordination difficult: first, robots moving in new environments must make split-second decisions about their trajectories despite having incomplete data about their future path; second, the presence of larger numbers of robots in an environment makes their interactions increasingly complex (and more prone to collisions). To overcome these challenges, Soon-Jo Chung, Bren Professor of Aerospace, and Yisong Yue, professor of computing and mathematical sciences, along with Caltech graduate student Benjamin Rivière (MS '18), postdoctoral scholar Wolfgang Hönig, and graduate student Guanya Shi, developed a multi-robot motion-planning algorithm called "Global-to-Local Safe Autonomy Synthesis," or GLAS, which imitates a complete-information planner with only local information, and "Neural-Swarm," a swarm-tracking controller augmented to learn complex aerodynamic interactions in close-proximity flight. "Our work shows some promising results to overcome the safety, robustness, and scalability issues of conventional black-box artificial intelligence (AI) approaches for swarm motion planning with GLAS and close-proximity control for multiple drones using Neural-Swarm," says Chung. When GLAS and Neural-Swarm are used, a robot does not require a complete and comprehensive picture of the environment that it is moving through, or of the path its fellow robots intend to take.


Machine Learning Helps Robot Swarms Coordinate

#artificialintelligence

To test their new systems, Chung's and Yue's teams implemented GLAS and Neural-Swarm on quadcopter swarms of up to 16 drones and flew them in the open-air drone arena at Caltech's Center for Autonomous Systems and Technologies (CAST). The teams found that GLAS could outperform the current state-of-the-art multi-robot motion-planning algorithm by 20 percent in a wide range of cases. Meanwhile, Neural-Swarm significantly outperformed a commercial controller that cannot consider aerodynamic interactions; tracking errors, a key metric in how the drones orient themselves and track desired positions in three-dimensional space, were up to four times smaller when the new controller was used. Their research appears in two recently published studies. "GLAS: Global-to-Local Safe Autonomy Synthesis for Multi-Robot Motion Planning with End-to-End Learning" was published in IEEE Robotics and Automation Letters on May 11 by Chung, Yue, Rivière, and Hönig. "Neural-Swarm: Decentralized Close-Proximity Multirotor Control Using Learned Interactions" was published in Proceedings of IEEE International Conference on Robotics and Automation on June 1 by Chung, Yue, Shi, and Hönig.


Machine Learning Helps Robot Swarms Coordinate – IAM Network

#artificialintelligence

Engineers at Caltech have designed a new data-driven method to control the movement of multiple robots through cluttered, unmapped spaces, so they do not run into one another.Multi-robot motion coordination is a fundamental robotics problem with wide-ranging applications that range from urban search and rescue to the control of fleets of self-driving cars to formation-flying in cluttered environments. Two key challenges make multi-robot coordination difficult: first, robots moving in new environments must make split-second decisions about their trajectories despite having incomplete data about their future path; second, the presence of larger numbers of robots in an environment makes their interactions increasingly complex (and more prone to collisions).[embedded


AI helps drone swarms navigate through crowded, unfamiliar spaces

#artificialintelligence

Drone swarms frequently fly outside for a reason: it's difficult for the robotic fliers to navigate in tight spaces without hitting each other. Caltech researchers may have a way for those drones to fly indoors, however. They've developed a machine learning algorithm, Global-to-Local Safe Autonomy Synthesis (GLAS), that lets swarms navigate crowded, unmapped environments. The system works by giving each drone a degree of independence that lets it adapt to a changing environment. Instead of relying on existing maps or the routes of every other drone in the swarm, GLAS has each machine learning how to navigate a given space on its own even as it coordinates with others.