BERKELEY – When scale model helicopters pass through a makeshift "urban canyon" in a test field, or engage in a game of aerial "chicken", the drills may look like a robotic stunt show to outside eyes. Members of the university's Berkeley Aerial Robot (BEAR) program have successfully conducted a series of field tests with 130-pound helicopters that not only fly autonomously -- without human control -- but that also react to avoid obstacles in their flight path. "Our BEAR group is the first to successfully develop a system where autonomous helicopters can detect obstacles, stationary or moving, and recompute their course in real-time to reach the original target destination," said David Hyunchul Shim, a research engineer on the project who first began this work as a UC Berkeley Ph.D. student in mechanical engineering. With these achievements, the researchers are inching towards a future of robo-copters that could maneuver through city streets or forested landscapes. The development of reliable systems that can handle obstacle-avoidance tasks is still several years away, researchers said, but the computational foundations for such unmanned aerial vehicles (UAVs) have been laid.
Picking a fight with a drone may seem a bizarre way of testing your theories - but one Stanford researcher has done just that. Ross Allen, a researcher at Stanford, decided the perfect way to test his drone avoidance system was to attack it with a sword. Wearing full fencing gear, he recorded a video putting the drone through its collision avoidance paces. Researchers are testing quadrotor drones with the ability to dodge obstacles and are showing off this achievement through fencing. A new video surfaced showing a human opponent taking jabs at a drone, which seems to'see' it coming and avoids being probed Stanford University's Department of Aeronautics and Astronautics proposes a framework that uses'an offline-online computation paradigm, neighborhood classification through machine learning, sampling-based motion planning with an optimal control distance metric, and trajectory smoothing to achieve real-time planning for aerial vehicle,' according to the published paper.
Autonomous aerial cinematography has the potential to enable automatic capture of aesthetically pleasing videos without requiring human intervention, empowering individuals with the capability of high-end film studios. Current approaches either only handle off-line trajectory generation, or offer strategies that reason over short time horizons and simplistic representations for obstacles, which result in jerky movement and low real-life applicability. In this work we develop a method for aerial filming that is able to trade off shot smoothness, occlusion, and cinematography guidelines in a principled manner, even under noisy actor predictions. We present a novel algorithm for real-time covariant gradient descent that we use to efficiently find the desired trajectories by optimizing a set of cost functions. Experimental results show that our approach creates attractive shots, avoiding obstacles and occlusion 65 times over 1.25 hours of flight time, re-planning at 5 Hz with a 10 s time horizon. We robustly film human actors, cars and bicycles performing different motion among obstacles, using various shot types.
Target tracking has been one of the many popular applications that an unmanned aerial vehicle (UAV) is used for, in a variety of missions from intelligence gathering and surveillance to reconnaissance missions. Target tracking by autonomous vehicles could prove to be a beneficial tool for the development of guidance systems- Pedestrian detection, dynamic vehicle detection, and obstacle detection too and can improve the features of the guiding assistance system. An aerial vehicle equipped with object recognition and tracking features could play a vital role in drone navigation and obstacle avoidance; video surveillance, aerial view for traffic management, self-driving systems, surveillance of road conditions, and emergency response too. Target detection capacity in drones has made stupendous progress off late. Earlier, target detection in drone systems mostly used vision-based target finding algorithms.
A drone is a flying, autonomous camera. A drone is so much a flying camera that in a release video for their newest drone, dronemakers DJI don't even say the word "drone," instead just referring to the latest unmanned aerial vehicle as a "flying camera." The features that make it drone-like: remote controls, onboard stabilization and obstacle avoidance sensors, GPS location and navigation programs, are really just camera accessories, part of the built-in airborne body for the new selfie machine. And the body is compact. "Mavic," DJI's latest drone, folds up, and is designed for backpacks and large pouches, rather than large, bulky, specialized carrying cases.