Goto

Collaborating Authors

Cinematography on the fly

MIT News

But a team of researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) and ETH Zurich hope to make drone cinematography more accessible, simple, and reliable. Then, on the fly, it generates control signals for a camera-equipped autonomous drone, which preserve that framing as the actors move. With our solution, if the subject turns 180 degrees, our drones are able to circle around and keep focus on the face. The researchers tested the system at CSAIL's motion-capture studio, using a quadrotor (four-propeller) drone.


Cinematography on the fly

#artificialintelligence

In recent years, a host of Hollywood blockbusters -- including "The Fast and the Furious 7," "Jurassic World," and "The Wolf of Wall Street" -- have included aerial tracking shots provided by drone helicopters outfitted with cameras. Those shots required separate operators for the drones and the cameras, and careful planning to avoid collisions. But a team of researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) and ETH Zurich hope to make drone cinematography more accessible, simple, and reliable. At the International Conference on Robotics and Automation later this month, the researchers will present a system that allows a director to specify a shot's framing -- which figures or faces appear where, at what distance. Then, on the fly, it generates control signals for a camera-equipped autonomous drone, which preserve that framing as the actors move.


Aggressive Quadrotors Conquer Gaps With Ultimate Autonomy

IEEE Spectrum Robotics

Just a few weeks ago, we posted about some incredible research from Vijay Kumar's lab at the University of Pennsylvania getting quadrotors to zip through narrow gaps using only onboard localization. This is a big deal, because it means that drones are getting closer to being able to aggressively avoid obstacles without depending on external localization systems. The one little asterisk to this research was that the quadrotors were provided the location and orientation of the gap in advance, rather than having to figure it out for themselves. Yesterday, Davide Falanga, Elias Mueggler, Matthias Faessler, and Professor Davide Scaramuzza, who leads the Robotics and Perception Group at the University of Zurich, shared some research that they've just submitted to ICRA 2017. It's the same kind of aggressive quadrotor maneuvering, except absolutely everything is done on board, including obstacle perception.


MIT's Leading the Pack With This Cool New Autonomous Drone Tech

#artificialintelligence

Any Star Wars fan knows that the chances of successfully navigating an asteroid field are approximately 3,720 to 1. The odds are probably significantly higher against today's autonomous drones, which fly quite a bit slower than sublight speed and without the mad skills of Han Solo. Researchers at MIT believe they have hit upon a solution--more than one, actually--to train drones to move quickly through a crowded, complex environment, though we're probably light years away from navigating through hostile star systems. One solution, dubbed "Flight Goggles," involves streaming a virtual reality environment to the drone as it flies through empty space. "The system is at the intersection of motion capture equipment, drone technology, and high-bandwidth communications," Sertac Karaman, associate professor of aeronautics and astronautics at MIT, told Singularity Hub.


Autonomous drone cinematographer: Using artistic principles to create smooth, safe, occlusion-free trajectories for aerial filming

arXiv.org Artificial Intelligence

Autonomous aerial cinematography has the potential to enable automatic capture of aesthetically pleasing videos without requiring human intervention, empowering individuals with the capability of high-end film studios. Current approaches either only handle off-line trajectory generation, or offer strategies that reason over short time horizons and simplistic representations for obstacles, which result in jerky movement and low real-life applicability. In this work we develop a method for aerial filming that is able to trade off shot smoothness, occlusion, and cinematography guidelines in a principled manner, even under noisy actor predictions. We present a novel algorithm for real-time covariant gradient descent that we use to efficiently find the desired trajectories by optimizing a set of cost functions. Experimental results show that our approach creates attractive shots, avoiding obstacles and occlusion 65 times over 1.25 hours of flight time, re-planning at 5 Hz with a 10 s time horizon. We robustly film human actors, cars and bicycles performing different motion among obstacles, using various shot types.