Goto

Collaborating Authors

 crazyflie drone


Multi Agent Framework for Collective Intelligence Research

Dochian, Alexandru

arXiv.org Artificial Intelligence

This paper presents a scalable decentralized multi agent framework that facilitates the exchange of information between computing units through computer networks. The architectural boundaries imposed by the tool make it suitable for collective intelligence research experiments ranging from agents that exchange hello world messages to virtual drone agents exchanging positions and eventually agents exchanging information via radio with real Crazyflie drones in VU Amsterdam laboratory. The field modulation theory is implemented to construct synthetic local perception maps for agents, which are constructed based on neighbouring agents positions and neighbouring points of interest dictated by the environment. By constraining the experimental setup to a 2D environment with discrete actions, constant velocity and parameters tailored to VU Amsterdam laboratory, UAV Crazyflie drones running hill climbing controller followed collision-free trajectories and bridged sim-to-real gap.


Testing Spacecraft Formation Flying with Crazyflie Drones as Satellite Surrogates

de la Barcena, Arturo, Rhodes, Collin, McCarroll, John, Cescon, Marzia, Hobbs, Kerianne L.

arXiv.org Artificial Intelligence

As the space domain becomes increasingly congested, autonomy is proposed as one approach to enable small numbers of human ground operators to manage large constellations of satellites and tackle more complex missions such as on-orbit or in-space servicing, assembly, and manufacturing. One of the biggest challenges in developing novel spacecraft autonomy is mechanisms to test and evaluate their performance. Testing spacecraft autonomy on-orbit can be high risk and prohibitively expensive. An alternative method is to test autonomy terrestrially using satellite surrogates such as attitude test beds on air bearings or drones for translational motion visualization. Against this background, this work develops an approach to evaluate autonomous spacecraft behavior using a surrogate platform, namely a micro-quadcopter drone developed by the Bitcraze team, the Crazyflie 2.1. The Crazyflie drones are increasingly becoming ubiquitous in flight testing labs because they are affordable, open source, readily available, and include expansion decks which allow for features such as positioning systems, distance and/or motion sensors, wireless charging, and AI capabilities. In this paper, models of Crazyflie drones are used to simulate the relative motion dynamics of spacecraft under linearized Clohessy-Wiltshire dynamics in elliptical natural motion trajectories, in pre-generated docking trajectories, and via trajectories output by neural network control systems.


An ARGoS plug-in for the Crazyflie drone

Stolfi, Daniel H., Danoy, Grégoire

arXiv.org Artificial Intelligence

We present a new plug-in for the ARGoS swarm robotic simulator to implement the Crazyflie drone, including its controllers, sensors, and some expansion decks. We have based our development on the former Spiri drone, upgrading the position controller, adding a new speed controller, LED ring, onboard camera, and battery discharge model. We have compared this new plug-in in terms of accuracy and efficiency with data obtained from real Crazyflie drones. All our experiments showed that the proposed plug-in worked well, presenting high levels of accuracy. We believe that this is an important contribution to robot simulations which will extend ARGoS capabilities through the use of our proposed, open-source plug-in.


Blind as a bat: audible echolocation on small robots

Dümbgen, Frederike, Hoffet, Adrien, Kolundžija, Mihailo, Scholefield, Adam, Vetterli, Martin

arXiv.org Artificial Intelligence

For safe and efficient operation, mobile robots need to perceive their environment, and in particular, perform tasks such as obstacle detection, localization, and mapping. Although robots are often equipped with microphones and speakers, the audio modality is rarely used for these tasks. Compared to the localization of sound sources, for which many practical solutions exist, algorithms for active echolocation are less developed and often rely on hardware requirements that are out of reach for small robots. We propose an end-to-end pipeline for sound-based localization and mapping that is targeted at, but not limited to, robots equipped with only simple buzzers and low-end microphones. The method is model-based, runs in real time, and requires no prior calibration or training. We successfully test the algorithm on the e-puck robot with its integrated audio hardware, and on the Crazyflie drone, for which we design a reproducible audio extension deck. We achieve centimeter-level wall localization on both platforms when the robots are static during the measurement process. Even in the more challenging setting of a flying drone, we can successfully localize walls, which we demonstrate in a proof-of-concept multi-wall localization and mapping demo.