Goto

Collaborating Authors

 duckietown


A Survey on Small-Scale Testbeds for Connected and Automated Vehicles and Robot Swarms

Mokhtarian, Armin, Xu, Jianye, Scheffe, Patrick, Kloock, Maximilian, Schäfer, Simon, Bang, Heeseung, Le, Viet-Anh, Ulhas, Sangeet, Betz, Johannes, Wilson, Sean, Berman, Spring, Paull, Liam, Prorok, Amanda, Alrifaee, Bassam

arXiv.org Artificial Intelligence

Connected and automated vehicles and robot swarms hold transformative potential for enhancing safety, efficiency, and sustainability in the transportation and manufacturing sectors. Extensive testing and validation of these technologies is crucial for their deployment in the real world. While simulations are essential for initial testing, they often have limitations in capturing the complex dynamics of real-world interactions. This limitation underscores the importance of small-scale testbeds. These testbeds provide a realistic, cost-effective, and controlled environment for testing and validating algorithms, acting as an essential intermediary between simulation and full-scale experiments. This work serves to facilitate researchers' efforts in identifying existing small-scale testbeds suitable for their experiments and provide insights for those who want to build their own. In addition, it delivers a comprehensive survey of the current landscape of these testbeds. We derive 62 characteristics of testbeds based on the well-known sense-plan-act paradigm and offer an online table comparing 22 small-scale testbeds based on these characteristics. The online table is hosted on our designated public webpage www.cpm-remote.de/testbeds, and we invite testbed creators and developers to contribute to it. We closely examine nine testbeds in this paper, demonstrating how the derived characteristics can be used to present testbeds. Furthermore, we discuss three ongoing challenges concerning small-scale testbeds that we identified, i.e., small-scale to full-scale transition, sustainability, and power and resource management.


A Low-Cost Lane-Following Algorithm for Cyber-Physical Robots

Gupta, Archit, Easwaran, Arvind

arXiv.org Artificial Intelligence

Duckiebots are low-cost mobile robots that are widely used in the fields of research and education. Although there are existing self-driving algorithms for the Duckietown platform, they are either too complex or perform too poorly to navigate a multi-lane track. Moreover, it is essential to give memory and computational resources to a Duckiebot so it can perform additional tasks such as out-of-distribution input detection. In order to satisfy these constraints, we built a low-cost autonomous driving algorithm capable of driving on a two-lane track. The algorithm uses traditional computer vision techniques to identify the central lane on the track and obtain the relevant steering angle. The steering is then controlled by a PID controller that smoothens the movement of the Duckiebot. The performance of the algorithm was compared to that of the NeurIPS 2018 AI Driving Olympics (AIDO) finalists, and it outperformed all but one finalists. The two main contributions of our algorithm are its low computational requirements and very quick set-up, with ongoing efforts to make it more reliable.


Demo Abstract: Real-Time Out-of-Distribution Detection on a Mobile Robot

Yuhas, Michael, Easwaran, Arvind

arXiv.org Artificial Intelligence

In a cyber-physical system such as an autonomous vehicle (AV), machine learning (ML) models can be used to navigate and identify objects that may interfere with the vehicle's operation. However, ML models are unlikely to make accurate decisions when presented with data outside their training distribution. Out-of-distribution (OOD) detection can act as a safety monitor for ML models by identifying such samples at run time. However, in safety critical systems like AVs, OOD detection needs to satisfy real-time constraints in addition to functional requirements. In this demonstration, we use a mobile robot as a surrogate for an AV and use an OOD detector to identify potentially hazardous samples. The robot navigates a miniature town using image data and a YOLO object detection network. We show that our OOD detector is capable of identifying OOD images in real-time on an embedded platform concurrently performing object detection and lane following. We also show that it can be used to successfully stop the vehicle in the presence of unknown, novel samples.


Cautious Adaptation For Reinforcement Learning in Safety-Critical Settings

Zhang, Jesse, Cheung, Brian, Finn, Chelsea, Levine, Sergey, Jayaraman, Dinesh

arXiv.org Machine Learning

Reinforcement learning (RL) in real-world safety-critical target settings like urban driving is hazardous, imperiling the RL agent, other agents, and the environment. To overcome this difficulty, we propose a "safety-critical adaptation" task setting: an agent first trains in non-safety-critical "source" environments such as in a simulator, before it adapts to the target environment where failures carry heavy costs. We propose a solution approach, CARL, that builds on the intuition that prior experience in diverse environments equips an agent to estimate risk, which in turn enables relative safety through risk-averse, cautious adaptation. CARL first employs model-based RL to train a probabilistic model to capture uncertainty about transition dynamics and catastrophic states across varied source environments. Then, when exploring a new safety-critical environment with unknown dynamics, the CARL agent plans to avoid actions that could lead to catastrophic states. In experiments on car driving, cartpole balancing, half-cheetah locomotion, and robotic object manipulation, CARL successfully acquires cautious exploration behaviors, yielding higher rewards with fewer failures than strong RL adaptation baselines. Website at https://sites.google.com/berkeley.edu/carl.


Learn to Program Self-Driving Cars (and Help Duckies Commute) With Duckietown

IEEE Spectrum Robotics

There is a strong and natural relationship between robots and rubber duckies. Being small, cheap, colorful, and pleasingly compliant, duckies became a sort of physical Stanford Bunny--when you want to show the scale of a robot, or give a robot something to visually locate or grasp or something, just toss a duckie in there. This relationship was formalized through the 2016 ICRA conference, where duckies inspired a bunch of videos and some poetry that is surprisingly not terrible. Since then, duckies have been taking over in robotics--at this point, I'm fairly certain that Andrea Censi at ETH Zurich is held hostage by (and doing the bidding of) a small army of little yellow duckies. This would explain why an entire duckie village full of duckie-sized autonomous cars that you can learn how to program is now on Kickstarter, with the hope that you'll help them take over the entire world.


Video Friday: Teaching a Robot to Pick Up a Knife, and More

IEEE Spectrum Robotics

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. Researchers at the Human Robot Interaction Laboratory at Tufts are teaching their PR2 to pick up objects by giving the robot instructions using natural language. Pretty cool project, but do they have to use a...knife?


Open Source Stories: Road to A.I.

#artificialintelligence

Duckietown is a hands-on, project-based course at MIT that focuses on self-driving vehicles and high-level autonomy. In Spring 2016, Liam Paull served as Duckietown's CEO and Teddy Ort worked as a vehicle autonomy engineer in training. Since the course began at MIT, it has spread to other universities around the globe, and is now taught in universities from Beijing to Zurich.


Self-driving cars, meet rubber duckies

AITopics Original Links

MIT has offered courses on everything from pirate training to "street-fighting math," but a new robotics class is truly one for the birds. This spring, a hands-on course housed at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) took students on a trip to "Duckietown." The class' goal was to create a fleet of 50 duckie-adorned self-driving taxis that can navigate the roads of a model city with just a single on-board camera and no pre-programmed maps. Beyond the class, Duckietown's leaders have larger ambitions: to work with roboticists around the world to incorporate their open-source teaching materials and $100 "Duckiebot" design into other schools' programs. This spring, a hands-on course housed at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) took students on a trip to "Duckietown."


Won't you take me to Duckietown? MIT is using rubber ducks to test self-driving tech

#artificialintelligence

In order to make self-driving cars viable, the automotive industry has recruited some of the best software developers, hardware engineers, and mobility analysts humanity has to offer. There's a new community working to push autonomous technology forward, but these researchers aren't human at all. Buried deep within the halls of MIT's Computer Science and Artificial Intelligence Lab (CSAIL) lies a small suburb called Duckietown, a mock-up municipality used to test and develop driverless technology. Populated entirely by rubber ducks riding on autonomous robo-taxis, Duckietown is the culmination of a graduate-level class that could prove invaluable to automakers in the future. "We believe a tool like this will help create a common platform and language for researchers to build on," said CSAIL postdoctoral associate Liam Paull, who co-leads the Duckietown course.


Quacky races! Self-driving 'duck taxis' can navigate a tiny town

#artificialintelligence

This experiment may look quackers, but it is an important step in teaching engineers of the future to train self-driving vehicles to navigate a town. Experts have created'Duckietown' - a miniature town with complex road junctions that's home to up to 50 taxis'driven' by rubber ducks. The self-driving duck taxis are fitted with cameras that allow them to read road signs and avoiding crashing into obstacles. Experts have created'Duckietown,' - a miniature town with complex road junctions that's home to up to 50 taxis'driven' by rubber ducks (pictured above) Duckietown is the brainchild of computer scientists at MIT's Computer Science and Artificial Intelligence Lab (Csail) where students are taught about autonomous vehicle technologies using 50 duck-mobiles. As part of the class, students had to build a fleet of duckie-adorned robo-taxis that use a single camera to navigate.