duckiebot
A Low-Cost Lane-Following Algorithm for Cyber-Physical Robots
Gupta, Archit, Easwaran, Arvind
Duckiebots are low-cost mobile robots that are widely used in the fields of research and education. Although there are existing self-driving algorithms for the Duckietown platform, they are either too complex or perform too poorly to navigate a multi-lane track. Moreover, it is essential to give memory and computational resources to a Duckiebot so it can perform additional tasks such as out-of-distribution input detection. In order to satisfy these constraints, we built a low-cost autonomous driving algorithm capable of driving on a two-lane track. The algorithm uses traditional computer vision techniques to identify the central lane on the track and obtain the relevant steering angle. The steering is then controlled by a PID controller that smoothens the movement of the Duckiebot. The performance of the algorithm was compared to that of the NeurIPS 2018 AI Driving Olympics (AIDO) finalists, and it outperformed all but one finalists. The two main contributions of our algorithm are its low computational requirements and very quick set-up, with ongoing efforts to make it more reliable.
- Transportation > Ground > Road (0.36)
- Information Technology > Robotics & Automation (0.36)
- Automobiles & Trucks (0.36)
- Leisure & Entertainment (0.34)
Demo Abstract: Real-Time Out-of-Distribution Detection on a Mobile Robot
Yuhas, Michael, Easwaran, Arvind
In a cyber-physical system such as an autonomous vehicle (AV), machine learning (ML) models can be used to navigate and identify objects that may interfere with the vehicle's operation. However, ML models are unlikely to make accurate decisions when presented with data outside their training distribution. Out-of-distribution (OOD) detection can act as a safety monitor for ML models by identifying such samples at run time. However, in safety critical systems like AVs, OOD detection needs to satisfy real-time constraints in addition to functional requirements. In this demonstration, we use a mobile robot as a surrogate for an AV and use an OOD detector to identify potentially hazardous samples. The robot navigates a miniature town using image data and a YOLO object detection network. We show that our OOD detector is capable of identifying OOD images in real-time on an embedded platform concurrently performing object detection and lane following. We also show that it can be used to successfully stop the vehicle in the presence of unknown, novel samples.
SynchroSim: An Integrated Co-simulation Middleware for Heterogeneous Multi-robot System
Dey, Emon, Hossain, Jumman, Roy, Nirmalya, Busart, Carl
With the advancement of modern robotics, autonomous agents are now capable of hosting sophisticated algorithms, which enables them to make intelligent decisions. But developing and testing such algorithms directly in real-world systems is tedious and may result in the wastage of valuable resources. Especially for heterogeneous multi-agent systems in battlefield environments where communication is critical in determining the system's behavior and usability. Due to the necessity of simulators of separate paradigms (co-simulation) to simulate such scenarios before deploying, synchronization between those simulators is vital. Existing works aimed at resolving this issue fall short of addressing diversity among deployed agents. In this work, we propose \textit{SynchroSim}, an integrated co-simulation middleware to simulate a heterogeneous multi-robot system. Here we propose a velocity difference-driven adjustable window size approach with a view to reducing packet loss probability. It takes into account the respective velocities of deployed agents to calculate a suitable window size before transmitting data between them. We consider our algorithm-specific simulator agnostic but for the sake of implementation results, we have used Gazebo as a Physics simulator and NS-3 as a network simulator. Also, we design our algorithm considering the Perception-Action loop inside a closed communication channel, which is one of the essential factors in a contested scenario with the requirement of high fidelity in terms of data transmission. We validate our approach empirically at both the simulation and system level for both line-of-sight (LOS) and non-line-of-sight (NLOS) scenarios. Our approach achieves a noticeable improvement in terms of reducing packet loss probability ($\approx$11\%), and average packet delay ($\approx$10\%) compared to the fixed window size-based synchronization approach.
- North America > United States > Maryland > Baltimore County (0.04)
- North America > United States > Maryland > Baltimore (0.04)
- Asia > Japan > Honshū > Kansai > Hyogo Prefecture > Kobe (0.04)
- Information Technology (1.00)
- Government > Military > Army (0.49)
Learn to Program Self-Driving Cars (and Help Duckies Commute) With Duckietown
There is a strong and natural relationship between robots and rubber duckies. Being small, cheap, colorful, and pleasingly compliant, duckies became a sort of physical Stanford Bunny--when you want to show the scale of a robot, or give a robot something to visually locate or grasp or something, just toss a duckie in there. This relationship was formalized through the 2016 ICRA conference, where duckies inspired a bunch of videos and some poetry that is surprisingly not terrible. Since then, duckies have been taking over in robotics--at this point, I'm fairly certain that Andrea Censi at ETH Zurich is held hostage by (and doing the bidding of) a small army of little yellow duckies. This would explain why an entire duckie village full of duckie-sized autonomous cars that you can learn how to program is now on Kickstarter, with the hope that you'll help them take over the entire world.
- Europe > Switzerland > Zürich > Zürich (0.25)
- North America > Canada > Quebec > Montreal (0.05)
- Transportation > Ground > Road (1.00)
- Education (1.00)
- Information Technology > Robotics & Automation (0.86)
- Transportation > Passenger (0.72)
Won't you take me to Duckietown? MIT is using rubber ducks to test self-driving tech
In order to make self-driving cars viable, the automotive industry has recruited some of the best software developers, hardware engineers, and mobility analysts humanity has to offer. There's a new community working to push autonomous technology forward, but these researchers aren't human at all. Buried deep within the halls of MIT's Computer Science and Artificial Intelligence Lab (CSAIL) lies a small suburb called Duckietown, a mock-up municipality used to test and develop driverless technology. Populated entirely by rubber ducks riding on autonomous robo-taxis, Duckietown is the culmination of a graduate-level class that could prove invaluable to automakers in the future. "We believe a tool like this will help create a common platform and language for researchers to build on," said CSAIL postdoctoral associate Liam Paull, who co-leads the Duckietown course.
- Automobiles & Trucks (1.00)
- Information Technology > Robotics & Automation (0.96)