Collaborating Authors

Researchers release open-source photorealistic simulator for autonomous driving


VISTA 2.0 is an open-source simulation engine that can make realistic environments for training and testing self-driving cars. Hyper-realistic virtual worlds have been heralded as the best driving schools for autonomous vehicles (AVs), since they've proven fruitful test beds for safely trying out dangerous driving scenarios. Tesla, Waymo, and other self-driving companies all rely heavily on data to enable expensive and proprietary photorealistic simulators, since testing and gathering nuanced I-almost-crashed data usually isn't the most easy or desirable to recreate. To that end, scientists from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) created "VISTA 2.0," a data-driven simulation engine where vehicles can learn to drive in the real world and recover from near-crash scenarios. What's more, all of the code is being open-sourced to the public.

Welcome to Waabi World, the "ultimate simulator" for autonomous vehicles


Raquel Urtasun, the former chief scientist at Uber's Advanced Technologies Group, promised an "AI-centric approach" to autonomous vehicles when she founded her own company last year, Waabi. Now, she's ready to deliver on that promise with the announcement of the company's first product: a virtual world in which to test its autonomous vehicles called "Waabi World." Urtasun says she doesn't want to rely on a large fleet of vehicles driving millions of miles on public roads and gathering data in service of training AI systems to drive better and safer than humans. That's expensive, time-consuming, and ultimately doesn't capture the seemingly endless number of edge cases that could confuse a self-driving vehicle. Urtasun claims that simulation is cheaper and more efficient than real-world testing.

Semi-Autonomous Road Train Trial Is a Success

AITopics Original Links

A semi-autonomous, four-vehicle road train has been successfully demonstrated at Volvo's test track in Hällered, Sweden, paving the way for on-road trials. Road trains, also known as platoons, feature vehicles that can monitor and mimic the actions of the car or truck immediately ahead. In a road train, cars and trucks with the same destination are grouped together, and control is handed over to a "lead vehicle" that's under the command of a professional driver. That allows the semi-autonomous vehicles in the train to follow together very closely, reducing congestion and decreasing energy use by up to 20 percent. Indeed, in the trail shown above, cars were a mere 20 feet from each other and travelled at speeds up to 56 mph, all while the folks in the driver's seats checked out their iPads.

Right Tools are Important for Making Autonomous Cars Smart and Safe


Autonomous vehicles are advancing from cutting-edge dreams to current reality, and as the innovation develops, personal and public transportation will be forever changed. In the long run, driverless vehicles will remove human drivers from the condition, banishing drowsy, weakened, and distracted drivers from the streets. Almost 40,000 individuals in the United States passed away on the streets in 2017, and as per the National Highway Traffic Safety Administration (NHTSA), around 90% of those mishaps were because of human blunder. Autonomous cars depend on sensors, actuators, complex algorithms, ML frameworks, and ground-breaking processors to execute programming. Autonomous vehicles make and keep up a map of their environmental factors dependent on an assortment of sensors arranged in various pieces of the vehicle.

Researchers release open-source photorealistic simulator for autonomous driving


VISTA 2.0 builds off of the team's previous model, VISTA, and it's fundamentally different from existing AV simulators since it's data-driven -- meaning it was built and photorealistically rendered from real-world data -- thereby enabling direct transfer to reality. While the initial iteration supported only single car lane-following with one camera sensor, achieving high-fidelity data-driven simulation required rethinking the foundations of how different sensors and behavioral interactions can be synthesized. Enter VISTA 2.0: a data-driven system that can simulate complex sensor types and massively interactive scenarios and intersections at scale. With much less data than previous models, the team was able to train autonomous vehicles that could be substantially more robust than those trained on large amounts of real-world data. "This is a massive jump in capabilities of data-driven simulation for autonomous vehicles, as well as the increase of scale and ability to handle greater driving complexity," says Alexander Amini, CSAIL PhD student and co-lead author on two new papers, together with fellow PhD student Tsun-Hsuan Wang.