Researchers release open-source photorealistic simulator for autonomous driving
VISTA 2.0 builds off of the team's previous model, VISTA, and it's fundamentally different from existing AV simulators since it's data-driven -- meaning it was built and photorealistically rendered from real-world data -- thereby enabling direct transfer to reality. While the initial iteration supported only single car lane-following with one camera sensor, achieving high-fidelity data-driven simulation required rethinking the foundations of how different sensors and behavioral interactions can be synthesized. Enter VISTA 2.0: a data-driven system that can simulate complex sensor types and massively interactive scenarios and intersections at scale. With much less data than previous models, the team was able to train autonomous vehicles that could be substantially more robust than those trained on large amounts of real-world data. "This is a massive jump in capabilities of data-driven simulation for autonomous vehicles, as well as the increase of scale and ability to handle greater driving complexity," says Alexander Amini, CSAIL PhD student and co-lead author on two new papers, together with fellow PhD student Tsun-Hsuan Wang.
Jun-21-2022, 17:50:29 GMT
- Country:
- Asia > Middle East
- Republic of Türkiye > Karaman Province > Karaman (0.05)
- North America
- Canada > Ontario
- Toronto (0.15)
- United States > Massachusetts
- Middlesex County > Cambridge (0.40)
- Canada > Ontario
- Asia > Middle East
- Industry:
- Automobiles & Trucks (0.67)
- Information Technology > Robotics & Automation (0.52)
- Transportation > Ground
- Road (0.52)
- Technology: