Waymo, the Alphabet subsidiary that hopes to someday pepper roads with self-driving taxis, today pulled back the curtains on a portion of the data used to train the algorithms underpinning its cars: The Waymo Open Dataset. Waymo principal scientist Dragomir Anguelov claims it's the largest multimodal sensor sample corpus for autonomous driving released to date. "[W]e are inviting the research community to join us with the [debut] of the Waymo Open Dataset, [which is composed] of high-resolution sensor data collected by Waymo self-driving vehicles," wrote Anguelov in a blog post published this morning. "Data is a critical ingredient for machine learning … [and] this rich and diverse set of real-world experiences has helped our engineers and researchers develop Waymo's self-driving technology and innovative models and algorithms." The Waymo Open Dataset contains data collected over the course of the millions of miles Waymo's cars have driven in Phoenix, Kirkland, Mountain View, and San Francisco, and it covers a wide variety of urban and suburban environments during day and night, dawn and dusk, and sunshine and rain.
It has been more than a decade in the making, but Waymo's self-driving taxis are officially picking up passengers without a human operator at the wheel. A group of early rider program members in Phoenix, Arizona received a message on this week offering them a free ride with the fully driverless service. Once the passenger is seated and the ride is underway, the car dials Waymo support to address any questions or concerns about the driverless ride – as many riders have never been carted around by a robot. It has been more than a decade in the making, but Waymo's self-driving taxis are officially picking up passengers without a human operator at the wheel. This car is all yours, with no one up front,' the pop-up notification from the Waymo app reads.
It seems the novelty of riding in a driverless car wears off quickly, if promotional footage from Google's Waymo is to be believed. Members of the public taking part in its Early Rider program in Arizona were recently invited to take trips in its now fully automated minivans. After their initial excitement wears off, the video clip shows them playing with their phones, taking selfies and even falling asleep. Waymo's first publicly available ride-hailing service is expected to be unveiled in Phoenix later this year, after the state gave the plans the go-ahead. Members of the public taking part in its Early Rider program in Arizona were recently invited to take trips in its now fully automated minivans.
Size and coverage: This release contains data from 1,000 driving segments. Such continuous footage gives researchers the opportunity to develop models to track and predict the behavior of other road users. Diverse driving environments: This dataset covers dense urban and suburban environments across Phoenix, AZ, Kirkland, WA, Mountain View, CA and San Francisco, CA capturing a wide spectrum of driving conditions (day and night, dawn and dusk, sun and rain). High-resolution, 360 view: Each segment contains sensor data from five high-resolution Waymo lidars and five front-and-side-facing cameras. Dense labeling: The dataset includes lidar frames and images with vehicles, pedestrians, cyclists, and signage carefully labeled, capturing a total of 12 million 3D labels and 1.2 million 2D labels.
When a self-driving car passes by, you tend to notice. The towering sensors whirling around on the top of the car more than stand out. But Chinese autonomous vehicle company Pony.ai is reimagining the roofline for its next generation of autonomous taxicabs. As part of a partnership with autonomous vehicle sensor maker Luminar announced Monday, the Pony.ai Typical LiDAR sensors like those from Velodyne, Intel's Mobileye, and Waymo's own Laser Bear Honeycomb are mostly cone-shaped to help pull in a full 360-degree view from the top and around the car.