Autonomous vehicles are on the rise to combat the country's motor vehicle fatalities. This article by Red Hat's Pete Brey takes a dive on how machine learning, artificial intelligence, and deep learning work together to achieve this goal. Houston, we have a problem. So does Los Angeles, Atlanta, New York, D.C, Boston, and all cities, towns, and counties throughout the United States. That problem is motor vehicle fatalities.
Waymo, the self-driving technology company owned by Google's parent company, Alphabet, released a dataset containing sensor data collected by their autonomous vehicles during more than five hours of driving. The set contains high-resolution data from lidar and camera sensors collected in several urban and suburban environments in a wide variety of driving conditions, and includes labels for vehicles, pedestrians, cyclists, and signage. The Waymo team announced the release of the Waymo Open Dataset in a blog post, describing it as "one of the largest, richest, and most diverse self-driving datasets ever released for research." The data was collected by Waymo's vehicles operating in the USA in Phoenix, AZ, Kirkland, WA, Mountain View, CA and San Francisco, CA, at various times of day and night, and in good and bad weather. The dataset consists of 1,000 segments of 20 seconds each, collected at 10Hz (i.e., 200,000 frames) which contain: Waymo also released a Google Colab notebook containing tutorials and a GitHub repository containing TensorFlow helper-code for building models.
Machine learning (ML) is the science of helping computers discover patterns and relationships in data instead of being manually programmed. It's a powerful tool for creating personalized and dynamic experiences, and it's already driving everything from Netflix recommendations to autonomous cars. But as more and more experiences are built with ML, it's clear that UXers still have a lot to learn about how to make users feel in control of the technology, and not the other way round. As was the case with the mobile revolution, and the web before that, ML will cause us to rethink, restructure, displace, and consider new possibilities for virtually every experience we build. In the Google UX community, we've started an effort called "human-centered machine learning" (HCML) to help focus and guide that conversation.
Google has updated its Flights app with a pair of new features that should help weary (and wary) travelers get to grips with the next trip to the airport. The first uses machine learning to predict upcoming flight delays, and the second breaks down exactly what different airlines mean by "basic economy" -- explaining what amenities are and are not included in so-called last class.