While the most sophisticated driverless cars on public roads can handle haboobs and rainstorms like champs, certain types of precipitation remain a challenge for them -- like snow. That's because snow covers cameras critical to those cars' self-awareness and tricks sensors into perceiving obstacles that aren't there, and because snow obscures road signs and other structures that normally serve as navigational landmarks. In an effort to spur on the development of cars capable of driving in wintry weather, startup Scale AI this week open-sourced Canadian Adverse Driving Conditions (CADC), a data set containing over 56,000 images in conditions including snow created with the University of Waterloo and the University of Toronto. While several corpora with snowy sensor samples have been released to date, including Linköping University's Automotive Multi-Sensor Dataset (AMUSE) and the Mapillary Vistas data set, Scale AI claims that CADC is the first to focus specifically on "real-world" driving in snowy weather. "Snow is hard to drive in -- as many drivers are well aware. But wintry conditions are especially hard for self-driving cars because of the way snow affects the critical hardware and AI algorithms that power them," wrote Scale AI CEO Alexandr Wang in a blog post.
In a test kitchen in a corner building in downtown Pasadena, Flippy the robot grabbed a fryer basket full of chicken fingers, plunged it into hot oil -- its sensors told it exactly how hot -- then lifted, drained and dumped maximally tender tenders into a waiting hopper. A few feet away, another Flippy eyed a beef patty sizzling on a griddle. With its camera eyes feeding pixels to a machine vision brain, it waited until the beef hit the right shade of brown, then smoothly slipped its spatula hand under the burger and plopped it on a tray. The product of decades of research in robotics and machine learning, Flippy represents a synthesis of motors, sensors, chips and processing power that wasn't possible until recently. Now, Flippy's success -- and the success of the company that built it, Miso Robotics -- depends on simple math and a controversial hypothesis of how robots can transform the service economy.
These days, machine learning and computer vision are all the craze. We've all seen the news about self-driving cars and facial recognition and probably imagined how cool it'd be to build our own computer vision models. However, it's not always easy to break into the field, especially without a strong math background. Libraries like PyTorch and TensorFlow can be tedious to learn if all you want to do is experiment with something small. In this tutorial, I present a simple way for anyone to build fully-functional object detection models with just a few lines of code.
Self-driving cars, home automation, virtual assistants…it's clear we've already seen some outstanding technological advances and are on the brink of more significant breakthroughs. Alain Fiocco, CTO for OVHcloud, calls 2020 "a new era" for technology. But with all new advances, which will pull ahead in 2020? Here is a breakdown of the top five telecom trends to watch for in the year ahead. Right now, the world runs on 4G, also known as LTE.
Few months go by without another devastating earthquake somewhere in the world reminding us how we all remain at the mercy of major seismic events that strike without warning. But a new branch of geophysics powered by machine learning is uncovering fresh insights into the earth's slipping faults that often trigger these catastrophic earthquakes. Machine learning, which often goes by the catchier moniker of artificial intelligence, has captured the public's imagination with its promises of fully autonomous cars and the approaching "singularity" when machines out-think people. The current state of the art, however, shows little signs of true intelligence, such as the ability to abstract the principles behind a given phenomenon. In image recognition, AI systems learn from rote memorization to identify objects and are, therefore, often fooled.
Driver rudeness seems associated with certain brands, and AI self-driving cars ought to consider ... [ ] this. What kind of car do you drive? I don't mean whether it is a four-door or two-door, nor whether it is red in color or blue. Specifically, what brand of car do you drive? According to various studies (cited in a moment herein), supposedly the brand of car is a telltale indicator of how rude a driver sits behind the wheel of the vehicle.
HONG KONG/BEIJING – Autonomous driving firm Pony.ai said it raised $462 million in its latest funding round, led by an investment by Toyota Motor Corp. Toyota invested around $400 million (¥44.2 billion) in the round, Pony.ai said in a statement Wednesday, marking its biggest investment in an autonomous driving company with a Chinese background. The latest fund raising values the three-year-old firm, already backed by Sequoia Capital China and Beijing Kunlun Tech Co., at slightly more than $3 billion. The investment by Japan's largest automaker comes at a time when global carmakers, technology firms, start-ups and investors -- including Tesla, Alphabet Inc.'s Waymo and Uber -- are pouring capital into developing self-driving vehicles. Over the past two years, 323 deals related to autonomous cars raised a total of $14.6 billion worldwide, according to data provider PitchBook, even amid concerns about the technology given its high cost and complexity. The Silicon Valley-based startup Pony.ai -- co-founded by CEO James Peng, a former executive at China's Baidu, and chief technology officer Lou Tiancheng, a former Google and Baidu engineer -- is already testing autonomous vehicles in California, Beijing and Guangzhou.
Car companies have been feverishly working to improve the technologies behind self-driving cars. But so far even the most high-tech vehicles still fail when it comes to safely navigating in rain and snow. This is because these weather conditions wreak havoc on the most common approaches for sensing, which usually involve either lidar sensors or cameras. In the snow, for example, cameras can no longer recognize lane markings and traffic signs, while the lasers of lidar sensors malfunction when there's, say, stuff flying down from the sky. MIT researchers have recently been wondering whether an entirely different approach might work.
Andreessen-Horowitz has always been the most levelheaded of the major current year VC firms. While other firms were levering up on "cleantech" and nonsensical biotech startups that violate physical law, they quietly continued to invest in sane companies (also hot garbage bugman products like soylent). I assume they actually listen to people on the front lines, rather than what their VC pals are telling them. Maybe they're just smarter than everyone else; definitely more independent minded. Their recent review on how "AI" differs from software company investments is absolutely brutal.
Every year, companies that operate self-driving cars in California are required to submit data to the state's Department of Motor Vehicles listing the number of miles driven and the frequency at which human safety drivers were forced to take control of their autonomous vehicles (also known as a "disengagement"). And every year, those same companies raise a huge stink about it. Waymo, which drove 1.45 million miles in California in 2019 and logged a disengagement rate of 0.076 per 1,000 self-driven miles, says the metric "does not provide relevant insights" into its technology. Cruise, which drove 831,040 miles last year and reported a disengagement rate of 0.082, says the "idea that disengagements give a meaningful signal about whether an [autonomous vehicle] is ready for commercial deployment is a myth." Aurora, which only drove 13,429 miles and recorded a disengagement rate of 10.6 per 1,000 miles, calls them "misguided."