While Google's physical self-driving cars logged 3 million miles last year, the virtual ones that drive in the Carcraft simulations logged an astounding 2.5 billion miles The exclusive software works in tandem with a secret test base called'Castle,' which is essentially a hidden mock city that can quickly be configured to test different scenarios. There, Waymo is testing several types of self-driving cars, including popular Lexus models, retired Priuses, Chysler Pacificas minivans and even autonomous vehicles labeled'level four,' meaning they physically cannot be driven by humans. There, Waymo is testing several types of self-driving cars, including popular Lexus models, retired Priuses, Chysler Pacificas minivans and even autonomous vehicles labeled'level four,' meaning they physically cannot be driven by humans. A Carcraft'fuzzing' chart that allows engineers to see different combinations of variables that would influence a self-driving cars decisions At Castle, Waymo is testing several types of self-driving cars, including popular Lexus models, retired Priuses, Chysler Pacificas minivans and even autonomous vehicles labeled'level four,' meaning they physically cannot be driven by humans'It's not enough to just track a thing through a space - You have to understand what it is doing.'
The company's long-rumored Project Titan vehicle will instead debut in the form of a shuttle bus for Apple employees, according to a New York Times report. The self-driving shuttle service will be one of the first real-world tests of that new system, according to the New York Times report, which cites five anonymous people familiar with Project Titan. The actual shuttle service, dubbed PAIL (Palo Alto to Infinite Loop), will ferry company personnel between Apple's main campus and other offices around the Palo Alto area. Back in April, the California DMV granted Apple permission to test its autonomous system in three Lexus RX540h SUVs on the state's roads, one of which was reportedly spotted out in the wild -- but the rumored shuttle service will probably look more like the University of Michigan's MCity shuttle program launching this school year.
Airbus calls its self-flying flying car Vahana, and is working on it at its Silicon Valley outpost A 3 (pronounced "a cubed"). If all goes according to plan, Vahana will use a Near Earth Autonomy technology called Peregrin. Singh has spent 25 years working on sensors for autonomous cars and aircraft, and spun off Near Earth Autonomy from Carnegie Melon University five years ago. That said, a lot of work remains to be done, and Neva Aerospace, a European consortium driving the development of key technologies for flying cars, believes fully autonomous flights remain a long ways off.
Apple famously planned to build an entire self-driving car, but abandoned that idea to focus on autonomous vehicle technology à la Uber and Waymo. It reportedly plans to test the tech by building a self-driving shuttle (called PAIL, for Palo Alto to Infinite Loop) that will take employees between its current campus and the new "Spaceship" HQ. We already know, thanks to many leaks and rumors, that Apple hired "hundreds" of engineers dedicated to building an entire autonomous car in a plan dubbed "Project Titan." As expected, Apple will use another company's vehicle to test its PAIL shuttle, much as Waymo has with Chrysler.
McKinsey analysis indicates that in 50 metropolitan areas around the world, home to 500 million people, integrated mobility systems could produce benefits, such as improved safety and reduced pollution, worth up to $600 billion. No matter how ready a city is to move toward advanced mobility models, municipal officials can already begin developing a vision for what integrated mobility ought to look like and how their cities might evolve accordingly. To help city leaders structure their thinking, we have created scenarios for how mobility might change in three types of cities: dense cities in developed economies, dense cities in emerging economies, and sprawling metropolitan areas in developed economies. As advanced mobility services and technologies have penetrated cities, public officials at the city, regional, and national levels have responded by establishing an array of new regulations.
"We're developing self-driving technology because the world is changing rapidly," Sherif Marakby, the company's vice president of autonomous vehicles and electrification, wrote in a Medium post Tuesday morning. Marakby further opened about Ford's plans to develop self-driving cars. "We plan to develop and manufacture self-driving vehicles at scale, deployed in cooperation with multiple partners, and with a customer experience based on human-centered design principles," he wrote. "Our team has decades of experience developing and manufacturing vehicles that serve commercial operations such as taxi and delivery businesses.
In order to decipher these complex situations, autonomous vehicle developers are turning to artificial neural networks. In place of traditional programming, the network is given a set of inputs and a target output (in this case, the inputs being image data and the output being a particular class of object). The process of training a neural network for semantic segmentation involves feeding it numerous sets of training data with labels to identify key elements, such as cars or pedestrians. Machine learning is already employed for semantic segmentation in driver assistance systems, such as autonomous emergency braking, though.
Before autonomous trucks and taxis hit the road, manufacturers will need to solve problems far more complex than collision avoidance and navigation (see "10 Breakthrough Technologies 2017: Self-Driving Trucks"). These vehicles will have to anticipate and defend against a full spectrum of malicious attackers wielding both traditional cyberattacks and a new generation of attacks based on so-called adversarial machine learning (see "AI Fight Club Could Help Save Us from a Future of Super-Smart Cyberattacks"). When hackers demonstrated that vehicles on the roads were vulnerable to several specific security threats, automakers responded by recalling and upgrading the firmware of millions of cars. The computer vision and collision avoidance systems under development for autonomous vehicles rely on complex machine-learning algorithms that are not well understood, even by the companies that rely on them (see "The Dark Secret at the Heart of AI").
With the rapid increases in computing power, it's easy to get seduced into thinking that raw computing power can solve problems like smart edge devices (e.g., cars, trains, airplanes, wind turbines, jet engines, medical devices). In chess, the complexity of the chess piece only increases slightly (rooks can move forward and sideways a variable number of spaces, bishops can move diagonally a variable number of spaces, etc. Now think about the number and breadth of "moves" or variables that need to be considered when driving a car in a nondeterministic (random) environment: weather (precipitation, snow, ice, black ice, wind), time of day (day time, twilight, night time, sun rise, sun set), road conditions (pot holes, bumpy, slick), traffic conditions (number of vehicles, types of vehicles, different speeds, different destinations). It's nearly impossible for an autonomous car manufacturer to operate enough vehicles in enough different situations to generate the amount of data that can be virtually gathered by playing against Grand Theft Auto.
Hearing plays an essential role in how you navigate the world, and, so far, most autonomous cars can't hear. It recently spent a day testing the system with emergency vehicles from the Chandler, Arizona, police and fire departments. Police cars, ambulances, fire trucks, and even unmarked cop cars chased, passed, and led the Waymo vans through the day and into the night. Sensors aboard the vans recorded vast quantities of data that will help create a database of all the sounds emergency vehicles make, so in the future, Waymo's driverless cars will know how to respond.