If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
"We're developing self-driving technology because the world is changing rapidly," Sherif Marakby, the company's vice president of autonomous vehicles and electrification, wrote in a Medium post Tuesday morning. Marakby further opened about Ford's plans to develop self-driving cars. "We plan to develop and manufacture self-driving vehicles at scale, deployed in cooperation with multiple partners, and with a customer experience based on human-centered design principles," he wrote. "Our team has decades of experience developing and manufacturing vehicles that serve commercial operations such as taxi and delivery businesses.
In order to decipher these complex situations, autonomous vehicle developers are turning to artificial neural networks. In place of traditional programming, the network is given a set of inputs and a target output (in this case, the inputs being image data and the output being a particular class of object). The process of training a neural network for semantic segmentation involves feeding it numerous sets of training data with labels to identify key elements, such as cars or pedestrians. Machine learning is already employed for semantic segmentation in driver assistance systems, such as autonomous emergency braking, though.
Before autonomous trucks and taxis hit the road, manufacturers will need to solve problems far more complex than collision avoidance and navigation (see "10 Breakthrough Technologies 2017: Self-Driving Trucks"). These vehicles will have to anticipate and defend against a full spectrum of malicious attackers wielding both traditional cyberattacks and a new generation of attacks based on so-called adversarial machine learning (see "AI Fight Club Could Help Save Us from a Future of Super-Smart Cyberattacks"). When hackers demonstrated that vehicles on the roads were vulnerable to several specific security threats, automakers responded by recalling and upgrading the firmware of millions of cars. The computer vision and collision avoidance systems under development for autonomous vehicles rely on complex machine-learning algorithms that are not well understood, even by the companies that rely on them (see "The Dark Secret at the Heart of AI").
With the rapid increases in computing power, it's easy to get seduced into thinking that raw computing power can solve problems like smart edge devices (e.g., cars, trains, airplanes, wind turbines, jet engines, medical devices). In chess, the complexity of the chess piece only increases slightly (rooks can move forward and sideways a variable number of spaces, bishops can move diagonally a variable number of spaces, etc. Now think about the number and breadth of "moves" or variables that need to be considered when driving a car in a nondeterministic (random) environment: weather (precipitation, snow, ice, black ice, wind), time of day (day time, twilight, night time, sun rise, sun set), road conditions (pot holes, bumpy, slick), traffic conditions (number of vehicles, types of vehicles, different speeds, different destinations). It's nearly impossible for an autonomous car manufacturer to operate enough vehicles in enough different situations to generate the amount of data that can be virtually gathered by playing against Grand Theft Auto.
Hearing plays an essential role in how you navigate the world, and, so far, most autonomous cars can't hear. It recently spent a day testing the system with emergency vehicles from the Chandler, Arizona, police and fire departments. Police cars, ambulances, fire trucks, and even unmarked cop cars chased, passed, and led the Waymo vans through the day and into the night. Sensors aboard the vans recorded vast quantities of data that will help create a database of all the sounds emergency vehicles make, so in the future, Waymo's driverless cars will know how to respond.
As it continues to improve its sensor technology to help its vehicle understand its surroundings and respond quickly and safely to unfolding events, it's also been considering how to deal with unavoidable collisions, whether it's with a "soft" human that could easily sustain an injury, or a harder object like another vehicle. A patent recently awarded to Waymo offers some insight into how the company is approaching the issue. In Waymo's own words: "The vehicle may contain tension members that are arranged so that a change in tension across one or more of the tension members will alter the rigidity of the vehicle's surface. The vehicle may identify and respond to a potential collision by altering the tension that is applied to one or more tension members, thereby altering the rigidity of the vehicle's surface."
Fiat Chrysler Automobiles (FCA) is teaming up with BMW Group, Intel, and Mobileye to develop autonomous cars. Intel acquired Mobileye for $15.3 billion to boost its competence in computer vision, and it will use that technology to help develop the autonomous car platform in cooperation with the carmakers. "In order to advance autonomous driving technology, it is vital to form partnerships among automakers, technology providers, and suppliers," said FCA CEO Sergio Marchionne, in a statement. In July 2016, BMW Group, Intel, and Mobileye announced that they were joining forces to make self-driving vehicles a reality by collaborating on a vehicle that would be in production by 2021.
It is an industry that has functioned largely without changes for the past hundred years, but with the emergence of technologies such as artificial intelligence, self-driving and robotics, the basic paradigm of the industry is expected to change. While the Tesla Gigafactory 1 is just one of many examples of auto companies increasingly employing robots in production, it is the strongest indication that as the auto industry moves toward automation and robotics, human employment in the industry is set to decrease. According to the Information Handling Services (IHS) Technology's Automotive Electronics Roadmap Report, the use of AI based driver-assistance systems in vehicles is set to jump from 7 million a couple of years ago to 122 million by 2025. Since cars are increasingly expected to be equipped with hardware such as camera-based machine units, radar-detection units and driver evaluation units, AI will serve as the connecting interface between the regular car machinery and such hardware -- e.g., advance brake warnings using object detection feedback from the onboard cameras.
Tech and automotive companies have quietly been trialing autonomous trucks since 2015. But, a new kind of driverless truck is designed to stick out like a sore thumb. While you read this, an autonomous impact protection vehicle is making its way around Colorado. With this trial, the state is now the first to test a connected impact protection vehicle without a support driver at the wheel.
In 2012 the engineers working on Google's self-driving car realised they had a problem. And before those fully autonomous cars arrive and are widely adopted, hundreds of thousands of lives will be lost that might have been saved. Decades from now, when fully autonomous vehicles are available everywhere, these stopgap measures won't be necessary. A truly autonomous car won't care if its passengers are watching the road.