Robots in the work place can perform hazardous or even 'impossible' tasks; e.g., toxic waste clean-up, desert and space exploration, and more. AI researchers are also interested in the intelligent processing involved in moving about and manipulating objects in the real world.
Artificial intelligence (AI) is already having a huge impact on many industries, including aviation, manufacturing, technology, and a host of others. That's because the facets of AI like machine learning and deep learning can make companies more efficient, help them plan for preventative maintenance of equipment, and even improve digital sales. But one sector already seeing dramatic changes from AI is the automotive industry. Artificial intelligence is creating entirely new ways for us to get around. It will soon influence how cars are produced, and it'll change how we manage traffic in our cities.
A mere sprinkling of autonomous vehicles exist in a few dozen cities today. And none of them -- at least not yet -- have been deployed as a true commercial enterprise. While the bulk of this nascent industry fixates on the system of sensors, maps and AI necessary for vehicles to drive without a human behind the wheel, the founders of startup RideOS are directing their efforts to the day when fleets of self-driving cars hit the streets. It's there, where human-driven and automated vehicles will be forced to mingle, that RideOS co-founders Chris Blumenberg and Justin Ho see opportunity. The company, which has existed for all of 12 months, has raised $25 million in a Series B funding round led by Next47, the venture arm of Siemens.
It's coming quickly down the road: a world where we can get in a car anytime we want to but don't own one. Where accidents are drastically reduced, and we don't have to worry about dangerous drivers on the road. Where we can join conference calls or draft a report on our morning commute. Cars are changing more quickly and drastically now than at any other point in their history. And it's no wonder: from 2014 to 2017, start-ups, automakers and other stakeholders invested an estimated $80 billion into autonomous vehicle (AV) technology.
Norm Judah is chief technology officer, Microsoft Services. Artificial Intelligence (AI) is already having a transformative impact across every industry. From helping employees at transportation companies predict arrival times or issues that may arise, to predicting toxins in grains of food. It's helping scientists learn how to treat cancer more effectively and farmers are figuring out how to grow more food using fewer natural resources. A 2017 study by PWC calculated global GDP will be 14 percent higher by 2030 as a result of AI adoption, contributing an additional $15.7 trillion to the global economy.
Though nearly a third of American adults still report experiencing fear of flying, the majority of us implicitly trust that airlines will get us from point A to point B and back again in one piece. Indeed, few factors are as crucial in the transportation industry as trust. If we didn't believe in the safety of a train, plane or automobile, no one would ever travel that way. Planes, however, still have pilots, and trains usually have conductors, and most likely will for the near future. Interestingly, the introduction of autopilot in commercial aviation – effectively rendering pilot control for large parts of the flight unnecessary – did not seem to bother most passengers, (similar to modern conductor-less locomotive systems).
As we get closer to the widespread deployment of autonomous vehicles, there's still discussions about what sensors will be required to make these vehicles safe – for the passengers and other pedestrians and vehicles. Those discussions usually revolve around cameras, radar, LiDAR (Light Distance and Ranging), and ultrasonic. There's one other sensor that isn't discussed – an audio microphone. I recently experienced a situation that caused me concern over how well an autonomous car would handle emergency vehicle awareness. I was stopped at a red light in the left-hand turn lane on a local road.
The other drivers wouldn't have noticed anything unusual as the two sleek limousines with German license plates joined the traffic on France's Autoroute 1. But what they were witnessing -- on that sunny, fall day in 1994 -- was something many of them would have dismissed as just plain crazy. It had taken a few phone calls from the German car lobby to get the French authorities to give the go-ahead. But here they were: two gray Mercedes 500 SELs, accelerating up to 130 kilometers per hour, changing lanes and reacting to other cars -- autonomously, with an onboard computer system controlling the steering wheel, the gas pedal and the brakes. Decades before Google, Tesla and Uber got into the self-driving car business, a team of German engineers led by a scientist named Ernst Dickmanns had developed a car that could navigate French commuter traffic on its own. The story of Dickmann's invention, and how it came to be all but forgotten, is a neat illustration how technology sometimes progresses: not in small steady steps, but in booms and busts, in unlikely advances and inevitable retreats --"one step forward and three steps back," as one AI researcher put it. It's also a warning of sorts, about the expectations we place on artificial intelligence and the limits of some of the data-driven approaches being used today.
Yesterday afternoon, I rode an autonomous shuttle down a short section of Broadway in the heart of Times Square, and it was easily the most boring part of my day. I'm not saying that because my life is particularly exciting, either. The trip was boring because everything inside the Coast Autonomous P-1 worked exactly the way it was supposed to: The shuttle crawled up to a barricade on 47th Street, paused for a bit, and scooted back in the opposite direction toward 48th. In this case, the vehicle wasn't completely autonomous -- Coast CTO Pierre Lefevre manually started each leg of a trip with an Xbox Elite controller -- but the P-1 navigated its surroundings all own its own. That short trip was one of many small-scale tests the company has put on over the years, all of which speak to the commercial viability of tiny, driverless buses.
Boeing says it's aiming to create a traffic management system for drones that makes use of artificial intelligence, blockchain technology -- and one of the companies in its investment portfolio. SparkCognition will be Boeing's partner in the traffic management project. Last year, the Texas-based AI company benefited from a $32.5 million investment round that included funding from Boeing HorizonX Ventures. Boeing is also creating a new business group, known as Boeing NeXt, to leverage the company's research and development activities and investments in areas such as autonomous flight, smart cities, advance propulsion and other parts of the wider transportation ecosystem. "We're at a point in history where technological advances and societal trends are converging to demand bold solutions and a different way to travel," Greg Hyslop, Boeing's chief technology officer, said today in a statement issued to coincide with this week's Farnborough International Airshow near London.