If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Not surprisingly, autonomy and electrification dominated transportation news at CES 2018. Toyota introduced its e-Palette concept mobility solution (which proceeded to dominate Engadget's Best of CES awards), while Silicon Valley startup Robomart unveiled plans to bring produce shopping to your front door. Automakers also announced a slew of upcoming self-driving technologies, ranging from Alexa integration and automated emergency brakes to Level 5 personal transport pods that do away with the steering wheel altogether. But even among the most adventurous concepts floated at this year's trade show, Nissan's vision for the future stands out. One in which driver and vehicle could someday work in perfect harmony, thanks to a brain machine interface that instantly translates your thoughts into the vehicle's actions.
There's one thing that keeps Toyota CEO Akio Toyoda up at night. It's not a traditional car company like Honda, Ford or Nissan. Or what he's going to have for breakfast the next day. It's technology juggernauts like Facebook, Google and Apple and what might happen when they decide to enter the automotive industry proper. Will the company be ready?
If the Department of Transportation grants GM's latest Safety Petition, the automaker will be able to deploy its no-steering-wheel, pedal-less autonomous car next year. GM has not only revealed what its level 4 self-driving vehicle will look like -- in a video you can watch after the break -- but also announced that it filed a Safety Petition to be able to deploy its completely driverless version of Chevy Bolt called Cruise AV in 2019. The company describes it as "the first production-ready vehicle built from the start to operate safely on its own, with no driver, steering wheel, pedals or manual controls." As you can see above, Cruise AV is much different from the self-driving Chevy Bolts GM is testing in California. It has no controls whatsoever, not even buttons you can push -- it 100 percent treats you as a passenger, no matter where you sit.
Shopping for fresh produce online has always been a bit of a gamble since you're not actually selecting the fruits and veggies yourself. Santa Clara, California-based startup Robomarts aims to change that by bringing online produce shopping to your front door. The company, as part of NVIDIA's deep learning/AI "Inception Program", has developed what is essentially an self-driving bodega on wheels. The concept relies on a Sprinter Van-sized delivery vehicle outfitted with an array of LiDAR, radar, and cameras, as well as a CAN motion control system and enough route planning and obstacle avoidance software to notch Level 5 autonomy -- that's the highest level a self-driving vehicle can achieve, requiring no human driver whatsoever. What's more, the vehicles use a fully electric drivetrain with an estimated 80-mile range, 25 mph top speed and come equipped with the HEVO wireless charging system.
Developing self driving vehicle technologies is hard -- just ask Google, or Uber, or Google that other time, or that one bus from Vegas. That's why a number of companies have been working to virtualize the development cycle so that untested technologies can crash and burn safely as their bugs are worked out. Among those companies is Optis, which announced on Tuesday that it will be partnering with two other firms to make virtual prototyping more accessible to the industry. The first company is LeddarTech -- they developed a signal processing technology that's used in solid state LiDAR. LeddarTech and Optis will be working to create a vehicle simulation system so that vehicle manufacturers and OEMs will be able to virtually prototype and test their LiDAR systems.
Now that Intel's MobilEye acquisition is complete, the tech titan is ready to get the ball rolling. In fact, we might see semi-autonomous vehicles powered by MobilEye's Road Experiment Management (REM) system as soon as this year. Intel has signed contracts with 11 carmakers, which will use the Level 2 autonomous driving tech MobilEye developed, on vehicles slated to be released throughout 2018 and 2019. This particular technology will add semi-autonomous features, such as simple braking, steering and acceleration, to cars. It's worth noting, though, that REM was created to make fully autonomous cars possible, and that's still Intel's ultimate goal.
The days of owning your own car may soon be coming to an end, what with the growing popularity of ride sharing matched with the promise of autonomous vehicle technology. Should that self-driving car service future come about, Harman International want to be ready with audio and connected vehicle systems to match whatever you're riding in to your specific mood and tastes. On Monday, Harman announced its new Configurable Entertainment and Moodscape experiences, built using the company's new AudioworX development platform. Configurable Entertainment is geared towards commercial ridesharing companies like Uber and Lyft and will allow them to "offer multiple in-car brand and entertainment experiences through a single set of in-vehicle hardware," according to the release. To do this, the company also announced that it is developing shape-shifting speakers.
At CES today, Panasonic announced a partnership with Amazon that will bring Alexa to your car. Dubbed Alexa Onboard, it works with Panasonic's Skip Gen IVI technology and is meant to make life with your virtual assistant more seamless as you move from your home to your vehicle. Alexa Onboard's functions were demonstrated on stage and the most interesting aspect is that it will still have use when offline. As expected, queries like what the weather's like, how far away a destination is and where the nearest gas station is, for example, are answered by Alexa when you ask. You'll also be able to take advantage of Alexa's other skills such as controlling your smart home devices, receiving news briefings and ordering from meal delivery services.
One of the reasons that automakers are pursuing self-driving cars is that, while they'll initially be too expensive to put up for sale to individuals, ride-hailing services (aka mobility) offer the technology the chance to mature in a way that's financially viable. Aptiv, a tier-one supplier of autonomous technology, is working hard to make sure its system is in a lot of those vehicles, and at CES this year, it showed off how that system might actually work in conjunction with Lyft. The cars (modified BMWs) were hailed by the Lyft app. Not the one on my phone, but one used by the team showing off the vehicles. Once it arrived in the staging area near the Las Vegas Convention Center, our driver drove it out of the parking lot on to the street and put it into autonomous mode with Caesars Palace as our destination.
As the need for more powerful processors in the emerging self-driving and semi-autonomous car grows, NVIDIA is making sure it stays ahead of the trend. At CES, the GPU-building powerhouse unveiled the Xavier SOC for AI car systems the company announced at last year's CES. The Xavier has over 9 billion transistors with a custom 8-core CPU, a 512-core Volta GPU, an 8K HDR video processor, a deep-learning accelerator and new computer-vision accelerators. NVIDIA says the SOC can perform 30 trillion operations per second using only 30 watts of power. NVIDIA says that's 15 times more efficient than the previous architecture.