Making Autonomous Vehicles Safer


While self-driving vehicles are beta-tested on some public roads in real traffic situations, the semiconductor and automotive industries are still getting a grip on how to test and verify that vehicle electronics systems work as expected. Testing can be high stakes, especially when done in public. Some of the predictions about how humans will interact with autonomous vehicles (AVs) on public roads are already coming true, but human creativity is endless. There have been attacks on Waymo test vehicles in Arizona, a DUI arrest of a Tesla driver sleeping at 70mph on a freeway, and other Tesla hacks using oranges and aftermarket gadgets to trick Tesla's Autopilot into thinking the driver's hands are on the wheel. But are those unsafe human behaviors any more dangerous than the drum beat of technology hype, unrealistic marketing, and a lack of teeth in regulating testing of AVs on public roads, the factory and the design lab?

Xilinx and ZF to Jointly Enable AI Innovation and Autonomous Driving Development


CES 2019 -- Xilinx, Inc. (NASDAQ: XLNX), the leader in adaptive and intelligent computing, and ZF Friedrichshafen AG (ZF), a global leader and Tier-1 automotive supplier in driveline and chassis technology as well as active and passive safety technology, today announced a new strategic collaboration in which Xilinx technology will power ZF's highly-advanced artificial intelligence (AI)-based automotive control unit, called the ZF ProAI, to enable automated driving applications. ZF is using the Xilinx Zynq UltraScale MPSoC platform to handle real-time data aggregation, pre-processing, and distribution, as well as to provide compute acceleration for the AI processing in ZF's new AI-based electronic control unit. ZF selected this adaptable, intelligent platform because it provides the processing power scalability and flexibility essential for the ZF ProAI platform to be customized for each of its customer's unique requirements. "The unique selling proposition of the ZF ProAI is its modular hardware concept and open software architecture. Our aim is to provide the widest possible range of functions in the field of autonomous driving," explained Torsten Gollewski, head of ZF Advanced Engineering and general manager of Zukunft Ventures GmbH.

8 Things to Expect From CES 2019: AI, 5G, 8K, and More


Every January, more than 150,000 people make a pilgrimage to Las Vegas for the annual event known as CES. Organized by the Consumer Technology Association, CES is one of the world's largest technology trade shows. It's an exhilarating and nauseating display of gadgetry, a kaleidoscope eye into what's to come: blinking smart lights, liquid-looking displays, hovering drones, yogic phones, driver-free vehicles, newfangled wireless protocols, and intangible technologies that all come with the promise of making life better. Except the tech being shown at CES really isn't just for nerds. We carry it with us on our daily commutes; we talk to it in our kitchens and living rooms; we take it to bed at night.

Enabling Faster, More Capable Robots With Real-Time Motion Planning

IEEE Spectrum Robotics Channel

This is a guest post. The views expressed here are solely those of the authors and do not represent positions of IEEE Spectrum or the IEEE. Despite decades of expectations that we will have dexterous robots performing sophisticated tasks in the house and elsewhere, the use of robots remains painfully limited, largely due to insufficient motion-planning performance. Motion planning is the process of determining how to move a robot, or autonomous vehicle, from its current configuration (or pose) to a desired goal configuration: For example, how to reach into a fridge to grab a soda can while avoiding obstacles, like the other items in the fridge and the fridge itself. Until recently, this critical process has been implemented in software running on high-performance commodity hardware.

EyeSight raises $15 million for AI-powered in-car monitoring


In 2013 alone, it tragically claimed the lives of more than 3,154 and injured 424,000. Now, each day in the United States about nine people are killed by an inattentive person behind the wheel. EyeSight, a Tel Aviv, Israel-based artificial intelligence (AI) and hardware startup, promises to eradicate the distracted driving problem once and for all -- at least in cars equipped with its hardware. To further that mission, it today announced a $15 million funding round led by Jebsen Capital, with participation from Arie Capital and Mizrahi Tefahot. EyeSight's tech leans on a combination of cameras and artificial intelligence to monitor driver activity.

Elon Musk eyes early 2019 release for Tesla's custom AI chip


Elon Musk has announced that Tesla's new custom AI chip is about six months away from being installed in new production cars. The CEO said that the chip, which was confirmed as being in development last December, will offer "somewhere between [a] 500% & 2000%" increase in its vehicle's autonomous driving performance. Existing Tesla owners who have already paid for full self-driving will be offered this "hardware 3" update for Autopilot free of charge. The announcement comes as v9 of Tesla's onboard software has already reportedly brought big improvements to its neural network with a unified camera network that more seamlessly integrates all eight of the car's cameras. Musk has suggested that this software update delivered an approximate 400 percent increase in its capabilities.

Teraki wins backing from Infineon for its automotive AI technology


Teraki announced that Infineon Systems will use its latest AI edge processing software in a family of automotive microcontrollers that will improve the safety of autonomous vehicles. Hyperloop technologies could revolutionise travel: here's everything you need to know about the technology and the companies involved. The Berlin, Germany-based startup said that its software is designed for processing large amounts of automotive sensor data combined with machine learning to achieve up to 10 times the processing speed by just using existing automotive hardware. Normally, the constrained hardware environment of an automobile prohibits the processing of the large amounts of data that autonomous vehicle systems require without specialist chips. "Automobiles are adding ever larger amounts of sensors to enable autonomous vehicles and this explosion in data is a problem because of latency." said Daniel Richart CEO and co-founder of Teraki.

How startups and small hardware companies can take on big players like Intel in the booming sensor market


Wisconsin-based FLIR Systems, a sensor company that makes thermal imaging products for a variety of applications, has made a large strategic investment in CVEDIA, a Singapore-based machine learning and AI startup. Unless you're sensor geek, that may seem like niche news. But the move is indicative of a new reality in an increasingly competitive sensor market and amid a proliferation of AI and machine learning technologies: Great hardware on its own is no longer enough for smaller companies, but even startups can thrive against major players if they pick a lane and package their hardware with smart AI engines. FLIR makes thermal imaging sensors for enterprise applications, such as field inspection and firefighting. Its sensors come in a variety of packages, including tablets and as smartphone add-ons.

Siri, get my iCar: Is Apple making a cool new ride or just dabbling with the techie parts?


Apple has become the world's first publicly traded company to be valued at $1 trillion, the financial fruit of stylish technology that has redefined what we expect from our gadgets. Apple's new 175-acre "spaceship" campus dubbed Apple Park. It was designed by Lord Norman Foster and cost roughly $5 billion. It will house 12,000 employees in over 2.8 million square feet of office space and will have nearly 80 acres of parking to accommodate 11,000 cars. SAN FRANCISCO – In a few weeks, Apple will unveil its newest iPhone.

Automotive Artificial Intelligence Market to Reach $26.5 Billion by 2025 - Novus Light Today


The automotive industry is among the sectors at the forefront of using artificial intelligence (AI) to mimic, augment and support the actions of humans, while simultaneously leveraging the advanced reaction times and pinpoint precision of machine-based systems. Indeed, today's semi-autonomous vehicles and the fully autonomous vehicles of the future will rely heavily on AI systems. However, according to a new report from Tractica, while autonomous driving will be a leading impetus for AI spending in the automotive industry, the use cases for AI in vehicles are in fact much broader. Key applications encompass automotive human machine interaction (HMI) functionality like voice/speech recognition, driver face analytics, emotion recognition and gesture recognition; maintenance and safety applications like predictive maintenance, automated on-road customer service and vehicle network and data security; and personalized services in cars, among many others. All told, across 15 such AI use cases, Tractica forecasts that revenue from automotive AI software, hardware and services will increase from $2.0 billion in 2018 to $26.5 billion by 2025, representing a compound annual growth rate (CAGR) of 46.9%.