Results


Waymo and Intel Combine to Power the Future of Self-Driving Cars

WIRED

For months now, major companies have been hooking up--Uber and Daimler, Lyft and General Motors, Microsoft and Volvo--but Intel CEO Brian Krzanich's announcement on Monday that the giant chipmaker is helping Waymo, Google's self-driving car project, build robocar technology registers as some seriously juicy gossip. Krzanich said Monday that Waymo's newest self-driving Chrysler Pacificas, delivered last December, use Intel technology to process what's going on around them and make safe decisions in real time. And last year, Google announced it had created its own specialized chip that could help AVs recognize common driving situations and react efficiently and safely. "Our self-driving cars require the highest-performance compute to make safe driving decisions in real-time," Waymo CEO John Krafcik said in a statement.


Nvidia hits another record high as AI takes centerstage

#artificialintelligence

"Our sense is management believes that investors still severely underestimates the impact of AI and the size of the potential market," Evercore analyst C J Muse wrote in a note on Friday after hosting Nvidia's management. Nvidia has been rapidly expanding into newer technologies including artificial intelligence, cloud computing and self-driving cars, away from designing graphics-processing chips for which the company was known for. Bank of America Merrill Lynch analyst Vivek Arya listed Nvidia a "top pick", basing his view "on (Nvidia's) underappreciated transformation from a traditional PC graphics vendor, into a supplier into high-end gaming, enterprise graphics, cloud, accelerated computing and automotive markets," according to Seeking Alpha. In May, Nvidia announced a partnership with Toyota Motor Corp through which the Japanese car maker would use Nvidia's AI technology to develop self-driving vehicle systems planned for the next few years.


Drones and Robots Are Taking Over Industrial Inspection

MIT Technology Review

The effort shows how low-cost drones and robotic systems--combined with rapid advances in machine learning--are making it possible to automate whole sectors of low-skill work. Avitas uses drones, wheeled robots, and autonomous underwater vehicles to collect images required for inspection from oil refineries, gas pipelines, coolant towers, and other equipment. Nvidia's system employs deep learning, an approach that involves training a very large simulated neural network to recognize patterns in data, and which has proven especially good for image processing. It is possible, for example, to train a deep neural network to automatically identify faults in a power line by feeding in thousands of previous examples.


Is Your Tesla Eligible For HW 2.5 Update For Full Autopilot?

International Business Times

However, the company did not implement the update and in less than a year, it has already started equipping all its vehicles in the production stage, including the Model 3, with the new HW 2.5 hardware, Electrek reported Wednesday. According to the Electrek report, the company has opted for an upgrade as the HW 2.0 was not capable of enabling Level 5 autonomy -- fully autonomous driving with no need of human interference. Tesla's vehicles are based on Nvidia's Drive PX2 platform for autonomous driving. The company is also getting its cars ready for the day it can actually issue an over-the-air software update and enable full autonomy on its vehicles.


Driverless cars: Tim Cook says Apple AI is applicable to more than just cars

#artificialintelligence

The firms have established a startup support programme at Volkswagen's Data Lab to provide technical and financial support for international startups developing machine learning and deep learning applications for the automotive industry. Volvo Cars, Autoliv and Zenuity will use Nvidia's AI car computing platform as the foundation for their own advanced software development. Nvidia has partnered with automotive supplier ZF and camera perception software supplier Hella to deploy AI technology on the New Car Assessment Program (NCAP) safety certification for the mass deployment of self-driving vehicles. The firms will use Nvidia's Drive AI platform to develop software for scalable modern driver assistance systems that connect their advanced imaging and radar sensor technologies to autonomous driving functionality.


Volvo is working with NVIDIA to develop self-driving car tech by 2021

Engadget

Volvo and NVIDIA have announced that they're teaming up with Zenuity to develop the next generation of self-driving vehicle systems which will be built on NVIDIA's Drive PX AI module. What's more, NVIDIA hopes that integrating additional autonomous safety features like automatic braking will help increase the scores of AI-equipped vehicles taking the DOT's New Car Assessment Program (NCAP) crash test safety certification. An increasing numbers of vehicles trading data with each other as they travel, why not have them talk to the infrastructure around them as well. "We'll be able to protect areas of potential congestion and really work with infrastructure, vehicles and navigation systems to optimize traffic flow and ultimately reduce congestion."


Watch a Roborace's driverless car zooming around a track

Daily Mail

Roborace CEO Denis Sverdlov said the demonstration was a major milestone in the development of autonomous racing: 'Roborace is the only company in the world right now testing driverless technologies on city streets without a human in the car – this is something truly unique.' These include five lidars, two radars, 18 ultrasonic sensors, two optical speed sensors, six AI cameras, GNSS positioning and a powerful Nvidia Drive PX2 'brain' processor, capable of 24 trillion AI operations per second. Roborace first revealed the stunning 4.8-metre-long (15.7 ft), two-metre-wide (6.5 ft) vehicle at February's Mobile World Congress in Barcelona. Roborace first revealed the stunning 4.8-metre-long (15.7 ft), two-metre-wide (6.5 ft) vehicle at February's Mobile World Congress in Barcelona The cars include five lidars, two radars, 18 ultrasonic sensors, two optical speed sensors, six AI cameras, GNSS positioning and a powerful Nvidia Drive PX2 'brain' processor, capable of 24 trillion AI operations per second Roborace provides an open AI platform for companies to develop their own driverless software and push the limits in a safe environment.


World's first DRIVERLESS race car Roborace hits the track

Daily Mail

These include five lidars, two radars, 18 ultrasonic sensors, two optical speed sensors, six AI cameras, GNSS positioning and a powerful Nvidia Drive PX2 'brain' processor, capable of 24 trillion AI operations per second. Roborace first revealed the stunning 4.8-metre-long (15.7 ft), two-metre-wide (6.5 ft) vehicle at March's Mobile World Congress in Barcelona. The futuristic vehicle completed a lap of the Paris ePrix circuit (pictured) ahead of the city's 2017 Formula E race, which took place on Saturday Saturday's public demonstration saw the car whip around 14 turns of the almost 2 kilometre (1.2 mile) track driven entirely by AI and sensors Mr Sverdlov said: 'This is a huge moment for Roborace as we share the Robocar with the world and take another big step in advancing driverless electric technology. Technologies guiding the vehicle include five lidars, two radars, 18 ultrasonic sensors, two optical speed sensors, six AI cameras, GNSS positioning and a powerful Nvidia Drive PX2 'brain' processor, capable of 24 trillion AI operations per second Mr Simon said: 'Roborace opens a new dimension where motorsport as we know it meets the unstoppable rise of artificial intelligence.


How NVIDIA's Neural Net Makes Decisions

#artificialintelligence

With NVIDIA PilotNet, we created a neural-network-based system that learns to steer a car by observing what people do. What makes BB8 an AI car, and showcases the power of deep learning, is the deep neural network that translates images from a forward-facing camera into steering commands. This visualization shows us that PilotNet focuses on the same things a human driver would, including lane markers, road edges and other cars. Besides PilotNet, which controls steering, cars will have networks trained and focused on specific tasks like pedestrian detection, lane detection, sign reading, collision avoidance and many more.


Bosch and Nvidia create an AI supercomputer for self-driving tech

#artificialintelligence

The AI onboard computer is expected to guide self-driving cars through even complex traffic situations, or ones that are new to the car. "Automated driving makes roads safer, and artificial intelligence is the key to making that happen. Driverless cars to be part of everyday life in the next decade Bosch's AI onboard computer can recognize pedestrians or cyclists. As a result, a self-driving car with AI can recognize and assess complex traffic situations, such as when an oncoming vehicle executes a turn, and factor these into its own driving.