Goto

Collaborating Authors

Results


NVIDIA to buy autonomous vehicle mapping company DeepMap

Engadget

NVIDIA is to acquire DeepMap, a company that makes high-definition mapping technology for self-driving cars. "DeepMap is expected to extend our mapping products, help us scale worldwide map operations and expand our full self-driving expertise," said NVIDIA VP Ali Kani. DeepMap provides maps with high levels of precision. NVIDIA points out that maps accurate to within a few meters are fine for turn-by-turn GPS directions, but autonomous vehicles require greater accuracy. "They must operate with centimeter-level precision for accurate localization, [so that] an AV can locate itself in the world," NVIDIA wrote in a blog post.


Hyundai taps Nvidia for future fleet AI, infotainment services

ZDNet

Hyundai has announced that all future Hyundai, Kia, and Genesis vehicles will utilize Nvidia's in-vehicle artificial intelligence (AI) and infotainment platform. The South Korean automaker said on Monday that the firm's "entire fleet" will eventually feature Nvidia's technology. Nvidia's Drive platform has been designed with smart and self-driving vehicles in mind. Nvidia Drive is an end-to-end solution combining AI, imaging, and computing, as well as wireless connectivity -- a must for today's infotainment dashboards, which are expected to provide driver assistance, maps, streaming services, and more. Nvidia says the solution "delivers everything needed to develop autonomous vehicles at scale," and Hyundai appears to agree.


Nvidia Is Not a 'Videogame Company.' Its Founder Explains What It Really Is.

#artificialintelligence

Jensen Huang: This is the autonomous vehicle computer and all of the software and the infrastructure that goes along with it. We announced a partnership that relates to every single Mercedes-Benz car starting in 2024. We're going to build one architecture that spans the entire fleet. We are going to develop applications and then offer them to their customers and share the economics. Whereas a chip could go for a couple of hundred dollars, autonomous driving applications could go for several thousand dollars.


Nvidia's Founder On Creating the Big Bang of Artificial Intelligence

#artificialintelligence

Jensen Huang: This is the autonomous vehicle computer and all of the software and the infrastructure that goes along with it. We announced a partnership that relates to every single Mercedes-Benz car starting in 2024. We're going to build one architecture that spans the entire fleet. We are going to develop applications and then offer them to their customers and share the economics. Whereas a chip could go for a couple of hundred dollars, autonomous driving applications could go for several thousand dollars.


NVIDIA and Mercedes partner to create a next-gen car computer

Engadget

During a joint press conference held Wednesday, NVIDIA and Mercedes Benz announced that they are teaming up to develop a "revolutionary in-vehicle computing system" for the automakers next generation of luxury automobiles in 2024. Touted as "the most sophisticated and advanced computing architecture ever deployed in an automobile," per an NVIDIA press release, this new software system will enable Level 2 and 3 driving autonomy -- that's on par and exceeding the current abilities of Tesla's Autopilot, respectively -- and Level 4 parking autonomy. That means that the vehicle will be able to, by and large, fit itself into parking stalls without any help from the human driver. There will still need to be a human on hand in case things go catastrophically sideways, but under normal conditions, there won't be much call for them to intercede. What's more, the computing system, which is based on NVIDIA's DRIVE platform, will be able to "automate driving of regular routes from address to address," according to the release.


Self-Driving Sector Contends Its Cars Can Prevent Many More Crashes Than Insurance Study Says

U.S. News

Jack Weast, vice president of autonomous vehicle standards at Intel Corp's Mobileye, in an interview on Friday said the auto industry was assembling a vast list of likely road scenarios and human behavior that every driverless car should be able to navigate safely. Government agencies and insurance companies are part of that process, Weast said.


NVIDIA Announces DRIVE AXG Orin, One of the Most Advanced Platforms for Autonomous Vehicles

#artificialintelligence

At Nvidia's GTC Technology Conference in China this week, the chipmaker unveiled its latest NVIDIA DRIVE platform the AGX Orin. Orin is an advanced processor for autonomous vehicles or robots that was a result of four years of R&D investment by Nvidia. The new platform is powered by a new system-on-a-chip (SoC), which consists of 17 billion transistors. The Orin SoC integrates NVIDIA's next-generation GPU architecture and Arm Hercules CPU cores, combined with new deep learning and computer vision accelerators that can deliver 200 trillion operations per second (200 TOPS), which Nvidia says is 7 times better performance than the company's previous generation Xavier SoC, which delivers 30TOPS of performance. Orin can transmit over 200 gigabytes of data per second of data using just 60 to 70 Watts of power, according to Danny Shapiro, Nvidia's senior director of automotive.


DRIVE Labs: Detecting the Distance NVIDIA Blog

#artificialintelligence

Editor's note: This is the latest post in our NVIDIA DRIVE Labs series, which takes an engineering-focused look at individual autonomous vehicle challenges and how NVIDIA DRIVE addresses them. The problem: judging distances is anything but simple. We humans, of course, have two high-resolution, highly synchronized visual sensors -- our eyes -- that let us to gauge distances using stereo-vision processing in our brain. A comparable, dual-camera stereo vision system in a self-driving car, however, would be very sensitive. If the cameras are even slightly out of sync, it leads to what's known as "timing misalignment," creating inaccurate distance estimates.


Lincoln Laboratory's new artificial intelligence supercomputer is the most powerful at a university

#artificialintelligence

The new TX-GAIA (Green AI Accelerator) computing system at the Lincoln Laboratory Supercomputing Center (LLSC) has been ranked as the most powerful artificial intelligence supercomputer at any university in the world. The ranking comes from TOP500, which publishes a list of the top supercomputers in various categories biannually. The system, which was built by Hewlett Packard Enterprise, combines traditional high-performance computing hardware -- nearly 900 Intel processors -- with hardware optimized for AI applications -- 900 Nvidia graphics processing unit (GPU) accelerators. "We are thrilled by the opportunity to enable researchers across Lincoln and MIT to achieve incredible scientific and engineering breakthroughs," says Jeremy Kepner, a Lincoln Laboratory fellow who heads the LLSC. "TX-GAIA will play a large role in supporting AI, physical simulation, and data analysis across all laboratory missions."


Volvo Group Selects NVIDIA to Transform Trucking NVIDIA Blog

#artificialintelligence

Volvo Group and NVIDIA are delivering autonomy to the world's transportation industries, using AI to revolutionize how people and products move all over the world. At its headquarters in Gothenburg, Sweden, Volvo Group announced Tuesday that it's using the NVIDIA DRIVE end-to-end autonomous driving platform to train, test and deploy self-driving AI vehicles, targeting public transport, freight transport, refuse and recycling collection, construction, mining, forestry and more. By injecting AI into these industries, Volvo Group and NVIDIA can create amazing new vehicles and deliver more productive services. The two companies are co-locating engineering teams in Gothenburg and Silicon Valley. Together, they will build on the DRIVE AGX Pegasus platform for in-vehicle AI computing and utilize the full DRIVE AV software stack for 360-degree sensor processing, perception, map localization and path planning.