Cars are increasingly driven by code, from Advanced Driver Assistance Systems (ADAS) to infotainment systems. Yet as these features and capabilities evolve, the software that needs to be developed, managed, deployed and updated becomes increasingly complex. Arm is leading a collaborative effort to build a common development framework for the software-defined car to solve this problem. It's working with major technology and automotive partners to develop standards, software, developer resources, and specialized processing platforms designed for automotive applications' safety and real-time needs. "We strongly believe a software-defined approach can help to address this growing software complexity," Chet Babla, Arm's VP of automotive, told reporters. "But without a common development framework, the industry can't maximize supply chain efficiencies or accelerate innovation."
At Nvidia's GTC Technology Conference in China this week, the chipmaker unveiled its latest NVIDIA DRIVE platform the AGX Orin. Orin is an advanced processor for autonomous vehicles or robots that was a result of four years of R&D investment by Nvidia. The new platform is powered by a new system-on-a-chip (SoC), which consists of 17 billion transistors. The Orin SoC integrates NVIDIA's next-generation GPU architecture and Arm Hercules CPU cores, combined with new deep learning and computer vision accelerators that can deliver 200 trillion operations per second (200 TOPS), which Nvidia says is 7 times better performance than the company's previous generation Xavier SoC, which delivers 30TOPS of performance. Orin can transmit over 200 gigabytes of data per second of data using just 60 to 70 Watts of power, according to Danny Shapiro, Nvidia's senior director of automotive.
Autonomous vehicles are dependent on historical and real-time data, without which artificial intelligence and machine learning would be impossible. They aren't plug-and-play because not all of the potential scenarios can possibly be predicted by the software developers, simulators or data modelers. Simulators, and to a degree connected and autonomous vehicles (CAVs) themselves, are also only as good as their algorithms and the data inputted into them. Consequently, AI and machine learning in autonomous vehicles can be limited, so nobody should expect them to instantly be able to cope with every potential scenario. Their development has to be taken with a sense of caution to prevent unintended consequences from occurring. There is also a need to educate consumers about what they can and cannot do safely.
Henry Ford may not have actually said that if he had asked people what they wanted, they would have said faster horses. But the automotive pioneer's apocryphal quote has become commonplace in business and innovation circles. More than a century after the Model T first hit the roads, those mythical words still resonate, not only because they shed light on a core truth of innovation – that groundbreaking developments come not only from listening to what customer say they want, but from devising more creative and ingenious ways of meeting their needs – but also because they are a reminder of how much the car itself has been transformed. For the automobile's first few decades, it was easy to think of the vehicle as a faster, more efficient way to get from A to B. But with the gradual introduction of more and more features – safety mechanisms like airbags, navigation and entertainment systems, and Advanced Driver Assistance Systems (ADAS) – the car has become much more than a conveyor. It's now an information and data center on wheels running on hundreds of millions of lines of code.
NVIDIA and Hyundai Motor Group today announced that the automaker's entire lineup of Hyundai, Kia and Genesis models will come standard with NVIDIA DRIVE in-vehicle infotainment (IVI) systems starting in 2022. From entry-level to premium vehicles, these fleets will feature a rich, software-defined AI user experience that is perpetually updateable. Recent breakthroughs in AI and accelerated computing have opened the door for next-generation cars and trucks to benefit from new functionality, capabilities and enhanced safety features that can be added after the car is purchased. With a centralized, software-defined computing architecture, future vehicles can always have the latest AI cockpit features. For Hyundai Motor Group, standardizing on the high-performance, energy-efficient NVIDIA DRIVE platform for its future models allows for a seamless and continuously enhanced in-vehicle AI user experience.