Video is the world's largest generator of data, created every day by over 500 million cameras worldwide. That number is slated to double by 2020. The potential there, if we could actually analyze the data, is off the charts. It's data from government property and public transit, commercial buildings, roadways, traffic stops, retail locations, and more. The result would be what NVIDIA calls AI Cities, a thinking robot, with billions of eyes trained on residents and programmed to help keep people safe.
For months now, major companies have been hooking up--Uber and Daimler, Lyft and General Motors, Microsoft and Volvo--but Intel CEO Brian Krzanich's announcement on Monday that the giant chipmaker is helping Waymo, Google's self-driving car project, build robocar technology registers as some seriously juicy gossip. Krzanich said Monday that Waymo's newest self-driving Chrysler Pacificas, delivered last December, use Intel technology to process what's going on around them and make safe decisions in real time. And last year, Google announced it had created its own specialized chip that could help AVs recognize common driving situations and react efficiently and safely. "Our self-driving cars require the highest-performance compute to make safe driving decisions in real-time," Waymo CEO John Krafcik said in a statement.
Automated inspection company Avitas Systems, which is a GE Venture company, is using Nvidia's DGX-1 and DGX Station to train its neural-network-based artificial intelligence to be able to quickly and consistently identify defects in industrial equipment. Alex Tepper, Avitas founder and head of corporate and business development, explained in an interview that GE has been helping customers with industrial inspections for a long time, and has found that these customers are spending hundreds of millions of dollars on inspections that involve a person driving out to, or flying a helicopter above an asset. Additionally, Avitas can provide reliable replication of observation conditions with automated inspection methods – robots can take the same photograph or sensor reading from the same perspective over and over again. "Avitas started with a prototype version of our station, and soon they'll be getting an upgrade to our DGX Station with Volta [launched in May], and that'll be a huge performance gain," explained Nvidia GM of DGX Systems Jim Hugh.
Technologies like artificial intelligence, machine learning, big data, Internet of things (IOT), and deep learning will come together to help realize Industry 4.0. Similarly, PTC plans this year to link its Creo computer-aided design system, to the company's ThingWorx IoT development platform. Fusion Connect Internet of Things software from Autodesk can help connect factory applications across a number of industrial machines and make sense of information returned from the connected machines. Introduced last summer, Autodesk's Design Graph is another machine learning system that helps users manage 3D content, offering Google search-like functionality for 3D models, says Mike Haley, who leads the machine intelligence group at Autodesk.
In other words, GPU delivers better prediction accuracy, faster results, smaller footprint, lower power and lower costs. What is fascinating about Nvidia is that it has a full stack solution architecture for DL applications, making it easier and faster for data scientist engineers to deploy their programs. As part of a complete software stack for autonomous driving, NVIDIA created a neural-network-based system, known as PilotNet, which outputs steering angles given images of the road ahead. In addition to learning the obvious features such as lane markings, edges of roads, and other cars, PilotNet learns more subtle features that would be hard to anticipate and program by engineers, for example, bushes lining the edge of the road and atypical vehicle classes (Source:Cornell university CS department).
The 2018 Audi A8, just unveiled in Barcelona, counts as the world's first production car to offer Level 3 autonomy. Here that involves driving no faster than 60 kilometers per hour (37 mph), which is why Audi calls the feature AI Traffic Jam Pilot. When the car up ahead stops, the A8's AI hits the brakes in time to avoid rear-ending it. Audi said in a statement that it will follow "a step-by-step approach" to introducing the traffic jam pilot.
While most buildings get occasional walk-through energy usage audits, Verdigris' digital system uploads electricity consumption data to the cloud 24/7. It can even integrate the data with building management systems to automate electricity usage controls. Chung estimates this helps Verdigris train models 20 times as fast as on CPUs. Eventually, Chung said he'd like Verdigris to expand beyond smart building optimization and into enabling smart cities.
Chief Executive Jen-Hsun Huang forecast carmakers may speed up their plans in the light of technological advances and that fully self-driving cars could be on the road by 2025. "Of course, we still have to prove that an autonomous car does better in driving and has less accidents than a human being," Bosch CEO Volkmar Denner told a news conference. Level three means drivers can turn away in well-understood environments such as motorway driving but must be ready to take back control, while level four means the automated system can control the vehicle in most environments. But Nvidia's Huang said he expected to have chips available for level three automated driving by the end of this year and in customers' cars on the road by the end of 2018, with level four chips following the same pattern a year later.
Nvidia will work with automotive supplier Bosch to mass-produce self-driving car systems using the upcoming Xavier chip. For example, Bosch supplies sensors and automotive parts to Tesla's cars with self-driving technology. Nvidia has said that its Drive PX computers will ultimately have the ability to recognize everything in sight as a car is moving, which will help autonomous vehicles make driving decisions. But getting a big car parts supplier is a good way for Nvidia to get its technology into autonomous vehicles.
Automotive supplier Delphi has been promoting its Multi-Domain Controller (MDC) concept for several years. In November 2016, Delphi announced a partnership with Intel and Mobileye to commercialize the MDC using chips and software from the two companies. Mobileye's next-generation EyeQ4 and EyeQ5 chips would be combined with one of Intel's new automotive processors in the Delphi automated driving MDC. Nvidia uses one of its high-end graphics processing units for the machine learning and sensor fusion with a more general purpose chip for path planning and control.