Key Points: – AI already impacts many aspects of our daily lives at work and at home – Over the next decade, AI enterprise software revenue will grow from $644 million to nearly $39 billion – Here are the top 10 ways that we predict AI will impact business over the next decade including vehicular object detection, predictive maintenance and intelligent recruitment. Artificial intelligence already impacts many aspects of our daily lives at work, at home and as we move about. Over the next decade, analyst firm Tractica predicts that annual Global AI enterprise software revenue will grow from $644 million in 2016 to nearly $39 billion by 2025. Services-related revenue should reach almost $150 billion. These functional areas are applicable to many use cases, industries, and generate benefits to both businesses and individuals.
Caruma's artificial intelligence-based connected-vehicle platform will incorporate RoadBotics' algorithms, data collection engine, and the processing to deliver advanced information on road surfaces, features, object detection and analysis. RoadBotics proprietary technology was developed using standard single lens data collection tools and will now be integrated into Caruma's powerful open vehicle network to rapidly collect useful information that can be distributed through an interface that municipalities and applications can subscribe to, based on the desired data profile and territory. The Caruma platform connects in-car cameras and sensor hardware to a continuous learning cloud and an artificial intelligence powered open vehicle network, which'learns' behavioral patterns of drivers and passengers, situational factors external to the vehicle and road infrastructural changes. About Caruma Technologies, Inc. Caruma is The Intelligent Connected-Vehicle Platform that utilizes the underlying technologies found in autonomous driving vehicles to improve safety, security, and convenience.
Using computer vision technology, facial analysis of the driver and/or passenger can be used to determine who out of a pool of drivers is sitting behind the wheel. Analyzing driver demographics in real time will also ensure the radio or music streaming service (such as Spotify, Pandora, Apple Music) displays the most relevant ads. Using computer vision for iris (gaze) tracking, eye openness detection, eyelid brink rate tracking, and head pose detection, in-car sensing technology can determine the driver's drowsiness and inattentiveness in real time. Now it's up to new technologies, like in-car sensing, computer vision, and AI to combat distracted driving and ensure our safety until fully autonomous vehicles hit the market.
Over the past few years, machine learning and AI have pushed forward the capacity of computers to recognize images, understand context, and make decisions. A report from IHS Technology expects that the number of AI systems in vehicles will jump from 7 million in 2015 to 122 million by 2025, bringing new opportunities to enhance the capabilities of connected cars as more data becomes available. In addition, AI will push advanced driver assistance systems (ADAS) into the mainstream. For that, they need AI, which is what enables the camera-based machine vision systems, radar-based detection units, driver condition evaluation and sensor fusion engine control units (ECU) that make autonomous vehicles work.
"Although this Statement focuses on the enormous safety potential of these new technologies, they offer an even wider range of possible benefits. Vehicle control systems that automatically accelerate and brake with the flow of traffic can conserve fuel more efficiently than the average driver. By eliminating a large number of vehicle crashes, highly effective crash avoidance technologies can reduce fuel consumption by also eliminating the traffic congestion that crashes cause every day on our roads. Reductions in fuel consumption, of course, yield corresponding reductions in greenhouse gas emissions. To the extent vehicles can communicate with each other and with the highway infrastructure, the potential for safer and more efficient driving will be increased even more.
Enter Toyota's newest companion: the palm-sized Kirobo Mini robot. Toyota calls Kirobo Mini a "communication partner" and it can read facial expressions and recall past vehicle trips. Kirobo Mini connects to a mobile phone via Bluetooth and uses a camera and voice and facial-expression recognition software in a setup that someday may work its way into vehicles. Toyota introduced a first-generation talking Kirobo robot in 2013.
Like much of the technology needed to support and enable autonomous vehicles, intelligent vision systems already exist and are used in other industries, for example, in industrial robots. This will require processing power that is only now becoming available, through advances made in System-on-Chip platforms, advanced software, deep learning algorithms and open source projects. It is enabled by the development of Heterogeneous System Architectures (HSA); platforms that combine powerful general purpose Microprocessing Units (MPUs) with very powerful and highly parallel Graphical Processing Units (GPUs). The software infrastructure needed to develop intelligent vision systems, such as OpenCV (Open Source Computer Vision) and OpenCL (Open Computer Language) require high performance processing platforms to execute their advanced algorithms.
In the autonomous car, AI will advance machine vision systems, while it will also migrate in sensor fusion electronic control units (ECU). In a phone interview with EE Times, Luca De Ambroggi, principal analyst, automotive semiconductors at IHS told us, "AI is viewed as a key enabler for real autonomous vehicles. EE Times asked the IHS analyst to break down automotive AI, including its advancements, applications inside vehicles, and the hardware available to process AI algorithms. EE Times: What hardware is best suited to implement AI applications [for autonomous cars] today?
The market research firm expects the attach rate of AI-based systems in new vehicles to increase from 8 percent in 2015 (the vast majority of today's AI systems in cars are focused on speech recognition) to 109% in 2025. IHS sees multiple AI systems of various types to be installed in many cars. In the human-machine interface in vehicles, IHS believes AI will play a role in speech and gesture recognition, eye-tracking, driver monitoring and natural language interfaces. In the autonomous car, AI will advance machine vision systems, while it will also migrate in sensor fusion engine control units (ECU). In a phone interview with EE Times, Luca De Ambroggi, principal analyst, automotive semiconductors at IHS told us, "AI is viewed as a key enabler for real autonomous vehicles.
Michael Karpf flew earlier this month to the Tesla Motors factory in Fremont, Calif., to pick up his new Model X electric sport-utility vehicle--known for a 200-plus mile battery range and Tesla CEO Elon Musk's claim that it's "the fastest SUV in history." The 75-year-old retiree planned to drive it across the country with his wife and son to their home in New Rochelle, N.Y. But the new-car gleam of Karpf's 138,000 titanium-on-beige P90D Model X faded with a string of problems as soon as he left the factory--delaying his journey. One of the wildly designed, upswinging "falcon wing" rear doors failed to close. The other falcon wing door failed to open, except from the inside.