According to our research, the total installed base of devices with Artificial Intelligence will grow from 2.694 billion in 2019 to 4.471 billion in 2024. Billions of petabytes of data flow through AI devices every day. However, right now, most of these AI devices are working independently of one another. Yet, as the volume of data flowing through these devices increases in the coming years, technology companies and implementers will need to figure out a way for all of them to learn, think, and work together in order to truly take advantage of the potential that AI can deliver. The key to making that a reality is multimodal learning, and it is fast becoming one of the most exciting – and potentially transformative – fields of AI.
The total installed base of devices with Artificial Intelligence (AI) will grow from 2.7 billion in 2019 to 4.5 billion in 2024, forecasts global tech market advisory firm, ABI Research. There are billions of petabytes of data flowing through these AI devices every day; the challenge now facing both technology companies and implementers is getting all these devices to learn, think, and work together. According to a recent whitepaper from ABI Research, Artificial Intelligence Meets Business Intelligence, multimodal learning is the key to making this happen, and it's fast becoming one of the most exciting -- and potentially transformative -- fields of artificial intelligence. "Multimodal learning consolidates disconnected, heterogeneous data from various sensors and data inputs into a single model," says ABI Research chief research officer Stuart Carlaw. "Learning-based methods that combine signals from different modalities can generate more robust inference, or even new insights, which would be impossible in a unimodal system."
Data sets are fundamental building blocks of AI systems, and this paradigm isn't likely to ever change. Without a corpus on which to draw, as human beings employ daily, models can't learn the relationships that inform their predictions. But why stop at a single corpus? An intriguing report by ABI Research anticipates that while the total installed base of AI devices will grow from 2.69 billion in 2019 to 4.47 billion in 2024, comparatively few will be interoperable in the short term. Rather than combine the gigabytes to petabytes of data flowing through them into a single AI model or framework, they'll work independently and heterogeneously to make sense of the data they're fed.
Cognitive assistance may be valuable in applications that reduce costs and improve quality in healthcare systems. Use cases and scenarios include persuasion, i.e., the design, development and evaluation of interactive technologies aimed at changing users' attitudes or behaviours through persuasion, but not through coercion or deception. We motivate persuasion for healthcare systems and propose solutions from an artificial intelligence (AI) perspective for conceptual design and system implementation. The goal is to develop an IoT (Internet-Of-Things) toolbox towards AI-based persuasive technologies for healthcare systems.
Biometrics-as-a-service model has been a key focus area among biometric service providers to cater evolving requirements of end-users in BFSI sector. High upfront capital investment continues to challenge the adoption of biometrics in the BFSI sector. Biometric service providers are focusing on subscription-based and predictable model offerings to the BFSI industry, in order to increase their market penetration. Faster installation and execution of the BaaS model compared to traditional models has further, and maintenance and infrastructure costs covered by the providers is further expected to drive demand for biometrics-as-a-service in the BFSI sector. Although BaaS is gaining popularity as a feasible model, higher risk of cyberthreats are associated with it.