Company acquires assets from Twenty Billion Neurons GmbH to bolster its AI Team. Qualcomm Technologies (QTI) is running a series of webinars titled "The Future of...", and the most recent edition is on AI. In this lively session, I hosted a conversation with Ziad Ashgar, QTI VP of Product Management, Alex Katouzian, QTI SVP and GM Mobile Compute and Infrastructure, and Clément Delangue, Co-Founder and CEO of the open source AI model company, Hugging Face, Inc. I've also penned a short Research Note on the company's AI Strategy, which can be found here on Cambrian-AI, where we outline some impressive AI use cases. Qualcomm believes AI is evolving exponentially thanks to billions of smart mobile devices, connected by 5G to the cloud, fueled by a vibrant ecosystem of application developers armed with open-source AI models.
Last year Qualcomm started rolling out its first chips for Android phones that supported upgradeable GPU drivers to optimize performance, so now it's doing a similar thing for on-device AI and machine learning. Droid-Life points out that during Google I/O, Google and Qualcomm have announced updatable neural network API drivers, representing a new model that will roll out along with Android 12. While NN API drivers have usually been updated along with major OS updates, now the companies say they can roll out quickly via Google Play Services. Even better, the updates will be available for older chipsets and multiple versions of Android. In an I/O presentation about advancements in machine learning, Google developers said the NN API could boost performance as though the phone had two additional CPU cores, while using less power and creating less heat.
And to that end, the 34-year-old company already appears to have some wind in its sails. Revenues were down 17% year over year during the company's fiscal year fourth quarter, but it beat Wall Street's expectations and sent company stock up 7%. The company is hinging its future performance on 5G, and highlighted areas of momentum that it expects to fuel growth. CEO Steve Mollenkopf told analysts that the company is actively working with standards bodies to define forthcoming advancements in 5G and positioning itself to support the expansion of 5G into enterprise, industrial IoT, and automotive markets. "The complexity and expansion of cellular technologies beyond the smartphone into nearly every industry play directly to Qualcomm's strengths and are why we believe 5G will represent the single biggest opportunity in Qualcomm's history," he said during an earnings call, according to a Seeking Alpha transcript.
A few months ago, I published a blog that highlighted Qualcomm's plans to enter the data center market with the Cloud AI100 chip sometime next year. While preparing the blog, our founder and principal analyst, Patrick Moorhead, called to point out that Qualcomm, not NVIDIA, probably has the largest market share in AI chip volume thanks to its leadership in devices for smartphones. Turns out, we were both right; it just depends on what you are counting. In the mobile and embedded space, Qualcomm powers hundreds of consumer and embedded devices running AI; it has shipped well over one billion Snapdragons and counting, all which support some level of AI today. In the data center, however, NVIDIA likely has well over 90% share of the market for training.
Cristiano Amon, a longtime veteran of chip giant Qualcomm and the company's president, is convinced you will see amazing things from 5G wireless technology, a cellular network upgrade being rolled out by AT&T and others that still remains something of a mystery to the average consumer. "Let's go for a trip down memory lane," he offered, in a chat with ZDNet inside the mammoth Qualcomm booth at the Consumer Electronics Show in Las Vegas this week. "Remember when 4G started, everyone was saying, Why do I need a hundred-megabit-per-second device?" he reflects. "Carriers told people it would be for connecting a laptop computer. "Now, we look at our smartphones and we say, How could we live without it?
Running artificial intelligence on mobile devices is a hot area of competition between vendors such as Apple and Samsung, as amply shown by Apple's continued emphasis on the "neural engine" circuitry within its "A-series" processors in the iPhone. But as a technology, mobile neural network reasoning is still a field evolving by fits and starts. Recent research highlights just how uneven are the efforts to run neural nets on Google's Android operating system. Benchmark results from researchers at Swiss university ETH Zurich reveal that development of neural networks on mobile devices is still a hairy business, with frameworks that are incomplete, chipsets with mixed support for networks, and results that are difficult to benchmark reliably. In a paper posted on arXiv this week, titled "PIRM Challenge on Perceptual Image Enhancement on Smartphones," Andrey Ignatov and Radu Timofte, both of the computer vision laboratory at ETH Zurich, describe how they ranked teams of developers who competed with different types of neural networks running on Android phones.
Over the last years, the computational power of mobile devices such as smartphones and tablets has grown dramatically, reaching the level of desktop computers available not long ago. While standard smartphone apps are no longer a problem for them, there is still a group of tasks that can easily challenge even high-end devices, namely running artificial intelligence algorithms. In this paper, we present a study of the current state of deep learning in the Android ecosystem and describe available frameworks, programming models and the limitations of running AI on smartphones. We give an overview of the hardware acceleration resources available on four main mobile chipset platforms: Qualcomm, HiSilicon, MediaTek and Samsung. Additionally, we present the real-world performance results of different mobile SoCs collected with AI Benchmark that are covering all main existing hardware configurations.
In recent months there was much talk of the arrival of artificial intelligence to cell phones. In fact, it is the feature that most companies highlight when describing their products. LG, for example, announced that it would launch a new version of its LG V30 that will incorporate artificial intelligence to its camera, as well as other optimizations that come hand in hand with Google Assistant. Last year Huawei announced, with all pomp, the arrival of its chip Kirin 970, which integrates a unit of neuronal processing and that is part of Mate 10, and Apple boasted the neuronal engine of A11 Bionic, present in the iPhone 8, 8 Plus and X. Samsung, meanwhile, announced its Exynos 9810 chip, which is also based on neural networks as Snapdragon 845 of Qualcomm. The Galaxy S9 and S9 will include one of these last two processors, depending on the country where it is marketed.
After surpassing $1 billion in IoT revenue in FY2017, Qualcomm is announcing new product families purpose-built for IoT applications. The company began by announcing a new family of IoT chipsets, the QCS603 and QCS605, along with software and reference designs, all dubbed the Qualcomm Vision Intelligence Platform. The platform brings the image and artificial intelligence (AI) processing capabilities found on its Snapdragon chipsets for premium smartphones to a wide range of consumer and industrial applications. With the transition from connected devices to intelligent devices, there is a push to bring AI processing from the cloud to the devices we use, commonly referred to as the "edge" or "edge devices" referring to the edge of the network. Bringing AI to the edge reduces cloud and connectivity bandwidth requirements while increasing security and system performance.
Not surprisingly, this year's smartphones feature faster processors than those from last year--that happens every year. But what is new this year is the predominance of machine learning features that just about every processor vendor is touting as a way of differentiating their devices. This is true for the phone vendors who design their own chips, the independent or merchant chip vendors who sell processors to phone vendors, and even the IP makers who design the cores that go into the processors themselves. First a little background: all modern application processors include designs (often referred to as intellectual property, or IP) from other companies, notably firms like ARM, Imagination Technologies, MIPS, and Ceva. Such IP can appear in various forms--for example, ARM sells everything from a basic license for its 32-bit and 64-bit architecture, to specific cores for CPUs, graphics, image processing, etc., that chip designers can then use to create processors.