Snapdragon and Qualcomm branded products are products of
Qualcomm Technologies, Inc. and/or its subsidiaries.
In his recent webinar, Accelerating Distributed AI Applications, Ziad Asghar, our Vice President, Product Management, Qualcomm Technologies, Inc., gave an insightful and pragmatic overview where distributed AI is today and the Snapdragon® mobile platforms behind it.
Let’s take a quick look at some of the key highlights and insights from the webinar and resources that developers can use to build distributed AI solutions today.

5G and AI
The promises of 5G are now being realized through deployments around the world, and Ziad considers mmWave to be a game changer due to its ultra-low latency.
At the same time, the use of AI is also taking off both at the cloud and at the edge. And as 5G and AI are converging, Ziad points out their very symbiotic relationship. In other words, 5G allows for AI inference to be distributed to different parts of the network. At the same time, AI is designed to make communication technology (e.g., modems) more efficient with intelligent signal handling in complex conditions.
Developers on Snapdragon mobile platforms now have access to a range of 5G modems and antenna modules for both mobile devices and fixed wireless access. For example, our latest generation Snapdragon mobile platform, the Snapdragon 888 5G Mobile Platform, is integrated with our Snapdragon X60 5G Modem-RF System. For a complete list of our 5G modems, see our recent blog: Riding the Wave of 5G, a Millimeter at a Time.
Power Efficiency
Powerful AI also has to be power efficient. In the webinar, Ziad discusses key features of the Snapdragon 888 5G Mobile Platform architecture, which can contribute to power reductions of up to three times that of the previous generation Snapdragon mobile platforms. He specifically highlighted its fusion of scalar, vector, and tensor processing capabilities noting their alignment to the different parts of today’s neural networks. He also mentioned the platform’s larger shared memory that can reduce the amount of data transfers (e.g., for weights and biases) as well as faster context switching.
Developers can start with the Snapdragon 888 mobile hardware development kit (HDK). And as Ziad points out, developers can take advantage of a rich AI software stack including the Qualcomm® Neural Processing SDK for artificial intelligence (AI) that provides a high-level pipeline for machine learning (ML) models and the Qualcomm® Hexagon™ DSP SDK for low-level, bare-metal optimizations. In addition, developers can also use our open-source AI Model Efficiency Toolkit (AIMET), which provides advanced model quantization and compression techniques for trained neural network models.
Cloud AI 100
Ziad points out that some cloud AI customers are now making 200 to 400 trillion inferences per day, causing the power consumption of some data centers to double each year. And contributing to this problem is the lack of GPU/CPU scalability for ML processing. This is why we have brought our Snapdragon mobile technology to the enterprise via the Qualcomm® Cloud AI 100. Using this platform, enterprises can scale up inference in the cloud while gaining the power efficiency benefits offered by our Snapdragon platforms.
Automotive
The automotive vertical is experiencing huge advances as electric cars become mainstream and autonomous functionality advances. Ziad also notes our expansion from infotainment into ADAS (Autonomy and Advanced Driver Assistance). This begins with driver monitoring, but also acknowledges the need for safety through multi-layer redundancy and sustained system performance.
Our solution for autonomous functionality is our Qualcomm® Snapdragon Ride™ Platform. Licensed developers can develop for it using the Qualcomm® Autonomy and Advanced Driver Assistance (ADAS) SDK, a C++ API containing numerous classes and methods for performing image processing functions specific to automotive.
IoT
IoT devices in conjunction with the myriad of sensors available today help allow for data collected at the edge to be processed locally or in the cloud. Ziad mentions Big IoT, where multiple devices collaborate, and each are coordinated to provide different levels of intelligence. Similarly, federated learning, another area of research, helps provide developers with the option to perform training at the edge.
Another key highlight for IoT was his vision for always-on AI. You’re probably already familiar with this concept through wake words used on today’s popular smart-speaker devices. But now always-on AI is expanding to encompass and fuse additional streams like the camera, sensors, etc., to provide new types of functionality (e.g., disable certain functions for the driver of a car).
There are several SoCs for developers to build IoT solutions, such as our APQ80xx series application processors and our Qualcomm® QCC51xx series SoCs like the Qualcomm® QCC5100.
Check it Out!
You can access Ziad’s webinar, which is about 40 minutes long, along with a podcast of the presentation at this link. And for additional updates from Ziad, be sure to follow his Twitter feed.
Snapdragon, Qualcomm Neural Processing, Qualcomm Hexagon, Qualcomm Autonomy and Advanced Driver Assistance, Qualcomm QCC51xx, and Snapdragon Ride are products of Qualcomm Technologies, Inc. and/or its subsidiaries. AIMET is a product of Qualcomm Innovation Center, Inc.