Qualcomm wants to put AI everywhere with a new software stack

Qualcomm is trying to lend a hand to anyone who wants to run artificial intelligence (AI) workloads on its growing lineup of chips, in a bid to boost its ambitions in connected smart edge devices.

The US semiconductor giant has rolled out the Qualcomm AI stack which unifies all of its existing software and tools into a single package, giving developers an end-to-end solution for deploying AI on a range of different chips.

The stack aims to allow manufacturers and developers to create AI models that they have trained and optimized for one type of device (like a smartphone), that are easily transferable. This will allow them to port it to other hardware such as a laptop or augmented reality (AR) headset instead of having to start from scratch.

However, at present, the new AI software package only works with the company’s connected smart edge products.

“Hardware is critical,” said Ziad Asghar, vice president of product management at Qualcomm Technologies, during a recent briefing with reporters and analysts, “but increasingly, AI software is absolutely essential”. He added, “We believe this is the leadership stack for the connected smart edge.”

Full-stack ambition

CEO Cristiano Amon said he aims to transform Qualcomm into a “full stack” company capable of serving a large share of the edge computing market. The company is using its expertise in high-end mobile phone chips, where battery life is the highest priority, to win customers in new markets, including PCs and IoT. It also integrates more powerful AI accelerators such as its NPU into its processors.

The company quickly became a central player in the automotive market. It deployed its Snapdragon Ride family to be the brains behind GM’s self-driving feature and other chips to drive cars’ digital dashboards. The company also wants to tackle other areas, including data centers, base stations and robots for use in factories.

For Qualcomm, the change in strategy strengthens competition with companies like AMD, Intel and NVIDIA. They are all betting on special-purpose hardware as AI spreads from the data center to the edge.

But instead of giving away chips and letting customers figure out what to do with them, they’re also investing in software tools to program them, like Intel’s OneAPI, NVIDIA’s CUDA, and Apple’s upcoming Unified AI Stack. AMD.

Qualcomm hopes to stand out with a unified software stack that covers all processors in its portfolio. With it, developers have the ability to develop an AI-powered feature and then move it to different products and tiers.

The company said consolidating its AI software assets into a single stack reflects its strategy in the hardware department. This involves leveraging the building blocks of its flagship Snapdragon family of mobile chips to expand into new markets.

“We have now made the same leap [with software]where basically the same offering is able to cover all the business we have today,” Asghar said.

Stacked for the edge

By reducing the amount of work needed to adapt AI features from any Qualcomm chip type to another, the company promises to save costs associated with engineering resources as well as development time. on the market.

“While we have expanded into new businesses, we are now giving OEMs the ability to do the same, without having to spend significantly more on [engineering resources]“, said Asghar.

Qualcomm said the unified AI stack supports a wide range of different AI frameworks and widely used runtimes, including PyTorch, TensorFlow and ONNX, as well as various libraries, services, compilers, debugging and common programming languages, as well as system interfaces, driver accelerators and other tools. “There is support at all the different levels of the stack,” Asghar noted.

Supported operating systems include Windows, Android, Linux, Ubuntu, CentOS, and even several embedded and automotive-grade real-time operating systems such as QNX and Zephyr.

This stack provides direct access to Qualcomm’s AI Engine and dedicated AI cores on its Cloud AI 100. The new offering, called AI Engine direct, will now scale across all AI accelerators in its product families processors.

The AI ​​Engine is a software library capable of delegating and deploying existing models directly to the AI ​​accelerators at the heart of its hardware. With AI Engine direct, companies can program software even closer to the silicon to gain additional performance or reduce power.

Toolbox tools

The unified stack includes a wide range of unique tools to make it easier for developers to deploy AI across its chip portfolio in a way that outperforms software competitors on offer, Qualcomm said.

One of the standout features of Qualcomm AI Stack is the AI ​​Model Efficiency Toolkit. Asghar explained that this tool provides the ability to condense a power-hungry AI model trained in the cloud with 32-bit floating-point operations. in 8-bit integer format so it can work better on battery-powered devices. “You’re able to deliver huge benefits in terms of energy consumption,” up to 4x improvement in many cases, he said.

Another key element is the Neural Architecture Search tool co-developed with Google. This tool allows developers to optimize AI models to fit various constraints, such as higher accuracy, lower power, or lower latency. This way you can fine-tune the same AI model to excel in power efficiency in smartphones or IoT devices, where long battery life is vital, or reduce latency in industrial robots, where delays pose a security risk.

“We believe that with Qualcomm’s AI stack, we enable developers and OEMs to do so much more with the AI ​​capabilities we build into our devices,” Asghar said.

Customers can access these tools through a graphical user interface (GUI) for AI development.

Domain specific SDKs

Much of the software underlying Qualcomm’s AI stack isn’t new. Many of its latest software development kits (SDKs) for cars (Snapdragon Ride) and AR and VR (Snapdragon Spaces) are powered by elements of it.

It’s also a core part of the company’s Neural Processing SDK, which it says remains a popular tool for device makers to port AI models to a wider range of powered hardware. by Qualcomm.

Unifying all of these building blocks into a single stack is a “leap forward” for its customers, Asghar said. But it also gives Qualcomm far more flexibility: the company said the unified AI stack will serve as the foundation for it to deploy domain-specific SDKs as new markets emerge and customers demand new and better tools.

Abdul J. Gaspar