by Joe Abajian and Elliot Mintz | 09/25/2025
Qualcomm sponsored VDC's travel to Snapdragon Summit 2025.
This week, Snapdragon Summit opened with Qualcomm CEO and President Cristiano Amon celebrating Snapdragon’s 10th anniversary Summit by highlighting the company’s vision for a future transformed by AI. Over the last three years, AI has found its way into the lives of billions of people around the world, most commonly in the form of querying Large Language Models (LLMs) running on the cloud. Amon and Qualcomm believe that “edge is essential for the future of AI.” While the development of AI models and certain intensive tasks will remain cloud-based, advanced computing architecture will enable on-device AI that is immediate, personal, and context-aware, truly functioning as a personal AI agent.
During the keynote, Qualcomm emphasized 6 key trends driving the future of AI:
Artificial intelligence will simultaneously use text, voice, images, and even gestures as inputs to gain a holistic and contextual view of the user’s environment. Agents will replace keystrokes and swipes, bringing connected intelligence beyond physical device interfaces.
Smartphones will no longer be the center of the technology ecosystem. Described as “the ecosystem of you,” AI agents will directly interface with smartphones, watches, earbuds, smart glasses and any other connected devices. Of course, the phone is not going anywhere, which Qualcomm emphasized. Our exclusive reliance on cellphones for many personal and professional functions such as schedule management, calendar organization, image/video capture, and leisure will fade in favor of agentic features powered by a wide range of interconnected and personalized devices. Announced on the second day of the event, the Snapdragon X2 Elite platform is Qualcomm’s foundation for an agent-enabled personal ecosystem. With a dedicated NPU to reduce the burden on other processing resources, Snapdragon chipsets will enable efficient and inexpensive AI function on mobile devices and at the edge.
Snapdragon designed its latest process architecture to focus on supporting agent-centric technology. Features include powerful AI processing, power-efficient NPUs for on-device AI, new memory architecture, and agentic modems. This AI-first software approach enables context aware inference via multimodal processing based on text, vision, voice, and action.
Model training and finetuning, as well as intensive tasks like data cleansing will remain on the cloud, but many inferencing tasks will move to the edge. Qualcomm officially endorsed a hybrid approach, which aligns well with the current model landscape. Users have preferences for various models for certain tasks based on performance and task type. If a user is relying on AI for large scale inferencing, they will use the latest models in the cloud. Users can perform other lighter weight tasks, however, such as email summarization or image touchup, which do not require a large model, directly on device.
Local or “on-device” AI has a few benefits. Local processing is far superior for applications that demand low latency, which includes many wearables, audio features, and autonomous systems. In short, you don’t want your self-driving car waiting on an inference response from the cloud. Additionally, on-device processing enables greater privacy, which has value for information-sensitive applications in A&D and government work as well as wearables with sensitive biometrics.
Models are currently trained on vast amounts of data from the internet, which is useful for general search queries but useless for personalization. Edge data is not only secure but continues to train AI on unique user inputs. Eventually, on-device processing will help AI serve as a personalized “dynamic adaptive network of intelligence” for the user that interacts with other local features and data.
Qualcomm expects pre-commercial devices with 6G to hit the market as early as 2028, with “context aware intelligence at scale.” 6G will drive the connection between cloud and edge devices and will be intelligent itself with the ability to perceive sensor data.
To conclude the keynote, Amon brought out Rick Osterloh, Senior VP of Devices and Services at Google to share how the two companies will collaborate to bring AI to all facets of technology. Beyond the most common applications on mobile devices and laptops, AI will empower rich computing experiences in wearables, automotive, XR, robotics, and fully autonomous technology. All of these things combined will create Qualcomm’s “ecosystem of you".