The race for software-defined vehicles is entering a more pragmatic phase: fewer lofty promises and more platforms capable of bringing advanced driver assistance functions (ADAS) to various models, trims, and markets without redesigning the vehicle from scratch. In this context, ZF and Qualcomm Technologies have announced a technological collaboration to deliver a scalable ADAS solution based on the ZF ProAI automotive supercomputer and the Snapdragon Ride platform.
The goal, according to both companies, is to combine processing power, perception capabilities, and an open architecture that allows manufacturers to integrate third-party software and tailor the “function package” for each vehicle. Put simply: a single technological core that can be configured as a domain controller, zonal controller, or central controller, capable of scaling from entry-level systems (regulatory and safety functions) to more ambitious automation scenarios.
A common “brain” for different electrical and electronic architectures
The collaboration is based on a reality that many manufacturers have already embraced: vehicle electronics cannot continue growing merely by adding control units. The approach is shifting towards fewer, more powerful control units capable of executing multiple functions simultaneously and receiving updates via software.
This is where ZF ProAI comes in—a family of “automotive-grade” central computers designed for diverse platforms and applications. In its highest configuration, ProAI is envisioned as a multi-domain system with multiple performance modules and a processing power exceeding 1,500 TOPS. This figure—familiar in semiconductor marketing language—is less about the number itself and more about what it enables: consolidating functions, allowing for more complex models, and providing a clear path to scale capabilities without reinventing hardware with each product cycle.
The other pillar is Snapdragon Ride, Qualcomm’s platform aimed at assisted and automated driving, offering the suite of chips (SoCs), perception stack, and a design philosophy centered on hardware-software co-optimization. In the announcement, both companies describe the possibility of grouping camera vision, sensor fusion, and decision logic, or even unifying it into an end-to-end (E2E) AI model within an architecture ready to incorporate additional modules.
Snapdragon Ride Pilot: from camera perception to the global “safety stack”
Within the collaboration, Snapdragon Ride Pilot stands out—a perception stack based on cameras (object detection, lane and signal recognition) and functions such as parking assistance, driver monitoring, and real-time mapping. The system can scale from a basic configuration with a front camera to multi-camera setups for surround perception.
To improve this “around the vehicle” perception, Qualcomm is targeting a high-fidelity bird’s-eye view (BEV) architecture, advanced processing of fisheye cameras, and radar integration with a focus on reducing latency and increasing safety in complex scenarios. The approach isn’t just about “seeing,” but consistently perceiving in real-world conditions: dense traffic, merging and exit ramps, poorly marked lanes, rain, harsh shadows, or reflections.
A key element in the commercial messaging is the “industrial” validity of the stack: Ride Pilot is presented as a safety and regulatory functions platform proven and deployed in more than 60 countries. In an industry where approval, traceability, and version management matter as much as performance, this data builds credibility for OEMs wary of long, costly integrations without guarantees of global deployment.
ZF: a modular catalog of ADAS functions for “pick and assemble”
ZF, for its part, highlights a modular software approach: the company talks about around 25 safety, comfort, and parking functions, including an advanced option like “hands-off navigate on autopilot (NOA).” The proposal is for manufacturers to select, individually and scalably, which functions to activate for each vehicle line, with the possibility to purchase them also as standalone software “as-a-product.”
This concept aligns with a major industry shift: assisted driving is becoming a “menu” of capabilities that can be activated by versions, markets, or subscriptions, requiring platforms that can support variations without escalating engineering costs.
Open architecture, OTA updates, and tools to accelerate “time-to-market”
An additional focus of the announcement is Qualcomm’s open-architecture integration platform, designed with a modular architecture to dynamically allocate processing resources and facilitate interoperability among heterogeneous ECUs. It addresses a real pain point: many manufacturers and suppliers carry legacy hardware and software that cannot be replaced overnight.
The solution is complemented by a development toolchain—including simulation, APIs, and software resources—intended to speed up prototyping, validation, and deployment. Most importantly, it supports fundamental features like OTA (over-the-air) updates and the ability to add or enhance functionalities over the vehicle’s lifespan.
In practice, this is what turns a “closed” ADAS into a living platform: a vehicle that not only maintains its original features but can also improve perception, logic, and behavior without requiring visits to a workshop—assuming regulatory frameworks and safety policies permit.
Scaling from compliance to advanced automation
Both ZF and Qualcomm emphasize that their proposal aims to cover a broad range— from regulatory functions to higher levels of automation, with Level 3 scalability as the upper limit for this collaboration. This breadth is important because, in the real market, basic vehicles only need to meet standards, while premium models seek differentiation through more advanced capabilities.
Furthermore, the combined ProAI + Snapdragon Ride platform is presented as a foundation not just for ADAS but also for new electronic architectures where the vehicle is organized by domains or zones. This opens the door to integrating functions that previously operated in silos—such as driving assistance and infotainment—with a clearer pathway toward central controllers.
ZF: industrial-scale; Qualcomm: platform for the software-defined vehicle
ZF enters this movement with the weight of a global supplier: the company reports around 161,600 employees, €41.4 billion in sales for 2024, and a manufacturing footprint with 161 facilities across 30 countries. In platform discussions, this scale signifies manufacturing capacity, support, and continuity—three factors that, in automotive, are as critical as performance.
For Qualcomm, such collaborations reinforce its strategy to become the “technology layer” of the software-defined vehicle: providing computing, perception, and tools for OEMs to build their product experience, differentiation, and commercial strategies on top.
Frequently Asked Questions
What does it mean that ZF ProAI can act as a domain, zonal, or central controller?
This means the same system can adapt to different vehicle architectures: from controlling a set of functions (domain), managing a zone within the vehicle (zonal), or becoming the main computer coordinating multiple domains.
What are TOPS, and why does ZF talk about more than 1,500 TOPS?
TOPS (Tera Operations Per Second) is a performance metric for operations, often associated with AI. In ADAS, higher TOPS enable perception, sensor fusion, and decision logic with more complex models or additional camera/radar inputs, and leave room for software evolution.
What advantages does an open architecture provide for an ADAS system?
It allows integration of third-party software (such as manufacturer-developed algorithms or specialized components), reduces dependencies, and makes it easier to adapt hardware for different needs without locking the ecosystem.
How do OTA updates improve driving assistance functions?
They enable remote deployment of improvements and fixes—from perception optimizations to new features—always under security, validation, and regulatory compliance controls.
via: qualcomm

