In a pivotal expansion of its long-term corporate vision, Uber is moving beyond its roots in ride-hailing to position itself as a central infrastructure provider for the autonomous vehicle (AV) industry. The strategy focuses on leveraging its massive global network of millions of drivers to transform everyday vehicles into high-fidelity data-gathering hubs. By equipping driver-owned cars with advanced sensors, Uber aims to create the world’s largest real-time ecosystem for training the artificial intelligence that will power future self-driving systems.
A strategic pivot to AV Labs
The initiative is spearheaded by the recently launched Uber AV Labs, a division dedicated to accelerating autonomous mobility by collecting and sharing vast amounts of real-world driving data.
While the programme is currently in a “scrappy” early phase—utilising a limited fleet of sensor-equipped vehicles like the Hyundai Ioniq 5—the long-term goal is to scale this technology across the platform. Chief Technology Officer Praveen Neppalli Naga recently highlighted that the primary hurdle for AV developers has shifted from software coding to the acquisition of diverse, high-volume training data, which Uber is uniquely positioned to provide.
Building the global AV Cloud
Central to this transition is the development of an AV Cloud, a searchable repository of multi-sensor data. This platform allows partners to access specific datasets—such as how vehicles navigate complex intersections or react to unpredictable weather—essential for reaching “Level 4” autonomy. Uber has already formed partnerships with approximately 25 autonomous firms, including NVIDIA and Wayve, to supply the raw telemetry and imagery needed to refine their machine-learning models.
Democratising data for safety
Uber’s leadership has emphasised that the primary objective is to “democratise” this information rather than seek immediate monetisation. By providing unblurred, high-resolution footage that allows AI to read fine-grained cues like pedestrian eye contact or head turns, the company claims it can significantly widen safety margins for the entire industry. This “shadow mode” testing allows AI systems to simulate decisions during real-world trips without taking control of the vehicle, providing a safe yet rigorous training ground.
The road ahead
This data-centric shift aligns with Uber’s broader 2026 targets, which include offering autonomous rides in 15 cities and becoming the world’s largest facilitator of AV trips by 2029. By integrating its digital and physical foundations—combining geospatial mapping with real-world sensor data—Uber is transitioning from a service provider into a foundational engine for the next generation of global transport.
/images/ppid_a911dc6a-image-177753685429338192.webp)






/images/ppid_a911dc6a-image-177764303435118289.webp)
/images/ppid_a911dc6a-image-177769903249436603.webp)


/images/ppid_a911dc6a-image-17775520333439602.webp)
/images/ppid_a911dc6a-image-177761155826186080.webp)