Fact-checked by Grok 2 weeks ago

Nvidia Drive

NVIDIA DRIVE is a comprehensive, AI-powered platform developed by Corporation for enabling autonomous vehicles, integrating scalable hardware, software stacks, and developer tools to process sensor , run deep neural networks, and support levels of autonomy from advanced driver assistance systems (ADAS) to full self-driving capabilities. Introduced at the (CES) in January 2015 with the initial DRIVE PX auto-pilot computing system, the platform has evolved to address the growing demands of production-ready autonomous driving, starting as an early supercomputer for vehicles and advancing through generations like DRIVE AGX (2018) and DRIVE Orin (announced in 2019). The core of NVIDIA DRIVE lies in its hardware components, particularly the DRIVE AGX family of in-vehicle computing systems, which provide energy-efficient processing for complex workloads such as perception, prediction, and . The latest iteration, DRIVE Orin, delivers up to 254 tera operations per second () of performance, enabling scalability across vehicle fleets from passenger cars to robotaxis while supporting safety-critical applications certified to automotive standards like ASIL-D. Complementing the hardware is the DRIVE OS software stack, a Linux-based operating system that includes middleware for , deep learning acceleration via NVIDIA's and TensorRT libraries, and tools for and validation through DRIVE Sim. This end-to-end ecosystem allows automakers and suppliers to develop, test, and deploy autonomous features efficiently, with built-in redundancy for . NVIDIA DRIVE has been adopted by major automotive manufacturers, powering production vehicles such as Mercedes-Benz's DRIVE PILOT for Level 3 and Volvo's EX90 with advanced ADAS, including a September 2025 upgrade to dual configuration. Partnerships with companies like (since 2017) and recent collaborations in 2025 with for scaling fleets and (via Tensor Auto) for consumer-owned autonomous vehicles further extend its reach into commercial fleets and ride-hailing services. As of November 2025, the platform has advanced with the release of the DRIVE AGX Thor developer kit in August 2025, targeting production in 2026 for even higher performance in next-generation autonomous mobility.

Overview

Purpose and Key Features

The NVIDIA DRIVE platform serves as an end-to-end, scalable computing solution for autonomous vehicles, enabling Level 2+ advanced driver assistance systems (ADAS) through Level 4 full by integrating , , and technologies to process complex in-vehicle workloads in . Initially announced in January 2015 at the (CES) as an extension of NVIDIA's processors for automotive applications, it has evolved from supporting basic driver assistance features to facilitating comprehensive self-driving capabilities, prioritizing safety through redundant computing architectures that ensure compliance. Key features of NVIDIA DRIVE include its energy-efficient processing, which optimizes power consumption for demanding automotive environments while delivering high-performance , and support for transformer-based models that enhance and in dynamic scenarios. The platform's allows for fleet-wide , enabling to standardize AI pipelines across diverse vehicle configurations and upgrade paths without overhauling entire systems. Additionally, built-in redundancy in hardware and software components, such as diverse processing units and mechanisms, underpins its emphasis on safety, achieving D (ASIL-D) certification to mitigate risks in autonomous operations.

Target Markets and Applications

NVIDIA DRIVE targets primary markets including passenger vehicles, commercial trucking, robotaxis, and vehicles, enabling advanced autonomous capabilities across diverse transportation sectors. In passenger vehicles, the platform supports the transition from driver assistance to higher levels of , powering features in electric and conventional cars for enhanced and . Commercial trucking applications leverage DRIVE for long-haul operations, reducing driver fatigue and optimizing logistics through reliable autonomous piloting. Robotaxis utilize the system for services, while vehicles facilitate unmanned operations in logistics. The platform's applications span advanced driver-assistance systems (ADAS) and full autonomous vehicle (AV) functionalities. In ADAS, NVIDIA DRIVE enables features such as , which maintains safe following distances by adjusting speed in real-time, and lane keeping, which assists in centering the vehicle within lanes to prevent unintended drifts. For full AV, it supports complex scenarios like urban , where the system processes data for obstacle avoidance and route optimization in dense city environments, and highway piloting, allowing seamless merging and lane changes at high speeds. These capabilities are built on scalable inference, ensuring robust performance across varying driving conditions. Integration with electric vehicles (EVs) highlights DRIVE's role in power-efficient processing, where the platform's low-energy minimizes drain while handling intensive compute tasks for and . This is particularly vital for EVs, as it balances with range optimization, supporting features from Level 2+ assistance to Level 4 operations without compromising efficiency. The system's scalability extends from individual personal cars to large fleet operations, accommodating single-vehicle deployments in consumer markets to coordinated and trucking fleets numbering in the tens of thousands. Additionally, DRIVE supports (V2X) communication, enabling real-time data exchange with infrastructure, other vehicles, and pedestrians to enhance and cooperative driving.

History

Inception and Early Milestones

NVIDIA's entry into automotive began in 2014, building on the success of its Tegra processors in mobile devices, with the announcement of the Tegra K1 at CES as a foundational platform for in-vehicle supercomputing. The Tegra K1, featuring a 192-core GPU based on the Kepler and support for , enabled early applications in advanced driver assistance systems (ADAS) such as pedestrian detection and lane monitoring, marking NVIDIA's initial push toward AI-enhanced vehicle perception. In January 2015, at CES, formally launched the DRIVE platform, introducing DRIVE PX as the world's first in-car computer designed for autonomous driving development. Powered by the X1 on the GPU architecture, DRIVE PX processed data from multiple cameras to enable 360-degree surround vision and deep learning-based , positioning it as a key enabler for self-driving prototypes. Concurrently, unveiled DRIVE CX, an system that bridged traditional computing with emerging autonomy features, utilizing the same X1 to deliver high-fidelity 3D mapping and graphics for driver interfaces. The platform's early milestones included strategic partnerships for prototype testing in 2016, with integrating DRIVE PX into its vehicles for ADAS validation and adopting the subsequent DRIVE PX 2 hardware for image processing in production models. From inception, DRIVE emphasized for perception tasks, such as real-time and scene understanding, demonstrated publicly at CES 2016 through live DRIVENet showcases that tracked vehicles and pedestrians using neural networks on DRIVE PX 2. These prototypes highlighted the platform's potential to advance from assisted to fully autonomous driving capabilities.

Evolution Through 2025

In 2017, shifted its DRIVE platform toward the AGX series, introducing the DRIVE PX system in October, which combined two SoCs and discrete GPUs to deliver over 320 for Level 4 (L4) autonomous capabilities, targeting robotaxis and highly automated vehicles. The DRIVE AGX , based on the architecture, followed in 2018 as a production-ready SoC providing 30 , enabling scalable L4 in partnerships with automakers like and for advanced driver-assistance systems (ADAS) and full self-driving. From 2018 to 2020, the platform emphasized redundancy and , with powering early L4 prototypes and entering volume production in 2020 for commercial AV testing. In December 2019, NVIDIA announced the AGX , leveraging the architecture for up to 254 and significant efficiency improvements over , with production ramping in 2022 for vehicles from partners like and . This marked a focus on software-defined vehicles, reducing power consumption by up to 70% while supporting generative workloads. NVIDIA revealed the DRIVE Atlan SoC in April 2021, built on the GPU architecture to target 1,000 for 2025 production, but canceled it in September 2022, citing the superior performance of the upcoming Blackwell architecture in the new DRIVE Thor platform. At GTC 2024 in March, NVIDIA detailed DRIVE Thor's integration of Blackwell GPUs for over 1,000 , unifying , ADAS, and parking in a single system for cost efficiency, with initial sampling to partners like and Lucid. The DRIVE AGX Thor developer kit became available for pre-order in August 2025, shipping in September to accelerate L4 development. In June 2025, rolled out its full-stack AV software platform, including modular for perception, planning, and simulation, enabling scalable deployment across and Thor hardware. At CES 2025 in January, the Hyperion platform achieved cybersecurity process approval from SÜD, with OS 6.0 conforming to ASIL D standards pending certification release. NVIDIA launched the DRIVE AGX Hyperion 10 reference platform in October 2025, a modular L4-ready architecture combining Thor compute, sensors, and software for robotaxis, adopted by , , and others to scale fleets starting in 2027. By late 2025, the DRIVE ecosystem encompassed a growing number of partners, including automakers and Tier 1 suppliers like , , and , supporting production scaling for autonomous vehicle fleets starting in 2027.

Hardware Platforms

Maxwell and Pascal-Based Systems

The NVIDIA DRIVE CX, introduced in 2015, served as an early primarily targeted at and digital cockpit applications with basic vision processing capabilities. It utilized the Tegra X1 processor, featuring a -based GPU with 256 cores and an integrated CPU comprising four Cortex-A57 cores and four Cortex-A53 cores. This configuration enabled the system to drive up to 16.8 million pixels across multiple displays for advanced graphics in and surround-view systems, while supporting initial tasks for vehicle differentiation in prototypes. The DRIVE PX platform, introduced in 2015, marked NVIDIA's entry into autonomous driving compute with a focus on for advanced driver-assistance systems (ADAS). Built around dual X1 superchips employing GPU architecture, it delivered approximately 2.3 TFLOPS of floating-point performance and could process inputs from up to 12 high-resolution cameras at 1.3 gigapixels per second. Integrated ARM-based CPUs facilitated multi-camera fusion for 360-degree environmental awareness, making it suitable for early prototyping in ADAS features like autonomous . This system was deployed in test vehicles, including prototypes for real-world validation of algorithms. Announced in 2016, the DRIVE PX 2 represented a significant leap in compute power for L2+ ADAS, incorporating Pascal architecture with two integrated X2 (Parker) SoCs—each with a 256-core Pascal GPU and CPU (two Denver 2 cores plus four A57 cores)—alongside two discrete GP106 Pascal GPUs. The platform achieved 8 TFLOPS of FP32 performance and 24 trillion operations per second (TOPS) for INT8 precision, enabling robust from cameras, , radar, and ultrasonics via CUDA-accelerated . Operating at 250 watts with liquid cooling, it powered early autonomous vehicle trials, including those by for mapping and perception in urban environments. These and Pascal-based systems emphasized CPU-GPU integration for efficient acceleration but were limited by lower scalability and higher power demands relative to subsequent generations, positioning them primarily for prototyping rather than production-scale deployment.

Volta and Turing-Based Systems

The and Turing-based systems marked a significant advancement in 's DRIVE platform, introducing high-performance system-on-chips (SoCs) optimized for Level 3 and higher autonomous driving capabilities between 2018 and 2020. These systems emphasized scalable with enhanced and features, transitioning from earlier prototype-oriented to production-ready solutions capable of handling complex tasks. The DRIVE AGX Xavier, released in 2018, served as the foundational in this era, integrating a Volta-based GV10B GPU with 512 cores and 64 Tensor Cores alongside an 8-core ARMv8.2 CPU cluster. It delivered 30 of INT8 performance for , enabling efficient processing of deep neural networks while supporting configurable power modes from 10W to 30W to balance performance and thermal constraints in automotive environments. This design facilitated advanced driver-assistance systems (ADAS) and partial automation, with the 's deep learning accelerators (DLAs) providing dedicated hardware for real-time on convolutional neural networks. Building on , the DRIVE AGX Pegasus platform launched in 2019 as a dual-SoC configuration combining two Xavier processors with additional Turing-based GPUs, achieving up to 320 of performance for highly automated applications. Targeted at Level 4 in robotaxis and trucking, Pegasus incorporated integrated through its dual-redundant architecture, ensuring fault-tolerant operation critical for safety-certified systems compliant with standards. Key innovations across these systems included a hardware root-of-trust mechanism, leveraging on-chip fuses and for secure boot chains that protect against threats by verifying software integrity from the level. The DLAs further accelerated , offloading vision and workloads from the main GPU to maintain low . In terms of performance, these systems supported from over 12 cameras and point cloud processing at 30 frames per second, enabling robust environmental perception for dynamic urban and highway scenarios. Adoption was evident in pilot programs, with Zoox leveraging DRIVE AGX Xavier for early development before upgrading to subsequent platforms, and incorporating it into advanced ADAS testing initiatives.

Ampere-Based Systems

The NVIDIA DRIVE AGX , released in , represents a significant advancement in -based systems for autonomous vehicle computing, integrating the GA10B GPU based on the Ampere architecture with a next-generation 12-core Cortex-A78AE CPU cluster. This platform delivers up to 254 of INT8 performance and 200 TOPS in sparse configurations, enabling efficient handling of multiple inference pipelines for , , and tasks. With a configurable (TDP) of 45-60W, Orin prioritizes power efficiency for production deployment in energy-constrained automotive environments. Orin variants cater to diverse applications, including the Nano module, which provides 70 of performance in a compact, cost-sensitive suitable for advanced driver-assistance systems (ADAS) at SAE Level 2+. In contrast, the full configuration targets Level 4 (L4) , supporting complex workloads in robotaxis and highly automated s. Key advancements include support for video decoding to process high-resolution feeds more efficiently and enhanced tracing capabilities via second-generation RT cores, which accelerate realistic simulation rendering for virtual training environments. The system accommodates up to 32 s, including 16 GMSL2 cameras, multiple radars, lidars via high-bandwidth Ethernet (up to 30 Gb/s), and CAN interfaces for networking. Efficiency improvements are a hallmark of , offering nearly 7x the AI performance of the previous-generation platform at comparable power levels, allowing for denser compute in space-limited production vehicles without excessive thermal demands. This scalability has facilitated real-world integrations, such as Mercedes-Benz's Drive Pilot system, which leverages for Level 3 hands-free highway driving in models like the S-Class and EQS starting in 2023. Similarly, incorporated dual configurations in the EX90 electric by 2024, enhancing over-the-air updates and advanced safety features. These deployments underscore Orin's role in transitioning from development to scalable, production-grade autonomous driving hardware.

Blackwell-Based Systems

The NVIDIA DRIVE AGX Thor, introduced in 2024 as the cornerstone of Blackwell-based systems, integrates a custom Blackwell GPU with V3AE CPUs to deliver up to 1,000 of INT8 performance (2,000 FP4 ), enabling advanced support for large language models (LLMs) in autonomous planning tasks. This platform also incorporates elements of the Grace CPU architecture for hybrid CPU-GPU computing, facilitating efficient processing of complex workloads at . With a (TDP) of 350 W, DRIVE AGX Thor prioritizes while handling demanding real-time computations essential for Level 4 and 5 autonomy. Key features of DRIVE AGX Thor include enhanced cybersecurity through support for , which protects sensitive models and data from unauthorized access during processing. The system achieves up to 10 times the performance of the prior platform, particularly in generative applications for predicting driver and pedestrian behaviors, thereby improving in dynamic environments. Production samples became available to select partners in the first half of 2025, paving the way for scaled deployments targeting 2027 vehicle production timelines. As of November 2025, integrations are advancing with partners including , , International/PlusAI, and / for applications in passenger vehicles, robotaxis, and autonomous trucks. The AGX Thor Developer Kit was made available for preorder in August 2025, bundled with DRIVE OS 7 to accelerate prototyping for automakers and suppliers. It has achieved general availability, with integrations underway for partners such as and , who are leveraging it for driverless truck and development. AGX Thor also powers the NVIDIA DRIVE Hyperion 10 reference platform, combining dual units for enhanced redundancy in production vehicles.

Software Stack

DRIVE Operating System

The NVIDIA DRIVE Operating System ( OS) serves as the foundational software layer for the DRIVE platform, delivering a certified, high-performance environment tailored for automotive and autonomous vehicle applications. Built on a with extensions for real-time processing and safety-critical operations, it integrates hardware accelerators, middleware, and development tools to enable efficient execution of complex workloads such as and inference. OS emphasizes , security, and scalability, supporting deployment across NVIDIA's DRIVE AGX hardware generations while adhering to automotive standards like ASIL-D for core development processes. The evolution of OS began with version 5 in 2020, a Linux-based system optimized for earlier platforms like DRIVE AGX , providing essential libraries for building and deploying autonomous vehicle software. Version 6, released in 2022, advanced support for Ampere-based systems such as DRIVE AGX Orin, incorporating -specific optimizations for enhanced GPU performance and energy efficiency in AI workloads. By 2025, version 7 introduced compatibility with Blackwell in DRIVE AGX Thor, featuring a kernel extension for deterministic operations, dedicated Blackwell drivers supporting full production clock speeds, and integrated safety monitors like VMON for monitoring and TMON for thermal oversight. Throughout its progression, OS 6.0 and later versions conform to ASIL-D standards, with certifications from SÜD and Rheinland validating its suitability for safety-critical automotive use. Central to DRIVE OS is the NVIDIA DriveWorks middleware, which includes a Sensor Abstraction Layer that unifies data capture from diverse automotive sensors—such as cameras, LiDAR, and radar—through standardized APIs, enabling seamless integration and synchronized timestamping without hardware-specific coding. Key features across versions encompass over-the-air (OTA) updates via the DRIVE Update mechanism, which handles secure firmware and partition management with post-quantum cryptography verification for multi-SoC systems. Virtualization capabilities, powered by a hypervisor, facilitate multi-OS support by allowing guest virtual machines (e.g., Linux or QNX) to fully utilize system cores while isolating safety-critical services, eliminating the need for a dedicated DriveOS core. Automotive-grade power management ensures reliability through features like suspend-to-RAM states, graceful shutdowns, and dynamic thermal/fan control via safety MCUs. DRIVE OS integrates deeply with NVIDIA's AI ecosystem, leveraging CUDA for parallel computing and TensorRT for optimized deep learning inference, which accelerate AI tasks directly on DRIVE hardware accelerators like GPUs and DLAs. This enables developers to deploy high-throughput models for and while maintaining low latency and power efficiency suitable for in-vehicle computing.

Perception and Planning Software

The NVIDIA DRIVE perception stack employs deep neural networks for and semantic segmentation to enable environmental understanding in autonomous vehicles. Object detection models identify and classify elements such as vehicles, pedestrians, and traffic signs from sensor inputs, while semantic segmentation assigns pixel-level labels to scene components like roads and lanes for precise scene parsing. These networks leverage -based architectures, including SegFormer, a vision transformer model developed by for efficient semantic segmentation with lightweight decoders. SegFormer models are trained on datasets like nuScenes, a large-scale featuring 1.4 million camera images and annotations from urban driving scenarios. The planning module in integrates end-to-end driving models that combine motion with to generate safe vehicle paths. These models, rolled out in the full-stack DRIVE AV software update in June 2025, unify , , and into a single framework, replacing traditional modular approaches. By processing raw sensor data directly to output control commands, the system optimizes trajectories for collision avoidance and efficiency in dynamic environments. Key concepts in DRIVE's perception and planning include foundation models trained on extensive human driving datasets to mimic natural behaviors, such as nuanced speed adjustments and lane changes. Latency is reduced through edge inference on the DRIVE AGX platform, enabling real-time processing with optimizations like the LiteVLM pipeline, which achieves up to 2.5 times lower end-to-end latency compared to prior methods. The 2025 software updates introduce enhanced multi-modal fusion of camera, , and data, improving robustness for Level 4 in urban settings. Performance in adverse weather is bolstered by simulation-trained models that replicate conditions like and , allowing the stack to maintain reliable and planning without real-world exposure to hazards. These models, validated on benchmarks including nuScenes, demonstrate robust operation across diverse scenarios, supporting safe deployment in challenging environments.

Development and Simulation Tools

The NVIDIA DRIVE SDK, encompassing the DriveWorks SDK and DRIVE OS components, provides developers with a comprehensive set of libraries, APIs, and tools for integrating sensors and building autonomous vehicle (AV) applications. DriveWorks includes modules for sensor abstraction, supporting cameras, radars, LiDARs, and IMUs through unified interfaces that enable raw data serialization, virtual sensor replay, and dynamic calibration based on vehicle motion and measurements. These APIs facilitate image and point cloud processing, egomotion estimation using odometry or IMU data, and compute graph frameworks optimized for NVIDIA DRIVE AGX platforms. Additionally, the SDK integrates NVIDIA NvMedia for efficient sensor data handling and NvStreams for high-throughput processing pipelines. For simulation, the DRIVE platform leverages for physics-based AV testing and generation (SDG), evolving from earlier DRIVE Sim deployments introduced in 2023. Current tools include Sensor RTX for high-fidelity sensor simulation (cameras, radars, LiDARs) integrated with platforms like CARLA, and NVIDIA Cosmos for world foundation models that enable neural reconstruction (via NuRec) and photorealistic scene generation from text prompts or real-world data. Cosmos Predict supports multiview video synthesis for training scenarios, while Cosmos Curator aids in dataset filtering and retrieval to accelerate SDG pipelines. These Omniverse-powered workflows allow scalable validation of perception and planning algorithms in virtual environments, reducing reliance on real-world miles. Key development tools within the DRIVE ecosystem adapt the Toolkit for GPU-accelerated computing on automotive hardware, enabling of AV workloads. TensorRT optimizes models for low-latency inference, supporting quantization and layer fusion tailored to DRIVE AGX SoCs, which can achieve up to 8x performance gains in perception tasks compared to unoptimized frameworks. Profiling is handled via NVIDIA Nsight Systems, which analyzes CUDA kernels, TensorRT execution timelines, and system latency to identify bottlenecks in and AI pipelines. As of 2025, DRIVE tools integrate with systems for cloud-based training, allowing developers to handle models exceeding 1 billion parameters through mixed-precision workflows on DGX Cloud, which combines with accelerated data pipelines like and . This supports end-to-end development from data ingestion to deployment. The overall workflow progresses from prototyping in environments to hardware-in-the-loop (HIL) testing using Constellation, a scalable that connects feeds to physical AGX hardware for validation against safety standards, culminating in certification-ready deployments. Constellation enables massive parallel simulations in data centers, simulating billions of virtual miles while interfacing with real ECUs for closed-loop testing.

Reference Platforms and Kits

DRIVE AGX Developer Kits

The NVIDIA DRIVE AGX Developer Kits provide scalable hardware platforms for developers to prototype and validate autonomous vehicle algorithms on production-grade system-on-chips (SoCs). The lineup spans three generations: the DRIVE AGX Developer Kit, released in 2018 with up to 30 INT8 of AI performance at 30 W; the DRIVE AGX Orin Developer Kit, launched in 2022 delivering up to 254 INT8 at up to 200 W; and the DRIVE AGX Thor Developer Kit, introduced in 2025 offering up to 1,000 INT8 (or 2,000 FP4 TFLOPS) at 350 W system power. These kits consist of reference carrier boards equipped with automotive-focused I/O interfaces, including for up to GMSL2/3 camera inputs for , multi-gigabit Ethernet (up to 76 Gb/s on Thor) for (V2X) communication, and multiple CAN, , and interfaces for integration with vehicle networks. They incorporate solutions compatible with automotive operating temperatures ranging from -40°C to 85°C, enabling reliable testing in real-world conditions. Primarily used for algorithm tuning, sensor integration testing, and early-stage software validation, the kits are available to members of the NVIDIA Developer Program through authorized distributors such as and Studio. The DRIVE AGX Thor kit notably integrates elements of the NVIDIA Grace CPU architecture, enhancing support for generative AI workloads in 2025 development cycles. These platforms are compatible with the DRIVE OS software stack for seamless progression to production.
KitAI Performance (TOPS)Power Consumption (W)Approximate PriceSupported DRIVE OS Versions
30 INT830$1,000+5.x
254 INT8200$2,000+6.x
Thor1,000 INT8 / 2,000 FP4350$5,000+ (est.)7.x

DRIVE Hyperion Platform

The NVIDIA DRIVE Hyperion platform serves as a complete end-to-end reference architecture for development, enabling developers to build and test systems from Level 2+ advanced driver assistance to Level 4 full . Introduced in 2021, it integrates compute hardware, software, and sensors into a that accelerates prototyping and deployment. Key versions include Hyperion 8, released in November 2021 and powered by the system-on-chip (SoC); Hyperion 7.1, an update based on the SoC with enhancements for Level 2+ applications around 2022-2023; and the latest Hyperion 10, unveiled in October 2025 and utilizing the Thor SoC for production-scale Level 4 capabilities. The platform's sensor suite provides comprehensive environmental perception, featuring 14 high-definition cameras for 360-degree visual coverage, 3 LiDARs for precise 3D mapping, 9 radars for all-weather object detection, and inertial measurement units () for motion tracking and stabilization. This multi-modal configuration is modular, allowing developers to scale sensor configurations based on autonomy level requirements, from partial automation in L2+ systems to fully driverless operation in L4 environments, while ensuring redundancy for safety-critical functions. Integration within Hyperion combines the DRIVE AGX SoC family with the DRIVE OS automotive operating system, delivering a unified foundation for sensor data processing and AI inference. The platform has achieved ASIL-D certification, the highest level under , through rigorous verification of hardware and software components. Cybersecurity is enhanced by hardware-based isolation mechanisms, including a that enables secure partitioning of compute domains to prevent interference between safety-critical and non-critical processes. In 2025, Hyperion reached significant milestones, including approvals from TÜV SÜD in January for automotive safety and cybersecurity compliance, validating its readiness for commercial applications. Additionally, in October, partnered with to deploy up to 100,000 Hyperion 10-based robotaxis starting in 2027, marking a major step toward large-scale Level 4 mobility networks. The platform's scalability allows compute upgrades across SoC generations without requiring a full redesign, supporting robust 360-degree for diverse use cases in passenger and commercial .

Ecosystem and Deployments

Partnerships with OEMs

NVIDIA DRIVE has formed extensive partnerships with original equipment manufacturers (OEMs) to integrate its AI computing platforms into production vehicles for advanced driver-assistance systems (ADAS) and autonomous driving capabilities. Key collaborations include , which adopted the DRIVE Orin system-on-a-chip for its Drive Pilot Level 3 autonomous driving feature, enabling hands-free operation on highways in models like the EQS and S-Class starting in late 2023. selected DRIVE Orin for the core computing in its EX90 electric , with plans to incorporate the more powerful DRIVE Thor platform in future models around 2026 to support enhanced AI-driven safety and infotainment features. partnered with NVIDIA to deploy DRIVE Thor across its vehicle lineup, including autonomous vehicle prototypes based on the IONIQ 5 for applications in collaboration with partners like . Tier-1 suppliers have also deepened integrations with DRIVE to support OEM deployments. Bosch announced in 2025 a strategic collaboration to embed DRIVE AGX Thor into its next-generation vehicle platforms, combining NVIDIA's AI compute with Bosch's safety systems for scalable software-defined vehicles. Continental expanded its long-standing partnership with NVIDIA, committing to mass-produce DRIVE Thor-based systems for autonomous trucking by 2027 in alliance with Aurora Innovation. ZF Friedrichshafen has utilized NVIDIA DRIVE since 2017 for its ProAI supercomputer, focusing on sensor fusion from cameras, lidar, and radar to enable AI-based perception in cars, trucks, and commercial vehicles. In the robotaxi sector, powers deployments through key alliances. Uber partnered with NVIDIA in 2025 to scale a global ride-hailing network, deploying up to 100,000 DRIVE AGX Hyperion-ready vehicles starting in 2027, with initial L4 autonomous fleets from OEMs like . Zoox, Amazon's autonomous mobility subsidiary, has leveraged since 2017 for its bidirectional fleet, using for purpose-built, fully driverless ride-hailing services tested in cities like and . By 2025, the NVIDIA DRIVE ecosystem encompassed over 370 partners worldwide, including more than 80 automakers and suppliers, with significant adoption among Chinese firms. , the world's largest producer, expanded its partnership to incorporate DRIVE Orin across its and series starting in 2023, transitioning to DRIVE Thor for next-generation software-defined EVs to enable automated driving and in-cabin . Other featured OEM partners include , , , and , all integrating DRIVE platforms for highly automated vehicle fleets. These partnerships often involve co-development of custom system-on-chips tailored to OEM requirements, alongside revenue streams from intellectual property licensing for DRIVE software and AI algorithms, enabling scalable deployment without full hardware redesigns.

Safety Certifications and Commercial Use

NVIDIA DRIVE platforms, including the AGX series starting with , have been certified to ASIL-D standards for in automotive applications, ensuring high integrity for safety-critical systems. This certification covers core development processes for hardware and software, enabling reliable performance in autonomous driving scenarios. In January 2025, the DRIVE Hyperion platform achieved additional milestones with certifications from TÜV SÜD for ISO 21434 cybersecurity processes and from TÜV Rheinland confirming compliance with UNECE safety requirements for complex electronic systems. These validations address cybersecurity management under UNECE Regulation No. 155 and support secure engineering practices across SoCs, platforms, and software. The platform's redundant processing and fault-tolerant design allow vehicles to detect faults and reconfigure for continued safe operation, verified through billions of simulated miles in virtual environments. Commercially, DRIVE powers advanced driver assistance in production vehicles, such as the , which entered production in 2024 with NVIDIA DRIVE as its central computing platform for automated driving and over-the-air updates. Partnerships with OEMs like and have enabled Level 3 approvals in models like the S-Class in 2023 and scaling toward over 100,000 autonomous vehicles by 2027. These deployments address challenges in edge cases via simulation-based testing and facilitate OTA updates to mitigate recalls and enhance system reliability without physical interventions.

References

  1. [1]
    Autonomous Vehicle & Self-Driving Car Technology from NVIDIA
    In-Vehicle Compute Platform​​ NVIDIA DRIVE AGX Hyperion™ 10 is a reference production computer that streamlines development for autonomous systems.
  2. [2]
    NVIDIA Paves Way for Tomorrow's Cars With NVIDIA DRIVE ...
    Jan 4, 2015 · NVIDIA DRIVE PX is an auto-pilot computing platform that can run capabilities to provide a seamless 360-degree view around the car and true ...
  3. [3]
    NVIDIA Introduces DRIVE AGX Orin — Advanced, Software-Defined ...
    Dec 17, 2019 · NVIDIA's (NASDAQ: NVDA) invention of the GPU in 1999 sparked the growth of the PC gaming market, redefined modern computer graphics and ...Missing: first | Show results with:first
  4. [4]
    In-Vehicle Computing for Autonomous Vehicles - NVIDIA
    NVIDIA DRIVE AGX gives you a scalable and energy-efficient AI computing platform designed to process the complex workloads required for autonomous driving.
  5. [5]
    NVIDIA DRIVE Solutions for Autonomous Vehicles - NVIDIA Developer
    The DRIVE AGX™ Developer Kit is an in-vehicle platform designed for developing production-level autonomous vehicles. NVIDIA's AV Infrastructure platform ...NVIDIA DriveOS SDK · DRIVE Downloads · DRIVE Documentation · Drive agx faq
  6. [6]
    nvidia drive documentation
    Learn how to develop for NVIDIA DRIVE, a scalable computing platform that enables automakers and Tier-1 suppliers to accelerate production of autonomous ...
  7. [7]
    NVIDIA DRIVE Powers Next Generation of Transportation
    Mar 18, 2024 · Since its founding in 1993, NVIDIA (NASDAQ: NVDA) has been a pioneer in accelerated computing. The company's invention of the GPU in 1999 ...
  8. [8]
    Embark Develops Autonomous Trucking Platform on NVIDIA
    Aug 9, 2021 · The NVIDIA DRIVE platform is the first scalable AI ... It combines deep learning, sensor fusion and surround vision for a safe driving experience.Missing: computer | Show results with:computer
  9. [9]
    NVIDIA Boosts IQ of Self-Driving Cars With World's First In-Car ...
    Jan 4, 2016 · NVIDIA DRIVE PX 2 artificial intelligence car computer features two next-generation Tegra processors and two next-generation GPUs to deliver unprecedented ...
  10. [10]
    [PDF] NVIDIA Autonomous Vehicles Safety Report
    It all starts with NVIDIA DRIVE®, our highly scalable platform that can enable all levels of autonomous driving as defined by the Society of Automotive.Missing: fusion | Show results with:fusion
  11. [11]
    NVIDIA Expands Automotive Ecosystem With Physical AI
    Mar 18, 2025 · BEVFormer, a state-of-the-art transformer-based model that fuses multi-frame camera data into a unified bird's-eye-view representation for 3D ...
  12. [12]
    DRIVE AGX Autonomous Vehicle Development Platform
    The highly scalable NVIDIA DRIVE platform allows developers to build, extend, and leverage one development investment across an entire fleet.
  13. [13]
    NVIDIA Makes the World Robotaxi-Ready With Uber Partnership to ...
    Oct 28, 2025 · At the core of DRIVE AGX Hyperion 10 are two performance-packed DRIVE AGX Thor in-vehicle platforms based on NVIDIA Blackwell architecture.
  14. [14]
    [PDF] NVIDIA Drive | Level 2+ Autonomous Vehicle Solution
    > Adaptive Cruise Control. > Lane Centering. > Driver-Initiated Lane Change. > Automatic Lane Change. > Lane Fork to Follow Route. (Highway Interchange). > Lane ...
  15. [15]
    Wave of EV Makers Choose NVIDIA DRIVE for Automated Driving
    Jan 8, 2024 · “The AI car computer of choice for today's intelligent fleets is NVIDIA ... energy-efficient, high-performance NVIDIA DRIVE car computing platform ...
  16. [16]
    Cohda Wireless Fuses V2X with Surround Vision on NVIDIA DRIVE ...
    Jan 8, 2018 · As a global leader in mobile, outdoor communications, Cohda's V2X solutions, which support both wireless 802.11p and 5G mobile networks ...
  17. [17]
    NVIDIA Slides Supercomputing Technology Into the Car With Tegra K1
    Jan 5, 2014 · The NVIDIA Tegra Visual Computing Module allows in-vehicle systems to be easily upgraded as newer hardware becomes available.Missing: founding | Show results with:founding
  18. [18]
    All new Teslas are equipped with NVIDIA's new Drive PX 2 AI ...
    Oct 21, 2016 · We reported in exclusivity earlier this month that Tesla was going with Nvidia hardware to power its imagine processing platform called ...
  19. [19]
    NVIDIA Announces World's First AI Computer to Make Robotaxis a ...
    Oct 10, 2017 · NVIDIA's (NASDAQ: NVDA) invention of the GPU in 1999 sparked the growth of the PC gaming market, redefined modern computer graphics and ...
  20. [20]
  21. [21]
    DRIVE Platform Moves to NVIDIA Ampere Architecture
    May 14, 2020 · The newly expanded NVIDIA DRIVE lineup starts at an NCAP 5-star ADAS system and runs all the way to a DRIVE AGX Pegasus robotaxi platform.Missing: history timeline
  22. [22]
    NVIDIA Unveils NVIDIA DRIVE Atlan, an AI Data Center on Wheels ...
    Apr 12, 2021 · NVIDIA's (NASDAQ: NVDA) invention of the GPU in 1999 sparked the growth of the PC gaming market and has redefined modern computer graphics, high ...Missing: first | Show results with:first
  23. [23]
    NVIDIA Unveils DRIVE Thor — Centralized Car Computer Unifying ...
    Sep 20, 2022 · “The virtualization, high-speed data transfer and massive processing performance of NVIDIA DRIVE Thor can enable safer vehicles, better user ...
  24. [24]
    NVIDIA Rolls Out DRIVE AGX Thor Developer Kit to World's ...
    now available for preorder today, with delivery in September — provides developers and researchers ...
  25. [25]
    NVIDIA DRIVE Full-Stack Autonomous Vehicle Software Rolls Out
    Global transportation industry taps NVIDIA AI, AV software and accelerated compute to build future AI-defined vehicles. June 11, 2025 by Xinzhou Wu.
  26. [26]
    NVIDIA DRIVE Hyperion Platform Achieves Critical Automotive ...
    Jan 6, 2025 · Its simplified architecture enhances generalization, reduces latency and boosts safety by harnessing powerful NVIDIA accelerated computing to ...Missing: key | Show results with:key
  27. [27]
    Toyota, Aurora and Continental Join Growing List of NVIDIA ...
    Jan 6, 2025 · Toyota, Aurora and Continental Join Growing List of NVIDIA Partners Rolling Out Next-Generation Highly Automated and Autonomous Vehicle Fleets.
  28. [28]
    NVIDIA Automotive Supercomputer Wants to Drive your Car
    Jan 6, 2016 · The Drive PX 2 requires about 250W of power as it contains two Tegra processors and two full GPUs. NVIDIA used water cooling to reduce the ...
  29. [29]
    Uber taps Nvidia to supply its self-driving tech - Mashable
    Uber announced at CES 2018 that it had chose Nvidia to be its supplier of self-driving tech, chips and AI algorithms, going forward.
  30. [30]
    Jetson AGX Xavier Series - NVIDIA
    At just 100 x 87 mm, Jetson AGX Xavier offers big workstation performance at 1/10 the size of a workstation. This makes it ideal for autonomous machines like ...
  31. [31]
    [PDF] Jetson AGX Xavier and the New Era of Autonomous Machines
    2x Vision Accelerator engines. Optimized offloading of imaging & vision algorithms – feature detection. & matching, stereo, optical flow.
  32. [32]
    [PDF] comprehensive solutions for self-driving vehicles - NVIDIA
    Configurations include: > NVIDIA DRIVE AGX Xavier: This system is designed for Level 2+ advanced driver assistance systems and Level 3 automated ...
  33. [33]
    Understanding Security | NVIDIA Docs
    NVIDIA DRIVE™ OS security services ensure the confidentiality of critical system secrets such as root keys and other device configuration information.
  34. [34]
    NVIDIA DRIVE Hyperion 7.1 - NVIDIA Developer
    DRIVE AGX Xavier Developer Kit (SKU 2000): ; Image Signal Processor (ISP), 1.5 Gigapixels/s ; Video encoder, Up to 1.2 GPix/s ; Video decoder, Up to 1.8 GPix/s.
  35. [35]
    Volvo Cars, Zoox, SAIC and More Join Growing Range of ...
    Apr 12, 2021 · DiDi, China's leading mobility-as-a-service provider, has also announced it is adopting NVIDIA DRIVE for its entire autonomous driving test ...Missing: AGX Rover
  36. [36]
    [PDF] DRIVE AGX Orin Development Platform
    254 INT8 TOPS—CUDA Tensor Core GPU + DLA. • 12 Cortex®-A78A (Hercules) ARM64 CPU. • Up to 200 GB/s memory bandwidth. • 4 R52 Lock-step Pairs Integrated ...
  37. [37]
    Jetson AGX Orin for Next-Gen Robotics - NVIDIA
    Jetson AGX Orin modules deliver up to 275 TOPS of AI performance with power configurable between 15W and 60W. This gives you up to 8X the performance of Jetson ...Jetson Xavier Series · Jetson Orin Nano Super · Skip to main content
  38. [38]
    Mercedes Benz Automotive Partner - NVIDIA
    This collaboration will use NVIDIA DRIVE Orin™ to create software-defined vehicles and NVIDIA Omniverse™ for more intelligent and efficient manufacturing ...
  39. [39]
    Volvo Cars Automotive Partner - NVIDIA
    The ES90 is the most powerful vehicle Volvo Cars has ever created in terms of core computing capacity. Learn how the dual NVIDIA DRIVE AGX Orin™ configuration ...
  40. [40]
    Accelerate Autonomous Vehicle Development with the NVIDIA ...
    Sep 3, 2025 · Modern vehicles now require not only advanced perception and sensor fusion, but also end-to-end deep learning pipelines that enable ...
  41. [41]
    NVIDIA Orin vs Thor: The Generational Leap in Edge AI Computing
    Thor: Single-chip performance reaches 2000 TOPS (INT8) – 8× Orin-X (up to 20× in transformer workloads). Scalable ...
  42. [42]
    Aurora, Continental, and NVIDIA Partner to Deploy Driverless Trucks ...
    Jan 6, 2025 · The Aurora Driver is a self-driving system designed to operate multiple vehicle types, from freight-hauling trucks to ride-hailing passenger ...
  43. [43]
    NVIDIA DriveOS SDK
    Designed for safety-critical applications. Complies with ASPICE, ISO 26262, and ISO/SAE 21434. Supports heterogeneous redundancy for compute workloads.
  44. [44]
    Introduction - NVIDIA Docs
    Mar 3, 2025 · With NVIDIA DRIVE, our partners can achieve the highest levels of safety with an architecture featuring diversity and redundancy in the ...
  45. [45]
    NVIDIA DRIVE OS 5.2 SDK Developer Guide
    Sep 9, 2020 · NVIDIA DRIVE OS 5.2 SDK consists of all required software, libraries, and tools to build, debug, profile, and deploy applications for autonomous vehicles and ...Nvidia drive os 5.2 sdk · Linux Filesystems · NVIDIA Build-Kit · CAN Driver
  46. [46]
    DRIVE OS 6.0 Installation Guide for NVIDIA Developer
    Nov 30, 2022 · NVIDIA DRIVE® OS 6.0 is the reference operating system and associated software stack designed specifically for developing and deploying
  47. [47]
    [PDF] NVIDIA DriveOS 7.0 Features
    DriveOS eliminates its need for a dedicated core, allowing all system cores to be fully utilized by Guest OS VMs, which then delegate cycles to DriveOS services ...Missing: Blackwell monitors real-
  48. [48]
    NVIDIA DRIVE Hyperion Platform Achieves Critical Automotive ...
    Jan 6, 2025 · Adopted and Backed by Automotive Manufacturers and Safety Authorities, Latest Iteration to Feature DRIVE Thor on NVIDIA Blackwell Running ...
  49. [49]
    Nvidia certifies Drive OS to ASIL-D, but on Orin ... - eeNews Europe
    Jan 7, 2025 · Nvidia has certified its Drive OS operating system with TÜV SÜD and TÜV Rheinland in Germany to provide the first ASIL-D operating system.Missing: DriveWorks abstraction
  50. [50]
    NVIDIA DriveWorks SDK - NVIDIA Developer
    Sensor Abstraction Layer. NVIDIA DriveWorks provides a Sensor Abstraction Layer that supports capturing of data from various sources. It is designed to ...
  51. [51]
    Over the Air Updates — NVIDIA DriveOS 7.0.3 Linux SDK Developer ...
    Jul 25, 2025 · Key Features · Installing NVIDIA Build-FS · Editing NVIDIA Build-FS ... To get basic help for nvidia-xconfig · To get options for modifying ...
  52. [52]
    Object Detection and Lane Segmentation Using Multiple ...
    Jun 20, 2019 · This post dives into an application that runs two deep learning models concurrently to do both object recognition and ego-lane segmentation on an image.Missing: stack Transformer nuScenes
  53. [53]
    [PDF] Deep Learning for Autonomous Vehicles—Perception | NVIDIA
    This workshop teaches how to build and train a semantic segmentation network for autonomous vehicle perception, including lane navigation and pedestrian  ...Missing: stack Transformer nuScenes
  54. [54]
    SegFormer - NVIDIA Docs
    Aug 26, 2024 · SegFormer is an NVIDIA-developed semantic-segmentation model that is included in the TAO Toolkit. SegFormer supports the following tasks:
  55. [55]
    nuScenes
    nuScenes is a large-scale autonomous driving dataset with 3D object annotations, full sensor suite, 1000 scenes, and 1.4M camera images.
  56. [56]
    End-to-End Driving at Scale with Hydra-MDP | NVIDIA Technical Blog
    Jun 17, 2024 · This enables the model to learn diverse trajectories, improving generalization across diverse driving environments and conditions.Missing: module optimization
  57. [57]
    Nvidia LiteVLM cuts AI latency for self-driving cars - Tech in Asia
    Jun 11, 2025 · Key stat: The new pipeline achieves a 2.5 reduction in end-to-end latency on the Nvidia Drive Thor platform compared to traditional models.
  58. [58]
    RoboSense integrates LiDAR portfolio with NVIDIA DRIVE ... - Gasgoo
    Sep 12, 2025 · The NVIDIA DRIVE AGX platform is designed to efficiently process data from multiple sensor types—including LiDAR, cameras, and radar—using ...
  59. [59]
    Autonomous Vehicle Simulation | Use Cases - NVIDIA
    Render diverse driving conditions—such as adverse weather, traffic changes, and rare or dangerous scenarios—without having to encounter them in the real world.The Need For High-Fidelity... · Running Physically Accurate... · News
  60. [60]
    Object detection task - nuScenes
    An overview of the object detection task in the nuScenes dataset.
  61. [61]
    Autonomous Vehicle Simulation | NVIDIA Developer
    Build Autonomous Vehicles in Virtual Environments. Accelerate AV development with simulation and synthetic data generation (SDG) pipelines powered by NVIDIA's ...Missing: 2023 | Show results with:2023
  62. [62]
    AI Infrastructure for Autonomous Vehicles
    ### Summary of NVIDIA AI Infrastructure for Autonomous Vehicles
  63. [63]
    [PDF] NVIDIA DRIVE CONSTELLATION
    Oct 19, 2019 · NVIDIA DRIVE Constellation™ is an open, scalable, autonomous vehicle validation platform that enables a virtual test fleet in the data center.Missing: HIL | Show results with:HIL
  64. [64]
    IPG Automotive Collaborating with NVIDIA on DRIVE Constellation ...
    Mar 14, 2019 · NVIDIA DRIVE Constellation is a hardware-in-the-loop (HIL) virtual reality simulation platform, designed to support the development and ...
  65. [65]
    [PDF] NVIDIA DRIVE AGX Thor Development Platform
    Blackwell GPU With Generative AI Engine. • FP32, 16, 8, and now 4-bit floating point AI support. • Quantization aware training and deployment with.Missing: GB10 | Show results with:GB10
  66. [66]
    NVIDIA Jetson AGX Orin Developer Kit, Nano ITX - Amazon.com
    Jetson AGX Orin features an NVIDIA Ampere architecture GPU together with next-generation deep learning and vision accelerators, and its high-speed IO and fast ...
  67. [67]
  68. [68]
    NVIDIA DRIVE AGX Orin Developer Kit with DRIVE OS 6
    Aug 31, 2022 · DRIVE AGX Orin includes a base kit for bench development and an add-on vehicle kit for vehicle installation. The platform also has a smaller ...
  69. [69]
    NVIDIA's New DRIVE Hyperion 8 Released, Includes Radar & Lidar
    Nov 10, 2021 · NVIDIA just announced its own version of a computer architecture, sensor set, and full-self driving software (in some scenarios), DRIVE Hyperian 8.Missing: IMUs 7.1 Xavier Thor OS isolation
  70. [70]
    Sensors are key to Nvidia tie-up with Uber for 100k robotaxis
    Oct 28, 2025 · Nvidia is partnering with Uber to support up to 100,000 robotaxis starting in 2027 with its Nvidia Drive AV software platform and Drive AGX ...
  71. [71]
  72. [72]
    NVIDIA Rolls Out DRIVE AGX Thor Developer Kit to World's ... - Reddit
    Aug 28, 2025 · This multi-compute domain isolation lets concurrent time-critical processes run without interruption. On one computer, the vehicle can ...
  73. [73]
    Partner Innovations and Ecosystem | NVIDIA DRIVE
    NVIDIA DRIVE partners include vehicle manufacturers, tier 1 suppliers, mapping, simulation, software, sensor providers, and startups, such as General Motors, ...Mercedes-Benz · BYD · JLR · MediaTek<|separator|>
  74. [74]
    Mercedes-Benz Vaults To Level 3 Autonomy As Sales Slow
    Oct 10, 2023 · Mercedes-Benz plans to introduce its Level 3 autonomous driving system, "Drive Pilot," in late 2023 on its EQS and S-class sedans.
  75. [75]
    Volvo Cars EX90 SUV Rolls Out, Powered by NVIDIA
    Automaker announces plans to integrate NVIDIA DRIVE Thor for future models and taps NVIDIA DGX for advanced AI training. September 4, 2024 by Ali Kani.Missing: 2026 | Show results with:2026
  76. [76]
    Hyundai Motor Group Announces NVIDIA Blackwell AI Factory to ...
    Oct 31, 2025 · Inside the vehicle, NVIDIA DRIVE AGX Thor, accelerated compute running on safety-certified DriveOS operating system, is set to provide the AI ...
  77. [77]
    Bosch To Integrate NVIDIA DRIVE AGX Thor Into Next-Generation ...
    Sep 30, 2025 · With performance of up to 2,000 FP4 TFLOPS, the platform enables high-speed processing for complex driving and in-vehicle experience tasks. It ...Missing: samples | Show results with:samples
  78. [78]
    Transportation Supplier ZF and NVIDIA Announce AI-Based Self ...
    Jan 4, 2017 · ZF ProAI can process inputs from multiple cameras, plus lidar, radar and ultrasonic sensors, in a process called sensor fusion. This will ...
  79. [79]
    Zoox Unveils Robotaxi Powered by NVIDIA
    Dec 17, 2020 · Designed for everyday urban mobility, the vehicle is powered by NVIDIA and is one of the first level 5 robotaxis featuring bi-directional capabilities.Missing: AGX Xavier
  80. [80]
    Innovations by Automotive Industry Partners | NVIDIA DRIVE
    Learn how Audi, Tesla, Mercedes-Benz, BMW, and Honda use NVIDIA DRIVE to design the cars of tomorrow.Missing: 2016 | Show results with:2016
  81. [81]
    BYD, World's Largest EV Maker, Partners With NVIDIA for ...
    Mar 21, 2023 · Beyond selecting NVIDIA DRIVE Orin for its EV fleets, BYD announced earlier this year that it is working with NVIDIA to enhance the in-vehicle ...
  82. [82]
  83. [83]
    NVIDIA Introduces DRIVE Constellation Simulation System to Safely ...
    Mar 27, 2018 · NVIDIA's (NASDAQ:NVDA) invention of the GPU in 1999 sparked the growth of the PC gaming market, redefined modern computer graphics and ...
  84. [84]
    Polestar Automotive Partner - NVIDIA
    Polestar uses NVIDIA DRIVE as the AI brain for Polestar 3, enabling automated driving and over-the-air updates, and to develop safe, intelligent features.Missing: 2024 | Show results with:2024
  85. [85]
    Production of Polestar 3 starts in South Carolina
    Aug 14, 2024 · Polestar 3 production started in South Carolina, USA, making it the first Polestar produced on two continents, for US and European customers.