Fact-checked by Grok 2 weeks ago

Nvidia Jetson

NVIDIA Jetson is a family of compact, power-efficient computing modules and kits developed by for accelerating () workloads at the edge, particularly in , , and autonomous systems. These platforms integrate 's GPU architecture with CPU and other components to deliver in small form factors, enabling to build and deploy applications without relying on infrastructure. The primary purpose of Jetson is to provide accelerated AI performance for edge devices across various industries, including , , , healthcare, and . Key features include support for NVIDIA's CUDA-X software libraries, integration with cloud-native technologies, and that allows operation in power-constrained environments, such as from 7W to 130W depending on the module. Jetson platforms are enabled by the NVIDIA JetPack SDK, which offers comprehensive tools for AI model , deployment, and optimization, streamlining the process from prototyping to production. The Jetson product lineup spans multiple generations, offering scalability for different performance needs:
  • Jetson AGX Thor: Delivers up to 2070 FP4 TFLOPS with 128 GB memory and configurable power from 40–130 W, targeted at next-generation robotics.
  • Jetson AGX Orin: Provides up to 275 TOPS of AI performance, representing 8x the capability of its predecessor generation.
  • Jetson Orin NX: Offers up to 157 TOPS in the smallest form factor within the Orin series.
  • Jetson Orin Nano: Achieves up to 67 TOPS at 7–25 W, suitable for cost-effective edge deployments.
  • Jetson AGX Xavier: Features configurable 10–40 W power with 20x the performance of the Jetson TX2.
  • Jetson Xavier NX: Delivers up to 21 TOPS for compact AI applications.
  • Jetson TX2: Provides up to 2.5x the performance of the Jetson Nano for embedded systems.
  • Jetson Nano: A low-power module for entry-level embedded AI and IoT projects.
Supporting this hardware ecosystem, Jetson includes extensive software resources such as the JetPack SDK, which encompasses Linux-based operating systems, frameworks like TensorRT, and simulation tools for development. Applications powered by Jetson range from real-time in autonomous machines to generative in edge devices, fostering innovation in areas like smart cities and industrial automation. The platform's kits and community support further accelerate adoption by providing accessible entry points for prototyping solutions.

Introduction and History

Overview

The NVIDIA Jetson platform comprises a family of compact, low-power System-on-Chip (SoC) modules and developer kits engineered to accelerate inference and training in environments. These components form the foundation for AI hardware, targeting applications that require on-device processing without reliance on centralized resources. At its core, Jetson enables the integration of advanced into resource-constrained settings, such as , drones, and (IoT) devices, through built-in GPU acceleration powered by NVIDIA's platform. This design facilitates real-time execution in scenarios demanding low and , bridging the gap between and portable, autonomous systems. Jetson supports a range of AI tasks at the edge, including for object detection and image analysis, for voice interactions, and generative AI for creating content, all while maintaining consumption ranging from 5 W to 130 W across modules, depending on configuration and workload. The platform's hardware relies on Tegra SoCs for seamless CPU-GPU integration, complemented by the JetPack software stack that provides optimized libraries and tools for development. Initially appealing to hobbyists and developers for prototyping, Jetson has matured into industrial-grade solutions capable of scaling to production-level deployments in demanding operational contexts.

Development Timeline

NVIDIA launched the Jetson platform in 2014 with the Jetson TK1 development kit, representing the company's initial effort to provide developers with an accessible embedded computing solution for AI applications. Announced at the GPU Technology Conference (GTC) on March 25, 2014, the TK1 was positioned as a "supercomputer on a module" to enable computer vision and early deep learning tasks in mobile and embedded systems. This inception aligned with the burgeoning deep learning revolution, allowing NVIDIA to extend GPU-accelerated computing beyond data centers to edge devices for prototyping innovative AI solutions. Between 2015 and 2017, NVIDIA expanded the platform with the Jetson TX1 in March 2015 and the Jetson TX2 in March 2017, focusing on advancements for mobile and automotive prototypes. The TX1 introduced higher performance for real-time , while the TX2 further optimized power efficiency and compute capabilities, broadening adoption in embedded development. These releases reflected 's strategic push to address the rising demand for processing in response to the boom, fostering partnerships such as with to power autonomous drones. From to 2020, the introduction of the Volta-based Jetson Xavier series marked a shift toward production-ready modules suitable for industrial applications, while the Pascal-based Jetson Nano provided an entry-level option for hobbyists and developers. Announced in June at GTC and made available in September , the AGX Xavier and subsequent Xavier NX in 2019 emphasized scalable AI deployment in robotics and environments. The Jetson Nano, announced in March 2019, offered a low-cost developer kit to democratize AI prototyping. The 2022-2023 period saw the rollout of the Ampere-based Jetson series, prioritizing scalable performance for in autonomous systems. Announced in November 2021 and available starting March 2022, the AGX and Orin Nano variants built on prior generations to support more complex workloads at the . In August 2025, NVIDIA announced and released the Jetson AGX Thor, transitioning to the Blackwell architecture to advance physical and capabilities. Unveiled on August 25, 2025, this platform targets next-generation and general-purpose robots, continuing 's evolution toward ubiquitous edge .

Hardware

Modules and Developer Kits

Jetson products are available in compute module form factors designed for integration into custom systems, as well as developer kits that serve as designs for prototyping and . Compute modules, such as the Jetson Orin NX, AGX Orin series, and AGX Thor, enable developers to embed high-performance capabilities into tailored solutions. These modules feature standardized pinouts that support (GPIO) via 40-pin expansion headers, high-speed camera interfaces through MIPI CSI-2 lanes (up to 16 lanes on AGX Orin), and Ethernet connectivity options ranging from 1 GbE to 25 GbE. Developer kits provide complete, ready-to-use platforms with a Jetson module mounted on a reference carrier board, facilitating rapid and testing. The Jetson Nano Developer Kit, released in 2019, offers a compact design targeted at hobbyists, students, and entry-level makers, including essential peripherals for AI experimentation. In contrast, the Jetson AGX Orin Developer Kit, introduced in 2022, targets industrial and professional applications with an expansive array of ports for connectivity and expansion. The Jetson AGX Thor Developer Kit, released in August 2025, is designed for next-generation humanoid robotics and physical , featuring the Blackwell GPU architecture. A notable update in December 2024 enhanced the Jetson Orin Nano Developer Kit to the "Super" variant, reducing its price to $249 while maintaining compatibility through software upgrades for improved accessibility. Jetson modules adopt compact form factors to suit diverse deployment needs, with the Jetson Nano measuring approximately 70 mm x 45 mm for space-constrained applications. The Orin NX and Orin Nano modules follow a similar small footprint at 69.6 mm x 45 mm, while the larger AGX Orin and AGX Thor modules span 100 mm x 87 mm to accommodate higher power and I/O demands. Power delivery typically occurs via a DC jack supporting input voltages from 5 V to 20 V, with options for (PoE) through compatible carrier boards or add-on hats for simplified cabling in networked setups. Cooling solutions vary by configuration, including passive heatsinks for low-power modes and active fan-based systems for sustained high-performance operation. Production modules are optimized for volume deployment in end-user devices, featuring ruggedized designs for environments, whereas developer kits emphasize with pre-integrated components like reference carrier boards. Key integration features across both include M.2 slots for NVMe storage (e.g., Key M with PCIe Gen3/4/5 support), multiple USB 3.2 ports for high-speed peripherals, and display outputs via or interfaces (up to 8K on AGX kits). These elements allow seamless embedding into , IoT devices, and edge systems without requiring extensive custom hardware redesign.

Key Specifications

NVIDIA Jetson platforms feature unified memory architectures that enable seamless sharing of system memory between the CPU and GPU, facilitating efficient data access without explicit transfers. Memory configurations typically utilize LPDDR4 or LPDDR5 variants, with capacities ranging from 4 in entry-level modules to 128 in high-end variants, supporting bandwidths up to 273 /s in advanced LPDDR5X implementations. This shared design optimizes resource utilization for and workloads. The CPU subsystems in Jetson modules are based on ARM architectures, incorporating multi-core setups such as quad-core Cortex-A57 processors in earlier generations, up to 12-core Cortex-A78AE configurations in series, or 14-core V3AE in the latest Thor generation. Earlier generations are complemented by dedicated accelerators, including up to two NVIDIA Deep Learning Accelerator (NVDLA) engines, which offload inference tasks from the main CPU cores; the latest Thor generation relies on its integrated Blackwell GPU for such tasks. Connectivity options across Jetson platforms include standard interfaces like PCIe (up to x16 Gen5 lanes), USB 3.2 ports (with support for multiple high-speed devices), for wired networking (up to 25 GbE), and slots for wireless modules enabling and integration. These interfaces provide flexible expansion for sensors, storage, and peripherals in embedded applications. Power and thermal management in Jetson modules support configurable TDP ranges from 5 W in low-power variants to 130 W in AGX series, with multiple efficiency modes (such as MAX-N and MAX-Q) allowing dynamic adjustment based on workload demands. Thermal throttling mechanisms automatically reduce clock speeds and power draw to prevent overheating, ensuring reliable operation in compact, fanless designs. Select modules include onboard eMMC storage with capacities from 16 GB to 64 GB for bootable operating systems and applications, while the latest AGX Thor uses external NVMe SSD support via PCIe slots for expanded storage up to several terabytes for data-intensive tasks.

Compute Architectures and Performance

Maxwell and Pascal Generations

The Jetson , introduced in 2015, marked the transition to the GPU architecture with 256 cores (compute capability 5.3), 4 GB of LPDDR4 memory, and up to 1 TFLOPS of FP32 performance, emphasizing mobile AI inference and video processing capabilities. Operating within a 10-15 W TDP envelope—typically 8-10 W under loads—it supported advanced features like hardware-accelerated video encoding and exceeded contemporary CPUs in perf-per-watt for classification, such as achieving 258 images per second on Caffe compared to 242 on an i7-6700K. Early benchmarks highlighted its suitability for , with modified GoogleNet models enabling real-time inference on half-HD (960×540) video streams after optimization with TensorRT, doubling performance over unoptimized frameworks. The Jetson Nano, released in 2019, is an entry-level module with 128 CUDA cores, 4 GB of LPDDR4 memory, and 472 GFLOPS of FP16 performance, targeted at and hobbyist projects due to its accessible pricing. It operates at a 5-10 W TDP and supports basic and tasks. Building on the TX1, the Jetson TX2 arrived in 2017 with a Pascal-family GPU retaining 256 cores but paired with an upgraded heterogeneous CPU complex of dual 2 64-bit cores and quad cores, 8 GB of LPDDR4 memory (59.7 GB/s bandwidth), and 1.3 TFLOPS of FP32 performance (peaking at 1.6 TFLOPS in high-load scenarios). It supported dual video decode and operated in configurable power modes, including MAXN (unlimited, up to 15 W+ for maximum throughput), MAXP (15 W balanced), and MAXQ (10 W efficiency-focused), allowing adaptation to battery-constrained environments. Representative benchmarks for , such as tasks, achieved around 48 in FP32 precision on low-resolution inputs, underscoring its role in deploying basic inference for and . However, both and Pascal generations lacked dedicated Tensor Cores, relying on FP16 via FP32 units, which limited efficiency for lower-precision workloads and precluded support for modern generative models requiring mixed-precision acceleration.

Volta Generation

The Volta generation of Nvidia Jetson modules marked a significant advancement in edge AI computing, introducing the Xavier series designed for scalable production in automotive, robotics, and industrial applications. Built on Nvidia's Volta microarchitecture, these modules integrated high-performance GPU cores with dedicated deep learning accelerators to deliver efficient AI inference at the edge. Released starting in 2018, the series emphasized balanced power efficiency and computational density, enabling real-time processing for complex workloads like autonomous driving and visual analytics. The flagship Jetson Xavier module, launched in 2018, features 512 Volta CUDA cores paired with two Deep Learning Accelerators (DLAs) for optimized AI tasks, along with 8 GB or 16 GB of LPDDR4 memory. It achieves up to 30 TOPS of INT8 performance, making it suitable for demanding applications in automotive and robotics where low-latency inference is critical. The Xavier NX, introduced in 2020 as a more compact variant, reduces the core count to 384 Volta CUDA cores while retaining one DLA and 8 GB of LPDDR4 memory, delivering 21 TOPS of INT8 performance within a configurable 10-20 W TDP range for space-constrained deployments. These modules introduced the first integrated DLAs in Jetson, which accelerate INT8 and FP16 operations for power-efficient deep learning, facilitating real-time edge analytics in production environments. The series is supported by JetPack 4.x SDK for seamless development. Performance benchmarks for the Volta generation highlight its capabilities in AI inference; for instance, the Xavier module can process ResNet-50 at up to 600 frames per second, while supporting multi-stream video analytics for up to 16 simultaneous 1080p streams.

Ampere Generation

The Ampere generation of Nvidia Jetson platforms, introduced with the series in 2022, represents a significant advancement in edge AI computing by leveraging the GPU to deliver high-performance for complex models. This generation emphasizes sparsity acceleration through third-generation Tensor Cores, which exploit structured sparsity patterns in neural networks to double effective throughput for supported workloads, enabling efficient deployment of large language models and other transformer-based architectures at without dependency. Performance figures include sparsity acceleration where applicable. The series balances power efficiency and compute density, targeting applications in , autonomous machines, and intelligent vision systems. The lineup includes the Jetson Orin Nano, launched in 2022 and updated in 2024, featuring 1024 CUDA cores, 32 Tensor Cores, and two Deep Learning Accelerators (DLAs) for dedicated inference. It is equipped with 8 GB of LPDDR5 memory and delivers up to 67 (with sparsity), operating within a 7-25 W TDP range to suit low-power embedded devices. The Jetson Orin NX, released in 2023, maintains 1024 CUDA cores and 16 GB of LPDDR5 memory, achieving up to 157 (with sparsity) at a 10-25 W TDP, making it ideal for mid-range deployments requiring scalable pipelines. At the high end, the Jetson AGX , also from 2023, doubles the compute with 2048 CUDA cores, two DLAs, and a Programmable Vision Accelerator (PVA) for efficient video analytics, paired with 64 GB of LPDDR5 memory to provide up to 275 (with sparsity) and up to 60 W TDP for demanding, server-class tasks. Key features of the generation include enhanced sparsity support in the Tensor Cores, which accelerates pruned models by processing only non-zero weights in a 2:4 pattern, significantly boosting efficiency for large-scale inference. This enables running generative AI models, such as LLMs, directly on edge hardware with reduced latency and power draw. Performance benchmarks demonstrate the platform's capabilities, including over 1,000 for YOLO-based on the AGX , alongside hardware support for video decoding and processing, facilitating advanced in real-time applications.

Blackwell Generation

The NVIDIA Jetson AGX Thor, introduced in August 2025, represents the Blackwell generation of the Jetson platform, designed specifically for advanced physical and applications. Powered by a 2560-core Blackwell GPU architecture featuring 96 fifth-generation Tensor Cores, it includes a 14-core V3AE CPU and delivers up to 2070 sparse FP4 TFLOPS of AI compute performance. The module integrates 128 GB of LPDDR5X unified memory with a of 273 GB/s and supports a configurable (TDP) range of 40–130 W, enabling deployment in compact, energy-efficient edge systems. Key innovations in the Blackwell generation include native support for FP4 quantization, which optimizes generative AI models for low-precision inference without sacrificing accuracy, and an enhanced next-generation Engine integrated into the GPU for efficient processing of transformer-based architectures common in large language and vision models. The platform also leverages Blackwell's fifth-generation Tensor Cores, which provide advanced sparsity acceleration and a dedicated Engine to handle compressed data streams more effectively, reducing latency in AI pipelines. These features build on the sparsity optimizations from the prior Ampere-based generation by extending support to even lower precisions like FP4 for higher throughput in edge environments. In terms of performance, the Jetson AGX Thor achieves up to 7.5 times the AI compute capability and 3.5 times the compared to the Jetson AGX , enabling real-time processing for complex tasks such as multi-agent simulations and humanoid . For instance, it supports running generative models like 3.3 70B with up to 7 times higher throughput on updated software stacks, facilitating agentic behaviors in physical systems. Targeted primarily at robots and general-purpose physical , the platform's increased memory capacity and bandwidth—approximately 1.3 times that of —allow for handling larger models and datasets on-device, accelerating development in embodied without reliance on resources.

Software and Development Tools

JetPack SDK

The NVIDIA JetPack SDK serves as the primary development environment for the Jetson platform, bundling essential libraries and tools to enable accelerated application development at . It integrates core components such as for GPU-accelerated computing, cuDNN for deep neural network primitives, TensorRT for high-performance inference optimization, and DeepStream for building scalable video analytics pipelines, allowing developers to deploy models with low and efficient resource utilization. JetPack versions have evolved alongside Jetson hardware generations, with JetPack 4.x series (released 2019–2021) supporting Volta-based modules like the through runtime libraries optimized for , including TensorRT 8.x for model quantization. The JetPack 5.x series (introduced in 2022) targeted Ampere-based modules, incorporating updated libraries for enhanced AI workloads. Subsequent JetPack 6.x releases (2024–2025), with the latest being 6.2.1 as of November 2025, extended support for with further optimizations. JetPack 7.0 (launched in 2025) introduced compatibility for Blackwell-based Jetson Thor, featuring 13.x and runtime libraries tailored for advanced tasks. Key components of the JetPack SDK include CUDA 12.x (and later 13.x in recent versions) for parallel GPU programming, enabling developers to write and optimize compute-intensive applications. TensorRT 8.x through 10.x provides inference acceleration via techniques such as INT8 quantization to reduce model precision for faster execution and layer fusion to minimize memory overhead during inference. Additionally, cuDNN accelerates deep learning operations, while the Vision Programming Interface (VPI), succeeding VisionWorks, supports efficient computer vision pipelines for image and video processing tasks. Supporting tools within JetPack encompass Nsight Systems and Compute for and GPU applications, the Toolkit for to adapt pre-trained models with minimal data, and Riva for end-to-end speech pipelines including and . These tools facilitate and deployment of solutions on Jetson . In 2025 updates aligned with Blackwell architecture support in JetPack 7.0, added tools for FP4 model conversion, leveraging native FP4 precision in Jetson Thor to enable higher-throughput inference for large models while maintaining accuracy. JetPack remains compatible with select distributions for seamless integration.

Supported Operating Systems

The NVIDIA Jetson platform primarily supports Jetson Linux, also known as for (L4T), which is an -based operating system optimized for and applications. L4T integrates NVIDIA's drivers for GPU, CPU, and other components, enabling full utilization of the system's compute capabilities. As of November 2025, the latest release for Jetson Thor is Jetson Linux 38.2 (with 38.2.2 security patch from October 2025), built on 24.04 LTS with 6.8. For the Jetson series, the latest is Jetson Linux 36.4.4 (updated October 2025 for security), built on 22.04 LTS with 5.15. Earlier generations are supported as follows: Jetson series with 20.04 LTS and kernel 5.10 (L4T 35.x via JetPack 5.x); Jetson with 18.04 LTS and kernel 4.9 (L4T 32.x via JetPack 4.x). Canonical provides official for on Jetson platforms, including optimized images for stability and security in enterprise deployments. For real-time applications, NVIDIA offers a (RT) Kernel patchset based on , installable via over-the-air () updates on devices running . This kernel enhances deterministic performance for time-sensitive tasks while maintaining compliance, suitable for and industrial control. On select automotive-oriented configurations, such as those derived from Jetson Xavier for , Neutrino RTOS is supported through platforms, providing hard real-time capabilities with safety certifications. Native Windows support is unavailable on Jetson hardware; however, Windows users can develop and test applications using () on a machine, with deployment requiring native environments. Containerization is natively supported through and NVIDIA GPU Cloud (NGC) containers, allowing cloud-native AI workflows on Jetson devices with . Key middleware integrations include ROS and ROS 2 via the ROS framework, optimized for GPU-accelerated robotics pipelines; for efficient media processing and streaming; and OpenCV bindings for tasks, all pre-configured in L4T. Recent 2025 updates in L4T 38.x for Jetson Thor emphasize enhanced security features, including secure boot to enforce a from hardware fuses, firmware TPM (fTPM) for root-of-trust measurements, and DICE-based attestation for verifiable boot integrity. These OS components are deployed through the JetPack SDK, which handles and configuration.

Applications and Ecosystem

Primary Use Cases

Nvidia Jetson platforms enable a range of primary use cases across industries by providing edge computing for real-time processing of sensor data and tasks. In , Jetson Orin modules power autonomous navigation and manipulation systems, such as those in warehouse robots that employ (SLAM) for mapping environments and object grasping for handling items. For instance, the Jetson AGX Orin integrates with ROS Visual SLAM and Nvblox to facilitate precise localization and 3D mapping in dynamic settings like warehouses, allowing robots to navigate cluttered spaces efficiently. Companies like Slamcore leverage Jetson Orin for real-time object awareness, enhancing safety and efficiency in robotic operations within industrial warehouses by differentiating objects and sharing spatial data. In drones and unmanned aerial vehicles (UAVs), Jetson modules support real-time for avoidance and flight autonomy. The 2 utilizes the Jetson TX2 to process inputs from six cameras, enabling 360° avoidance in complex environments without GPS reliance. This setup allows the to detect and maneuver around obstacles at high speeds, supporting applications like aerial and . For industrial automation, Jetson platforms drive and through edge on factory sensors and cameras. The Jetson Orin Nano is deployed in cameras for defect detection, analyzing images in to identify anomalies like surface imperfections or errors, thereby reducing and improving manufacturing yield. NVIDIA's TAO Toolkit further accelerates the development of custom models for such tasks, enabling efficient deployment on Orin Nano for scalable industrial vision systems. In healthcare, Jetson systems facilitate portable diagnostics by running on-device AI for medical imaging analysis, including generative models for tasks like image enhancement or anomaly generation. The Jetson AGX Thor platform, with its support for large transformer models, enables real-time processing of multimodal data in medical devices, such as portable ultrasound or X-ray systems, allowing for immediate diagnostic insights in remote or field settings without cloud dependency. This capability is particularly valuable for generative AI applications that simulate or augment imaging data to aid clinicians in low-resource environments. Automotive applications benefit from Jetson modules in advanced driver-assistance systems (ADAS) prototypes, where they handle for enhanced perception and decision-making. The Jetson Xavier performs multi-sensor fusion from cameras, , and to support Level 2 autonomy features like and lane-keeping, processing fused data to detect vehicles, pedestrians, and road conditions in real-time. This integration provides the computational power needed for safe, partial automation in development vehicles. For edge AI inference in smart cities, Jetson Nano modules are commonly used for traffic monitoring, processing multiple video streams from urban cameras to analyze vehicle flow, detect congestion, and enforce regulations. Deployed at intersections, a single Jetson Nano can handle feeds from several cameras—typically 4 to 10 streams at resolution using optimized pipelines like DeepStream—enabling applications such as incident detection and optimized signal timing to improve urban mobility. The Jetson AGX Thor, released in 2025, extends these capabilities to humanoid and physical , delivering for complex tasks like and multimodal sensor integration in embodied systems.

Community and Partners

The Jetson platform fosters a vibrant through official , project showcases, and collaborative resources on the website. The Jetson Projects allows users to share detailed write-ups, code repositories, and links to external documentation, enabling peer feedback and inspiration for new applications. The has contributed numerous projects built with Jetson kits, spanning diverse domains such as , , healthcare, and education. Notable community projects include the SSL-Detector, a real-time system for detecting soccer balls and robots in RoboCup Small Size League matches, recognized as Project of the Month in July 2022; Bird@Edge, which uses Jetson for bird species identification in edge environments, highlighted in May 2022; and Neurorack, a deep -based module from December 2021. These examples demonstrate the 's focus on practical, innovative applications, often integrating Jetson's capabilities with open-source tools like TensorRT and DeepStream. is further supported by resources such as tutorials, code samples, and videos, encouraging contributions from hobbyists to professional developers. The Jetson AI Lab represents a specialized community hub dedicated to generative on Jetson platforms, offering tutorials on running large models (LLMs) like chatbots and text generation directly on edge devices. It includes interactive channels for discussions and collaboration among developers exploring inference at the edge. NVIDIA's for Jetson is bolstered by the NVIDIA Partner (NPN), which connects solution providers to accelerate development and deployment of -powered systems. Partners in this offer , software, and services tailored to Jetson, helping customers optimize pipelines for industries like , healthcare, and smart cities. The provides benefits such as access to NVIDIA's training, marketing support, and technical resources, ensuring compatibility and performance with Jetson modules. Key partner categories include providers, which develop carrier boards and full systems; cameras and sensors specialists, offering modules and development kits; and software firms, delivering SDKs, frameworks, and engineering services. Examples of partners are AAEON, , and Advantech, which produce rugged, compact systems for industrial applications. In cameras and sensors, Allied Vision, , and Arducam provide high-resolution modules optimized for Jetson's CSI interfaces, supporting real-time video analytics. Software partners like alwaysAI, Acontis, and DeepEdge contribute edge tools, including model deployment frameworks and integrations. Recent additions, such as for connectivity solutions in autonomous machines, underscore the 's expansion into and industrial . Other notable partners include Studio for reimagined developer kits, Basler for elite-level camera integration, and Antmicro for custom Tegra-based services.

References

  1. [1]
    Jetson Modules, Support, Ecosystem, and Lineup | NVIDIA Developer
    Jetson Nano is a small, powerful computer for embedded AI systems and IoT that delivers the power of modern AI in a low-power platform. Get started fast with ...Buy the Latest Jetson Products · Jetson Nano · Jetson Benchmarks
  2. [2]
    Jetson Software for Real-time AI and Robotics - NVIDIA Developer
    NVIDIA Jetson™ is the leading platform for real-time AI and robotics, delivering unmatched intelligence for all your edge applications. The NVIDIA Jetson ...
  3. [3]
    Jetson - Embedded AI Computing Platform | NVIDIA Developer
    NVIDIA Jetson is a platform with developer kits for creating AI products, AI learning, and more across all industries.
  4. [4]
    Embedded Systems Developer Kits & Modules from NVIDIA Jetson
    NVIDIA Jetson is the leading platform for edge AI and robotics, offering powerful, compact computers and the NVIDIA JetPack SDK for accelerated development.
  5. [5]
    Jetson Orin Nano Super Developer Kit - NVIDIA
    The Jetson Orin Nano Super Developer Kit is the most powerful and affordable embedded computer for generative AI. Now priced at $249 USD, it can be ...Transform Generative Ai... · Unlock New Physical Ai... · Advancing Generative Ai At...<|control11|><|separator|>
  6. [6]
    NVIDIA Jetson TX2 Enables AI at the Edge
    Mar 7, 2017 · It will be available in other regions in the coming weeks. The Jetson TX2 module will be available in Q2 for $399 (in quantities of 1,000 or ...
  7. [7]
    Inception Spotlight: New Skydio 2 Drone Powered by NVIDIA Jetson
    Oct 1, 2019 · Skydio 2 is capable of flying for up to 23 minutes at a time and can be piloted by either an experienced pilot or by the AI-based system.Missing: strategic context edge
  8. [8]
  9. [9]
  10. [10]
    NVIDIA Sets Path for Future of Edge AI and Autonomous Machines ...
    Nov 9, 2021 · The new Jetson platform bringing the performance and versatility of the Ampere architecture to enable even further advancements in autonomous mobile robots.
  11. [11]
    NVIDIA Announces Availability of Jetson AGX Orin Developer Kit to ...
    Mar 22, 2022 · NVIDIA Announces Availability of Jetson AGX Orin Developer Kit to Advance Robotics and Edge AI. One Million Developers Now Deploying on Jetson; ...
  12. [12]
    NVIDIA Blackwell-Powered Jetson Thor Now Available, Accelerating ...
    NVIDIA Blackwell-Powered Jetson Thor Now Available, Accelerating the Age of General Robotics. August 25, 2025. NVIDIA Blackwell-Powered Jetson ...
  13. [13]
    Jetson AGX Orin for Next-Gen Robotics - NVIDIA
    Priced at $249, it provides an affordable and accessible platform for developers, students, and makers, backed by NVIDIA AI software and a broad AI ecosystem.
  14. [14]
    NVIDIA Announces Jetson Nano: $99 Tiny, Yet Mighty NVIDIA ...
    Mar 18, 2019 · Jetson Nano joins the Jetson™ family lineup, which also includes the powerful Jetson AGX Xavier™ for fully autonomous machines and Jetson TX2 ...
  15. [15]
    NVIDIA Jetson Orin Nano Developer Kit Gets a “Super” Boost
    Dec 17, 2024 · Jetson Orin Nano Developer Kit can be upgraded to Jetson Orin Nano Super Developer Kit with just a software update. ... New reduced price of $249, ...
  16. [16]
    Jetson Nano - NVIDIA Developer
    Jetson Nano is a small, powerful computer for embedded applications and AI IoT that delivers the power of modern AI in a $99 (1KU+) module.<|control11|><|separator|>
  17. [17]
    Jetson Ecosystem - NVIDIA Developer
    ATS provides active and passive air cooling solutions built specifically for NVIDIA Jetson based boards and integrated systems. In addition, ATS ...
  18. [18]
    Jetson FAQ | NVIDIA Developer
    What changes for industrial environments does Jetson TX2i have compared to Jetson TX2? ; Vibration, 10Hz ~200Hz, 1g & 2g RMS. Random: 5g RMS 10 to 500Hz.<|control11|><|separator|>
  19. [19]
    Unified Memory for CUDA Beginners | NVIDIA Technical Blog
    Jun 19, 2017 · This post introduces CUDA programming with Unified Memory, a single memory address space that is accessible from any GPU or CPU in a system.
  20. [20]
    Jetson Orin NX Series and Jetson AGX Orin Series - NVIDIA Docs
    Sep 16, 2024 · This topic describes power and performance management features of NVIDIA Jetson Orin NX series and NVIDIA Jetson AGX Orin series devices.
  21. [21]
    Jetson Thor | Advanced AI for Physical Robotics - NVIDIA
    NVIDIA Jetson Thor Series ; Memory, 128 GB 256-bit LPDDR5X 273 GB/s ; Storage, 1 TB NVMe M.2 Key M Slot, Supports NVMe through PCIe. Supports SSD through USB3.2.
  22. [22]
    Introducing NVIDIA Jetson Thor, the Ultimate Platform for Physical AI
    Aug 25, 2025 · Introducing NVIDIA Jetson Thor, the Ultimate Platform for Physical AI. Aug 25, 2025 ... Today, we're excited to announce that the NVIDIA Jetson ...Missing: August | Show results with:August
  23. [23]
    Unlock Faster, Smarter Edge Models with 7x Gen AI Performance on ...
    Oct 15, 2025 · NVIDIA's Jetson AGX Thor has seen a 3.5x increase in performance on certain models, such as Llama 3.3 70B and DeepSeek R1 70B, due to software ...Missing: DLAs humanoid
  24. [24]
    Jetson SDKs - NVIDIA Developer
    NVIDIA JetPack SDK is the most comprehensive solution for building AI applications. It bundles all the Jetson platform software, including TensorRT, cuDNN, CUDA ...Missing: components Nsight TAO Riva
  25. [25]
    JetPack Software Stack for NVIDIA Jetson - NVIDIA Developer
    NVIDIA JetPack is the official software stack for NVIDIA Jetson, providing tools and libraries for building AI-powered edge applications. JetPack 7 is the ...JetPack Archive · Jetson Linux Release 36.4.4 · Technical Blog
  26. [26]
    JetPack Archive - NVIDIA Developer
    This page includes access to previously released versions of JetPack. The latest version of JetPack is always available under the main NVIDIA JetPack product ...Jetson Linux 36.4 · JetPack 4.6.6 · JetPack 5.1.5 · JetPack 6.1Missing: timeline | Show results with:timeline
  27. [27]
    JetPack SDK 6.0 - NVIDIA Developer
    This JetPack 6.0 release includes Jetson Linux 36.3 which packs Linux Kernel 5.15 and Ubuntu 22.04 based root file system. Jetson AI stack packaged with this ...Jetpack 6.0 · Installing Jetpack · Debian Package Method
  28. [28]
    NVIDIA JetPack SDK Downloads and Notes
    Aug 25, 2025 · Access the latest release notes, downloadable packages, and development and production resources for the NVIDIA JetPack SDK and Jetson ...Release Information · What's New · Supported Libraries And Sdk...Missing: components | Show results with:components
  29. [29]
    JetPack SDK - NVIDIA Developer
    The Jetson AI stack packaged with JetPack 6.1 includes CUDA 12.6, TensorRT 10.3, cuDNN 9.3, VPI 3.2, DLA 3.1, and DLFW 24.0. Additionally, the release include ...Jetpack 6.1 · Installing Jetpack · Debian Package MethodMissing: TAO Riva
  30. [30]
    JetPack SDK 4.6.1 - NVIDIA Developer
    JetPack SDK is for building AI apps, including L4T, CUDA, and TensorRT. JetPack 4.6.1 supports new Jetson modules and includes TensorRT 8.2.Installing Jetpack · Nvidia Sdk Manager Method · More Resources
  31. [31]
    Introduction to NVIDIA JetPack SDK
    Jun 26, 2025 · JetPack 6.2.1 includes NVIDIA Jetson Linux 36.4.4 which includes the Linux Kernel 5.15, UEFI based bootloader, Ubuntu 22.04–based root file ...Libraries · Samples · Developer Tools
  32. [32]
    Jetson Linux - NVIDIA Developer
    NVIDIA® Jetson™ Linux Driver Package is the board support package for Jetson. It includes Linux Kernel, UEFI bootloader, NVIDIA drivers, flashing utilities, ...<|control11|><|separator|>
  33. [33]
    Ubuntu now officially supports NVIDIA Jetson: powering the future of ...
    Mar 18, 2025 · Ubuntu now officially supports NVIDIA Jetson, bringing optimized performance, enterprise-grade stability, and enabling AI solutions across ...
  34. [34]
    Installing Real-Time Kernel — NVIDIA Jetson Linux Developer Guide
    Sep 18, 2025 · The Real-Time (RT) Kernel can be installed with a Debian package management–based OTA on Jetson devices running Jetson Linux or Jetson ...
  35. [35]
    Is there Windows 10/11 support on any of the Jetson modules?
    Jul 26, 2024 · If windows 10/11 support is now available on any of the Jetson modules (Orin, Xavier, Nano)? No, there is no Windows OS support on Jetson platform, and no plan ...
  36. [36]
    Isaac ROS (Robot Operating System) - NVIDIA Developer
    Isaac ROS delivers a rich collection of individual ROS packages (GEMs) and complete pipelines (NITROS) optimized for NVIDIA GPUs and NVIDIA Jetson™ platforms.Advancing Robot Learning... · Forums · Isaac Robot Operating System...
  37. [37]
    Firmware TPM — NVIDIA Jetson Linux Developer Guide 1 ...
    Nov 4, 2024 · The secure boot function should construct Hardware Root of Trust (HROT), Root of Trust for Reporting (RTR), and Root of Trust for Measurement ( ...
  38. [38]
    Create, Design, and Deploy Robotics Applications Using New ...
    Jun 2, 2024 · This integration is demonstrated on the NVIDIA Jetson AGX Orin using Isaac ROS Visual SLAM and Nvblox. LIPS also successfully integrated its ...
  39. [39]
    Slamcore + NVIDIA Deliver Real-Time Object Awareness for Robots
    Jan 8, 2024 · Slamcore uses NVIDIA Jetson to enable robots to see and differentiate objects, share this information, and improve safety and efficiency.Missing: grasping | Show results with:grasping
  40. [40]
    Transforming Industrial Defect Detection with NVIDIA TAO and ...
    Nov 20, 2023 · This post explores how NVIDIA TAO can be employed to design custom AI models that pinpoint defects in industrial applications, enhancing overall quality.Missing: factory | Show results with:factory
  41. [41]
    Top 5 Use Cases for NVIDIA® Jetson Orin™ Nano in Edge AI
    Jul 4, 2024 · The Jetson Orin Nano can inspect products on the production line in real-time, detecting defects and ensuring that only high-quality items reach ...
  42. [42]
    NVIDIA Jetson Thor: New SoC Features Guide
    Oct 2, 2025 · Thor supports decoding of up to 10× 4Kp60 or 4× 8Kp30 video streams and encoding up to 6× 4Kp60. It supports modern codecs (H.265, AV1, VP9, etc ...
  43. [43]
    Jetson AGX Xavier Series - NVIDIA
    The Jetson AGX Xavier series of modules delivers up to 32 TOPS of AI performance and NVIDIA's rich set of AI tools and workflows, letting developers train and ...
  44. [44]
    NVIDIA Jetson Modules Overview - Assured Systems
    Multi-Sensor Fusion: The Xavier AGX Series supports multi-sensor fusion, enabling devices to process data from various sensors simultaneously. This ...
  45. [45]
    Building Smart Cities With Help of AI Video Analytics - NVIDIA
    Deploy AI from the edge to the cloud with a range of appliances, from the NVIDIA® Jetson Nano™ handling a city's traffic cameras, to an entire fleet of NVIDIA ...
  46. [46]
    Multi-stream real-time predection with jetson nano
    Apr 27, 2020 · I used threads for H264 HW decoding. and I want to know how to handle this problem for 12 cameras at the same time? I want to know in Nvidia ...
  47. [47]
    Jetson Community Projects - NVIDIA Developer
    Explore and learn from Jetson projects created by us and our community. These have been created with Jetson developer kits. Scroll down to see projects with ...Hello AI World · Real-time Human Pose... · JetBot · JetRacer
  48. [48]
  49. [49]
    NVIDIA Jetson AI Lab: Home
    Generative AI at the Edge. Bring generative AI to the world with NVIDIA ® Jetson™. Explore Tutorials Discord · Check out our tutorials · Text Generation.Jetson Copilot · Jetson Platform Services · Models · Tutorial - IntroductionMissing: post deep revolution
  50. [50]
    NVIDIA Partner Network (NPN)
    Join the NVIDIA partner program or find an experienced partner. Benefits include sales and distribution, training, marketing, service, and support.
  51. [51]
    u-blox joins NVIDIA Jetson Partner Ecosystem
    Aug 1, 2024 · u-blox this week announced that it has joined the NVIDIA Jetson Partner Ecosystem, which offers AI for autonomous machines and other industrial applications.
  52. [52]
  53. [53]
    Basler Announces Elite-Level Status in NVIDIA Partner Network to ...
    Jun 17, 2022 · Basler AG is elevated to Elite partner-level status. The collaboration provides Basler customers with the opportunity to combine the NVIDIA Jetson platform ...<|control11|><|separator|>
  54. [54]
    Antmicro becomes Nvidia® Jetson™ Ecosystem Partner
    Feb 23, 2016 · As an NVIDIA Jetson Ecosystem Partner, Antmicro offers comprehensive services to help customers take full advantage of the NVIDIA Tegra® ...