Fact-checked by Grok 2 weeks ago

ARToolKit

ARToolKit is an library for developing (AR) applications, utilizing to track fiducial markers—typically square black-and-white patterns—and overlay virtual 3D graphics onto a live camera feed in real time. Originally created by Hirokazu Kato at the Human Interface Technology Laboratory (HITL) of the , it enables precise camera pose estimation relative to these markers, facilitating applications in , , and industrial visualization. Developed initially in 1999 as part of research into video-based systems, ARToolKit was first demonstrated at the exhibition and released under an open-source license in 2001, making it one of the earliest accessible tools for marker-based AR. Subsequent versions, supported by collaborations with HIT Lab NZ at the and the company ARToolworks (incorporated in 2002), expanded its capabilities to include custom marker patterns, camera calibration tools, and support for multiple programming languages such as and C++. By 2004, it adopted the GNU General Public License (GPL) for non-commercial use, with commercial licensing options available through ARToolworks, which pioneered camera-based AR technologies. ARToolKit's core features include real-time marker detection, 6-degree-of-freedom pose tracking, and integration with libraries, allowing developers to build cross-platform applications without extensive requirements. The has evolved through community-driven forks and official releases, such as ARToolKit5 (version 5.4 as of recent updates), which supports modern operating systems like Windows, macOS, , iOS, and under the GNU Lesser General Public License (LGPLv3). After ARToolworks' developments up to 2015, ongoing maintenance has been handled by initiatives like artoolkitX, ensuring its relevance in contemporary ecosystems despite competition from proprietary SDKs. Its influential original paper remains highly cited in research, underscoring its role in advancing marker-tracking techniques.

History

Origins and Early Development

ARToolKit's development commenced in 1999 at the Human Interface Technology Laboratory (HITLab) at the , led by Hirokazu Kato. As a visiting scholar from the Nara Institute of Science and Technology, Kato initiated the project to address the need for accessible tools in research. The primary objective was to develop a straightforward, library that facilitated marker-based tracking for applications, allowing for the real-time superposition of virtual 3D objects onto live video streams from a camera. This approach leveraged fiducial markers—distinctive square patterns printed on paper or other surfaces—to simplify the integration of virtual elements with the real world, making AR prototyping feasible for researchers without extensive expertise. The library's inaugural public showcase occurred at 1999, integrated into the project, where it demonstrated collaborative interactions across networked environments. During this early phase, developers tackled significant hurdles in real-time , including robust detection of markers under varying lighting and motion conditions to enable precise camera pose estimation relative to the physical scene. Implemented primarily in , ARToolKit prioritized cross-platform compatibility, with initial support for academic workstations running , alongside emerging PC-based systems. This design choice ensured low overhead and broad accessibility, establishing ARToolKit as a pivotal tool in the nascent field of .

Major Releases and Evolution

ARToolKit's journey began with its initial development in 1999 by Hirokazu Kato at the Human Interface Technology Laboratory (HITLab) at the , where it was first demonstrated at . The software was released in 2000 under a custom license, establishing it as the pioneering open-source (AR) tracking library that enabled marker-based video see-through AR applications. In 2002, ARToolWorks was incorporated, and version 1.0 was made fully available as open-source through the HITLab, fostering widespread academic and research adoption. The 2.x series followed, with version 2.72.1 released in May 2007, introducing enhanced multi-platform support for Windows, , OS X, and other systems, alongside improvements in marker recognition accuracy and robustness to lighting variations. Around , ARToolWorks launched ARToolKit Professional, a proprietary edition offering enterprise-grade extensions beyond the open-source core, notably including Natural Feature Tracking (NFT) for markerless . This edition integrated commercial advancements while maintaining a parallel open-source track. The saw key updates for mobile platforms, such as early support for in 2005, iOS with the in 2008, and starting in 2011, adapting the library to hardware constraints. The transition to the 5.x series marked a significant evolution, with version 5.2 re-released open-source in 2015 after acquisition by DAQRI, incorporating previously proprietary features. Version 5.4, released under the GNU Lesser General Public License (LGPL) version 3, further enhanced stability, expanded the for easier integration, and included optimizations from commercial developments. Following DAQRI's shutdown in 2019, maintenance shifted to community-driven repositories under artoolkitX, ensuring ongoing updates and compatibility with modern hardware as of 2025. Throughout its evolution, ARToolKit has been shaped by community feedback via forums and contributions, integration of academic research such as refined pose estimation algorithms, and adaptations to emerging hardware like mobile cameras and sensors.

Technical Foundations

Core Components and Architecture

ARToolKit employs a that separates concerns into distinct components for , image processing, marker recognition, and transformation, enabling developers to integrate or replace modules as needed for custom applications. The core , libAR, handles fundamental functions including marker tracking routines, , and parameter collection, while the video module manages frame capture through platform-specific SDK wrappers. Image processing occurs via thresholding and feature detection within the AR module, followed by marker recognition that identifies fiducial patterns, and transformation computes pose relative to the camera. The is primarily C-based, providing a core with optional C++ wrappers for ease of integration in object-oriented environments, and includes key functions such as arInit for system initialization, arVideo for video stream handling (including arVideoOpen and arVideoGetImage), and cleanup routines like arVideoClose to release resources. Data flow follows a pipeline architecture: input frames from the camera undergo preprocessing in the video module, pass to the module for detection and pose estimation yielding transformation matrices, and output these matrices for external rendering use. This structure ensures efficient processing in loops, with the libARgsub offering utilities for basic overlay tasks independent of specific windowing systems. Portability is achieved via platform-agnostic code with conditional compilation directives to handle OS differences, supporting environments such as Windows, , macOS, , and without requiring extensive modifications. prioritizes real-time performance through efficient buffer handling, where video frames are processed in luminance-only formats and remain valid only until the next capture call, minimizing allocation overhead in continuous operation.

Marker Detection and Pose Estimation

ARToolKit utilizes a system based on square black-and-white patterns, each encoding a unique ID through distinct inner designs, enabling reliable detection and identification in video streams. These markers are typically printed on planar surfaces and can be deployed as single markers or in multi-matrix configurations, where multiple markers are rigidly attached to a common object with predefined relative poses to improve overall tracking stability and accuracy. The multi-matrix approach leverages the collective visibility of markers to mitigate issues like partial , as the system can derive a robust pose even if individual markers are partially obscured. The marker detection process commences with image preprocessing via adaptive thresholding to binarize the input frame, converting it into a representation that highlights potential marker regions against varying backgrounds. Contour extraction follows, identifying connected components in the and approximating their boundaries with polygonal fits, specifically seeking shapes indicative of square markers by verifying four corner points and parallel sides. Candidate regions are then subjected to : the detected is normalized through a perspective transformation to a standard square, after which its inner pattern is correlated against a of predefined templates to confirm the marker's and . This multi-stage ensures efficient detection, typically operating at video frame rates on standard hardware. Pose in ARToolKit computes the camera's position and orientation relative to the detected marker by solving for the that aligns the marker's known world coordinates with its observed projections. Using the four corner points of the marker, the algorithm derives a matrix H that maps the planar marker points from world space to space. For a calibrated camera, this homography is decomposed to yield the extrinsic parameters, expressed as: H = K [R \mid T] where K is the camera intrinsic matrix, R is the 3x3 rotation matrix, and T is the 3x1 translation vector. The rotation and translation are optimized iteratively from initial estimates based on the marker's edge normals, ensuring accurate recovery of the camera pose even under perspective distortion. Sub-pixel refinement of corner locations further enhances precision, achieving localization errors below one pixel on average, which propagates to improved pose accuracy in 3D space. In multi-matrix scenarios, poses from multiple visible markers are combined using their known relative transformations, applying robust estimation techniques like M-estimation to reject outliers and bolster reliability. Despite its effectiveness, the marker detection and pose estimation pipeline has inherent limitations. It presupposes planar markers lying in a known coordinate (typically z=0), restricting applicability to non-planar or dynamically deforming targets without extensions. Accurate camera is essential, as distortions in the intrinsic parameters [K](/page/K) directly degrade pose quality. Additionally, the thresholding step renders the system sensitive to lighting variations, where uneven illumination or shadows can lead to binarization errors, false positives, or missed detections; while adaptive thresholding mitigates this to some extent, extreme conditions still pose challenges. Partial occlusions are better tolerated in multi-matrix setups but can still compromise single-marker tracking if more than a corner is obscured.

Features

Tracking Capabilities

ARToolKit's core tracking functionality relies on marker-based methods, where square fiducial markers with unique black-and-white patterns are detected in video frames to compute the camera's position and orientation relative to the real world in real time. This approach enables the simultaneous detection of multiple markers, with support for hierarchical multi-marker configurations that enhance stability in complex scenes by relating individual markers to a parent structure. On modern hardware, marker-based tracking achieves real-time performance at over 30 frames per second (FPS), allowing for smooth augmentation even with dozens of markers visible. Natural Feature Tracking (NFT), introduced in ARToolKit version 4 and refined in subsequent releases, extends capabilities to markerless environments by using pre-trained image templates of planar textured surfaces, such as photographs or documents. NFT employs feature point detection and matching with descriptors similar to SIFT, including the framework in version 5.x, to initialize and maintain tracking without fiducials. This method supports robust recognition across varying scales and viewpoints, though it is computationally more intensive than marker-based tracking. For enhanced robustness, multi-camera support allows simultaneous processing from setups, facilitating depth and pose from multiple views to improve accuracy in dynamic conditions. In ideal conditions with controlled lighting and minimal , ARToolKit achieves sub-millimeter to low millimeter-level pose accuracy at close ranges (under 1 m), with errors increasing to the centimeter level at distances of 1-2 for standard 10-20 cm markers; tracking ranges typically extend up to 2-2.5 . Advanced options include adjustable binarization thresholds to adapt to varying illumination and multi-threaded processing in version 5.x for optimized operation on multi-core systems. These features underpin the library's pose , which relies on geometric transformations briefly referenced from marker detection processes.

Rendering and Integration Tools

ARToolKit facilitates the integration of content with graphics rendering systems, primarily through its support for , enabling developers to overlay virtual 3D objects onto real-world views captured by a camera. The library's ARgsub_lite module provides essential utility functions for this purpose, including arglCameraFrustum, which computes an perspective from ARToolKit's camera parameters, and arglCameraView, which generates a viewing based on the detected marker pose. These functions allow for the projection of 3D models onto marker poses by setting up the frustum and camera position, akin to traditional methods like gluLookAt for aligning virtual content with physical markers. Additionally, arglSetupForCurrentContext initializes the context with ARToolKit parameters, ensuring seamless rendering without extensive manual configuration. For video overlay, ARToolKit employs the libARvideo library to capture live camera feeds and composite them with rendered virtual elements in real-time, supporting video see-through setups. The arglDispImage function renders the captured video frame directly via , overlaying it as the background for virtual content while preserving the current state for efficiency. A stateful variant, arglDispImageStateful, further optimizes this by avoiding resets to the state, which is particularly useful in complex rendering pipelines. API extensions in ARToolKit allow for customization beyond basic rendering, with hooks in the ARgsub module enabling the integration of custom shaders, , and animations through standard calls. The lightweight ARgsub_lite variant supports efficient 2D and 3D drawing operations, making it suitable for resource-constrained environments while allowing developers to extend functionality for advanced graphics effects. Camera calibration is supported via the ARICP utility, which estimates intrinsic and extrinsic parameters by processing images of checkerboard patterns captured under varying conditions. This tool generates the necessary camera parameter files (e.g., .cpara) required for accurate pose estimation and rendering alignment. A representative workflow in ARToolKit applications involves loading a marker dictionary, detecting the marker pose from the current video frame, applying the resulting 4x4 transformation matrix to set the OpenGL view via arglCameraView, rendering the virtual scene, and finally compositing it onto the video buffer using arglDispImage. The rendering pipeline in ARToolKit is optimized for low-latency performance to support , with the lite implementation (ARgsub_lite) tailored for embedded systems by minimizing overhead in graphics setup and video handling.

Supported Platforms

Operating Systems and Compatibility

ARToolKit provides primary support for Windows in both 32-bit and 64-bit configurations, macOS up to the latest versions, and distributions such as . The library is designed to compile and run on these desktop environments, leveraging platform-specific toolchains like Visual Studio for Windows, for macOS, and or for . For mobile deployment, ARToolKit5 and its successor artoolkitX enable integration with (minimum version 12) and , requiring SDK-specific adaptations for camera access and rendering, with experimental support for web platforms via . These versions facilitate cross-compilation to architectures prevalent in mobile devices, allowing developers to build AR applications that utilize device cameras for real-time marker tracking. Earlier iterations of ARToolKit included legacy support for SGI and ports to operating systems, reflecting its origins in late-1990s hardware and early mobile platforms. Contemporary development emphasizes modern x86 and architectures across supported systems, phasing out compatibility with obsolete environments. Building ARToolKit from source utilizes for cross-platform configuration and compilation, streamlining adaptation across operating systems. Key dependencies include for rendering and GLUT for example applications, with optional libraries like for enhanced functionality on . The v5.x series achieves full cross-platform compatibility through abstracted , enabling seamless video capture and tracking on diverse hosts without major code modifications. Compatibility challenges arise from variations in video backends, including or Windows Media Capture for Windows to handle webcam streams and Video4Linux2 (V4L2) on for device enumeration and capture. Developers must configure these appropriately to ensure reliable input across platforms, often via environment variables or build flags.

Hardware and Performance Considerations

ARToolKit's minimum hardware requirements are modest, centering on a compatible camera and basic processing capabilities. A webcam or mobile camera supporting at least 640x480 resolution at 15 frames per second (FPS) is essential for reliable video input and marker detection, as lower rates restrict the rendering module's performance. The CPU must handle video acquisition and image processing, with support for SSE instructions enabling optimized operations such as color format conversions from 32-bit RGBA. Systems like an Intel Pentium 4 or equivalent from the early 2000s suffice for basic use, though modern equivalents like Intel Core i3 processors ensure smoother operation without specific RAM mandates beyond 128-512 MB observed in early mobile deployments. For recommended configurations, a GPU supporting 1.5 or later—such as or cards with dedicated graphics acceleration—facilitates efficient rendering of overlaid content, particularly in complex scenes. Higher-resolution cameras, ideally 1/2-inch sensors or larger with adjustable apertures, are advised for Natural Feature Tracking (NFT) to capture sufficient detail for feature extraction, outperforming standard consumer webcams in low-light or textured environments. Professional options like Point Grey Firefly cameras, compatible via IEEE-1394 interfaces, provide enhanced control over and for stable tracking. Performance is primarily limited by CPU demands in marker detection and pose estimation, which can cap FPS at 2-6 on low-end devices like early smartphones with 400-1000 MHz processors and 128-512 RAM. On modern desktop hardware, such as 2.4 GHz processors, single-marker tracking reaches 60 , while laptops at 1.6 GHz achieve around 30 . Optimizations like SSE-accelerated routines and low-frequency in markers help mitigate bottlenecks, though higher resolutions beyond 640x480 may reduce without . On mobile platforms supporting and , ARToolKit leverages processors for tracking at the camera's full , but continuous operation raises concerns for battery consumption and thermal throttling due to sustained camera access and computation. Developers must implement pauses in processing to manage heat buildup on devices without dedicated cooling. Built-in example applications in ARToolKit display real-time and metrics, aiding developers in configurations. Marker design guidelines recommend sizes of 10-20 cm for optimal detection ranges of 80-150 cm at handheld distances, as smaller markers lose trackability beyond 40-50 pixels in the image. For scalability, ARToolKit supports multi-marker scenes, but processing additional markers trades off ; for instance, tracking 5-8 markers yields 8 on 109 MHz devices, with similar proportional drops on higher-end hardware in dense configurations.

Applications

Research and Educational Uses

ARToolKit originated in academic research at the Human Interface Technology Laboratory (HITLab) at the , where it was developed in 1999 by Hirokazu Kato for human-computer interaction projects involving overlays on real-world environments. The library's initial demonstration occurred at 1999 as part of a shared virtual space project, enabling early explorations of collaborative interfaces that blend physical and digital elements. In educational settings, ARToolKit supports and curricula through accessible tutorials and sample code that introduce marker-based tracking and pose estimation concepts. These resources are integrated into AR programming textbooks and courses, allowing students to build basic applications that demonstrate real-time and . Research leveraging ARToolKit has advanced collaborative , such as in studies creating shared virtual spaces where multiple users interact with overlaid digital content on physical markers, as demonstrated in early prototypes like the Invisible Train system. In medical visualization, prototypes have used the library to project interactive anatomical models onto patient scans or surgical tools, enhancing preoperative planning and educational simulations for procedures like total . As an open-source library, ARToolKit facilitates student-led projects by providing free access to core tracking algorithms, enabling the creation of interactive models for —such as virtual dissections triggered by printed markers—or historical reconstructions that overlay period artifacts on modern environments. Notable research papers citing ARToolKit include IEEE publications evaluating its tracking accuracy, such as analyses showing positional errors under 1 mm at close ranges with fiducial markers, which inform improvements in pose estimation for stable AR overlays. Integrations with have supported advanced simulations, like multirotor flight training systems where ARToolKit handles marker detection to align virtual aircraft models with real-world controls in . University-hosted resources, including workshops at conferences like IEEE , offer hands-on sessions with ARToolKit sample tailored for beginners, covering setup, camera , and simple scene development to lower barriers for academic experimentation.

Commercial and Industrial Implementations

ARToolKit has been integrated into various commercial products and services, particularly through its professional variant, ARToolKit Pro, which offers enhanced tracking reliability for applications. This version supports scalable object tracking in augmented and setups, enabling businesses to deploy marker-based AR without extensive custom development. Companies like ARToolworks provide licensing for these tools, facilitating integration into and environments for profit-driven solutions. In entertainment, the Softbank Hawks iOS app employed ARToolKit to let fans visualize a virtual mascot via stadium posters, enhancing fan engagement at baseball events. Advertising campaigns have leveraged ARToolKit for immersive print media experiences, including Boffswana's markerless AR promotion for the 2010 "Clash of the Titans" film, where users scanned images to summon a virtual Kraken. Jack Link's "Living Sasquatch" web campaign used FLARToolKit (a derivative) to animate Sasquatch characters, attracting 100,000 unique visitors and 500,000 page views in its first month. Hotels.com's AR city views, also based on FLARToolKit, increased site traffic by 26% and transactions by 36% over three months, generating an estimated $14 million in media value. Industrial implementations include 's early prototyping of AR assembly guides using ARToolKit for manufacturing tasks, which informed their Boeing Augmented Reality Kit () for wiring diagrams and hands-free instructions. In the automotive sector, ARToolKit has supported reviews by overlaying virtual prototypes on physical models, as seen in early applications for product . The Otolift Measuring Tool by TWNKLS utilized ARToolKit with MarkerSLAM for precise of staircases, reducing measurement errors to 0.1% and earning the 2013 Auggie Award for Best Enterprise AR. Notable products incorporating ARToolKit include furniture placement apps that use its library to position virtual items in real spaces via markers, aiding visualization. For , 2010s mobile apps like those employing ARToolKit markers provided on-site historical overlays, guiding visitors through landmarks. The 3D-Live extended ARToolKit for remote , superimposing life-sized virtual video of distant experts onto local views for maintenance training. Economically, ARToolKit's open-source model has lowered entry barriers for (SMEs) to develop AR solutions, while professional licensing ensures high-reliability for industries requiring robust performance, such as and manufacturing.

Community and Extensions

Licensing and Open-Source Model

ARToolKit was initially developed in 1999 by Hirokazu Kato at the Human Interface Technology Laboratory (HITLab) at the and released as version 1.0 under a non-commercial license to support academic and research applications. This early licensing model restricted use to non-commercial purposes, aligning with its origins in academic research, before transitioning to more permissive open-source terms. By 2004, following the involvement of ARToolworks (incorporated in 2002), the project adopted the GNU General Public License (GPL) for broader distribution, enabling free use, modification, and redistribution while requiring derivative works to remain open-source. In 2015, following the acquisition of ARToolworks by DAQRI, ARToolKit version 5.2 shifted to the GNU Lesser General Public License version 3 (LGPL v3), releasing all previously proprietary features, including Natural Feature Tracking (NFT), as open source. DAQRI ceased operations in 2019, after which maintenance has been handled by volunteers. The open-source core is hosted on platforms such as SourceForge and GitHub, where the source code is freely available for download, modification, and redistribution under LGPL v3 terms, promoting widespread adoption among developers. This model has supported over 500,000 downloads of the open-source version 2 alone, contributing to a vibrant global developer community engaged in augmented reality innovation. Advanced features like NFT for markerless applications are now included in the open-source ARToolKit5. Contributions are accepted through GitHub's issue tracker, pull requests, and community forums, ensuring ongoing development without formal oversight. Compliance with the LGPL v3 mandates proper attribution to the original and prohibits any warranties, advising users to conduct thorough testing before deployment, as the software is provided "." This open-source framework has indirectly influenced numerous derivatives, though the core project emphasizes adherence to its licensing for sustained community collaboration.

Derivatives and Modern Forks

ARToolKitPlus represents an early open-source fork of the original ARToolKit, developed to address limitations in marker detection and pose estimation. It introduces improvements such as support for up to 4096 binary-based markers, a robust planar pose algorithm to reduce jitter, automatic thresholding for better adaptability to lighting conditions, and hull-based tracking in multi-marker setups. These enhancements make it suitable for more reliable tracking in resource-constrained environments, though development ceased around 2009. Former commercial extensions like natural feature tracking (NFT) are now part of the open-source ARToolKit5, with tools like NyARToolKit providing a Java-based that supports both marker-based and image-based without fiducials. This version also includes database management capabilities for handling multiple image targets, enabling scalable applications in professional settings. Integrations with game engines such as and are available through community-maintained plugins, such as arunityx for . artoolkitX serves as a prominent modern fork and continuation of ARToolKit version 5.x, initiated post-2015 to revive community support after the original project's maintenance slowed and DAQRI's shutdown. It provides multi-platform compatibility across , , macOS, Windows, and , with experimental support via for browser-based applications. Key advancements include texture tracking for natural features, 6DoF pose estimation, and high-performance video acquisition, while maintaining with legacy markers. As of 2024, artoolkitX remains actively updated, with ongoing releases and community forums recommending it for new projects over the stagnant original. Community-driven forks on further extend ARToolKit for specialized uses, such as mobile optimizations in and builds, and experimental branches integrating with / hybrids like HoloLens for spatial mapping. These efforts leverage the original's open-source LGPL license to innovate on niche requirements, though they often lack the comprehensive support of artoolkitX.

References

  1. [1]
    ARToolKit Home Page - Human Interface Technology Laboratory
    ARToolKit is a software library for building Augmented Reality (AR) applications. These are applications that involve the overlay of virtual imagery on the ...
  2. [2]
    [PDF] Marker Tracking and HMD Calibration for a Video-based ...
    Hirokazu Kato1 and Mark Billinghurst2 ... The AR user also has a set of small marked cards and a larger piece of paper with six letters on it around the outside.Missing: original | Show results with:original
  3. [3]
    Welcome to ARToolworks
    ARToolworks is a pioneer in the field of camera-based augmented reality, first showing its core product - ARToolKit - at the Siggraph Exhibition in 1999.
  4. [4]
    ARToolKit Documentation (User Introduction)
    ARToolKit is a C and C++ language software library that lets programmers easily develop Augmented Reality applications.
  5. [5]
    artoolkit/ARToolKit5 - GitHub
    ARToolKit consists of a full ecosystem of SDKs for desktop, web, mobile and in-app plugin augmented reality. Stay up to date with information and releases ...
  6. [6]
    artoolkitX
    ARToolKit was first released as an open source project in 2001. Under the ownership of ARToolworks, it continued with an open source variant until 2015 when ...
  7. [7]
    IEEE VGTC Virtual Reality Technical Achievement Award 2009
    The original paper describing the ARToolKit is currently the third most cited paper in AR. Even 10 years after its original development, the ARToolKit is ...
  8. [8]
    ARToolKit Documentation (History)
    ARToolKit was developed in 1999 when Hirokazo Kato arrived at the HITLab. The first demonstration have been at SIGGRAPH 1999 for the shared space project.Missing: Hirokazu | Show results with:Hirokazu
  9. [9]
    [PDF] ARToolKit - Tinmith
    ARToolKit version 2.33: A software library for Augmented Reality Applications. Copyright (C) 2000. Hirokazu Kato, Mark Billinghurst, Ivan Poupyrev. This program ...
  10. [10]
    A Brief History of Augmented Reality - InformIT
    Jun 10, 2016 · This situation changed when Kato and Billinghurst [1999] released ARToolKit, the first open-source software platform for AR. It featured a ...
  11. [11]
    ARToolKit download | SourceForge.net
    Rating 4.9 (18) · Free · DeveloperMay 15, 2014 · The Augmented Reality Tool Kit (ARToolKit) captures images from video sources, optically tracks markers in the images, and composites them ...
  12. [12]
    ARToolKit NFT Release Notes - ARToolworks support library
    Mar 30, 2011 · ARToolKit NFT version 3.49.0 is released to you under a proprietary license. ... - Tracking history is now handled in libAR2, rather than being ...
  13. [13]
    Augmented Reality tools, the freemium model | WIRED
    Aug 6, 2009 · "The commercially licensed ARToolKit Professional version 4 represents the continued output of ARToolKit's original authors, and incorporates ...
  14. [14]
    artoolkitx/artoolkit5: ARToolKit v5.x - GitHub
    Jul 28, 2018 · ARToolKit v5.2 was the first major release under an open source license in several years, and represented several years of commercial ...
  15. [15]
  16. [16]
    ARToolKit Framework
    The ARToolKit library consists of four modules: AR module: core module with marker tracking routines, calibration and parameter collection. Video module: a ...
  17. [17]
    ARToolKit API Documentation
    Jan 12, 2007 · ARToolKit subroutines. Core of the ARToolKit Library. This file provides image analysis and marker detection routines. Differents routines ...
  18. [18]
    GitHub - artoolkit/ARToolKit5
    ### Summary of ARToolKit Core Libraries (libAR, libARgsub, ARICP)
  19. [19]
    Multimarker Tracking | artoolkit-docs
    Multimarker tracking has special support in the ARToolKit API and allows for a number of tracking performance and stability enhancements.
  20. [20]
    ARToolKit Documentation (Computer Vision Algorithm)
    ARToolKit is based on a basic corner detection approach with a fast pose estimation algorithm. We discuss more precisely of this algorithm in this page.
  21. [21]
    Measuring ARToolKit Accuracy in Long Distance Tracking ...
    The 3D pose of the marker was detected using the ARToolKit library which provides subpixel accuracy estimation of the marker's location with an average ...
  22. [22]
    ARToolKit NFT - ARToolworks
    Robust multi-resolution tracking allows the user to view augmented materials at a variety of scales; users can get “up-close” to the tracked surface without ...
  23. [23]
    ARToolKit Feature Comparison
    Support for simultaneous tracking from multiple video sources, e.g. stereo cameras · Stereo camera calibration · Robust pose estimation from calibrated stereo ...Missing: 5 SLAM<|control11|><|separator|>
  24. [24]
    gsub_lite - artoolkitX
    Mar 23, 2016 · gsub_lite also provides utility functions for setting the OpenGL viewing frustum and camera position based on ARToolKit- camera parameters and ...
  25. [25]
    ARToolKit Video Library Configuration documentation
    ARVideo may be configured using one or more of the following options, separated by a space: -nodialog Don't display video settings dialog. -width=w Scale camera ...
  26. [26]
    ARToolKit Documentation (Camera Calibration)
    This page presents how use utility programs included with ARToolKit to calibrate your video camera. ARToolKit provides two calibration approaches.Missing: ARICP | Show results with:ARICP
  27. [27]
    ARToolKit Documentation (Developing your First Application, Part 1)
    You initialize a 3D rendering by ask ARToolKit to do rendering of 3D object and setup minimal OpenGL state: argDrawMode3D(); argDraw3dCamera( 0, 0 ); ...Missing: integration arDraw
  28. [28]
    ARToolKit for Desktop - ARToolworks
    ARToolKit Pro provides reliable, low cost, adaptable, scalable object tracking for augmented and virtual reality applications.Missing: 2009 | Show results with:2009
  29. [29]
    artoolkitX - GitHub
    Providing high-performance video acquisition, marker and texture tracking for augmented reality, in native code for iOS, Android, macOS, Windows, and Linux ...Missing: ARICP checkerboard
  30. [30]
    ARToolkit on Symbian
    ARToolkit on Symbian. Hi all, I have made a first port of ARToolkit to the Symbian platform. It was presented at MUM 2004 and showed unofficially at ISMAR. I ...
  31. [31]
    Getting ARToolKit - osgART
    The recommended approach to building ARToolKit is to use CMake. CMake is tool that creates and configures the necessary project files for your particular system ...
  32. [32]
    ARToolKit Documentation (User Setup)
    The software dependencies for each supported operating system are outlined below along with installation instructions. Building. Building on Windows.
  33. [33]
    Product Information:ARToolKit Professional - ARToolworks, Inc.
    Supported across multiple platforms and architectures, including Windows, Mac OS X, Linux, SGI Irix, Symbian, and Windows Mobile. No expensive hardware ...Missing: legacy | Show results with:legacy
  34. [34]
    [PDF] How Mobile Phones Perform in Augmented Reality Marker Tracking?
    Table II shows the performance evaluation, in terms of the average number of FPS achieved by each mobile phone, and the latency for the transmission stage of ...<|control11|><|separator|>
  35. [35]
    Hardware Selection and Configuration | artoolkit-docs
    The key optical variables of interest to ARToolKit are the light-gathering power of the lens (primarily a factor of its size), the camera aperture, and the ...Missing: documentation | Show results with:documentation
  36. [36]
    low frame rate - Human Interface Technology Laboratory
    I am now running the 2.70.1 version of artoolkit on both my 2.4 desktop computer and my 1.6 pentium M laptop. ... FPS on the laptop against about 60 FPS on the ...
  37. [37]
    ARToolKit for Android - ARToolworks
    Works with Android Smartphones and Tablets running Android 2.2 or later. · Tracking at full camera frame rate. · Features reusable Java classes built to the ...
  38. [38]
    How to improve performance in Augmented Reality applications
    Aug 25, 2020 · This article will help to achieve AR-app performance goals, if you just want to read the tricks then navigate directly to “Key Take-Away” section below.Missing: ARToolKit FPS
  39. [39]
    ARToolKit Documentation
    Development Principle · First Step: Writing your first application · Second Step: Recognized multiple patterns · ARToolKit Framework: Basic Description · API ...
  40. [40]
    About the Traditional Template Square Marker | artoolkit-docs
    Table 1 shows some typical maximum ranges for square markers of different sizes. These results were gathered by scanning square markers of increasing size, ...
  41. [41]
    simultaneous detection/solution of 4-5 markers?
    I'm using 5-8 marker tracking on the Nokia 6600 cellular phone with 109Mhz ARM CPU without floating point, with 8 frames per second. I'm not using ARToolkit ...
  42. [42]
    Developing AR Applications with ARToolKit. | Request PDF
    In 1999, Hirokazu Kato of the Nara Institute of Advanced Science and Technology developed and realized the first AR open-source framework based on the GPL ...Missing: SIGGRAPH | Show results with:SIGGRAPH
  43. [43]
    [PDF] Collaborative Augmented Reality
    In this paper we have provided several examples of the types of interfaces that can be produced from taking advantage of these characteristics. Despite early ...
  44. [44]
    Visual support for medical communication by using projector-based ...
    We focus on the advantages of projector-based technology and ARToolKit. Our technique, based on thermal markers (i.e., using human body temperature as a source ...
  45. [45]
    24 ARToolKit and AR. (ARToolKit.n.d.) - ResearchGate
    In this AR application, the user performs THR surgery procedures and interactively explores human anatomy. Also, The user may test their knowledge with a quiz ...
  46. [46]
    Accuracy in optical tracking with fiducial markers - IEEE Xplore
    The results show a specific distribution of tracking accuracy dependent on distance as well as angle between camera and marker.
  47. [47]
    [PDF] An Augmented Reality Visualization System for Simulated Multirotor ...
    The AR visualization module was software developed with the ARToolKit, that is integrated in the Unity. 3D game engine, which runs on a smartphone device and ...<|separator|>
  48. [48]
    Tutorial 2: Developing Augmented Reality Applications
    ... AR systems and hybrid AR interfaces. They will also be able to try several ... ARToolKit, a software library that enables developers to easily buil,d ...Missing: studies | Show results with:studies
  49. [49]
    ARToolKit licensing
    As it has been since the first public release of version 1.0, ARToolKit is freely available for non-commercial use under the terms of the GNU General Public ...Missing: history | Show results with:history
  50. [50]
    News - ARToolworks
    Downloads of ARToolKit v.2 (open source version) pass 500,000 · ARToolKit v.4.5 – New Updates and Enhancements · ARToolworks and Inglobe Technologies sign a ...
  51. [51]
    ARToolKitPlus is a computer tracking library for creation of ... - GitHub
    ARToolKit is a software library that can be used to calculate camera position and orientation relative to physical markers in real time.Missing: capabilities | Show results with:capabilities
  52. [52]
    Commercial Software - ARToolworks
    ARToolKit for Desktop. ARToolKit for Desktop v5.x is the professional version of ARToolKit, the world's most widely used tracking library for AR applications.
  53. [53]
    NyARToolKit - ARToolworks
    Some of the key features of NyARToolKit include: Marker based AR tracking; Natural Feature Tracking (Professional version only); Support for desktop and mobile ...
  54. [54]
    ARToolKit for Unity - ARToolworks
    ARToolKit for Unity allows a new class of AR applications; applications that blend ARToolKit's proven AR tracking engine with Unity's high performance, award- ...
  55. [55]
    ARToolKit for Unreal Cheat Sheet | YourDevKit
    ARToolKit for Unreal is a software development kit that allows developers to create augmented reality applications using the Unreal Engine.<|control11|><|separator|>
  56. [56]
    About ARToolkitX
    As well as artoolkitX, we will maintain a fork of ARToolKit v5.x, provide live binary builds of the software, and actively support the user community via our ...Missing: platform | Show results with:platform
  57. [57]
    artoolkitX, providing high-performance video acquisition ... - GitHub
    artoolkitX version 1.0 is a software development kit (SDK) consisting of libraries and utilities that help developers implement the foundation of great ...
  58. [58]
    Augmented Reality on the Web in 2019 - SitePen
    May 21, 2019 · ARToolKit is not actively maintained but a fork, artoolkitX, remains active. So the state of support is not currently intuitive or ...
  59. [59]
    qian256/HoloLensARToolKit - GitHub
    Apr 19, 2020 · With fiducial marker tracking provided by ARToolKit, plus the indoor localization of HoloLens, many Augmented Reality applications would be made ...