Fact-checked by Grok 2 weeks ago

Robot software

Robot software encompasses the suite of programs, libraries, tools, and frameworks designed to control and coordinate robotic systems, enabling them to perceive their through sensors, process data for , and execute actions via actuators in . This software integrates diverse functionalities such as localization, mapping, path planning, and human-robot interaction, addressing the inherent complexities of operating in dynamic and uncertain settings. At its core, robot software manages asynchronous interactions between multiple processes, ensuring robust communication and to handle varying temporal requirements—from millisecond-level control loops to long-term task planning. Key components typically include for (e.g., publish-subscribe mechanisms via topics and services), behavioral control layers for reactive actions, executive systems for plan execution, and higher-level planners using techniques like hierarchical task networks. These elements combine general-purpose computing with constraints, demanding bounded resource usage and fault-tolerant designs to ensure physical safety and reliability, particularly in applications like or . Prominent frameworks such as the (ROS 2) exemplify modern approaches by providing open-source libraries for , simulation, and algorithm implementation, fostering reusability across platforms from industrial arms to autonomous vehicles. Other architectures, including layered models like Sense-Plan-Act or subsumption hierarchies, have evolved to support multi-robot coordination and adaptability, with developments emphasizing component-based designs for scalability. Despite advances, challenges persist in managing software variability, ensuring performance amid hardware diversity, and incorporating ethical considerations for safe deployment in human-centric environments.

Fundamentals

Definition and Scope

Robot software encompasses the programs, algorithms, and systems that enable robotic to perceive, , actuate, and interact with environments, bridging the gap between physical components and task execution. It includes for low-level hardware interfacing, operating systems tailored for robotic operations, control algorithms for , and high-level applications for complex behaviors. Unique to robot software are components designed for real-time constraints and integration of heterogeneous elements, such as operating systems (RTOS) that ensure deterministic timing for safety-critical tasks, data processing pipelines that fuse inputs from cameras, , and IMUs into coherent world models, and loops that execute precise trajectories via feedback mechanisms like controllers. These elements address the complexity of handling uncertain, dynamic environments by combining reactive behaviors with deliberative . For instance, modules process raw streams to detect obstacles, while actuation systems translate plans into motor commands with minimal . The scope of robot software extends across diverse robot types, distinguishing embedded implementations for single physical units—like industrial manipulators focused on repetitive tasks in controlled settings—from cloud-based systems for swarms or teleoperated platforms that coordinate multiple agents via . In industrial robots, software emphasizes efficiency and safety in structured environments; service robots prioritize and adaptability in spaces; and autonomous robots require robust in unstructured terrains. This breadth highlights robot software's role in enabling applications from to healthcare. In the modern context of , robot software increasingly adopts hybrid edge-cloud architectures to balance computational demands, where edge devices handle low-latency processing for and in mobile robots, while clouds manage intensive tasks like model training and multi-robot coordination. This approach mitigates issues in bandwidth-limited scenarios, enhancing for applications in dynamic environments such as urban delivery or collaborative human-robot teams.

Historical Development

The development of robot software began in the mid-20th century with the advent of industrial . In the 1950s, patented the first programmable robot arm, leading to the deployment of in 1961 at ' assembly lines, where its proprietary control software managed hydraulic actuators for tasks like and using or core memory for sequential instructions. This marked the initial shift from manual to automated control, though software was rudimentary and hardware-tethered. By the , innovations like teach pendants—handheld devices for manually guiding robots to record positions—emerged, enabling lead-through programming for repetitive tasks, while offline programming allowed code development away from the factory floor using early computers, reducing downtime in industries like automotive . The 1980s and 1990s saw the formalization of robot programming languages to enhance flexibility and precision. The VAL (Variable Assembly Language) was introduced in 1979 by for robots, providing a structured syntax for and sensor integration in industrial settings, building on earlier research efforts like the Shakey project (1966–1972) at , which pioneered AI-driven planning software in for mobile navigation and obstacle avoidance. Simultaneously, ABB developed in 1994 as a high-level language for its IRB series robots, supporting modular routines for and , which became an industry standard for offline simulation and error handling. These advancements facilitated the integration of sensors and feedback loops, transitioning robot software from simple sequencers to more adaptive systems. The 2000s ushered in the open-source era, democratizing access for research and development. The Player Project, launched in 2000 by Brian Gerkey and others at , provided a interface for heterogeneous robot hardware, enabling in academic labs for mobile and multi-robot applications. This was followed by the (ROS) in 2007, initiated by , which offered a distributed framework for task orchestration, simulation, and , fostering collaboration in fields like service robotics. Key events like the (2004) and Urban Challenge (2007), along with later efforts such as the DARPA Robotics Challenge (2012–2015) and Subterranean Challenge (2018–2021), accelerated autonomy software by emphasizing perception, mapping, and decision-making algorithms, with winners like Stanford's relying on integrated software stacks for real-time navigation. From the onward, robot software evolved toward reliability and intelligence. ROS 2, released in 2017, addressed real-time constraints and multi-robot coordination with DDS middleware for deterministic communication, improving safety in industrial and autonomous vehicles. The breakthrough of , exemplified by in 2012, spurred integration of neural networks for vision and manipulation, enabling end-to-end learning in robots like those in Google’s DeepMind projects post-2015. The from 2020 to 2022 accelerated service robot software, with frameworks adapted for disinfection and delivery tasks, boosting adoption of AI for dynamic environments in healthcare and logistics. As of 2025, ongoing developments preview enhancements in ROS distributions for better multi-robot orchestration, while research explores quantum-inspired algorithms for path optimization in , though widespread adoption remains nascent.

Software Architecture

Low-Level Control Systems

Low-level control systems form the foundational layer of robot software, directly interfacing with to ensure precise, timely of physical components. These systems handle essential tasks such as reading sensor data, commanding actuators, and implementing mechanisms to maintain and accuracy in motion. Device drivers serve as the primary , translating high-level commands into low-level signals for actuators like and servos, while capturing inputs from sensors such as encoders, , and force-torque sensors to enable closed-loop . A cornerstone of these systems is the control , which regulates robot motion by minimizing between desired and actual states through feedback loops. The controller computes the control output u(t) as follows: u(t) = K_p e(t) + K_i \int_0^t e(\tau) \, d\tau + K_d \frac{de(t)}{dt} where e(t) is the , and K_p, K_i, K_d are tunable gains for proportional, integral, and terms, respectively. This is widely applied in for tasks like tracking in manipulators and velocity control in mobile robots, providing robust performance in nonlinear environments. Real-time requirements are paramount in low-level control, necessitating deterministic execution to achieve sub-millisecond response times for handling and scheduling. Real-time operating systems (RTOS) like and address this by prioritizing tasks, managing s from sensors and actuators, and ensuring predictable critical for safety in dynamic operations. , with its lightweight kernel, supports multitasking in resource-constrained embedded robotics, while offers certified reliability for high-stakes applications through priority-based preemptive scheduling. Hardware abstraction in these systems is achieved via firmware running on microcontrollers, which encapsulates low-level hardware specifics to simplify integration. Platforms like Arduino-based boards and microcontrollers host firmware that manages peripheral I/O, for , and analog-to-digital conversion for sensor signals, enabling portable code across hardware variants. Kinematic models, parameterized using Denavit-Hartenberg (DH) conventions, further abstract configurations for forward and calculations, defining lengths, twists, and offsets to compute end-effector poses from joint angles. The DH parameters originated from a 1955 formulation for serial manipulators, remaining a standard for modeling in control software. As of 2025, advancements in have enhanced low-level control for robots, enabling on-device processing to minimize in and tasks. Integration of edge platforms like allows direct execution of control loops near actuators, reducing communication delays to microseconds for real-time adaptation in unstructured environments. Complementing this, neuromorphic emulate to optimize power in control loops, achieving significant reductions in energy consumption, up to 25-fold for certain AI inference tasks compared to conventional GPU-based architectures by processing only event-driven data from sensors. These , such as IBM's NorthPole, facilitate efficient feedback in battery-limited humanoids, supporting prolonged operation without compromising responsiveness.

Middleware and Frameworks

Middleware in robot software serves as an intermediary layer that enables communication and coordination among distributed components, abstracting complexities and facilitating modular development. It acts as a bridge for heterogeneous systems, handling tasks such as , , and to support scalable robotic applications. A primary function is to provide for inter-node interactions, often through publish-subscribe models that decouple producers from consumers, ensuring efficient exchange in environments. This abstraction promotes portability across platforms, allowing developers to focus on high-level behaviors rather than low-level integrations. Among the most widely adopted frameworks, the (ROS) and its successor ROS 2 stand out for their comprehensive support of modular . In ROS 2, the system decomposes applications into independent nodes, each handling a specific function like processing or actuation, connected via a of topics for asynchronous and services for synchronous request-response interactions. Underlying this is the (DDS) standard, which enables real-time publish-subscribe communication with quality-of-service policies for reliability and latency control, making it suitable for safety-critical . Yet Another Robot Platform (YARP) emphasizes peer-to-peer communication for research-oriented systems, allowing devices and modules to connect dynamically through extensible protocols that maintain loose coupling. It supports a variety of connection types, from simple name-based addressing to advanced streaming, fostering reusability in humanoid and multi-robot setups. The Orocos Real-Time Toolkit (RTT), in contrast, targets hard real-time control, providing a C++ framework for deploying distributed components with built-in support for task scheduling and execution management in time-constrained environments. Robot middleware architectures often employ layered models to organize data flow, with lower layers managing transport and discovery, and upper layers handling application-specific logic such as perception-action cycles that iteratively process sensory inputs to generate behaviors. Data flow graphs, implemented via topic-based routing in frameworks like ROS 2, enable efficient by directing streams from multiple sources—such as cameras and lidars—into unified processing pipelines, reducing bottlenecks in distributed setups. As of 2025, advancements include extensions in recent ROS 2 releases, such as the long-term support Jazzy Jalisco (2024) and the short-term Kilted Kaiju (2025), which incorporate federation mechanisms for cloud integration, allowing seamless scaling of robotic fleets across edge and remote resources. Emerging frameworks like Zenoh further enhance IoT-robot interoperability, leveraging pub-sub over UDP for sub-millisecond latencies in 5G-enabled networks, as demonstrated in multi-robot swarm deployments. These developments prioritize low-latency communication for resilient, decentralized operations in industrial and autonomous settings.

Programming Languages and Paradigms

Industrial Robot Languages

Industrial robot languages are proprietary or standardized programming systems developed specifically for controlling industrial manipulators, prioritizing precision, repeatability, and integration with manufacturing processes. These languages enable operators to define robot motions, sensor interactions, and task sequences tailored to high-volume production environments, often combining imperative programming with robot-specific commands for path control and synchronization. Unlike general-purpose languages, they emphasize deterministic execution to ensure cycle times and accuracy in tasks like material handling. Prominent examples include , developed by ABB in the 1990s as a high-level, structured language resembling for programming robot tasks and motions. RAPID supports modular with routines for manipulation and handling, facilitating complex sequences in settings. KUKA's KRL, introduced in the 1990s, draws from Pascal and incorporates motion primitives such as point-to-point (PTP), linear (LIN), and circular (CIRC) movements to define trajectories efficiently. Fanuc's TP (Teach Pendant) programming, a teach-pendant-based system, focuses on manual teaching modes like T1 for slow-speed programming and recording of positions, allowing intuitive setup without extensive coding. Key features of these languages include with offline tools for virtual testing, such as ABB's RobotStudio, which allows code validation without halting production lines. Cycle time optimization is achieved through instructions that minimize idle periods and blend motions seamlessly, as in KRL's support for continuous path execution. They also enable multi-axis coordination, including path planning with to generate smooth trajectories across six or more axes, ensuring precise synchronization in coordinated systems. The evolution of industrial robot languages has shifted toward standardized, XML-based formats in the , exemplified by PLCopen's specifications, which define reusable function blocks for across vendors. By 2008, PLCopen's XML formats enabled exchange of program data and libraries compliant with , reducing . Recent developments include 2025 integrations with OPC UA under initiatives like VDMA's OPC , enhancing Industry 4.0 by standardizing robot data exchange for real-time monitoring and control in smart factories. These languages are widely applied in , where precise arc control and seam tracking ensure consistent quality, and , supporting pick-and-place operations with high . However, their manufacturer-specific nature limits flexibility, creating challenges in code reusability and adaptation to non-standard tasks compared to more versatile general-purpose options.

General-Purpose and Specialized Languages

General-purpose programming languages form the backbone of much modern robot software development, offering flexibility for research, prototyping, and deployment across diverse robotic applications such as service robots and autonomous systems. stands out for its simplicity and extensive ecosystem, enabling developers to handle tasks like , , and integration with minimal boilerplate code. For instance, libraries like PyBullet provide interfaces for physics-based and reinforcement learning in , allowing users to model and interactions efficiently without deep low-level programming. This language's interpreted nature supports rapid iteration, making it ideal for academic and experimental settings where quick adjustments to algorithms for or are essential. C++ complements Python by addressing performance bottlenecks in robotics, particularly in environments demanding low-latency loops and efficient resource management. In the (ROS), C++ is commonly used to implement nodes for , , and hardware interfaces, where its compiled efficiency ensures deterministic execution critical for safety in dynamic scenarios. and extend this toolkit for modeling complex systems, with Stateflow enabling the design of finite state machines (FSMs) to represent behavioral logic in robot controllers, such as switching between modes based on environmental inputs. These tools integrate seamlessly with hardware-in-the-loop simulations, facilitating the verification of algorithms before physical deployment. Specialized languages and variants build on these foundations to target niche domains, enhancing safety, portability, and integration. played a pivotal role in early AI-driven , powering and reasoning in the Shakey robot project from 1966 to 1972, where it supported symbolic manipulation for goal-directed in unstructured environments. finds application in Android-based mobile robots, leveraging its object-oriented structure for cross-platform apps that control locomotion and user interfaces, as seen in SDKs for humanoid platforms like . Emerging in 2025, is gaining traction for due to its and concurrency features, which prevent common errors in multi-threaded systems for sensor data handling and actuator control without runtime overhead. Trends like (WASM) are enabling browser-based , compiling ROS2 components to run efficiently in web environments for remote simulation and teleoperation. The advantages of these languages lie in their cross-platform portability and rich ecosystems, which accelerate development in heterogeneous robotic setups. Python's library, for example, optimizes array operations for processing sensor data like point clouds, providing near-C-level performance through vectorized computations while maintaining readability. This ecosystem supports seamless integration with tools for and , reducing development time for service robot applications. In contrast to domain-specific industrial languages, general-purpose options like C++ offer hardware independence, allowing across simulators and real hardware. Examples include URScript, a lightweight scripting extension for Universal Robots cobots that embeds directly into runtime for precise trajectory scripting without external compilers. Similarly, serves as a scripting layer in systems like , enabling on-the-fly customization of flight behaviors with minimal resource footprint for embedded autopilots.

Visual and Alternative Paradigms

Visual programming languages enable robot developers, particularly non-experts, to construct control logic through intuitive graphical interfaces rather than traditional text-based code. , developed by , facilitates drag-and-drop block assembly to generate executable code for robotic applications, such as those integrated with the (ROS). Similarly, , from , supports robotics extensions like those for or , allowing users to sequence robot behaviors—such as movement and sensor responses—via interlocking blocks that abstract underlying syntax. These tools democratize robot programming by emphasizing visual composition, reducing errors from syntax issues, and promoting in educational and hobbyist settings. LabVIEW, from National Instruments, extends visual paradigms through dataflow programming, where graphical nodes represent functions connected by wires to depict signal flow, particularly suited for vision-based robotic tasks like image acquisition and processing. In robotics modules, it integrates with hardware drivers to handle real-time camera feeds, enabling applications such as object detection without manual code wiring. Alternative paradigms shift from linear scripting to modular structures for handling complex, dynamic robot behaviors. Behavior trees (BTs) provide a hierarchical, reactive control framework for tasks like navigation, where nodes represent actions, conditions, or sequences that execute based on runtime priorities, offering modularity over traditional finite state machines. Finite state machines (FSMs), meanwhile, model task sequencing as discrete states with defined transitions triggered by events or sensors, commonly used in introductory for predictable workflows such as patrol routes or operations. Parallel programming paradigms address concurrency in robot systems by enforcing structured communication. Esterel, a synchronous language, models reactive behaviors as instantaneous reactions to inputs in lockstep with a global clock, ideal for safety-critical robotic controllers like navigation. Occam, based on , uses channels for point-to-point to manage concurrency on multi-core processors, enabling safe parallelism in distributed robotic control without risks. By 2025, advancements in visual programming incorporate (AR) and (VR) for immersive robot design. No-code platforms like further simplify robot orchestration through flow-based wiring of nodes for event-driven automation, such as integrating sensors and actuators in humanoid robots without scripting.

Application Software

Simulation and Development Tools

Simulation and development tools in robot software provide virtual environments for prototyping, testing, and debugging robotic systems, minimizing reliance on physical hardware and enabling rapid iteration in complex scenarios. These tools simulate physics, sensors, and actuators to mimic real-world behaviors, allowing developers to validate algorithms before deployment. Key examples include , an open-source 3D simulator tightly integrated with the (ROS) for realistic physics-based rendering and dynamic environments described in Simulation Description Format (SDF). Webots, a multi-platform desktop application, excels in multi-robot simulations, supporting diverse robot types such as wheeled, legged, and aerial vehicles in indoor or outdoor settings. , the successor to the Virtual Robot Experimentation Platform (V-REP), offers versatile kinematics simulation for mechanisms ranging from simple arms to redundant or closed-chain systems, maintaining full with its predecessor. Core features of these tools emphasize fidelity in virtual testing, including through robust physics engines—for instance, utilizes the () for accurate rigid body interactions and contact responses, while primarily uses but supports and other engines such as . Sensor emulation is equally critical, with support for devices like via ray-tracing algorithms that generate point clouds from simulated laser scans, as implemented in Gazebo's sensor models and CoppeliaSim's volumetric proximity and vision sensors for precise distance and image processing. Additionally, hardware-in-the-loop (HIL) testing bridges simulation and reality by connecting physical controllers or sensors to virtual models, enabling validation of embedded systems under simulated dynamics, as demonstrated in frameworks like 's Jetson-integrated HIL setups for . Development workflows leverage standard software practices tailored to robotics, such as version control with for managing robot code repositories, including URDF models and launch files, to facilitate collaborative prototyping. Integrated development environments (IDEs) like enhance efficiency through extensions that automate ROS workspace setup, provide for message files (.msg) and robot descriptions (URDF/Xacro), and support debugging of C++ or nodes via launch configurations. By 2025, advancements have introduced AI-accelerated simulations, notably Isaac Sim, an open-source framework built on that leverages GPU-accelerated physics engines like and for high-fidelity testing of AI-driven robots, including for autonomous behaviors. This tool supports scalable cloud-based testing through container deployments on platforms like AWS EC2 via the AWS Marketplace, allowing developers to run parallel simulations on GPU instances without local hardware constraints.

AI and Autonomy Integration

AI and autonomy integration in robot software encompasses the high-level algorithms and frameworks that enable robots to perceive their , make decisions, and learn from interactions, transforming them from scripted machines into intelligent agents. These components build upon lower-level by incorporating and techniques to handle , adapt to dynamic settings, and execute complex tasks autonomously. Key to this integration is the use of modular software stacks that process sensory data into actionable insights, supporting applications from industrial manipulation to exploratory navigation. Core modules form the foundation of intelligent behaviors, starting with for perception. Libraries like facilitate real-time in robotic systems, enabling identification of targets through techniques such as contour analysis and feature matching, as demonstrated in agricultural robots where it processes camera feeds to locate manipulable items. Path planning algorithms, such as the A* search, compute optimal trajectories by evaluating nodes in a graph representation of the environment; the cost function is defined as f(n) = g(n) + h(n), where g(n) is the exact cost from the start to node n, and h(n) is a estimate to the goal, ensuring efficient navigation around obstacles. For mapping unknown spaces, (SLAM) systems like Google's Cartographer fuse and inertial data to construct 2D or 3D maps while localizing the robot, achieving loop closure for consistent global representations in real-time indoor environments. AI frameworks enhance decision-making and learning capabilities within these modules. Deep learning libraries such as and support models for tasks like robotic ing, where convolutional networks predict grasp poses from depth images, improving success rates in cluttered scenes through end-to-end training on synthetic datasets. Reinforcement learning tools, including Stable Baselines, implement algorithms like (PPO) to develop navigation policies, allowing mobile robots to learn obstacle avoidance and goal-reaching behaviors in simulated-to-real transfer scenarios. These frameworks integrate seamlessly with robot operating systems, enabling scalable deployment of policies that adapt to varying terrains. Autonomy levels in robot software range from reactive to deliberative paradigms, with multi-agent extensions for collaborative operations. Reactive approaches, exemplified by the subsumption architecture, layer simple behaviors like collision avoidance that suppress higher-level plans when needed, prioritizing immediate responses in unpredictable settings as pioneered in early mobile robots. Deliberative methods employ symbolic planning, such as STRIPS, which represents the world as predicates and actions to generate sequences achieving goals, like assembling parts in . For , multi-agent systems coordinate decentralized agents using consensus algorithms, enabling collective tasks such as search-and-rescue formations through local communication and emergent behaviors. As of 2025, emerging trends emphasize multimodal AI for enhanced interfaces and efficiency. Large language models (LLMs) like GPT-4o are integrated into via ROS2 bridges, allowing commands to generate motion plans or task sequences, as seen in voice-enabled industrial manipulators that interpret instructions for assembly. Edge AI with TinyML deploys lightweight neural networks on microcontrollers for on-device inference, reducing latency in resource-constrained robots like drones performing real-time without cloud dependency. These advancements prioritize low-power, interpretable intelligence to broaden deployment in edge environments.

Safety and Standards

Safety-Critical Features

Safety-critical features in robot software encompass mechanisms engineered to mitigate risks, detect failures, and ensure operational integrity, particularly in environments where human interaction or high-stakes tasks are involved. These features prioritize immediate hazard prevention through hardware-software integration, such as emergency stop logic, which halts motion upon detecting unsafe conditions via dedicated input circuits wired in a normally closed configuration to maintain operation. Collision avoidance algorithms, including velocity obstacle (VO) methods, compute safe velocity sets for robots by modeling obstacles as forbidden velocity regions, enabling dynamic path replanning to prevent impacts with static or moving objects. Fault-tolerant designs incorporate in software architectures, such as duplicate loops or neural network-based kinematic reconfiguration, allowing continued operation despite or failures by redistributing tasks across backup modules. Integration of international standards like ISO 10218-1:2025 ensures these features align with verified requirements for robots, mandating protective measures such as speed and separation to limit hazards during human-robot collaboration. Real-time is facilitated by timers embedded in robot software, which reset the system if periodic "kicks" from the main program fail due to hangs or faults, thereby upholding deterministic execution in time-critical applications. Advanced techniques include using tools like UPPAAL for timed automata representations of robot behaviors, confirming properties such as collision-free trajectories against real-time constraints. For AI-integrated systems, runtime assurance employs barrier certificates to enforce invariants, certifying that neural controllers remain within provably safe operational envelopes during execution. Recent advancements as of 2025 emphasize machine learning-based in robot software, where unsupervised algorithms analyze system logs to identify deviations indicative of faults, enhancing predictive safety in autonomous operations. In collaborative robots (cobots), software like Universal Robots' PolyScope implements force/torque limiting through embedded sensors and control modes, capping interaction forces below 150 N to comply with ISO standards and prevent injury during shared workspaces. These features collectively enable scalable, reliable robot deployments by balancing performance with rigorous safeguards.

Compliance and Ethical Considerations

Robot software development must adhere to international standards that ensure safety and reliability, particularly in collaborative and autonomous applications. The ISO/TS 15066 standard provides safety requirements for collaborative industrial robot systems, specifying limits on force, pressure, and speed to minimize risks during human-robot interaction. Similarly, the EU (Regulation (EU) 2024/1689) classifies certain robot software as high-risk if it poses significant threats to health, safety, or fundamental rights, mandating conformity assessments, risk management systems, and transparency obligations for providers. In the United States, the National Institute of Standards and Technology (NIST) offers frameworks such as the Performance Assessment Framework for Robotic Systems, which includes test methods for evaluating perception, mobility, and autonomy to support standardized validation of robot software performance. Compliance with these standards involves rigorous certification processes to verify software integrity. For instance, the standard defines Safety Integrity Levels (SIL) from 1 to 4, quantifying the reliability of safety functions in electrical, electronic, and programmable systems, including robot software, with higher levels requiring more stringent development and verification procedures. For open-source robot software, such as components in the (ROS), auditing processes identify vulnerabilities, license obligations, and security risks using tools like to ensure regulatory alignment before deployment. Ethical considerations in robot software extend beyond technical compliance to address societal impacts, particularly in AI-integrated systems. in models used for perception or can lead to unfair outcomes, such as discriminatory navigation in diverse environments; mitigation strategies employ fairness metrics like demographic parity, which measures whether predictions are independent of protected attributes like or across groups. In cloud-based systems, where data is processed remotely for tasks like , compliance with the General Data Protection Regulation (GDPR) requires data minimization, , and explicit mechanisms to protect collected via sensors. As of 2025, emerging global standards are shaping the ethical deployment of advanced robot software, especially for systems. The IEEE Study Group has outlined a framework for future standards, emphasizing safety in human-robot interaction, ethical design principles, and to address gaps in existing norms. Additionally, ethical guidelines advocate for transparent decision logs in robot software, enabling auditability of AI processes to build trust and accountability, as promoted by initiatives from organizations like IEEE and aligned with broader explainable AI practices.

References

  1. [1]
    ROS: Home
    The Robot Operating System (ROS) is a set of software libraries and tools that help you build robot applications. From drivers to state-of-the-art algorithms, ...ROSCon 2025 · ROSCon 2024 · Robot Operating System · Jazzy - Installation
  2. [2]
    [PDF] 8. Robotic Systems Architectures and Programming
    Robot software systems tend to be complex. This complexity is due, in large part, to the need to con- trol diverse sensors and actuators in real time, in.
  3. [3]
    [PDF] Control Software for Humanoid Robots - Frontiers
    Mar 8, 2016 · Robot software combines the challenges of general purpose and real-time software, requiring complex logic and bounded resource use.<|control11|><|separator|>
  4. [4]
    [PDF] ROS: an open-source Robot Operating System - Stanford AI Lab
    We anticipate that its open-ended design can be extended and built upon by others to build robot software systems which can be useful to a variety of ...
  5. [5]
    A component based design framework for robot software architecture
    In this paper, we propose a component based design framework for robot software architecture. First, the robot system is functionally decomposed into reusable ...
  6. [6]
    Software variability in service robotics - PMC - PubMed Central - NIH
    Robots are cyber-physical systems blending hardware and software to interact with their environment. Developing, integrating, and customizing hardware, software ...<|control11|><|separator|>
  7. [7]
  8. [8]
    Autonomous Robots for Services—State of the Art, Challenges, and ...
    May 22, 2023 · This article reviews the current state of development of this technology and highlights the evolution of interest in it.
  9. [9]
    [PDF] Cloud-Edge Hybrid Robotic Systems for Physical Human Robot ...
    Aug 11, 2020 · Cloud Robotics has scalable servers that host artificial intelligence, robotic vision, crowd-sourcing, and web-based human computer interface ( ...
  10. [10]
  11. [11]
    Drivers - ROS Wiki
    Nov 3, 2022 · Actuators are usually drive units that convert an electrical signal into mechanical movements or changes in physical variables and thus actively ...
  12. [12]
    Robustness and Tracking Performance Evaluation of PID Motion ...
    The results show that a PID controller can effectively control a highly nonlinear and complex exoskeleton-type robot. ... robot are presented in this paper.
  13. [13]
    Application of the PID Algorithm in Robot | ITM Web of Conferences
    This paper discusses in depth its application in the control of VEX robots, including the analysis of the mathematical principles of the PID algorithm.
  14. [14]
    10 Most Useful Embedded RTOS in 2025
    Dec 17, 2024 · For instance, FreeRTOS is ideal for IoT and robotics due to its lightweight nature, while VxWorks excels in aerospace and automotive sectors ...
  15. [15]
    Real-Time Operating System (RTOS): Working and Examples
    Mar 7, 2024 · VxWorks by Wind River Systems is one of the most popular RTOS in the market. VxWorks provides deterministic behavior with real-time capabilities ...
  16. [16]
    Firmware 101: STM32 Quickstart guide | by Leonardo Cavagnis
    Oct 13, 2021 · A Firmware is a specific class of software that provides the low-level control for a specific hardware. Examples of devices containing firmware ...
  17. [17]
    stm32duino/Arduino_Core_STM32: STM32 core support for Arduino
    This repo adds the support of STM32 MCU in Arduino IDE 2.x. This porting is based on: STM32Cube MCU Packages including: The HAL hardware abstraction layer, ...
  18. [18]
    [PDF] Denavit Hartenberg Representation - Inverse Kinematics
    The Denavit–Hartenberg parameters (also called DH parameters) are the four parameters associated with a particular convention for attaching reference frames to ...<|separator|>
  19. [19]
    Edge Computing and its Application in Robotics: A Survey - arXiv
    Jul 1, 2025 · In short, both edge and fog computing in robotics facilitate low-latency, secure, and efficient processing close to robotic platforms, enabling ...
  20. [20]
  21. [21]
    (PDF) Energy-Efficient Neuromorphic Chips for Real-Time Robotic ...
    The paper finds that neuromorphic chips can significantly reduce energy consumption and improve real-time performance in robotic control. However ...
  22. [22]
    [PDF] Robotic frameworks, architectures and middleware comparison - arXiv
    Nov 18, 2017 · The most basic task of a robotic middleware is to provide the communications infrastructure between the software nodes running in a robotic ...
  23. [23]
  24. [24]
    [PDF] FastCycle: A Message Sharing Framework for Modular Automated ...
    Nov 28, 2022 · A middleware is a software layer enabling different communications, data man- agement and task scheduling services between an operating system ...
  25. [25]
    Understanding topics — ROS 2 Documentation: Foxy documentation
    ROS 2 breaks complex systems down into many modular nodes. Topics are a vital element of the ROS graph that act as a bus for nodes to exchange messages.
  26. [26]
    Understanding nodes — ROS 2 Documentation: Foxy documentation
    Each node in ROS should be responsible for a single, modular purpose, e.g. controlling the wheel motors or publishing the sensor data from a laser range-finder.Missing: components | Show results with:components
  27. [27]
    Understanding services — ROS 2 Documentation
    Services are another method of communication for nodes in the ROS graph. Services are based on a call-and-response model versus the publisher-subscriber model ...
  28. [28]
    ROS on DDS - ROS2 Design
    ROS would need to tap into the DDS API to get information like a list of all nodes, a list of all topics, and how they are connected. Accessing this ...
  29. [29]
    What exactly is YARP?
    YARP is plumbing for robot software. It is a set of libraries, protocols, and tools to keep modules and devices cleanly decoupled.
  30. [30]
    YARP: Welcome to YARP
    YARP supports building a robot control system as a collection of programs communicating in a peer-to-peer way, with an extensible family of connection types.
  31. [31]
    The Orocos Real-Time Toolkit
    The Orocos Real-Time Toolkit (RTT) provides a C++ framework, or "runtime", targeting the implementation of (realtime and non-realtime) control systems.Missing: middleware | Show results with:middleware
  32. [32]
    Orocos Middleware | The Orocos Project
    Oct 27, 2006 · Orocos Middleware. The Real-Time Toolkit allows deployment, distribution and the building of real-time software components. It is sometimes ...
  33. [33]
    [PDF] Advancing Edge AI Perception Platforms and Sensor Fusion for Last ...
    The middleware architecture of ROS 2 consists of several abstrac- tion layers distributed across many decoupled packages. These abstraction layers enable ...Missing: cycles graphs
  34. [34]
    Decentralized Sensor Fusion for Ubiquitous Networking Robotics in ...
    This paper addresses the sensor architecture devised for URUS and the type of robots and sensors used, including environment sensors and sensors onboard the ...<|separator|>
  35. [35]
    On the performance of Zenoh in Industrial IoT Scenarios
    Apr 1, 2025 · The results show that Eclipse Zenoh delivers lower latency under specific conditions, particularly in distributed (routed) architectures ...
  36. [36]
    [PDF] Dual-Link Data Resilient Edge-to-cloud Communication Framework ...
    Our framework scales to multi-robot deployments via ROS 2 namespaces and Zenoh multicast, laying the groundwork for resilient swarm operations in rural ...
  37. [37]
    Unleashing the power of decentralized serverless IoT dataflow ...
    Jan 11, 2024 · This paper outlines the challenging requirements of this novel IoT context and presents an innovative IoT framework to develop dataflow applications for data- ...
  38. [38]
    Introduction to ABB Robot Programming Language - Technical Articles
    Feb 22, 2022 · ABB robots use RAPID, a language similar to C-style structured text, with functions for robot-specific tasks and motion.
  39. [39]
    Robotics Software: Past, Present, and Future - Annual Reviews
    Oct 20, 2023 · Today, we have KRL from Kuka (which has a Pascal/BASIC-like syntax), Karel from Fanuc (which has a Pascal-like syntax), and RAPID from ABB ( ...
  40. [40]
    KUKA Robot Language - Wikipedia
    The KUKA Robot Language, also known as KRL, is a proprietary programming language similar to Pascal and used to control KUKA robots. KUKA Robot Language.Missing: history 1990s primitives
  41. [41]
    Introduction to FANUC Robot Programming - Technical Articles
    Mar 8, 2022 · With FANUC, there are two programming languages: teach pendant (TP) and Karel. The TP language is the code that can be seen on the teach pendant ...
  42. [42]
    FANUC's Robot Modes - T1, T2 or Auto - Motion Controls Robotics
    Every FANUC robot controller comes with a key switch that lets operators choose between three operating modes: T1 (Teach mode) for slow, safe testing, T2 (fast ...
  43. [43]
    RobotStudio® Suite - ABB
    RobotStudio® Suite is the world's most popular offline programming and simulation tool for robotic applications, helping users reduce their commissioning time ...Downloads · RobotStudio® tutorials · RobotStudio® Desktop · RobotStudio® Cloud
  44. [44]
    Understanding the difference between motion types - Robot-Forum
    Sep 21, 2020 · We go along a defined path using SPTP motions and finally an SLIN to the part location and then follow the same path back to the start position.Missing: 1990s primitives
  45. [45]
    [PDF] MITSUBISHI ELECTRIC INDUSTRIAL ROBOT FR Series
    Function for highly accurate coordination (interpolation) with additional axis (straight coaxial). Function for managing the robot status by tracking ...
  46. [46]
    [PDF] XML Formats for IEC 61131-3 | PLCopen
    Nov 4, 2008 · This document describes XML formats for IEC 61131-3, used for programming tools, graphical/logical information, and function block libraries.Missing: history | Show results with:history
  47. [47]
    [DOC] Press release. VDMA OPC Robotics initiative: making industrial ...
    Thanks to OPC UA, robots can now speak the same language as the rest of the automation world. This achievement by the VDMA OPC Robotics Initiative is not only a ...
  48. [48]
    Welding Robots: All the Applications - Codinter Americas
    Dec 15, 2024 · This article explores the vast array of applications of welding robots across various important industries.
  49. [49]
    Industrial robots often used in manufacturing - Universal Robots
    Apr 1, 2019 · Common industrial applications of robots include welding, dispensing, assembling and disassembling, packaging, labelling, quality assurance, ...
  50. [50]
    Standardization in Robot Programming: Challenges & Opportunities
    It creates a steep learning curve, limits knowledge sharing, reduces reusability of code, slows down innovation, and increases costs.Missing: limitations | Show results with:limitations
  51. [51]
    Understanding real-time programming — ROS 2 Documentation
    This document outlines the requirements of real-time computing and best practices for software engineers.
  52. [52]
    Model a Finite State Machine - MATLAB & Simulink - MathWorks
    Create a Simulink® model that contains an empty Stateflow chart by using the sfnew function. To open the Stateflow Editor, double-click the Chart block.
  53. [53]
    [PDF] Shakey the Robot - Stanford AI Lab
    From 1966 through 1972, the Artificial Intelligence Center at SRI conducted research on a mobile robot system nicknamed "Shakey.".
  54. [54]
    Blue Frog Robotics - Develop with Buddy SDK
    With full programmability via the SDK and Java coding tools, developers can focus on app creation without worrying about robot control.
  55. [55]
    Embedded Rust in Production 2025 - OneVariable
    Apr 9, 2025 · Rust is making its way into robotics areas, particularly for important tasks such as machine vision, sensor management, and other connected ...
  56. [56]
    (PDF) ROS2WASM: Bringing the Robot Operating System to the Web
    Oct 3, 2025 · WebAssembly (WASM) Low-level binary format designed for efficient execution on modern web browsers. Web Worker. A JavaScript API that allows ...
  57. [57]
    Array programming with NumPy - Nature
    Sep 16, 2020 · NumPy combines the expressive power of array programming, the performance of C, and the readability, usability and versatility of Python in a ...Missing: portability | Show results with:portability
  58. [58]
    Develop URScript - Universal Robots
    URScript is a custom programming language by Universal Robots to control robot arms, allowing programming without PolyScope, and is useful for complex logic ...
  59. [59]
    Lua Scripts — Copter documentation - ArduPilot
    This page describes how to setup scripts on your autopilot, the scripting API, scripting applets,and some examples to help get started.
  60. [60]
    Blockly - Google for Developers
    Blockly is a visual programming editor by Google that uses drag-and-drop blocks. It's the engine that powers the most popular coding education programs world ...Guides · Get the code · Blockly Developer Tools · Generate and run JavaScript
  61. [61]
    blockly - ROS Wiki
    Oct 1, 2025 · This package provides web-based visualization and block programming tools for robots and drones based in blockly (github.com/google/blockly).
  62. [62]
    Scratch - Imagine, Program, Share
    Scratch is a free programming language and online community where you can create your own interactive stories, games, and animations.Explore · Download the Scratch app · About · For ParentsMissing: robotics drag- drop
  63. [63]
  64. [64]
  65. [65]
  66. [66]
    [1709.00084] Behavior Trees in Robotics and AI: An Introduction
    Aug 31, 2017 · A Behavior Tree (BT) is a way to structure the switching between different tasks in an autonomous agent, such as a robot or a virtual entity in a computer game.
  67. [67]
    The Esterel Language - Inria
    Esterel is a programming language for reactive systems, and a compiler that translates programs into finite-state machines. It's a synchronous language.
  68. [68]
    Design of a Mobile Robot Controller using Esterel Tools
    This paper reports on a project to program a non-trivial robot, the Rug Warrior, in the Artificial Intelligence Laboratory of UNSW, using Esterel.
  69. [69]
    [PDF] Safe Parallelism for Robotic Control - Kent Academic Repository
    In this short piece of code, we see several abstractions provided by the occam-pi programming language. Line 1 declares two channels (think “virtual wires”) ...
  70. [70]
    Unity-Technologies/Unity-Robotics-Hub - GitHub
    This is a central repository for tools, tutorials, resources, and documentation for robotic simulation in Unity. The contents of this repository are in active ...
  71. [71]
    Node-RED: Low-code programming for event-driven applications
    Node-RED's goal is to enable anyone to build applications that collect, transform and visualize their data; building flows that can automate their world.Getting Started · Documentation · Running on Raspberry Pi · Node-RED
  72. [72]
    Robotics-Empowerment-Designer/RED-Platform: Welcome to Node ...
    Aug 19, 2025 · Node-(RED) 2 is a self-hosted web application designed to allow even untrained users to easily create and configure scenarios for the humanoid robot Pepper.
  73. [73]
    Cyberbotics: Robotics simulation with Webots
    Webots is an open source and multi-platform desktop application used to simulate robots. It provides a complete development environment to model, program and ...Sample Webots Applications · Webots for Automobiles · Installing Webots · Tutorials
  74. [74]
    Robot simulator CoppeliaSim: create, compose, simulate, any robot ...
    The video shows a CoppeliaSim simulation (formerly known as V-REP) where robots are playing chess and tic-tac-toe. It mainly illustrates CoppeliaSim's forward ...CoppeliaSim User Manual · CoppeliaSim · Download previous versions · ServicesMissing: successor | Show results with:successor
  75. [75]
    Design Your Robot on Hardware-in-the-Loop with NVIDIA Jetson
    May 24, 2023 · Hardware-in-the-loop (HIL) testing is a powerful tool used to validate and verify the performance of complex systems, including robotics and computer vision.
  76. [76]
    ms-iot/vscode-ros: Visual Studio Code extension for Robot ... - GitHub
    Sep 2, 2025 · The Visual Studio Code Extension for ROS provides support for Robot Operating System (ROS) development for ROS1 and ROS2 on Windows and Linux.
  77. [77]
    Isaac Sim - Robotics Simulation and Synthetic Data Generation
    NVIDIA Isaac Sim™ is an open-source reference framework built on NVIDIA Omniverse™ that enables developers to simulate and test AI-driven robotics solutions ...NVIDIA PhysX · What Is SimReady? · How to Train Autonomous...
  78. [78]
  79. [79]
    Large language models for robotics: Opportunities, challenges, and ...
    This study provides a comprehensive overview of the emerging integration of LLMs and multimodal LLMs into various robotic tasks.Missing: 4o | Show results with:4o
  80. [80]
    Vision-Based Intelligent Robot Grasping Using Sparse Neural Network
    Aug 22, 2023 · This paper introduces Sparse-GRConvNet and Sparse-GINNet, using sparsity for efficient robot grasping, achieving high accuracy on CGD and JGD ...
  81. [81]
    [PDF] Deep Reinforcement Learning for Mapless Mobile Robot Navigation
    Developing OpenAI Gym environment and integrating it with Stable Baselines 3 framework for mobile robot navigation which allows training of the agent in various.
  82. [82]
    [PDF] A Robust Layered Control System for a Mobile Robot
    We call this architecture a subsumption architecture. In such a scheme we have a working control system for the robot very early in the piece as soon as we ...
  83. [83]
    [PDF] STRIPS: A New Approach to the Application of .Theorem Proving to ...
    ABSTRACT. We describe a new problem solver called STRIPS that attempts to find a sequence of operators in a spcce of world models to transform a given ...
  84. [84]
    Swarm Intelligence-Based Multi-Robotics: A Comprehensive Review
    Oct 2, 2024 · This study provides a comprehensive review of SI, focusing on its application to multi-robot systems.
  85. [85]
    How should emergency stops be wired? - Robotics Stack Exchange
    Oct 24, 2012 · The safe way of wiring an emergency stop is in a normally closed manner. That means that the switch is normally closed, and the two terminals ...
  86. [86]
    Velocity obstacle guided motion planning method in dynamic ...
    By choosing a velocity external to the VO area, the robot can achieve the task of collision avoidance effectively.
  87. [87]
    Enhanced fault tolerant kinematic control of redundant robots with ...
    This paper mainly focuses on fault tolerant kinematic control methods based on neural networks within the realm of software redundancy.
  88. [88]
    ISO 10218-1:2025 - Robotics — Safety requirements — Part 1
    In stock 2–5 day deliveryISO 10218-1 establishes guidelines for the safety requirements specific to industrial robots, addressing them as partly completed machinery.Benefits · Buy Together · Industrial Robot Safety...
  89. [89]
    A Guide to Watchdog Timers for Embedded Systems - Interrupt
    Feb 18, 2020 · We will walk through a step-by-step example of how to implement a watchdog subsystem, incorporating a “hardware” and “software” watchdog.Missing: robot | Show results with:robot
  90. [90]
    Formal Verification of ROS-Based Robotic Applications Using Timed ...
    The robot Kobuki is used as a complex case study, over which properties are automatically verified using the UPPAAL model checker, enabling the identification ...Missing: checking | Show results with:checking
  91. [91]
    [PDF] Run Time Assurance for Safety-Critical Systems - Sam Coogan
    Jun 7, 2022 · While identifying Lyapunov functions or barrier certificates is typically a difficult task, they may be computed automatically for polynomial ...
  92. [92]
    Detection of Anomalous Behavior in Robot Systems Based ... - arXiv
    Sep 12, 2025 · In this study, we present a machine learning-based approach for detecting anomalies in system logs to enhance the safety and reliability of ...
  93. [93]
    PolyScope 5 - Universal Robots
    Adds advanced functionality for components and applications. Force Mode. Program the robot to move with any given force using the torque sensor.Capabilities · Ur Script · Get The Latest Software
  94. [94]
    ISO/TS 15066:2016 - Robots and robotic devices
    In stock 2–5 day deliveryISO/TS 15066:2016 specifies safety requirements for collaborative industrial robot systems and the work environment.
  95. [95]
    AI Act | Shaping Europe's digital future - European Union
    The AI Act (Regulation (EU) 2024/1689 laying down harmonised rules on artificial intelligence) is the first-ever comprehensive legal framework on AI worldwide.
  96. [96]
    Performance Assessment Framework for Robotic Systems | NIST
    Mar 21, 2014 · NIST will deliver a suite of test methods for perception, mobility, dexterity, and safety components of robots derived from an assembly operation taxonomy and ...
  97. [97]
    What is an Open Source Audit and How Does it Work? - Black Duck
    An open source audit is an analysis of a codebase that identifies all open source components, associated license conflicts and obligations.
  98. [98]
    (PDF) Fairness and Bias in Robot Learning - ResearchGate
    Jun 17, 2025 · In this work, we present the first survey on fairness in robot learning from an interdisciplinary perspective spanning technical, ethical, and legal challenges.
  99. [99]
    [PDF] The impact of the General Data Protection Regulation (GDPR) on ...
    This study aims to provide a comprehensive assessment of the interactions between artificial intelligence (AI) and data protection, focusing on the 2016 EU ...Missing: cloud- | Show results with:cloud-
  100. [100]
    IEEE study group publishes framework for humanoid standards
    Sep 25, 2025 · The IEEE Humanoid Study Group published the document summarizing its work to review gaps in existing robot safety standards.
  101. [101]
    Autonomous and Intelligent Systems (AIS) Standards - IEEE SA
    This recommended practice describes the methodology and application of 'compliance by design' in the area of human-robot interaction (HRI) with regard to ...