Fact-checked by Grok 2 weeks ago

Data acquisition

Data acquisition (DAQ) is the process of measuring electrical or physical phenomena—such as voltage, , , , or —and converting these measurements into digital form for processing, analysis, or storage by a computer. A complete DAQ system typically comprises three primary elements: sensors to detect and transduce real-world signals into electrical forms, measurement for and , and software for controlling the acquisition and manipulating the data. This integration enables precise capture of analog signals from multiple sources with minimal information loss, facilitating applications in testing, monitoring, and control. In and scientific contexts, DAQ systems play a critical role in interfacing with sensors to gather real-world data, often involving stages like , filtering, and to optimize signals for analog-to-digital conversion (). The component samples the conditioned analog input at specified rates and resolutions, producing digital outputs that can range from 8 bits to over 24 bits, depending on the required precision for phenomena like or acoustic measurements. Modern DAQ hardware supports various interfaces, including USB, Ethernet, and , allowing scalability from portable devices to modular systems like CompactDAQ with over 70 options. DAQ finds widespread use in industries such as , automotive testing, medical , and , where it supports real-time data processing, hardware-in-the-loop simulations, and automated test systems to validate designs and detect defects early. Key considerations in DAQ design include sampling rates (e.g., up to thousands of samples per second), channel count for multichannel setups, and techniques to ensure data integrity in harsh environments. Programmable software, such as or NI FlexLogger, enhances usability by enabling custom configurations, data visualization, and with tools, making DAQ indispensable for and industrial automation.

Fundamentals

Definition and principles

Data acquisition (DAQ), also known as data collection or signal acquisition, is the process of sampling signals that measure real-world physical phenomena—such as voltage, , , or motion—and converting the resulting analog signals into a format suitable for processing, storage, and analysis by computers or other systems. This conversion is typically achieved using analog-to-digital converters (ADCs), which discretize continuous signals in both time and amplitude domains to enable computational manipulation. The primary goal of DAQ is to faithfully capture and represent physical events with sufficient fidelity to support accurate interpretation, ensuring that the mirrors the original analog input without significant loss of . At the core of DAQ principles lies the Nyquist-Shannon sampling theorem, which establishes the minimum sampling rate required to accurately reconstruct a continuous-time signal from its discrete samples. According to this theorem, the sampling frequency f_s must be at least twice the highest frequency component f_{\max} in the signal to prevent , a distortion where higher frequencies masquerade as lower ones. The is thus given by: f_s \geq 2 f_{\max} where f_{\max} is the maximum frequency of interest in the signal. Violating this principle leads to irreversible information loss, underscoring the theorem's foundational role in designing effective DAQ systems. Complementing sampling accuracy is the signal-to-noise ratio (SNR), a key metric quantifying the strength of the desired signal relative to background noise, expressed in decibels as \text{SNR} = 10 \log_{10} \left( \frac{P_{\text{signal}}}{P_{\text{noise}}} \right). A higher SNR enhances measurement precision and reduces errors in digital representation, making it essential for reliable DAQ in noisy environments. Basic components of a DAQ system begin with sensors and transducers, which interface directly with the physical world by detecting phenomena and generating corresponding electrical signals. Sensors convert non-electrical quantities, like or , into measurable forms, while transducers broadly encompass devices that transform energy from one form to another, often producing an initial analog output. These signals then follow initial paths involving basic conditioning to amplify or filter them before , ensuring compatibility with downstream ADCs without delving into specific amplification techniques. This high-level architecture forms the bridge between analog reality and digital processing, prioritizing from the outset.

Applications

Data acquisition systems play a pivotal role in industrial settings, particularly for real-time monitoring and in processes. In machinery , these systems capture high-frequency signals from rotating equipment to detect early faults such as imbalances or bearing wear, enabling timely interventions that reduce and extend equipment life. For instance, low-cost data acquisition setups integrated with accelerometers allow for continuous in production lines, where data helps optimize performance and prevent catastrophic failures. In chemical , data acquisition supports process control by collecting parameters like , , and flow rates, facilitating automated adjustments to maintain and during reactions. sensor-based acquisition technologies further enhance this by providing scalable data collection for equipment status, improving overall plant reliability. In scientific research, data acquisition is essential for , where systems gather seismic data to assess geological stability and predict natural events. Integrated approaches using multi-sensor networks enable of sites prone to earthquakes, collecting time-series data for correlation with environmental factors. Automated PC-based systems, for example, process seismic events in for ground studies, supporting hazard mitigation in vulnerable areas. Biomedical applications leverage data acquisition for non-invasive signal capture, such as electrocardiogram (ECG) monitoring in medical devices, which records cardiac electrical activity to diagnose arrhythmias and support telemedicine. Portable systems that simultaneously acquire and analyze ECG alongside other vital signals exemplify this, providing high-fidelity data for clinical decision-making in wearable monitors. Beyond industry and science, data acquisition underpins testing in automotive and domains. In , on-vehicle systems collect engine performance metrics like , emissions, and during road tests, ensuring compliance with efficiency standards and identifying design flaws. High-performance data loggers, for instance, enable precise synchronization of multiple parameters to evaluate fuel systems and drivetrains under real-world conditions. In , flight data recorders (FDRs) acquire critical parameters such as altitude, speed, and control inputs to reconstruct incidents and enhance protocols. Compact units like the Data Acquisition Flight Recorder (DAFR) combine voice and parametric , meeting regulatory requirements for crash survivability and post-flight analysis. Emerging applications extend data acquisition into interconnected ecosystems, notably sensor networks for smart cities, where distributed devices collect urban data on traffic, pollution, and utilities to enable responsive infrastructure management. These networks process inputs from environmental s to optimize , such as dynamic lighting or , fostering sustainable urban growth. In renewable energy, wind turbine monitoring relies on supervisory control and data acquisition () systems to track blade vibrations, power output, and structural integrity, predicting maintenance needs in remote offshore farms. Best practices from distributed monitoring highlight how such systems integrate multi-source data to minimize operational disruptions and maximize energy yield.

Historical Development

Early innovations

The origins of data acquisition trace back to the , when manual methods dominated the recording of physical phenomena, particularly in scientific experimentation. In , the kymograph, invented by physiologist Carl Ludwig in 1847, represented a pivotal mechanical that automated the logging of variables such as and muscle contractions onto rotating smoked-paper drums. This device allowed for continuous, graphical representation of time-varying signals, enabling researchers to analyze dynamic processes without constant manual intervention, and it laid the groundwork for systematic in experimental sciences. Entering the early , innovations shifted toward electrical and visual recording technologies, enhancing the precision and speed of data capture. Analog oscilloscopes emerged in the , utilizing cathode-ray tubes to display electrical waveforms in , with early commercial models becoming available by 1931 from manufacturers like General Radio. Complementing these were strip-chart recorders, which evolved from kymograph designs to produce linear traces of analog signals on moving paper charts, facilitating the documentation of phenomena like voltage fluctuations in . These tools addressed the limitations of purely mechanical systems by incorporating electromagnetic principles, allowing for broader applications in industrial monitoring and laboratory settings. Following , the 1950s saw the advent of electronic data recorders driven by military and aerospace needs, particularly in for remote signal transmission. These systems enabled the wireless collection of sensor data from high-speed vehicles, with early implementations focusing on analog recorders to store multi-channel inputs. A notable example was NASA's deployment of systems in 1958 for the mission, the first U.S. satellite, which captured and relayed rocket performance data including radiation levels and environmental conditions during launch and orbit. By the early 1960s, commercial data acquisition systems began to integrate technology for more accessible recording, marking a transition toward standardized electronic tools. introduced the 185B sampling in 1962, capable of capturing high-frequency signals up to 1 GHz and providing photographic or direct readout capabilities for persistent , which supported applications in and beyond specialized use. Similarly, the 180A solid-state that year offered improved reliability and multi-trace displays, establishing oscilloscope-based recorders as foundational commercial DAQ instruments.

Modern advancements

The transition from analog to digital data acquisition systems in the late marked a pivotal shift, building briefly on early analog foundations by integrating computational power for more efficient signal handling and processing. In the 1970s and 1980s, the rise of microprocessor-based (DAQ) systems revolutionized the field by enabling programmable control and automation of measurement tasks. , founded in 1976, pioneered this era with its initial products, including the 1977 GPIB (General Purpose Interface Bus) interface card, which facilitated microprocessor-driven from instruments, laying the groundwork for graphical programming environments like , first released in 1986. By the mid-1980s, these systems had evolved to support elemental components such as GPIB DAQ cards, allowing for scalable integration with personal computers. The 1990s saw further standardization through the introduction of the bus, which provided high-speed data transfer rates up to 133 MB/s and became a cornerstone for DAQ hardware integration in PCs. Developed by starting in 1990 and publicly proposed in 1992, the PCI bus replaced slower interfaces like , enabling more reliable and performant DAQ boards for applications requiring rapid data throughput. This standard's adoption extended to modular platforms like PXI in 1997, which leveraged PCI for synchronized, high-channel-count acquisitions in test and measurement. A key milestone in the early was the of USB-based DAQ, which introduced plug-and-play and simplified deployment without specialized slots. With USB 2.0's release in 2000 offering up to 480 Mbps bandwidth, manufacturers like launched USB DAQ devices in the mid-2000s, such as the USB-6008 in 2005, enabling portable, hot-swappable systems powered directly via the interface. This shift reduced setup complexity and costs, making DAQ accessible for field and lab use. During the 2000s, wireless and networked DAQ systems emerged, incorporating standards for and distributed monitoring. The standard, ratified in 1999 with 11 Mbps speeds, was integrated into DAQ architectures by the early to support ad-hoc wireless local area networks (WLANs) for transmission from sensors in inaccessible locations, such as . These systems enhanced flexibility over wired setups, with prototypes demonstrating reliable vibration data acquisition over WLANs by 2003. In the 2010s and , high-speed DAQ advanced through (FPGA) integration, allowing customizable, real-time processing at rates exceeding 100 MSPS. FPGAs enabled parallel data handling and low- buffering, as seen in systems for high-energy physics experiments achieving error-resilient communication up to gigabit speeds by 2017. Concurrently, AI-enhanced acquisition incorporated for real-time , processing streaming data to identify deviations in industrial environments, such as energy systems, with frameworks achieving detection latencies under milliseconds. This adoption extended to contexts via 5G-enabled , where 5G's ultra-low latency (below 1 ms) and massive connectivity support billions of devices for continuous acquisition in smart factories and cities, projected to reach 39 billion connected devices by 2030.

System Components

Hardware

Data acquisition hardware encompasses the physical components that interface with real-world signals, converting them into digital form for processing. At the core are sensors and transducers, which detect physical phenomena and produce corresponding electrical signals. Thermocouples, for instance, generate a voltage proportional to differences based on the Seebeck effect, making them suitable for measuring temperatures in industrial environments. gauges, on the other hand, measure mechanical by detecting changes in electrical when deformed, often used in pressure transducers where the gauge is bonded to a that flexes under applied . These devices output low-level analog signals, typically in the microvolt to millivolt range, requiring careful interfacing to avoid signal degradation. Central to digitizing these signals are analog-to-digital converters (ADCs), which sample and quantize analog inputs into discrete digital values. Successive approximation (SAR) ADCs operate by iteratively comparing the input voltage to a reference using an internal , achieving resolutions from 8 to 18 bits and sampling rates up to several MSPS, ideal for multiplexed data acquisition in . Sigma-delta (Σ-Δ) ADCs, conversely, employ and noise shaping through a modulator and , providing higher resolutions of 12 to 24 bits at lower effective rates (up to a few hundred Hz), excelling in precision applications like sensor digitization where rejection of line noise (50/60 Hz) is critical. The of an ADC, determined by its n, defines the , where the full-scale range equals $2^n \times LSB, with LSB being the least significant bit voltage step; for example, a 12-bit ADC divides the input range into 4096 steps, yielding finer granularity but potentially higher quantization noise if the signal does not span the . Signal conditioning hardware prepares these analog signals for ADC input by enhancing quality and compatibility. Amplifiers, such as instrumentation amplifiers, boost weak sensor outputs to match the ADC's input range, improving ; for thermocouples, gains of 100 or more can elevate microvolt signals to volts, enhancing measurement resolution. filters, typically low-pass analog filters, attenuate frequencies above the Nyquist limit (half the sampling rate) to prevent spectral folding and , with programmable cutoffs ensuring compliance in or audio acquisition. Multiplexers enable multi-channel by sequentially signals from multiple s to a single or , supporting up to thousands of channels in scalable systems while minimizing footprint. Data acquisition interfaces facilitate connectivity between sensors, conditioning stages, and host systems. Data acquisition boards, often in , integrate ADCs, multiplexers, and conditioning into compact cards that plug directly into a computer's bus, offering high-speed data transfer for desktop-based measurements. Modular systems like PXI chassis provide a rugged, scalable platform with slots for interchangeable modules, combining electrical features with Eurocard packaging for synchronized, high-channel-count applications in automated test equipment. Hardware architectures in data acquisition systems vary between centralized and distributed designs to meet diverse deployment needs. Centralized architectures consolidate sensors, , and processing in a single location, such as a lab-based PCIe DAQ board, simplifying but limiting for remote or large-area monitoring. Distributed architectures, incorporating devices like networked CompactDAQ modules, place acquisition hardware near the signal source to reduce and cabling, enabling processing at the edge before . Power considerations involve selecting supplies (e.g., 9-30 V for ) that match device ratings to avoid introduction or damage, often with to prevent ground loops. ensures temporal alignment across channels or devices, achieved via shared clocks, triggers, or GPS timing in distributed setups to maintain phase coherence in multi-device acquisitions.

Software

Software in data acquisition (DAQ) systems encompasses the programming interfaces, libraries, and tools that enable , , and of components, facilitating seamless and across various applications. These software elements act as intermediaries between the physical sensors and the end-user applications, handling tasks from low-level communication to high-level handling. Device drivers form the foundational layer, providing operating system (OS) integration for DAQ . For instance, NI-DAQmx drivers support Windows and environments, including modules for USB DAQ devices, allowing direct access to resources without custom modifications. Similarly, other vendors like provide drivers compatible with and Windows for their DAQ modules, ensuring portability across OS platforms. Development environments and libraries further extend this functionality, offering APIs and graphical interfaces for building DAQ applications. NI-DAQmx serves as a comprehensive driver library that communicates with NI DAQ , supporting configuration of tasks, channels, and timing through its API, which is accessible in languages like via the nidaqmx package. The nidaqmx library, an object-oriented wrapper around the NI-DAQmx API, enables developers to create tasks for analog and digital channels, configure sampling clocks, and perform reads/writes for data acquisition in environments on supported OS. For graphical programming, provides a virtual where users drag-and-drop elements to create applications for DAQ tasks, integrating control with built-in analysis functions without traditional line-by-line coding. These tools streamline development, with emphasizing intuitive block diagrams for test and measurement systems. Core functions of DAQ software include hardware configuration, such as setting sampling rates and triggers, alongside data logging and real-time visualization. NI-DAQmx allows precise configuration of sampling rates via functions like cfg_samp_clk_timing and trigger setups for synchronized acquisitions, ensuring accurate capture of signals at rates up to hardware limits. Data logging is facilitated through methods to write acquired data to files, such as TDMS format in nidaqmx, enabling persistent storage for post-processing. Real-time visualization is supported via integrated tools in environments like , which display waveforms and metrics during acquisition, or through DAQ Assistant in NI software for immediate signal monitoring. Middleware standards enhance interoperability in industrial DAQ setups, allowing diverse systems to exchange data seamlessly. OPC UA (Open Platform Communications Unified Architecture) acts as a platform-independent middleware for secure, real-time data transfer in industrial environments, supporting horizontal and vertical integration across devices and software. It enables DAQ systems to interface with enterprise-level applications, standardizing communication protocols to reduce vendor lock-in. Security features in networked DAQ software are critical to protect against unauthorized and manipulation, particularly in distributed systems. OPC UA incorporates robust mechanisms by design, including and signing to ensure and integrity, and uses certificates for with support for modes like "Sign & Encrypt". However, as of 2025, certain implementations have been found vulnerable to issues such as bypass and tampering (e.g., CVE-2024-42512, CVE-2024-42513, CVE-2025-1468), which can expose systems to risks; users should apply vendor patches, use secure configurations like disabling deprecated , and implement network controls to mitigate these threats. Recent updates as of April and July 2025 have introduced enhanced security policies, including () support and updated SDKs, to address evolving needs. In broader networked DAQ contexts, protocols help mitigate risks in SCADA-integrated systems, where unencrypted communications could expose to tampering.

Acquisition Process

Signal acquisition

Signal acquisition represents the foundational phase of data acquisition, where analog signals from physical phenomena are captured, conditioned, and prepared for . This begins with sensor , where an external stimulus—such as a constant voltage or current—is applied to the to generate a measurable response proportional to the measurand. For instance, resistive sensors like strain gauges require excitation currents typically in the range of 1-10 mA to produce voltage variations that reflect mechanical changes. Following , the weak output undergoes signal to boost its to levels suitable for further processing, often using differential amplifiers to reject common-mode noise. Amplification gains can range from 10 to 1000, depending on sensitivity and system requirements. Subsequent filtering removes unwanted noise and interference, employing low-pass filters to attenuate high-frequency components while preserving the signal of interest; for example, filters with cutoff frequencies aligned to the signal are essential. These steps ensure the before sampling, minimizing distortion from or limitations. Sampling techniques determine how and when the conditioned is converted into discrete time-domain samples. Synchronous sampling involves a fixed , where all samples are taken at uniform intervals determined by the system's master clock, ideal for periodic signals like in machinery . In contrast, asynchronous sampling allows variable rates triggered by external events, enabling efficient capture of sporadic phenomena such as transient impacts. Trigger mechanisms further refine acquisition timing: triggers initiate sampling on a rapid signal transition (rising or falling), while level triggers activate when the signal sustains above or below a , providing flexibility for diverse applications like oscilloscope-based diagnostics. In multi-channel systems, signal acquisition must handle multiple inputs simultaneously to maintain data coherence. Scanning rates define the sequence and speed of multiplexing across channels, with throughput divided among active inputs—for example, a 1 MS/s system scanning 8 channels yields 125 kS/s per channel. Synchronization ensures temporal alignment, often achieved via a shared clock or GPS-based timing for distributed setups, preventing phase errors in applications like phased-array radar. Hardware such as multiplexers facilitates this, as detailed in system component overviews. Common error sources in signal acquisition include quantization noise, arising from the finite resolution of the , which introduces a uniform distribution with variance \sigma_q^2 = \frac{\Delta^2}{12}, where \Delta is the quantization step size. occurs in multi-channel setups due to capacitive or between adjacent lines, manifesting as signal bleed-over that can degrade channel isolation by up to 60 dB without mitigation. To counteract these, shielding encloses cables and circuits in conductive barriers grounded to divert , while twisted-pair wiring and guard traces further reduce susceptibility. A critical aspect of sampling is preventing aliasing, where high-frequency components masquerade as lower frequencies in the sampled data. The Nyquist criterion stipulates that the sampling frequency f_s must exceed twice the maximum signal frequency f_{\max}, expressed as: f_s > 2 \times f_{\max} This ensures faithful reconstruction without distortion. In practical implementations, an oversampling factor of 2.5 to 5 is applied, increasing f_s beyond the theoretical minimum to accommodate imperfect anti-aliasing filters and enhance noise averaging.

Data processing and analysis

Following digitization by the (), the raw digital output is formatted into structured data streams, typically as values representing quantized signal levels, before being buffered in to handle temporary and prevent during high-speed acquisition. Buffering often employs (first-in, first-out) queues or circular buffers to manage varying data rates from the , ensuring continuous flow without overflow. Transmission of this buffered data to or units utilizes protocols such as Ethernet for high-bandwidth, networked environments or for robust, real-time communication in embedded systems like automotive or industrial controls. Basic processing begins with digital filtering to refine the signal, where (FIR) filters apply a weighted sum of recent samples without , providing linear phase response ideal for preserving shape, as in implementations that smooth transients. (IIR) filters, conversely, incorporate from prior outputs to achieve sharper responses with fewer coefficients, though they may introduce distortion, making them suitable for recursive averaging to reduce high-frequency noise. Averaging techniques, a subset of FIR filtering, compute the over multiple samples to attenuate random noise, effectively lowering the signal's variance while minimally affecting the underlying trend. Analysis techniques transform processed data into interpretable insights, with the fast Fourier transform (FFT) enabling frequency-domain examination to identify spectral components such as dominant frequencies or harmonics in vibration or acoustic signals. The FFT computes the discrete Fourier transform efficiently via the Cooley-Tukey algorithm, given by: X_k = \sum_{n=0}^{N-1} x_n e^{-i 2\pi k n / N} where X_k is the k-th frequency bin, x_n are the N time-domain samples, and i is the imaginary unit; this allows spectral power estimation for anomaly detection in mechanical systems. Statistical methods complement this by calculating the mean to estimate the signal's central tendency and variance to quantify dispersion, aiding validation of data quality against expected noise levels or calibration standards. Processing modes differ based on application demands: real-time analysis streams data through processors for immediate in systems, such as filtering and FFT on-the-fly for adaptive machinery monitoring, constrained by requirements under milliseconds. Offline processing, in contrast, involves batch handling of stored datasets for comprehensive analysis, enabling complex statistical validations or high-resolution FFT without timing pressures, as in post-experiment review of acquisitions.

Standards and Challenges

Key standards

Data acquisition systems rely on standardized protocols and specifications to ensure interoperability, reliability, and safety across hardware, software, and communication interfaces. These standards facilitate modular design, precise control, and seamless integration in diverse applications, from laboratory testing to industrial automation.

Hardware Standards

The PCI eXtensions for Instrumentation (PXI) standard defines a rugged, PC-based platform for modular measurement and automation systems, incorporating the PCI bus with added signals for instrumentation such as trigger lines, local buses, and a system reference clock to support high-performance data acquisition. PXI enables scalable chassis configurations with controllers and modules from multiple vendors, promoting cost-effective and synchronized data collection in test environments. Similarly, the VME eXtensions for Instrumentation (VXI) standard provides an open, modular architecture for high-density instrumentation, specifying mechanical, electrical, and protocol requirements for integrating modules into a common backplane. VXI supports precise timing and triggering through dedicated bus lines, making it suitable for demanding applications like aerospace testing where reliability and modularity are critical. For portable data acquisition, the USB Implementers Forum (USB-IF) specifications outline requirements for USB-based devices, including power delivery, data transfer rates up to 480 Mbps for USB 2.0, and plug-and-play connectivity that enables multifunction I/O without specialized interfaces. These specs ensure compatibility for entry-level DAQ systems, allowing analog and digital inputs to interface directly with host computers in field measurements.

Software Protocols

The (SCPI) establishes a syntax and command set for controlling test instruments over interfaces like GPIB or USB, extending IEEE 488.2 to include device-specific hierarchies for measurements such as voltage or frequency. SCPI promotes vendor-independent programming, reducing development time by standardizing queries and responses in data acquisition software. The Interchangeable Virtual Instruments (IVI) standard defines a driver architecture for instrument classes like oscilloscopes and multimeters, enabling software to switch between compliant without changes through shared components and class specifications. IVI supports and .NET frameworks, ensuring compliance and extensibility in automated test systems for consistent data handling.

Communication Standards

is a master-slave protocol for serial or Ethernet-based industrial networks, facilitating data exchange between supervisory computers and remote terminal units in systems with simple request-response mechanisms for reading registers and coils. It supports up to 247 devices on a network, providing robust, low-overhead communication for acquisition in process control. Profibus (Process Field Bus) specifies a standard under IEC 61158 for deterministic communication in , with variants like for high-speed cyclic data transfer at rates up to 12 Mbps over cabling. enables multi-drop topologies for connecting sensors and actuators, ensuring reliable synchronization in distributed DAQ setups. IEEE 1588, known as the (PTP), standardizes network-based clock synchronization with sub-microsecond accuracy, using hardware timestamping to align distributed devices over Ethernet for coherent data acquisition. It operates via master-slave hierarchies, compensating for propagation delays to support time-sensitive applications like phased-array testing.

Safety and Quality Standards

provides a framework for in electrical, electronic, and programmable electronic systems, defining safety integrity levels (SIL 1-4) based on risk reduction and failure probability metrics to guide DAQ system design and verification. It requires systematic capability assessments and lifecycle management to mitigate hazards in safety-related data acquisition, such as in automotive or process industries.

Recent Updates

As of 2025, OPC UA version 1.05 introduces enhancements for device integration and secure , including support for software updates and cloud-compatible pub-sub mechanisms that enable scalable DAQ over industrial IoT networks. This update aligns OPC UA with edge-to-cloud architectures, facilitating real-time data acquisition from field devices to analytics platforms.

Common challenges

One of the primary challenges in data acquisition systems is managing and , which can distort and compromise measurement accuracy. (EMI) arises from external sources such as power lines or nearby , and mitigation strategies include proper grounding to create equipotential surfaces and shielding cables to block radiative . noise, inherent to electronic components due to random electron motion, can be reduced by lowering operating temperatures or employing low-pass filters to limit , thereby minimizing the proportional to the signal frequency range. These techniques are essential in high-precision environments, where unaddressed can degrade the below acceptable limits. Scalability poses significant hurdles when handling high data rates, such as those exceeding 1 GS/s in applications like or , often constrained by limitations in analog-to-digital converters (ADCs) and transmission lines. To address this, systems employ techniques like parallel data paths or FPGA-based processing to distribute the load, though these increase demands and potential bottlenecks in streaming. extension methods, including active equalization, help sustain fidelity at elevated rates but require careful calibration to avoid per the . Synchronization challenges emerge in distributed data acquisition setups, where —caused by oscillator inaccuracies—leads to temporal misalignment across nodes, potentially skewing multi-sensor correlations. Solutions involve external references like GPS timing signals, which provide sub-microsecond precision by disseminating a global time base via satellite, enabling phase-locked loops to correct drifts in real-time networks. Standards such as IEEE 1588 (PTP) further support this by facilitating software-based adjustments in Ethernet-connected systems. Ensuring is critical, particularly against transmission errors and malicious threats in networked environments. Cyclic redundancy checks (CRC) serve as a robust error-detection mechanism, appending a polynomial-derived to packets to identify bit flips or bursts with high probability, often achieving detection rates near 100% for errors up to the code length. In networked DAQ, cybersecurity threats like advanced persistent threats (APTs) target supervisory control and acquisition () protocols, exploiting unpatched vulnerabilities for unauthorized access or data manipulation; countermeasures include segmentation and aligned with NIST guidelines. Cost and complexity arise from trade-offs between and speed in DAQ components, where higher bit resolutions (e.g., 16-bit vs. 8-bit) demand slower sampling to maintain , escalating expenses through specialized ADCs and cooling. Emerging trends leverage for automated troubleshooting, using to detect anomalies in streams and predict failures, thereby reducing manual intervention and operational costs in large-scale systems.

References

  1. [1]
    Data Acquisition (DAQ) Systems, Devices & Software
    ### Summary of Data Acquisition (DAQ) from https://www.ni.com/en/shop/data-acquisition.html
  2. [2]
    [PDF] Principles of Data Acquisition and Conversion (Rev. A)
    Data acquisition and conversion systems are used to acquire analog signals from one or more sources and convert these signals into digital form for analysis or ...
  3. [3]
    Data acquisition design resources | TI.com
    Data-acquisition systems (DAQs) are an important component of many test environments. DAQs interface with various sensors to collect real-world data for ...
  4. [4]
    Data Acquisition Solutions | Analog Devices
    Data acquisition systems accurately capture the signals generated by electronic devices and sensors for real-time processing, hardware-in-the-loop simulation, ...
  5. [5]
  6. [6]
    Data Acquisition (DAQ) - The Ultimate Guide - Dewesoft
    Sep 16, 2025 · Data acquisition (commonly abbreviated as DAQ or DAS) is the process of sampling signals that measure real-world physical phenomena and converting them into a ...The Measurement Process · Sensors Or Transducers · Signal Conditioners<|control11|><|separator|>
  7. [7]
  8. [8]
    The Nyquist–Shannon Theorem: Understanding Sampled Systems
    May 6, 2020 · The Nyquist sampling theorem, or more accurately the Nyquist-Shannon theorem, is a fundamental theoretical principle that governs the design of mixed-signal ...
  9. [9]
  10. [10]
    What Is Signal-to-Noise Ratio (SNR) and Why Does It Matter in Data ...
    Jul 2, 2025 · A higher SNR means a clearer signal with less interference, which is crucial for accurate data interpretation. ... Signal-to-noise ratio plays a ...
  11. [11]
    Understand How Data Acquisition Systems Work | Keysight Blogs
    Feb 12, 2024 · What are the key components of a data acquisition unit? · sensors or transducers to convert the phenomenon being measured into electrical signals ...
  12. [12]
    What is Data Acquisition (DAQ)? - Tektronix
    Dec 7, 2023 · A DAQ (Data Acquisition) system is a set of components and devices used to measure and collect data from various sensors or transducers. It ...
  13. [13]
    Data Acquisition System - MATLAB & Simulink
    A transducer is a device that converts input energy of one form into output energy of another form. For example, a microphone is a sensor that converts sound ...
  14. [14]
    Low-Cost, High-Frequency, Data Acquisition System for Condition ...
    Jun 20, 2020 · Data acquisition is a crucial stage in the execution of condition monitoring (CM) of rotating machinery, by means of vibration analysis.
  15. [15]
    A review of vibration analysis and its applications - PMC - NIH
    Feb 10, 2024 · Vibration analysis (VA) is the most commonly used technique in predictive maintenance. It allows the diagnosis of faults, especially those in the early stages.Missing: manufacturing | Show results with:manufacturing
  16. [16]
    Reliability data acquisition and evaluation in process plants
    The organization and quality control of a project of data acquisition in the process industry were described. ... Methodology, reliability model, and maintenance- ...Missing: scholarly | Show results with:scholarly
  17. [17]
    [PDF] Data Acquisition Technology of Chemical Equipment Based ... - Aidic
    Study the data acquisition technology of chemical equipment using wireless sensor technology. Various processing functions of the status of equipment were ...
  18. [18]
    An Integrated Data Acquisition Approach for the Structural Health ...
    Aug 17, 2024 · This study emphasizes the necessity for extended data collection to establish robust correlations and refine monitoring strategies, aiming to ...
  19. [19]
    [PDF] Seismic Event Data Acquisition and Processing - CDC Stacks
    NIOSH has developed an automated PC-based seismic event (induced earthquake) monitoring system for use in mine ground control studies.
  20. [20]
    Simultaneous Acquisition of EIT and ECG Signals on Active ... - NIH
    The present paper describes a method to simultaneously acquire ECG and EIT signals from active electrodes in an EIT system. This technique is used in the ACT 5 ...
  21. [21]
    Portable Biomedical System for Acquisition, Display and Analysis of ...
    Jan 23, 2025 · This study introduces a mechatronic biomedical device engineered for concurrent acquisition and analysis of four cardiac non-invasive signals.
  22. [22]
    Automotive Data Collection During On-Vehicle Testing - Keyence
    Automotive data collection, often conducted by data acquisition systems, provides valuable insight into vehicle performance, safety, and reliability. By ...
  23. [23]
    How We Built a Cutting-Edge Vehicle Test Data Acquisition System
    Oct 27, 2025 · Our instruments can test all kinds of engines, including combustion engines running gasoline, diesel, hydrogen, and ethanol, as well as today's ...
  24. [24]
    [PDF] GE Aviation's Data Acquisition System
    The Data Acquisition. System (DAS) is a drop in replacement for the F-16 Crash Survivable Flight Data Recorder and the USAF Standard Flight Data Recorder.
  25. [25]
    DAFR Data Acquisition Flight Recorder
    The DAFR is a rugged flight data recorder designed for reliable capture and storage of both critical flight information and additional data.
  26. [26]
    A proactive role of IoT devices in building smart cities - ScienceDirect
    The use of various IoT devices is making a vast approach and every happening becomes part of the network and due to that towns are converting into smart cities.
  27. [27]
    Sensors on Internet of Things Systems for the Sustainable ...
    The IoT provides critical components for smart cities, such as data collection, data analysis, and application handling [6]. By harnessing IoT capabilities, ...Missing: acquisition | Show results with:acquisition<|separator|>
  28. [28]
    Condition monitoring of wind turbines based on spatial-temporal ...
    Currently, condition monitoring of wind turbines has been performed by the supervisory control and data acquisition (SCADA) system and the specifically designed ...
  29. [29]
    [PDF] Distributed Wind Energy Monitoring Best Practices - NREL
    Sep 2, 2024 · Although utility-scale wind farms benefit from reliable and continuous supervisory control and data acquisition-based monitoring platforms ...
  30. [30]
    A Brief Journey into the History of the Arterial Pulse - PMC
    Twelve years later in 1847, Carl Ludwig, a German physiologist, invented the Kymograph which was a device capable of recording hemodynamic variables (Figure 4) ...2. Ancient Medicine · 2.4. Greek Medicine · 4. Modern MedicineMissing: acquisition | Show results with:acquisition
  31. [31]
    The first recordings of pharmacological effects - PMC - NIH
    The kymograph was the first device used to document records of experiments, contributing critically to the emergence of pharmacology as an independent science.
  32. [32]
    Oscilloscope History and Milestones
    Oscilloscopes have been invented in the 1920s. Up to now this instrument encountered many innovations.
  33. [33]
    [PDF] A History of the Analog Cathode Ray Oscilloscope - vintageTEK
    It is probable that analog oscilloscopes will become obsolete by the end of the 20th century. It could be argued that many of our modern electronic ...
  34. [34]
    Milestones:Electronic Technology for Space Rocket Launches, 1950 ...
    Jan 17, 2019 · A wide variety of advances in radar tracking, data telemetry, instrumentation, space-to-ground communications, on-board guidance, and real-time computation ...
  35. [35]
    Explorer 1 - NASA
    Explorer 1 was the first satellite launched by the United States when it was sent into space on January 31, 1958.
  36. [36]
    Oscilloscope - TDR - HP Memory Project
    HP 185B - The Kilomegacycle Sampling Oscilloscope. Just two years later, the HP 185B was introduced in the March 1962 issue of the Hewlett Packard Journal. The ...
  37. [37]
    An Awesome Oscilloscope: The HP 180A - Hewlett-Packard History
    The HP 180A oscilloscope was a 100 percent solid-state component oscilloscope, Hewlett-Packard's first. An entirely new cathode ray tube design.
  38. [38]
    Data Acquisition Systems History [UPDATED 2023] - Dewesoft
    Oct 13, 2025 · The first data acquisition systems​​ American computer company IBM developed computers in the early 1960s that were specifically intended to ...Introduction · The first data acquisition systems · The first PC-based data...
  39. [39]
    The story of NI and its influence on instrument control - EDN Network
    Oct 17, 2023 · In 1986, National Instruments introduced its first flagship product—LabVIEW graphical development platform—for the Macintosh computer. It made a ...
  40. [40]
    Introduction to National Instruments (NI) LabVIEW Software
    Jun 9, 2021 · Thus, the first version of LabVIEW was released in 1986, essentially providing the first computer-based (virtual) laboratory instruments.
  41. [41]
  42. [42]
    USB grabs data-acq spotlight - EE Times
    The NI USB-9211 and 9215 deliver integrated signal conditioning and are the first in a series of single-function USB DAQ devices. The USB-9211 is a four-channel ...
  43. [43]
    [PDF] USB-6008/6009 User Manual and Specifications
    This guide describes how to use the National Instruments USB-6008/6009 data acquisition (DAQ) devices and lists specifications. Introduction. The NI USB-6008/ ...<|separator|>
  44. [44]
    USB Eases Data Acquisition - EDN Network
    Jan 1, 1999 · The USB's biggest advantage is its plug-and-play capability. Many engineers echo this sentiment or point to related ease-of-use features such as ...
  45. [45]
    Architecture of a digital wireless data communication network for ...
    In this paper the architectural design of a measurement system for remote sensing, based on 'ad hoc' Wireless Local Area Network (WLAN) concept is proposed.
  46. [46]
    IEEE 802.11‐Based Wireless Sensor System for Vibration ...
    Mar 25, 2010 · Abstract. Network-based wireless sensing has become an important area of research and various new applications for remote sensing are expected ...
  47. [47]
    A potent approach for the development of FPGA based DAQ system ...
    Oct 28, 2017 · To address these challenges, in the present work, we discuss an efficient DAQ scheme for error resilient, high speed data communication on ...
  48. [48]
    Real-Time Energy Data Acquisition, Anomaly Detection, and ...
    This paper proposes an AI-integrated, secured IIoT infrastructure incorporating heterogeneous data collection and storing capability, global inter- ...
  49. [49]
    The Integration of the Internet of Things (IoT) Applications into 5G ...
    This study examines the fundamental technical applications, obstacles, and future perspectives for integrating IoT applications with 5G networks.
  50. [50]
    Number of connected IoT devices growing 14% to 21.1 billion globally
    Oct 28, 2025 · Looking further ahead, the number of connected IoT devices is estimated to reach 39 billion in 2030, reflecting a CAGR of 13.2% from 2025.Missing: acquisition 2020s
  51. [51]
    Sensor Fundamentals
    **Summary of Sensors and Transducers in Data Acquisition (NI)**
  52. [52]
    Which ADC Architecture Is Right for Your Application?
    We have discussed here the successive approximation, Σ-Δ, and pipelined architectures—those most widely used in modern integrated circuit ADCs. Successive- ...
  53. [53]
    Chapter 20: Analog to Digital Conversion
    Jan 20, 2021 · So, for an N-bit ADC, there are 2N codes and 1 LSB = FS/2N, where FS is the full-scale analog input voltage. However, ADC operation in the real ...
  54. [54]
    None
    ### Summary of Signal Conditioning Hardware from https://www.ni.com/pdf/features/sigcontut.pdf
  55. [55]
    PXI Systems
    ### Summary of Data Acquisition Interfaces from PXI Systems (NI)
  56. [56]
  57. [57]
  58. [58]
  59. [59]
  60. [60]
    USB DAQ Devices Support For Linux - NI
    ### Summary of USB DAQ Devices Support For Linux
  61. [61]
    DAQ Drivers - Downloads - EAGLE Data Acquisition
    Data Acquisition Driver for DOS, Linux and Windows. Linux DOS Windows 95/98 Windows 95 & 98 drivers. Windows 2000/XP (WDM) 32-bit WDM drivers for Windows 2000 ...
  62. [62]
    NI-DAQmx Overview - NI
    ### Summary of NI-DAQmx Functions for DAQ Configuration, Logging, and Visualization
  63. [63]
    NI-DAQmx Python Documentation — nidaqmx 1.4.0.dev0 documentation
    ### Summary of nidaqmx Python Library for DAQ
  64. [64]
    What is NI LabVIEW? Graphical Programming for Test & Measurement
    ### Summary: LabVIEW as a Graphical Tool for Virtual Instrumentation in DAQ
  65. [65]
    [PDF] OPC Unified Architecture
    At the heart of the Industrial IoT (IIoT), OPC UA ad- dresses the need for standardized data connectivity and interoperability for both horizontal and vertical.
  66. [66]
    What is OPC Unified Architecture (OPC UA)? - PTC
    OPC UA helps to enable good security practices for process control and data acquisition for SCADA/HMI and Industry 4.0/Digital Transformation within modern ...<|separator|>
  67. [67]
    Key OPC UA Security Concepts - PTC
    Jul 25, 2022 · One of the key OPC UA security concepts is ensuring the integrity and confidentially of messages through message encryption and signing.
  68. [68]
    Security in OPC UA: Overview of Security Mechanisms - manubes
    OPC UA defines three Message Security Modes: Sign & Encrypt, Sign, and None. With None, no encryption is applied (intended only for testing environments and ...
  69. [69]
    9 SCADA System Vulnerabilities and How to Secure Them
    SCADA systems often employ proprietary communication protocols that lack encryption, making data transmission susceptible to interception and tampering.
  70. [70]
    [PDF] Transducer/Sensor Excitation and Measurement Techniques
    In data-acquisition systems, it is not uncommon to see constant-voltage excitation used for strain and pressure sensors, while constant current excitation is ...
  71. [71]
    [PDF] II. Signal Formation and Acquisition - Physics Division
    To extract the amplitude or timing information the electrical signal is coupled to an amplifier, sent through gain and filtering stages, and finally digitized ...
  72. [72]
    [PDF] "Signal Conditioning and Preprocessing". In
    Signal conditioning involves converting sensor responses to electrical signals, then analog conditioning (filtering), and finally digital preprocessing. ...Missing: steps | Show results with:steps
  73. [73]
    Data Acquisition Basics - Voler Systems
    Polled (or asynchronous) acquisition can be used, in which the application determines when to sample data from the data acquisition device, one sample at-a-time ...
  74. [74]
    Data Acquisition Synchronisation & Timing Methods - Dewesoft
    May 22, 2023 · In this article, we discuss synchronization and timing methods in data acquisition and how they are used to synchronize Dewesoft Data Acquisition (DAQ) systems.
  75. [75]
    What Is a Trigger Event in Electronics? - Keysight
    Edge Trigger. An edge trigger activates when the signal crosses a specified voltage level. Rising edge: This trigger is employed when the signal's voltage ...What Is A Trigger Event In... · Types Of Trigger Events · Advanced Triggering...
  76. [76]
    [PDF] Multichannel Data Acquisition Systems Section 4 - Texas Instruments
    The actual sampling rate per channel depends on how many channels are to be scanned, and is obtained by dividing the throughput by the number of channels to be ...
  77. [77]
    Quantization Noise and Amplitude Quantization Error in ADCs
    Apr 22, 2019 · In this article, we'll look at the conditions under which we are allowed to use a noise source to model the quantization error.
  78. [78]
    Top 8 Ways to Deal with Noise in Data Acquisition and Test Systems
    1. Shield Your Cables. Proper use of shielded cables in a data acquisition system will help minimize common mode electrostatic noise.
  79. [79]
    [PDF] Signal Conditioning and PC-Based Data Acquisition Handbook
    somewhat difficult topic of electrical noise interference, using the best shielding and grounding techniques, and identifying the major sources of crosstalk.
  80. [80]
    [PDF] Application Note - ADC Oversampling - Texas Instruments
    The goal of oversampling is to increase ENOB by reducing the noise observed in the signal. Oversampling performs multiple conversions on the same input signal ...
  81. [81]
    [PDF] Data acquisition Digital filters and signal processing
    ➢ Filtering – leaving out some unwanted part of the signal ... ◇ Moving average filter is a simple example of FIR. Page 35. Moving Average ...
  82. [82]
    CAN Bus Protocol Tutorial - Kvaser
    This tutorial provides a great introduction to the fundamentals of CAN (controller area network) as it is used in automotive design, industrial automation ...
  83. [83]
    Using Ethernet for Data Acquisition - Dataforth
    The property of the original 10Base2 Ethernet that makes it nondeterministic is the protocol for recovery from collision detection. The original ethernet used ...
  84. [84]
    Guide to FFT Analysis (Fast Fourier Transform) | Dewesoft
    Sep 29, 2025 · FFT analysis is one of the most used techniques when performing signal analysis across several application domains.Missing: statistical variance
  85. [85]
    [PDF] Section 7: Digital Signal Processing Techniques
    The simplest form of a digital filter is the finite impulse response filter (FIR), and the most elementary form of an FIR filter is a moving average filter as ...Missing: reduction | Show results with:reduction
  86. [86]
  87. [87]
    [PDF] -1 Hardware Specification - PXI Systems Alliance
    PXI uses the standard PCI bus and adds specific signals for instrumentation including bused trigger lines, slot-specific triggers, a dedicated system clock, and ...
  88. [88]
    VXI Specifications
    You can view or download the specifications you want. They are listed left to right by release dates.
  89. [89]
    Document Library - USB-IF
    Search category: Legal Presentations, Specification Tools, White Paper Type, Cable and Connector Specification, Device Class Specification, Test Matrix Test.USB Power Delivery · USB 2.0 Specification · USB Type-C® Connectors and...
  90. [90]
    Document Library - USB-IF
    Search category: Legal Presentations, Specification Tools, White Paper Type, Cable and Connector Specification, Device Class Specification, Test Matrix Test, ...
  91. [91]
    [PDF] Volume 1: Syntax and Style - Keysight
    Standard Commands for Programmable Instruments (SCPI) is the new instrument command language for controlling instruments that goes beyond IEEE 488.2 to address ...
  92. [92]
    The SCPI Standard - IVI Foundation
    The Standard Commands for Programmable Instrumentation (SCPI) Standard provides a common interface language between computers and test instruments.
  93. [93]
    IVI Foundation | Standards for Instrument Communication & Control
    The IVI Foundation is an open consortium founded to promote specifications for programming test instruments that simplify interchangeability.Specification Downloads · The SCPI Standard · Older IVI Shared Components
  94. [94]
    Modbus Organization
    The Modbus Organization is a group of independent users and suppliers of automation devices. We seek to drive the adoption of the Modbus communication protocol ...
  95. [95]
  96. [96]
    PROFIBUS Standard - DP Specification
    The PROFIBUS communication is specified in IEC 61158 Type 3 and IEC 61784. IEC 61158 Type 3 includes the entire range of PROFIBUS, consisting of the versions DP ...
  97. [97]
    PI - Industry Standards - www.profibus.com
    PROFINET is an open Industrial Ethernet standard that meets all the requirements of mechanical engineering through its fast, deterministic real-time ...PI Organization · Profibus- DP · Upcoming Trainings & Events · About PI
  98. [98]
    Precision Time Protocol - MATLAB & Simulink - MathWorks
    PTP (IEEE 1588) synchronizes clocks in a network, is more accurate than NTP, and more robust than GPS, achieving sub-microsecond accuracy locally.
  99. [99]
  100. [100]
    IEC 61508: The Functional Safety Standard - Intertek
    IEC 61508 is an international standard that provides a framework for ensuring the functional safety of systems that depend on electrical, electronic, ...
  101. [101]
    IEC 61508 (all parts) - Gt-Engineering
    The overall title of IEC 61508 is 'Functional Safety of electrical, electronic and programmable electronic (E/E/PE) safety-related systems'. It has eight parts.Safety Integrity Levels · Safety Functions And... · An Example Of Risk Reduction...Missing: acquisition | Show results with:acquisition
  102. [102]
    Published Document Devices - OPC UA Profiles
    Jul 15, 2025 · the working group completed Version 1.05 for the UA Companion Specification for Devices - DI (OPC 10000-100). Enhancements for Software Update ...
  103. [103]
    OPC UA Online Reference - Released Specifications
    OPC 10000-1: UA Part 1: Overview and Concepts. 1.05.06. 2025-10-31. 1.04. 2017-11-22. OPC 10000-2: UA Part 2: Security. 1.05.06. 2025-10-31.OPC 10000-3 UA Part 3 · OPC 10000-4 UA Part 4 · UA Part 23 · 1.05.06Missing: cloud DAQ<|control11|><|separator|>
  104. [104]
    [PDF] Managing Electromagnetic Interference in Large Instrumentation ...
    EMI is a corrupting signal compromising competent collection of data signals. Very simply, the goal of EMC is to minimize EMI. Specifically, the goal of EMC is ...
  105. [105]
  106. [106]
    [PDF] Data Acquisition: Bandwidth, Accuracy, and R&R White Paper - Instron
    Without ensuring that the bandwidth is correct, the user could be collecting large amounts of data at high data rates that are not accurate representations of ...
  107. [107]
    Techniques for Extending Real-Time Oscilloscope Bandwidth
    Data rates that were once 1 Gb/sec and below are now routinely greater than 10 Gb/sec. Optical communication is now routinely being designed at 100 Gb/sec and ...
  108. [108]
  109. [109]
    [PDF] Selection of Cyclic Redundancy Code and Checksum Algorithms to ...
    It includes a literature review, a discussion of error detection performance metrics, a comparison of various checksum and CRC approaches, and a proposed ...
  110. [110]
    APT Cyber Tools Targeting ICS/SCADA Devices - CISA
    May 25, 2022 · The threat from this tool can be significantly reduced by properly configuring OPC UA security. Refer to the Mitigations below for more ...
  111. [111]
  112. [112]
    Using AI in Predictive Maintenance: What You Need to Know - Oracle
    Dec 23, 2024 · AI can predict equipment failures and generate maintenance insights faster and more accurately than older technologies.What Is Ai In Predictive... · Benefits Of Ai In Predictive... · How Ai Is Used In Predictive...Missing: troubleshooting acquisition