Fact-checked by Grok 2 weeks ago

Exposure assessment

Exposure assessment is the scientific process of estimating or measuring the , , and of human or ecological contact with environmental agents, such as chemical, physical, or biological stressors, at boundaries like , lungs, or , to characterize potential health risks. This multidisciplinary field integrates data from , human activity patterns, , and modeling to evaluate exposure routes (e.g., , , dermal ) and pathways, supporting frameworks established by organizations like the U.S. Environmental Protection Agency (EPA) and the National Research Council (NRC). Originating in the early within and industrial hygiene, it gained formal recognition in the 1983 NRC "" as a core component of risk analysis, evolving with advances in analytical techniques and computational models—including recent developments as of 2025 such as new approach methodologies (NAMs) and AI-enhanced probabilistic modeling—to address uncertainties in exposure estimates. Key methods in exposure assessment include direct measurement using personal monitoring devices to capture real-time contact, indirect approaches relying on environmental concentrations combined with exposure factors (e.g., time spent indoors), and biological markers like biomarkers in or to reconstruct past exposures via reverse . Probabilistic modeling, such as simulations, accounts for variability across populations—considering factors like , , and —and quantifies uncertainties from data limitations or assumptions. These techniques are applied in occupational settings by the Centers for Control and Prevention (CDC) to characterize hazards like silica dust or (PFAS), and in broader environmental contexts to assess pollutants in air, water, and soil. The importance of exposure assessment lies in its role in informing protection, regulatory decisions, and intervention strategies, such as reducing emissions from industrial sources or designing safer consumer products, by identifying vulnerable populations (e.g., children, pregnant individuals, or low-income communities) and tracking exposure trends over time. , , and transparent documentation ensure assessments are robust and reproducible, aligning with EPA guidelines updated in 2019 to incorporate and integrated exposure models like SHEDS (Stochastic Human Exposure and Dose Simulation). Ultimately, it bridges and , enabling evidence-based actions to minimize adverse outcomes from stressors like pesticides, carcinogens, or climate-related hazards.

Definition and Fundamentals

Definition

Exposure assessment is the process of estimating or measuring the magnitude, frequency, and duration of contact between a receptor—such as a , animal, or —and a chemical, biological, or physical agent. This involves evaluating the intensity and routes of such contact, often quantified as the concentration of the agent multiplied by the time of contact at the receptor's boundary (e.g., , lungs, or ). The approach can be quantitative, providing numerical estimates, or qualitative, describing potential exposure scenarios without precise measurements. Within the broader framework of , serves as one of four key components, alongside hazard identification, dose-response assessment, and . It provides essential estimates of or dose that are integrated with data from dose-response assessments to inform , enabling the evaluation of potential or ecological impacts. This integration supports regulatory decision-making, such as prioritizing site cleanups or setting limits, by linking environmental contaminants to affected populations or ecosystems. Basic terminology in exposure assessment distinguishes between related concepts: exposure refers to the contact with an agent at the outer boundary of the receptor, without necessarily implying absorption. Intake describes the amount of the agent entering the body, such as through ingestion or inhalation, crossing an exposure surface that is not an absorption barrier. Dose, in contrast, represents the amount absorbed and available for interaction with target tissues, often termed absorbed dose or internal dose once uptake occurs. The term "exposure assessment" was formalized in the 1980s by the U.S. Environmental Protection Agency (EPA), particularly in response to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) of 1980, also known as legislation, which mandated systematic evaluation of site risks. This development built on earlier foundations in and industrial hygiene from the early 20th century but gained prominence with the 1983 National Research Council report, which outlined it as a core element of .

Key Concepts

Exposure assessment relies on key quantitative metrics to characterize the magnitude, frequency, and duration of contact with , enabling the estimation of potential impacts. A fundamental metric is the average daily dose (ADD), which represents the mass of a contaminant per unit body weight per day, calculated as ADD = (C × IR × EF × ED) / (BW × AT), where C is the concentration of the stressor in the medium, IR is the intake rate (amount contacted or ingested per unit time), EF is the exposure frequency, ED is the exposure duration, BW is body weight, and AT is the averaging time (e.g., lifetime in days for exposures). Closely related is the daily intake (), which averages intake over a lifetime (typically 70 years) to assess long-term exposures, often expressed in similar units and derived from similar parameters but normalized over an averaging time equal to the lifetime. These metrics provide a standardized basis for comparing exposures across populations and integrating with processes. Assessors distinguish between central tendency and high-end exposure estimates to capture variability in population exposures. Central tendency estimates represent average or typical exposures, often using arithmetic means of input parameters to reflect the experience of most individuals in the exposed group. In contrast, high-end estimates focus on the upper tail of the distribution, such as the 90th or 95th , to evaluate risks for highly exposed subgroups by selecting upper-bound values for key variables. Point estimates, which use fixed values for deterministic analyses, can yield these categories, while probabilistic approaches employ distributions (e.g., simulations) to generate full ranges of possible exposures, offering greater insight into and variability. Time-weighted average (TWA) accounts for varying concentrations over time, providing a representative measure for periods with fluctuating . The TWA is computed as E = \sum (C_i \times t_i) / T_{total}, where C_i is the concentration during time interval t_i, and T_{total} is the overall (e.g., an 8-hour workday or lifetime). This approach is essential for chronic assessments, as it normalizes intermittent or variable contacts into an equivalent constant level, facilitating comparisons against health-based thresholds. Bioavailability plays a critical role by quantifying the fraction of an that actually reaches a target site, such as systemic circulation, influencing the effective dose. It is typically expressed as a factor between 0 and 1, incorporating elements like , gastrointestinal uptake, or dermal penetration, and is adjusted in dose equations (e.g., multiplying ADD by a bioavailability factor) to refine estimates. Factors affecting include chemical speciation, , and physiological conditions, with site-specific measurements often used to avoid overestimation of risks from total concentrations.

Applications and Contexts

Environmental Applications

Exposure assessment in environmental science evaluates the risks posed by pollutants in air, water, soil, and other media to ecosystems and human populations through ambient exposures. This process integrates data on contaminant sources, transport mechanisms, and receptor characteristics to quantify potential exposures, informing regulatory decisions and remediation strategies. By focusing on ecological and environmental pathways, it distinguishes from health-specific applications by emphasizing broader ecosystem dynamics and long-term ambient risks. Multimedia exposure assessment addresses combined routes of contaminant movement across environmental compartments, such as air, water, , and , to predict overall ecological and risks. For instance, multimedia models simulate the fate, transport, and of chemicals like volatile methylsiloxanes, revealing how emissions partition into multiple media and accumulate in organisms. A key example is in aquatic food chains, where persistent pollutants like polychlorinated biphenyls (PCBs) concentrate in tissues, leading to exposure through consumption of contaminated . This approach highlights the need for integrated modeling to capture indirect exposures via ecological magnification, as outlined in frameworks for contaminant . In Superfund site remediation under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) of 1980, exposure assessment quantifies risks from contaminated groundwater to guide cleanup actions. Site assessments evaluate contaminant migration from sources like leaking landfills into aquifers, estimating exposure concentrations for nearby populations via drinking water or irrigation pathways. For example, at sites with volatile organic compounds in groundwater, baseline risk assessments integrate hydrogeological data to set remediation goals, ensuring protectiveness of human health and the environment as required by CERCLA processes. This application has been central to the cleanup of 459 Superfund sites since 1980 (as of March 2025), prioritizing sites based on exposure potential. The U.S. Environmental Protection Agency's Exposure Factors Handbook (2011 edition) serves as a primary framework for population-based environmental data in exposure assessments. It compiles physiological and behavioral parameters, such as inhalation rates, soil ingestion, and fish consumption patterns, tailored for U.S. populations to estimate exposures to environmental contaminants. Updated from prior versions, the handbook supports probabilistic modeling by providing distributions of factors like body weight and activity levels, enabling site-specific applications in ecological risk evaluations. Emerging concerns in environmental exposure assessment include effects, which alter distribution and habitats, potentially increasing exposure risks. Warmer temperatures and shifting patterns expand the range of vectors like mosquitoes, accelerating development and of diseases such as . For instance, climate-driven habitat changes have led to higher incidence in temperate regions, necessitating adaptive exposure models that incorporate meteorological projections. Assessments now integrate these factors to predict amplified ecological exposures in vulnerable areas.

Occupational and Public Health Applications

In occupational health, exposure assessment plays a critical role in evaluating worker risks to chemical, physical, and biological hazards in the workplace. Organizations like the (OSHA) have established permissible exposure limits (PELs) since the passage of the Occupational Safety and Health Act in 1970, which set the first federal standards in 1971 for over 400 toxic substances to protect workers from excessive exposure. For instance, monitoring for solvents such as or involves personal sampling devices to measure airborne concentrations against PELs, ensuring levels remain below thresholds like 1 ppm for over an 8-hour time-weighted average. Similarly, noise exposure assessment uses dosimeters to quantify sound levels, with OSHA's standard (29 CFR 1910.95) mandating monitoring when exposures approach 85 decibels (dBA) and requiring hearing conservation programs if the PEL of 90 dBA is exceeded. In applications, exposure assessment informs epidemiological studies by quantifying environmental contaminants' links to adverse outcomes, enabling targeted interventions. A prominent example is the 2014 Flint crisis in , where switching to a corrosive led to lead from , resulting in elevated lead levels (BLLs) in children; post-switch data from 2015 showed the incidence of children with BLLs ≥5 μg/dL increasing from 2.4% to 4.9%, particularly in socioeconomically disadvantaged neighborhoods. This assessment involved through tests and sampling, highlighting how exposure data can trace community-wide health impacts and support responses, such as the establishment of a lead exposure registry. Consumer exposure assessment evaluates risks from everyday products, focusing on pathways like dermal contact with in plastics, which are used as plasticizers in items such as flooring, toys, and . Dermal uptake occurs through direct skin contact, with studies estimating daily exposures of 0.1–10 μg/kg body weight for di(2-ethylhexyl) phthalate (DEHP) via handling or mouthing of contaminated items, particularly in children. Regulatory bodies like the U.S. Environmental Protection Agency (EPA) use scenario-based modeling to assess these risks, incorporating factors like product usage frequency and skin absorption rates to determine if exposures exceed tolerable daily intake levels. Exposure assessment integrates with in studies to calculate attributable risk fractions, which quantify the proportion of disease incidence directly linked to a specific exposure. In prospective designs, researchers estimate the population attributable fraction (PAF) as the excess cases preventable by eliminating the exposure, using formulas like PAF = ( × (RR - 1)) / (1 + × (RR - 1)), where is the exposure and RR is the . This approach has been applied in occupational cohorts to attribute fractions of respiratory diseases to solvent exposures, informing preventive strategies and resource allocation in .

Exposure Pathways

Routes of Exposure

Routes of exposure refer to the primary biological and physical pathways by which hazardous agents, such as chemicals, , or biological contaminants, come into contact with and are absorbed by receptors, typically humans or other organisms. The three main routes—, , and dermal —account for the majority of exposures in environmental and occupational settings, with absorption efficiency varying significantly by route and agent properties. These pathways determine the dose received and influence the subsequent effects, as the site of entry affects and within the body. Inhalation occurs when gases, vapors, or airborne particles are taken up through the , primarily via the s, where they can be absorbed into the bloodstream. For gases and vapors, absorption is often nearly complete, approaching 100% due to the large surface area of the alveoli and efficient mechanisms. Particulate matter uptake depends on aerodynamic ; for instance, particles with diameters of 2.5 micrometers or smaller (PM2.5) represent the respirable fraction capable of penetrating deep into the alveoli, while larger thoracic particles (up to 10 micrometers) may deposit in upper airways. Factors such as breathing rate, , and particle further modulate deposition and absorption efficiency. Ingestion involves the oral intake of contaminants through , , or inadvertent swallowing of , , or other media, leading to absorption primarily in the . In adults, this route is common via contaminated or , but in children, hand-to-mouth behavior significantly increases nondietary risks, as young children frequently touch surfaces and then place hands or objects in their mouths, potentially transferring or . Absorption rates for ingested agents vary by and chemical form, often assumed at 50-100% for many soluble compounds, though less soluble materials like lead may have lower uptake (up to 50% in children). This route is particularly relevant for persistent environmental pollutants bioaccumulating in chains. Dermal absorption happens when agents penetrate the skin barrier, entering the systemic circulation through the and underlying layers, influenced by the chemical's lipid , molecular weight, and duration. Highly lipophilic compounds, as indicated by a high , absorb more readily due to the skin's lipid-rich structure, while hydrophilic substances penetrate poorly. Prolonged contact, such as from or repeated handling, enhances uptake, with absorption rates typically ranging from 10-50% for many organic chemicals over typical periods, though this can vary widely (e.g., 1-3% for some pesticides in children). integrity, hydration, and (e.g., ) also play key roles in this route. Other routes, such as ocular or injection, are less common but can result in rapid and high-efficiency absorption in specific scenarios. Ocular involves agents reaching the eyes or mucous membranes, where they may be absorbed through the thin, vascularized tissues, often leading to local or systemic effects if the agent is highly permeable. Injection, typically accidental in occupational settings like needlestick injuries, bypasses external barriers for direct entry into the bloodstream, achieving nearly 100% absorption akin to intravenous administration. These routes are rare in general assessments but critical in healthcare or environments.

Exposure Scenarios

Exposure scenarios in exposure assessment integrate various routes of exposure into cohesive, contextual narratives that represent realistic or hypothetical situations in which individuals or populations may come into contact with stressors. These scenarios serve as frameworks for evaluating exposures by combining assumptions about environmental conditions, human behaviors, and chemical characteristics to estimate intake or uptake. Routes such as , , and dermal contact act as building blocks within these scenarios, allowing assessors to simulate combined exposures across pathways. Scenario development typically employs either deterministic or probabilistic methods to quantify . Deterministic approaches use single-point estimates, such as mean or upper-bound values for parameters like concentration and rates, to produce a fixed estimate, often applied in initial screening to simplify calculations. In contrast, probabilistic methods, such as simulations, incorporate variability and uncertainty by sampling from probability distributions of input parameters, yielding a range of possible outcomes with associated probabilities; this is particularly useful for capturing real-world heterogeneity in behaviors and environments. The choice between these methods depends on data availability and assessment goals, with probabilistic techniques recommended when sufficient distributional data exist to refine estimates beyond point values. A common example is the residential for children, which accounts for age-specific hand-to-mouth behaviors leading to incidental of contaminated . In this , young children aged 3 to less than 6 years are assumed to ingest an upper rate of 200 mg of per day, derived from tracer studies and modeling that reflect typical play activities in home environments. This rate integrates as the primary route while considering factors like concentration and to estimate potential lead or contaminant uptake. Assessments often follow tiered approaches, progressing from screening-level scenarios with conservative assumptions—such as maximum concentrations and high-end rates—to refined, site-specific evaluations using measured data and probabilistic modeling for greater accuracy. Screening tiers prioritize bounding estimates to quickly identify low-risk situations, while higher tiers incorporate detailed observations to reduce in complex cases. Key factors influencing scenario design include exposure duration and frequency, distinguishing acute (short-term, high-intensity) from (long-term, repeated) exposures, as well as subgroups like infants, who exhibit higher relative due to behaviors, or the elderly, who may face reduced mobility but prolonged residence in contaminated areas. These elements ensure scenarios align with the temporal and demographic aspects of potential risks.

Assessment Methods

Direct Measurement

Direct measurement in exposure assessment involves empirical techniques that quantify to hazardous agents through the collection of real-time or sampled from individuals or their environments. These methods provide of levels, capturing the , , and of contact with contaminants such as chemicals, , or biological agents. Unlike predictive modeling approaches, direct measurement relies on observational to establish actual scenarios, offering high specificity for validation in regulatory and research contexts. Personal monitoring represents a core component of direct measurement, utilizing portable devices worn by individuals to assess in occupational or daily settings. Common tools include badges, air pumps, and sensors that sample contaminants; for instance, diffusive samplers passively collect volatile organic compounds (VOCs) in air by allowing analytes to diffuse onto adsorbent materials over time, enabling lightweight and pump-free operation for extended periods. These devices, such as badge-style samplers, are particularly effective for worker to gases and vapors without interfering with normal activities, providing time-integrated concentrations that reflect personal dose. Biological monitoring complements personal methods by measuring internal through biomarkers in bodily fluids or tissues, integrating all routes of uptake such as , , and dermal absorption. This approach analyzes samples like or to detect parent compounds or , offering a direct indicator of and systemic effects. A representative example is the use of cotinine, a , in to quantify , where urinary levels above 50 ng/mL often indicate active and levels between 1 and 50 ng/mL may reflect secondhand . Environmental sampling employs fixed-site monitors to capture ambient concentrations of pollutants at specific locations, providing population-level for exposure assessment in or settings. These stationary devices continuously or periodically measure air, , or soil contaminants using standardized protocols; for example, EPA Method TO-15 involves collecting air samples in evacuated canisters followed by analysis to quantify up to 97 VOCs at parts-per-billion levels. Fixed-site networks, such as those operated under the Clean Air Act, track criteria pollutants like and , informing broader exposure patterns while highlighting spatial variability. Advantages of direct measurement include the provision of , verifiable data that minimizes reliance on assumptions, with modern equipment like wearable sensors enhancing precision. Since the , GPS-linked personal sensors have enabled spatiotemporal tracking of exposures, such as integrating air quality readings with location data to map individual mobility and pollutant hotspots. These tools offer immediate feedback for , though they require and can be resource-intensive compared to indirect modeling techniques.

Indirect Measurement and Modeling

Indirect measurement and modeling in exposure assessment rely on surrogate data sources, mathematical simulations, and probabilistic frameworks to estimate contaminant concentrations and exposures without direct of the target individual or . These techniques are essential for scenarios where direct sampling is impractical due to high costs, spatial-temporal limitations, or the need to evaluate hypothetical or future exposures, enabling predictions grounded in emission rates, transport mechanisms, and behavioral patterns. By integrating environmental physics with statistical variability, indirect methods facilitate population-level risk evaluations and support regulatory decision-making. A core component is exposure modeling, which uses simplified s to predict contaminant levels in defined compartments. In indoor air applications, the one-box model assumes uniform mixing within a single room and derives the steady-state concentration C as C = \frac{S}{Q + kV}, where S represents the source emission rate (e.g., in μg/s), Q the rate (m³/h), k the deposition or removal rate constant (h⁻¹), and V the room (m³). This balances influx from sources against losses via and surface deposition, providing a quick estimate for exposures in residential or occupational settings. The model has been foundational in assessments and indoor studies, often serving as a for more complex refinements. Proxy data enhance model accuracy by incorporating real-world behavioral surrogates that influence exposure duration and intensity. For instance, activity pattern databases capture daily routines such as time spent in microenvironments (e.g., , , outdoors) or ingestion-related behaviors, allowing inference of contact frequencies without individual monitoring. The U.S. EPA's Consolidated Human Activity Database () aggregates nearly 180,000 individual study days of survey data from national and regional studies, detailing patterns like indoor occupancy (typically 80-90% of time for adults) and location-specific activities, which inform probabilistic exposure simulations for chemicals in air, water, or . CHAD data have been pivotal in constructing representative cohorts for U.S. populations, improving the realism of modeled exposures compared to uniform assumptions. Advanced software tools operationalize these concepts through integrated simulations. The EPA's Stochastic Human Exposure and Dose Simulation (SHEDS) model employs methods to generate probabilistic distributions of exposures by stochastically sampling from activity diaries (e.g., ), microenvironmental concentrations, and intake parameters across pathways like and dermal contact. SHEDS has been validated for applications such as pesticide residues and air toxics, producing outputs like 95th exposure estimates that account for inter-individual variability, and it supports for thousands of chemicals. This tool exemplifies how indirect modeling bridges deterministic equations with for aggregate risk assessments. Hierarchical approaches structure modeling from rudimentary to refined levels, starting with mass balance principles like the one-box model for screening and escalating to (CFD) for precise dispersion predictions. Basic models offer computational efficiency for broad scenarios, estimating average concentrations via input-output fluxes, while CFD solves Navier-Stokes equations to simulate , , and pollutant trajectories in three dimensions, capturing gradients in non-idealized spaces like ventilated rooms. This progression enhances resolution for critical cases, such as industrial emissions or urban air quality, with simpler tiers calibrated using empirical benchmarks to ensure reliability.

Specialized Approaches

Receptor-Based Approach

The receptor-based approach in exposure assessment employs a bottom-up that begins with the exposed individual or , known as the receptor, to estimate with contaminants through their behaviors, locations, and susceptibilities. This integrates environmental concentrations with receptor-specific factors, such as daily activities and physiological traits, to quantify exposure via pathways like , , or dermal . It is particularly suited for scenarios involving microexposure events in daily life, such as hand-to-mouth behaviors or time spent in contaminated microenvironments, and relies on tools like probabilistic modeling to account for variability across populations. The methodology follows a structured sequence of steps to build exposure estimates. First, the receptor population is identified, considering demographics like age, lifestage, , and genetic factors that influence susceptibility. Next, receptor activities are characterized using data from time-activity diaries, surveys, or models to map behaviors (e.g., dietary habits, occupational tasks, or subsistence practices like ) and locations (e.g., residential yards, workplaces, or versus rural settings). Finally, contact is estimated by route of exposure, combining these factors with measured or modeled contaminant concentrations in relevant media such as , , air, or . This approach often incorporates probabilistic techniques, such as simulations, to represent distributions of exposure factors rather than point estimates. A representative example is the assessment of human health risks from residues in residential settings, where ren may face elevated through ingestion and dermal contact. The chronic daily intake (CDI) is calculated using the formula: \text{CDI} = \frac{\text{IR} \times \text{C} \times \text{EF} \times \text{ED}}{\text{AT}} Here, IR is the ingestion rate (e.g., intake in g/day, such as 50-200 mg/day for toddlers based on hand-to-mouth activity); C is the contaminant concentration in the medium (e.g., level in , mg/g); EF is the (e.g., days/year spent in the yard); ED is the (e.g., years of ); and AT is the averaging time (typically ED for chronic non-cancer effects or lifetime for cancer risks). To derive this, start with the amount ingested per event (IR × C), multiply by the number of events over the period (EF × ED) to get the total intake, then divide by AT to obtain the average daily amount over the relevant timeframe; for body-weight-normalized dose, divide further by average body weight (BW). This equation, adapted from standard factors, allows estimation of aggregate risks from multiple microexposures, such as a ingesting 100 mg/day of -laden at 0.01 mg/g concentration over 350 days/year for 6 years, averaged over 70 years. This approach offers advantages in capturing variability among vulnerable groups, such as children who exhibit higher ingestion rates (e.g., up to 10 times adults due to mouthing behaviors) or tribal populations with elevated consumption leading to greater bioaccumulative intake. By prioritizing receptor-specific data, it provides more tailored and realistic estimates compared to source-oriented methods, enhancing the of protections.

Source-Based Approach

The source-based approach to exposure assessment employs a top-down that initiates with the of contaminant releases from identified sources, followed by modeling of their , , and environmental fate to estimate concentrations at potential or ecological receptors. This method is particularly suited for scenarios where source emissions can be reliably quantified, allowing predictions of exposure potential across broader areas without direct measurement at every receptor location. Unlike forward-tracing from receptors, it traces contaminants outward from emission points, facilitating proactive in environmental and occupational settings. The process typically involves several key steps. First, source strength is quantified through emission inventories or direct measurements, capturing factors such as release rates, temperatures, and flow volumes for point sources like industrial stacks. Second, environmental fate and are simulated using physicochemical parameters, including partitioning coefficients that describe how contaminants distribute between media such as air, water, and —for instance, the (Kow) influences potential in aquatic systems. Finally, receptor concentrations are predicted via models that account for meteorological conditions, , and chemical , yielding time-averaged exposure estimates. A prominent example in atmospheric applications is the use of Gaussian plume models, which assume pollutants disperse in a bell-shaped plume under steady-state conditions, incorporating , atmospheric , and plume rise to calculate downwind concentrations. These models form the basis for advanced tools like AERMOD, a steady-state dispersion model developed collaboratively by the U.S. Environmental Protection Agency (EPA) and the , promulgated in 2006 as the preferred regulatory tool for evaluating pollutant impacts from industrial sources such as stack emissions. AERMOD integrates processes for more accurate near- and far-field predictions, often applied to assess compliance with air quality standards. This approach is widely applied in regulatory permitting processes, such as Prevention of Significant Deterioration (PSD) reviews under the Clean Air Act, where it informs emission limits to protect ambient air quality, and in pollution control strategies to prioritize source reductions for persistent pollutants like volatile organic compounds. By focusing on source-receptor pathways, it supports cost-effective interventions, though it may require validation with receptor-based measurements for site-specific refinements.

Regulatory Frameworks

Occupational Exposure Limits

Occupational exposure limits (OELs) establish the maximum allowable concentrations of hazardous substances in workplace air to protect workers from adverse health effects over a working lifetime. In the United States, the (OSHA) enforces Permissible Exposure Limits (PELs), which are legal standards primarily adopted in 1971 from earlier federal guidelines and American Conference of Governmental Industrial Hygienists (ACGIH) recommendations. These PELs specify the highest concentration levels to which workers may be exposed without expected harm, serving as enforceable benchmarks for compliance. Complementing PELs, ACGIH publishes Threshold Limit Values (TLVs), voluntary guidelines first issued in 1946 as Maximum Allowable Concentrations for 148 substances, with the TLV term introduced in 1956 to denote levels below which adverse effects are unlikely. OELs are derived from toxicological data, typically starting with the (NOAEL) identified from human or animal studies, which represents the highest exposure without detectable harm. This NOAEL is then adjusted by applying or factors to account for variability in , data quality, and needs; for instance, a factor of 10 is commonly used for interspecies differences when animal data is extrapolated to humans. These factors ensure the limit provides a margin of , with the process emphasizing studies where possible and case-by-case evaluation to balance protection and feasibility. Monitoring for compliance involves measuring airborne concentrations against specific metrics, including the 8-hour time-weighted average (), which calculates the average over an 8-hour workday in a 40-hour workweek and must not be exceeded. Short-term exposure limits (STELs) TWAs by restricting peak exposures, typically to a 15-minute average not to be surpassed during the shift, preventing acute effects from brief high-level contacts. OSHA requires employers to assess exposures when potential hazards exist and maintain levels below these limits through , administrative measures, or . Globally, OELs vary by jurisdiction, with the establishing them under regulation framework effective from 2007, which mandates risk assessments and derivation of derived no-effect levels (DNELs) for occupational scenarios as part of chemical registration. The sets indicative OELs through directives like 98/24/EC, promoting harmonization while allowing member states to adopt binding limits tailored to local needs. These EU approaches integrate with broader strategies but focus on protections distinct from ambient environmental guidelines.

Environmental Exposure Standards

Environmental exposure standards establish thresholds for contaminants in air, water, and other media to protect and ecosystems in non-occupational settings, differing from occupational limits that focus on worker protections. These standards are developed based on scientific assessments of , exposure patterns, and sensitive populations, aiming to prevent adverse effects at population levels. In the United States, the Environmental Protection Agency (EPA) plays a central role in setting such standards under laws like the Clean Air Act and . The (NAAQS), established by the EPA under the Clean Air Act of 1970 and first promulgated in 1971, set permissible levels for six criteria air pollutants: , lead, , , , and . These primary standards are designed to protect with an adequate margin of safety, particularly for vulnerable groups like children and those with respiratory conditions, while secondary standards safeguard public welfare including visibility and ecosystems. For , a key pollutant contributing to , the EPA revised the primary 8-hour standard in 2015 to 70 , reflecting updated scientific evidence on respiratory and cardiovascular risks. NAAQS are periodically reviewed every five years, with levels adjusted based on epidemiological and toxicological data to ensure ongoing protection. Reference doses (RfD) provide chronic oral thresholds for non-carcinogenic chemicals, representing an estimate of daily likely without appreciable risks over a lifetime for the general , including sensitive subgroups. The RfD is derived from the (NOAEL) or similar points of departure by dividing by factors (UF) typically ranging from 10 to 1,000 to account for interspecies extrapolation, intraspecies variability, and data limitations. For example, the EPA's Integrated Risk (IRIS) database lists an RfD for inorganic at 0.00006 mg/kg-day (0.06 µg/kg-day) as of the 2025 assessment, based on human epidemiological studies of ischemic heart disease and with UFs applied for duration. These values guide risk assessments for environmental contaminants in , , and , ensuring exposures remain below levels associated with developmental or systemic . Under Section 304(a) of the Clean Water Act, the EPA develops national recommended criteria for toxins to protect aquatic life and human health in surface waters. These criteria specify maximum concentrations of pollutants, such as or pesticides, that should not cause acute or chronic harm to , , or other organisms, while also considering risks for human consumption via . For instance, the acute criterion for in freshwater is 1.8 µg/L (at a water hardness of 100 mg/L as CaCO3) to prevent lethality in sensitive species like salmonids, derived from toxicity tests and water chemistry adjustments. States adopt these into their standards to maintain designated uses like or sources, with updates reflecting new ecotoxicological data. Internationally, the (WHO) provides global air quality guidelines to inform national policies, with a 2021 update significantly tightening limits for fine particulate matter (PM2.5) to an annual mean of 5 µg/m³ and a 24-hour mean of 15 µg/m³, based on evidence linking even low levels to and mortality. These guidelines also cover (100 µg/m³ as a peak season 8-hour mean) and other pollutants, offering interim targets for countries with higher pollution burdens to progressively align with the strictest levels. Adopted by over 100 nations in policy frameworks, they emphasize reducing ambient exposures to minimize global health impacts from .

Challenges and Limitations

Systematic Errors

Systematic errors in assessment refer to consistent biases that distort estimates in a predictable direction, often leading to over- or underestimation of across studies or populations. These errors arise from flaws in measurement techniques, modeling frameworks, or input data, and they differ from random variability by producing non-zero mean deviations from true values. Unlike uncertainties, systematic errors can propagate through regulatory decisions and evaluations if not identified and corrected. One primary type of systematic error is measurement bias, which occurs when sampling devices fail to accurately capture environmental contaminants due to physical or operational limitations. For instance, samplers can underestimate particle concentrations in y conditions because alters efficiency, causing non-isokinetic sampling that favors larger particles and misses finer ones relevant to exposure. This bias is particularly pronounced in outdoor air monitoring, where unaccounted meteorological effects lead to consistently lower reported levels than actual s. Another type involves errors from modeling assumptions that do not reflect real-world dynamics, such as assuming steady-state conditions in environments with transient releases or varying rates. In biomarker-based assessments, steady-state models unrealistically presume constant levels in the body, resulting in biased rate estimates when actual involve time-dependent accumulation or elimination. These assumptions can overestimate or underestimate exposures in dynamic settings like industrial accidents or episodic pollution events. Sources of systematic errors often stem from inadequate or outdated exposure factors, such as activity pattern data that do not account for behavioral changes over time. For example, using pre-1990s activity data for children can lead to overestimation of or hand-to-mouth contact in modern settings, where reduced outdoor play and improved alter actual behaviors. Such outdated inputs systematically inflate assessments for vulnerable populations, perpetuating conservative but inaccurate regulatory thresholds. A notable historical case is the underestimation of exposures in occupational assessments before the , where early monitoring relied on coarse gravimetric methods and ignored size distributions, leading to reported levels far below those causing and . This systematic underestimation delayed stringent regulations, such as the U.S. OSHA reductions from 12 fibers per cubic centimeter in 1971 to 0.1 in 1994, contributing to thousands of preventable cases. To mitigate systematic errors, is employed to evaluate how variations in key parameters—such as sampler or model assumptions—affect overall estimates, allowing of influential biases for targeted corrections. By systematically perturbing inputs and observing output changes, this approach quantifies error propagation and supports more robust assessments, though it requires validation data to confirm directional impacts.

Uncertainty and Variability

In exposure assessment, uncertainty refers to a lack of about factors influencing estimates, such as incomplete or modeling assumptions, while variability describes the inherent heterogeneity in across individuals, populations, or time periods due to differences in behaviors, environments, or physiological traits. These elements are distinct yet interconnected, with variability often contributing to the range of possible exposures and affecting the confidence in those estimates. Parameter variability arises from fluctuations in key inputs, such as body weight distributions, which vary by age, sex, and ethnicity and are derived from large-scale surveys like the Continuing Survey of Food Intakes by Individuals (CSFII). For instance, adult body weights in the U.S. follow a with means around 70-80 kg, but the can exceed 110 kg, directly impacting dose calculations in or dermal exposure models. Scenario uncertainty, on the other hand, stems from unpredictable future conditions, such as changes in emission rates due to regulatory shifts or technological advancements, which complicate long-term projections in environmental scenarios. To propagate these sources through models, is a widely adopted technique that generates probability distributions of by repeatedly sampling from input distributions, allowing assessors to quantify the joint effects of variability and . In a two-stage approach, the method first accounts for population variability by sampling across individuals, then incorporates by varying model across iterations, producing outputs like cumulative distribution functions that represent the range of plausible exposures. Variability is commonly characterized using s from these distributions, with the 95th often employed to estimate high-end exposures for , capturing elevated risks without assuming worst-case scenarios for all factors. This approach enables decision-makers to identify protective thresholds, such as in site evaluations where the 95th exposure informs cleanup standards. Advances in handling uncertainty include Bayesian methods, which have gained prominence since the early for updating exposure estimates as new data becomes available, integrating prior knowledge with observed evidence through probabilistic frameworks like algorithms. These methods allow for dynamic refinement of parameter distributions, enhancing the adaptability of assessments to emerging information on variability sources.

References

  1. [1]
    [PDF] Guidelines for Human Exposure Assessment Risk Assessment Forum
    Oct 30, 2019 · This, in turn, provides a means to identify the sources and routes for stressors of interest and the amount of exposure incurred because of.
  2. [2]
    Exposure Assessment - Science and Judgment in Risk ... - NCBI - NIH
    The NRC report on exposure assessment (NRC, 1991a) provides a scientific framework to identify routes of entry and degree of contact and indicates how exposure ...Missing: authoritative | Show results with:authoritative
  3. [3]
    Exposure Assessment Program | Research Programs - CDC
    Mar 15, 2024 · Exposure Assessment is the multi-disciplinary field that identifies and characterizes workplace exposures, develops estimates of exposure.Missing: authoritative | Show results with:authoritative
  4. [4]
    Guidelines for Exposure Assessment | US EPA
    The Guidelines for Exposure Assessment (hereafter "Guidelines") describe the general concepts of exposure assessment including definitions and associated units.
  5. [5]
    [PDF] Guidelines for Exposure Assessment | EPA
    These Guidelines establish a broad framework for Agency exposure assessments by describing the general concepts of exposure assessment including definitions and ...
  6. [6]
    The NRC Risk Assessment Paradigm | US EPA
    Sep 2, 2025 · The 1983 NRC report identified four steps integral to any risk assessment: 1) hazard identification, 2) dose-response assessment, 3) exposure assessment, and 4 ...Missing: components | Show results with:components
  7. [7]
    Conducting a Human Health Risk Assessment | US EPA
    Jan 31, 2025 · A human health risk assessment involves four steps: planning, hazard identification, dose-response, exposure assessment, and risk ...
  8. [8]
    Superfund: CERCLA Overview | US EPA
    CERCLA, commonly known as Superfund, was enacted by Congress in 1980. It created a tax on the chemical and petroleum industries and gave Federal authority ...
  9. [9]
    [PDF] Exposure Factors Handbook Glossary Page - EPA
    Intake dose—The amount of an agent that enters a target by crossing an exposure surface that does not act as an absorption barrier. See also Absorption barrier ...
  10. [10]
    [PDF] Risk Assessment: Guidance for Superfund Volume 1 Human Health ...
    This procedure is based on EPA's published. Guidelines for Exposure Assessment (EPA. 1986a) and on other related guidance (EPA. 1988a, 1988b). It is an ...
  11. [11]
    Exposure Assessment Tools by Tiers and Types - Deterministic and ...
    Mar 10, 2025 · EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools
  12. [12]
    Exposure Assessment Tools by Routes - Inhalation | US EPA
    Apr 1, 2025 · Biologically effective dose is the amount of contaminant that interacts with the internal target tissue or organ. Illustration of Inhalation ...
  13. [13]
    Soil Bioavailability at Superfund Sites: Human Health | US EPA
    Jan 6, 2025 · This guidance has been used to support bioavailability adjustments across different routes of exposure at contaminated sites.
  14. [14]
    Exposure Assessment Tools by Media - Aquatic Biota | US EPA
    Apr 1, 2025 · EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization ...Missing: applications multimedia
  15. [15]
    Application of multimedia models for understanding the ...
    Aug 10, 2021 · Application of multimedia models for understanding the environmental behavior of volatile methylsiloxanes: Fate, transport, and bioaccumulation.
  16. [16]
    Bioaccumulation of organic contaminants in humans - PubMed
    Jan 1, 2011 · Bioaccumulation is an important component of the exposure hazard assessment and risk assessment of organic chemicals.Missing: applications | Show results with:applications
  17. [17]
    Bioaccumulation of Organic Contaminants in Humans: A Multimedia ...
    Aug 9, 2025 · Bioaccumulation is an important component of the exposure hazard assessment and risk assessment of organic chemicals.
  18. [18]
    Superfund Site Assessment Process | US EPA
    Sep 22, 2025 · The Superfund site assessment process begins with site discovery or notification of a release or potential release into the environment.
  19. [19]
    Superfund Site - an overview | ScienceDirect Topics
    A baseline risk assessment is conducted at all Superfund sites. Removal actions may precede completion of the risk assessment for urgent matters but the ...
  20. [20]
    Exposure Factors Handbook (2011 Edition) | US EPA
    Jan 15, 2025 · The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental ...
  21. [21]
    About the Exposure Factors Handbook | US EPA
    CHAD is intended to be an input file for exposure/intake dose modeling and/or statistical analysis. CHAD is a master database providing access to other ...<|control11|><|separator|>
  22. [22]
    Climate Change Indicators: West Nile Virus | US EPA
    Studies show that warmer temperatures associated with climate change can accelerate mosquito development, biting rates, and the incubation of the disease within ...
  23. [23]
    Impact of recent and future climate change on vector‐borne diseases
    Our review highlights significant regional changes in vector and pathogen distribution reported in temperate, peri‐Arctic, Arctic, and tropical highland ...
  24. [24]
    Vector-Borne Diseases | Climate and Health - CDC
    Mar 2, 2024 · Climate changes can lead to vector/pathogen adaptations, causing shifts or expansions in their geographic ranges. Such shifts can alter disease ...
  25. [25]
  26. [26]
  27. [27]
    Elevated Blood Lead Levels in Children Associated With the Flint ...
    Jan 21, 2016 · The percentage of children with elevated blood lead levels increased after water source change, particularly in socioeconomically disadvantaged neighborhoods.
  28. [28]
    The Flint Water Crisis: A Coordinated Public Health Emergency ...
    The grant established a lead exposure registry to collect data on a voluntary basis from residents who were exposed to the Flint water between April 25, 2014, ...
  29. [29]
    Phthalates and Their Impacts on Human Health - PMC - NIH
    May 18, 2021 · An important phthalate exposure route could be consisted of ingestion, inhalation, and dermal contact mainly via PCPs [9]. Some dairy products, ...
  30. [30]
    [PDF] Draft Consumer and Indoor Exposure Assessment for Diethylhexyl ...
    May 1, 2025 · Footwear components were assessed for DEHP exposure by dermal contact only. DEHP content was. 590 reported by the Danish EPA in several items ...
  31. [31]
    [PDF] Attributable Risk Estimation in Cohort Studies - Mayo Clinic
    Oct 8, 2009 · Population attributable risk (PAR) or etiologic fraction is the proportion of a disease that could be prevented by elimination of a causal risk ...
  32. [32]
    Adjusted Attributable Fractions
    Cohort study: In a cohort study, the AF represents the excess of incident disease cases that can be attributed to exposure of interest. However, cohort studies ...
  33. [33]
    Exposure Assessment Tools by Routes | US EPA
    Oct 1, 2025 · Exposure may be estimated using one of the three exposure routes: inhalation, ingestion, and dermal. Alternatively, exposure assessments may be categorized.
  34. [34]
    Pharmacokinetics and Metabolism of Pesticides - NCBI - NIH
    Systemic absorption of different classes of pesticides varies widely. As a rule, lipid soluble, unionized forms are relatively well absorbed throughout the GI ...
  35. [35]
    Thoracic and respirable particle definitions for human health risk ...
    Apr 10, 2013 · Thoracic and respirable fractions are defined as the fraction of inhaled particles capable of passing beyond the larynx and ciliated airways, respectively, ...
  36. [36]
    Exposure Assessment Tools by Routes - Ingestion | US EPA
    Apr 1, 2025 · EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization toolsCalculations · Scenarios · Factors<|control11|><|separator|>
  37. [37]
    [PDF] Chapter 2: Routes of Lead Exposure, Toxicology, and Societal Costs ...
    The primary route of exposure in children is ingestion of lead-contaminated dust through normal hand-to-mouth activity. Children absorb up to 50 percent of the ...
  38. [38]
  39. [39]
    Estimating Site Specific Ingestion | PHA Guidance Manual
    Soil/Sediment Ingestion Exposure Dose Formula. D = (C × IR × EF × CF) / BW. D = Exposure Dose (mg/kg/day), C = Contaminant Concentration (mg/kg), IR = Intake ...
  40. [40]
    [PDF] Dermal Exposure Assessment: Principles and Applications
    to interpret compound-specific dermal absorption rate data relative to the structure of the skin. ... mode of uptake for the inhalation route of exposure (Dahl, ...
  41. [41]
  42. [42]
  43. [43]
    [PDF] Guiding Principles for Monte Carlo Analysis | EPA
    This report is part of a continuing effort to develop guidance covering the use of probabilistic techniques in Agency risk assessments. This report draws ...
  44. [44]
    [PDF] policy for use of probabilistic analysis in risk assessment - EPA
    For human health risk assessments, the application of Monte Carlo and other probabilistic techniques has been limited to exposure assessments in the majority of.
  45. [45]
    [PDF] Exposure Factors Handbook - Chapter 5: Soil and Dust Ingestion
    Sep 5, 2011 · Predicted Soil and Dust Ingestion Rates for Children Age 3 to <6 Years (mg/day). Percentile. Mean. 5. 25. 50. 75. 95. 100. Dust ingestion/hand.
  46. [46]
    Exposure Assessment Tools by Approaches - Direct Measurement ...
    Mar 24, 2025 · Point-of-contact exposure measurements are performed using personal monitoring techniques (ie, personal air sampling or diet sampling) that record an ...
  47. [47]
    HUMAN EXPOSURE ASSESSMENT - INCHEM
    Direct monitoring methods for exposure measurements include the use of personal air monitors and/or analysis of human tissue and/or biological fluids.
  48. [48]
    Diffusive Uptake Rates for Passive Air Sampling - PubMed Central
    Passive (diffusive) sampling using sorbents is an economical and versatile method of measuring pollutants in air, including volatile organic compounds (VOCs).
  49. [49]
    Exposure Reconstruction (Biomonitoring and Reverse Dosimetry)
    May 2, 2025 · Biomarker data, gathered through biomonitoring, can strengthen exposure assessment. Using PK Models in Exposure Assessment: Predictive and ...National Health and Nutrition... · Other Sources · Tools · Selected References...
  50. [50]
    Biomarkers of exposure to new and emerging tobacco delivery ...
    Cotinine blood concentrations average ~150–250 ng/ml in daily cigarette smokers. Due to its longer t1/2 than nicotine, cotinine levels rise gradually during the ...
  51. [51]
    [PDF] EPA Air Method, Toxic Organics - 15 (TO-15)
    EPA method TO-15 determines volatile organic compounds (VOCs) in air using gas chromatography/mass spectrometry (GC/MS) on specially prepared canisters.
  52. [52]
    Managing Air Quality - Ambient Air Monitoring | US EPA
    Jun 5, 2025 · Ambient air monitoring is the systematic, long-term assessment of pollutant levels by measuring the quantity and types of certain pollutants in the surrounding ...Missing: fixed- | Show results with:fixed-
  53. [53]
    New Methods for Personal Exposure Monitoring for Airborne Particles
    Alternately, personal exposure assessment using direct-reading sensors can be paired with a global positioning system (GPS) receiver to track participant ...
  54. [54]
    Indoor Air Pollutant Exposure for Life Cycle Assessment: Regional ...
    The one-box model recommended by Hellweg et al. for estimation of indoor air intake fraction is given as (eq ...Figure 1 · Figure 2 · Discussion
  55. [55]
    Consolidated Human Activity Database (CHAD) | US EPA
    Dec 13, 2024 · The data in CHAD can help researchers understand the patterns of human behavior that influence their exposure to chemicals in their environment.
  56. [56]
    The National Exposure Research Laboratory's Consolidated Human ...
    Nov 1, 2000 · CHAD contains 22,968 person days of activity and is designed to assist exposure assessors and modelers in constructing population “cohorts” of ...
  57. [57]
    Stochastic Human Exposure and Dose Simulation (SHEDS) | US EPA
    Jul 11, 2025 · The SHEDS models are probabilistic models that can estimate exposures people face from chemicals encountered in everyday activities.What are SHEDS Models? · Benefits of SHEDS · How SHEDS Work
  58. [58]
    Improvement of the performance of a simple box model using CFD ...
    The purpose of this study is to improve the predictive potential of a simple box model by using CFD simulation. In this easy-use box model, the material/air ...
  59. [59]
  60. [60]
    [PDF] UNDERSTANDING VARIATION IN PARTITION COEFFICIENT, Kd ...
    The partition coefficient (Kd) is a parameter used to estimate the migration potential of contaminants in aqueous solutions in contact with solids.
  61. [61]
    Calculation and Modeling of Exposure - NCBI
    Gaussian-plume models are used by the Environmental Protection Agency (EPA) to estimate the concentration of a pollutant at locations some distance from an ...
  62. [62]
    Air Quality Dispersion Modeling - Preferred and Recommended ...
    As of December 9, 2006, AERMOD is fully promulgated as a replacement to ISC3, in accordance with Guideline of Air Quality Models (pdf) (531.03 KB, 11-29-2024, ...AERMOD Modeling System... · Meteorological Processors · Related Programs
  63. [63]
    AERMOD Modeling System Development | US EPA
    The EPA has developed the AERMOD Development and Update Plan. The Plan provides extensive history on AERMOD development, documents the EPA's best practices for ...
  64. [64]
    Permissible Exposure Limits - Annotated Tables | Occupational Safety and Health Administration
    ### Definition and Purpose of Permissible Exposure Limits (PELs)
  65. [65]
    History - ACGIH
    The term “Threshold Limit Values (TLVs)” was introduced in 1956. The first edition of Documentation of the Threshold Limit Values was published in 1962 and is ...
  66. [66]
    [PDF] Methodology for the Derivation of Occupational Exposure Limits
    'Safety factors' were initially developed in the early 1950s for the derivation of. Acceptable Daily Intakes (ADIs) for food additives or contaminants. These ...
  67. [67]
    Pocket Guide to Chemical Hazards Introduction | NIOSH - CDC
    The OSHA permissible exposure limits ( PEL s), as found in Tables Z-1, Z-2, and Z-3 of the OSHA General Industry Air Contaminants Standard (29 CFR 1910.1000) ...
  68. [68]
    Understanding REACH - ECHA - European Union
    Occupational exposure limits. Understanding Chemical Agents Directive (CAD) and Carcinogens, Mutagens or Reprotoxic substances Directive (CMRD) · Legislation.
  69. [69]
    Criteria Air Pollutants | US EPA
    The Clean Air Act requires EPA to set National Ambient Air Quality Standards (NAAQS) for six commonly found air pollutants known as criteria air pollutants.NAAQS Table · Ground-level Ozone Pollution · Hazardous Air · Lead (Pb)
  70. [70]
    Ozone National Ambient Air Quality Standards (NAAQS) | US EPA
    Jan 23, 2025 · Read about the 2015 revised ozone standards, and find the regulatory impact analyses, related rules, and other supporting doucments.Missing: criteria | Show results with:criteria
  71. [71]
    Ozone (O3) Air Quality Standards | US EPA
    Aug 7, 2025 · The Clean Air Act requires EPA to set national ambient air quality standards (NAAQS) for ozone and five other pollutants considered harmful to public health ...
  72. [72]
    Reference Dose (RfD): Description and Use in Health Risk ... - EPA
    Mar 15, 1993 · Exposure to a given chemical, depending on the dose employed, may result in a variety of toxic effects. These may range from gross effects, such ...
  73. [73]
    National Recommended Aquatic Life Criteria table - EPA
    Jul 21, 2025 · Aquatic life criteria for toxic chemicals are the highest concentration of specific pollutants or parameters in water that are not expected to ...
  74. [74]
    Basic Information on Water Quality Criteria | US EPA
    Sep 25, 2025 · The EPA bases aquatic life criteria on how much of a chemical can be present in surface water before it is likely to harm plant and animal life.Missing: toxins | Show results with:toxins
  75. [75]
    WHO global air quality guidelines: particulate matter (‎PM2.5 and ...
    Sep 22, 2021 · WHO global air quality guidelines: particulate matter ( PM2.5 and PM10) , ozone, nitrogen dioxide, sulfur dioxide and carbon monoxide
  76. [76]
    What are the WHO Air quality guidelines?
    Sep 22, 2021 · The WHO Air quality guidelines recommend levels and interim targets for common air pollutants: PM, O3, NO2, and SO2.
  77. [77]
    Effects of systematic exposure assessment errors in partially ...
    Exposure assessment errors that systematically distort the group-level exposure measures have more pronounced effects on bias and coverage.
  78. [78]
    [PDF] Factors Affecting Aerosol Sampling - CDC
    Mar 15, 2003 · This sampler specification eliminates the question of bias for that type of measurement. Investigation of the effect of changing the physical ...
  79. [79]
    Estimating equations for biomarker based exposure estimation ...
    Jun 13, 2011 · Unrealistic steady-state assumptions are often used to estimate toxicant exposure rates from biomarkers. A biomarker may instead be modeled ...
  80. [80]
    Addressing systemic problems with exposure assessments to ...
    Jan 12, 2023 · Because exposure assessment data are used to determine whether the general population as well as specific subpopulations are at risk, it is ...
  81. [81]
    The History of Asbestos Regulation in the US - Eurofins USA
    Mar 29, 2024 · The misconception that asbestos was banned, led to a decline in public interest. In 2002, the last asbestos mine closed, putting an end to the ...
  82. [82]
    Approaches to Uncertainty in Exposure Assessment in ... - NIH
    Uncertainty in assessment of individual exposure levels leads to bias, often, but not always, toward the null in estimates of health effects, ...
  83. [83]
    [PDF] Exposure Factors Handbook - Chapter 2: Variability and Uncertainty
    VARIABILITY AND UNCERTAINTY. Accounting for variability and uncertainty is fundamental to exposure assessment and risk analysis.
  84. [84]
    Uncertainty and Variability: The Recurring and Recalcitrant ... - NCBI
    To date, however, probabilistic exposure assessments have focused on the uncertainty and variability associated with variables in an exposure-assessment model.UNCERTAINTY IN RISK... · VARIABILITY AND... · UNCERTAINTY AND...
  85. [85]
    Characterizing Variability and Uncertainty in Exposure Assessments ...
    Incorporating variability and uncertainty surrounding the estimates of human exposures is key to making sound decisions and maximizing the benefits attained ...
  86. [86]
    A Bayesian framework for incorporating exposure uncertainty into ...
    This paper proposes a Bayesian framework using kernel density estimation (KDE) to incorporate exposure uncertainty into health analyses, using a two-stage ...