Fact-checked by Grok 2 weeks ago

Toxicity


Toxicity denotes the degree to which a substance or agent can produce harmful or adverse effects in living organisms, ranging from mild irritation to death, with the severity determined primarily by the dose administered. This concept underpins toxicology, the scientific discipline studying such effects, and is encapsulated in the foundational principle articulated by Paracelsus: "the dose makes the poison," meaning that all substances possess potential toxicity, but harm manifests only above certain exposure thresholds.
Central to understanding toxicity is the dose-response relationship, which quantifies how the magnitude of exposure correlates with the intensity and type of biological response, often plotted as a curve showing increasing effects with higher doses until a plateau or maximum is reached. arises from short-term, high-level exposures, as measured by metrics like the (LD50), defined as the amount of a substance required to kill 50% of a test population, typically in milligrams per kilogram of body weight; lower LD50 values indicate greater toxicity. , conversely, involves prolonged low-level exposures leading to cumulative damage, such as or , and is assessed through long-term studies rather than single-dose endpoints. Toxicity manifests through various routes of —ingestion, , dermal contact, or injection—and depends on factors including the chemical's inherent properties, the organism's , and environmental conditions, with selective toxicity enabling targeted effects, as in pharmaceuticals that harm pathogens more than the host. Controversies in toxicity assessment include the ethical concerns over animal-based LD50 testing, which has prompted development of alternative and computational models, though these must be validated against empirical data for reliability. Regulatory frameworks, such as those from the EPA and CDC, classify substances by toxicity categories to guide standards, emphasizing empirical measurement over speculative risk without dose specificity.

Fundamentals

Core Definition and Paracelsus Principle


Toxicity refers to the capacity of a substance or agent to induce adverse effects in living organisms, encompassing cellular damage, organ dysfunction, or death, with outcomes determined by exposure parameters such as dose, duration, and route. These effects arise from interactions between the toxicant and biological targets, often disrupting normal physiological processes like enzyme function or membrane integrity. In toxicology, toxicity is quantified through dose-response assessments, where the severity correlates with the amount absorbed relative to body weight and sensitivity.
The foundational Paracelsus principle, articulated by the Swiss physician and alchemist ( Bombastus von Hohenheim, 1493–1541), asserts that "Sola dosis facit venenum"—the dose alone makes the poison—meaning all substances can be toxic or therapeutic depending on quantity, as even essentials like or oxygen become harmful in excess. derived this from empirical observations, including analyses of occupational exposures among miners to metals like mercury and , which informed his rejection of Galenic humoralism in favor of chemical . This dose-dependent framework revolutionized by establishing that toxicity is not absolute but relational, enabling distinctions between poisons and medicines via controlled administration. Paracelsus' contributions extended to pioneering chemical assays and animal experimentation for toxicity testing, laying groundwork for modern where thresholds like no-observed-adverse-effect levels (NOAEL) quantify safe exposures. implies a continuum of responses, from (beneficial low-dose effects) to overt , emphasizing causal links between exposure magnitude and biological perturbation over intrinsic malevolence of agents. Empirical validation persists in regulatory standards, such as those from the U.S. , which derive permissible limits from dose-response curves.

Etymology and Conceptual Evolution

The term "toxicity" entered English in 1880, formed by adding the suffix "-ity" to "toxic," denoting the state or quality of being poisonous. The root "toxic" originates from the late Latin toxicus, borrowed from the Greek toxikon (τοξικόν), literally meaning "poison for or of arrows" or "bow poison," referring to substances applied to arrowheads for hunting or warfare. This etymon traces further to toxon (τόξον), the ancient Greek word for "bow" or "arc," highlighting the historical association of toxicity with weaponized venoms derived from plants, animals, or minerals. Conceptually, toxicity initially connoted acute lethality in targeted applications, as evidenced in Homeric epics around the BCE, where poisoned arrows symbolized swift, irreversible harm. By the , Greek physicians like Dioscorides (circa 40–90 ) expanded the idea in works such as , classifying substances by their poisonous potentials beyond weaponry, integrating empirical observations of dose, exposure route, and physiological effects. This marked a shift from mythic or ritualistic views of poisons—prevalent in ancient Egyptian and Mesopotamian texts dating to 3000 BCE, which treated toxicity as or alchemical duality—to a proto-scientific framework emphasizing causal mechanisms of harm. The modern conceptualization crystallized in the 16th century with (1493–1541), who asserted that "," reframing toxicity not as an intrinsic property of substances but as a quantitative relationship between exposure level and biological response, applicable to both medicinal agents and environmental hazards. This principle underpinned the coining of "" in the mid-17th century from Greek toxikon and logos (study), evolving by the 19th century into a discipline quantifying adverse effects via metrics like LD50 ( for 50% of subjects), distinguishing acute from based on temporal dynamics of exposure and latency to response. Such evolution reflects a progression from qualitative, context-specific dangers to rigorous, evidence-based assessments prioritizing dose-response over anecdotal lethality.

Historical Development

Ancient and Medieval Foundations

Concepts of toxicity emerged in ancient civilizations through observations of poisonous substances in nature and their effects on humans and animals. The , dating to approximately 1550 BCE in , documents treatments for various disorders caused by animal, plant, and mineral toxins, including prescriptions involving incantations and herbal remedies to expel poisons such as . Similarly, the , composed around 1400 BCE, references poison arrows, indicating early awareness of lethal projectiles enhanced with toxic agents. In and , systematic study advanced the understanding of poisons. (c. 460–370 BCE) contributed to clinical by cataloging poisons and differentiating their therapeutic from harmful doses, laying groundwork for dose-dependent effects. (c. 40–90 CE), a Greek physician serving in the , authored around 60–70 CE, describing over 600 plants with details on their toxic properties, s, and forensic implications, which served as a foundational pharmacopeia for centuries. (23–79 CE) expanded on these in , compiling knowledge of numerous plant, animal, and mineral poisons prevalent in society, where intentional was a noted method of . King Mithridates VI of Pontus (r. 120–63 BCE) exemplified practical experimentation by daily self-administration of poisons to build tolerance, culminating in a universal formula after consulting experts. Medieval scholarship, particularly in the Islamic world, preserved and refined ancient toxicological knowledge amid alchemical pursuits. Avicenna (Ibn Sina, 980–1037 CE) detailed clinical approaches to oral poisoning in his Canon of Medicine, recommending specific materia medica like antidotes derived from plants and minerals to counteract venom and other toxins based on observed symptoms. Arabic texts, such as those by Ibn Wahshiya (9th–10th century), classified poisons from animals, plants, and minerals, emphasizing symptom diagnosis and remedies, influencing both Eastern and Western traditions. In Europe, alchemy intertwined with toxicology, as practitioners like those handling arsenic—widely used and feared for its subtlety—explored poisonous metals in elixirs and transmutations, though empirical testing remained limited; Pietro d'Abano (c. 1257–1316) prescribed emetic methods in his Trattati dei veleni to expel mineral poisons like litharge. Arsenic gained notoriety as a covert agent in political and social poisonings during this era.

Modern Toxicology from 19th Century to Present

The emergence of toxicology as a distinct scientific discipline occurred in the early , driven by advances in and the need for forensic evidence in cases. , a Spanish-born who became dean of the Medical Faculty, published Traité des Poisons in , the first comprehensive treatise systematically classifying poisons, detailing their detection through animal experiments, clinical observations, and post-mortem analyses, and establishing reliable methods to identify substances like in biological tissues. Orfila's work refuted prior assumptions that poisons were undetectable after assimilation, proving instead that chemical traces persisted, thereby founding modern and influencing legal proceedings, such as the 1840 Lafarge trial where he testified on detection. This period also saw the invention of the in 1836 by James Marsh, a sensitive qualitative method for detecting via hydrogen arsenide gas production, which reduced false negatives in forensic investigations and spurred further chemical assays for toxins like and mercury. By the mid-19th century, toxicology expanded beyond forensics to address industrial exposures amid the , with studies documenting occupational hazards such as in workers and aniline dye-related bladder cancers, prompting early regulatory efforts like Britain's of 1833 and 1844 limiting child labor in toxic environments. The late introduced quantitative approaches, including dose-response concepts refined from but empirically tested via animal models, and the differentiation of toxicology from , emphasizing adverse rather than therapeutic effects. The marked toxicology's maturation into a multidisciplinary field, propelled by wartime chemical agents and post-war synthetic chemicals. Fritz Haber's development of and during (1915–1918) necessitated studies on toxicity and antidotes, while the 1920s saw J.W. Trevan introduce the metric in —a statistically derived from animal bioassays—to standardize potency assessments for pharmaceuticals and poisons. Post-World War II, the widespread use of organochlorine pesticides like (introduced 1940s) revealed bioaccumulation and ecological disruptions, culminating in Rachel Carson's 1962 , which documented avian reproductive failures and spurred , leading to the U.S. ban on in 1972. Regulatory frameworks solidified in this era: the U.S. of 1906 required toxicity labeling, followed by the 1938 Food, Drug, and Cosmetic Act mandating safety data, and the establishment of the Environmental Protection Agency in 1970 to oversee chemical risks under laws like the Toxic Substances Control Act of 1976. Analytical techniques advanced with (1950s) and (1960s), enabling trace-level detection and metabolite identification, while mechanistic insights grew through biochemical studies of enzyme inhibition, such as interactions. In the late 20th and early 21st centuries, toxicology integrated molecular biology, with genomics and proteomics elucidating toxicogenomics—gene expression changes from exposures—and addressing emerging threats like endocrine-disrupting chemicals (e.g., bisphenol A) and nanomaterials, whose unique size-dependent reactivity poses novel risks not captured by traditional metrics. The Society of Toxicology, founded in 1961, formalized professional standards, and computational models like physiologically based pharmacokinetic simulations (developed 1980s onward) reduced animal testing by predicting human exposures. Despite these advances, challenges persist in extrapolating animal data to humans and evaluating low-dose chronic effects, underscoring ongoing reliance on empirical validation over assumption-driven models.

Types of Toxic Agents

Chemical Toxins

Chemical toxins, also known as toxicants, are synthetic or naturally occurring substances that exert harmful effects on biological systems through chemical interactions, distinct from biological toxins produced by living organisms. These agents include inorganic compounds like and organic chemicals such as pesticides and solvents, with toxicity determined by factors including dose, exposure duration, , and individual susceptibility. Unlike biological toxins, which often involve enzymatic or protein-based mechanisms, chemical toxins typically disrupt cellular processes via direct molecular binding or reactive intermediates. Chemical toxins are classified by chemical structure, target organ, or effect type, encompassing (e.g., lead, mercury, ), volatile organic compounds (VOCs), (), and industrial pollutants like and . accumulate in tissues, causing (e.g., lead impairs in children via interference with function), nephrotoxicity, and carcinogenicity (e.g., induces DNA damage leading to ). Pesticides, such as organophosphates, inhibit enzymes, resulting in acute cholinergic crises characterized by and convulsions. Mechanisms of chemical toxicity often involve covalent binding to biomolecules, generation of inducing , or disruption of endocrine signaling. For instance, persist in the environment and bioaccumulate, linked to reproductive effects like decreased fertility and developmental delays in offspring through interference with and immune function. VOCs, emitted from paints and cleaners, cause immediate irritant effects on eyes and , with chronic associated with liver and kidney damage via central nervous system depression. , a common indoor air , acts as a by forming DNA adducts, increasing nasopharyngeal cancer risk at occupational levels above 1 ppm. Quantification of chemical effects relies on dose-response relationships, where low doses may elicit no observable adverse effects, but thresholds exist beyond which harm occurs, as evidenced by LD50 values for acute (e.g., LD50 of 15 mg/kg in rats). Environmental releases of like or have caused acute injuries in industrial incidents, with equipment failure contributing to 41-46% of cases per CDC surveillance data from 2000-2013. Regulatory classifications, such as those under the Globally (GHS), categorize chemicals by hazard severity, informing safe handling based on empirical toxicity data.

Biological Toxins

Biological toxins are poisonous substances produced by living organisms, including microorganisms, , and , that exert adverse effects on other organisms through specific biochemical interactions. These toxins, often proteins or polypeptides, differ from chemical toxins in their biological origin and high target specificity, enabling potent disruption of cellular processes at low doses. For instance, , produced by the bacterium , has an estimated human lethal dose of approximately 1 ng/kg body weight via inhalation, making it one of the most toxic known substances. Microbial toxins, derived from bacteria, fungi, , or , represent a major category. Bacterial exotoxins, secreted proteins like tetanus toxin from Clostridium tetani or diphtheria toxin from Corynebacterium diphtheriae, typically act by interfering with host cell signaling, enzymatic activity, or membrane function; tetanus toxin, for example, blocks inhibitory neurotransmitters, causing muscle spasms. Endotoxins, such as lipopolysaccharides from , trigger systemic inflammatory responses upon release from dying cells. Fungal mycotoxins, including aflatoxins from Aspergillus species, contaminate food and induce liver damage through formation and . Plant-derived phytotoxins, such as ricin from Ricinus communis castor beans, inhibit ribosomal protein synthesis, leading to cell death; a dose of 22 micrograms per kilogram can be fatal in humans. Animal toxins, often delivered via venoms or secretions, include neurotoxins like tetrodotoxin from pufferfish (Tetraodontidae), which selectively blocks voltage-gated sodium channels, causing rapid paralysis and respiratory failure, with an LD50 of about 8 micrograms per kilogram in mice. Snake venoms contain enzymatic components like phospholipases that disrupt cell membranes and induce hemorrhage. These toxins' mechanisms generally involve receptor binding, enzymatic cleavage of key molecules, or ion channel modulation, underscoring their evolutionary role in defense or predation. Biological toxins pose risks in natural exposures, food contamination, and potential due to their stability, ease of production, and difficulty in detection. Regulatory frameworks, such as the U.S. Select Agents list, classify high-risk examples like and as requiring strict controls because of their low LD50 values and lack of immediate antidotes in many cases. Despite toxicity, some, like , have therapeutic applications in at controlled doses for conditions such as muscle spasms.

Physical and Radiative Agents

Physical agents refer to non-chemical and non-biological environmental factors that induce adverse effects through direct , , electrical, acoustic, or mechanisms, distinct from molecular-level interactions of chemical toxins. These include extreme temperatures, changes, electrical currents, , and whole-body or localized , which can cause damage, physiological dysfunction, or chronic conditions depending on dose and exposure duration. Thermal agents exemplify physical toxicity via heat or cold stress. , where core body temperature exceeds 40°C, denatures proteins, disrupts cellular membranes, and triggers , potentially leading to multi-organ failure in severe cases; occupational exposure limits are set at wet-bulb globe temperatures below 30°C for heavy work to prevent such effects. below 35°C impairs neuronal signaling and cardiac function by altering and , with mortality rates approaching 40% in untreated severe cases. Mechanical and pressure-related agents cause or ; rapid pressure changes, as in beyond 10 meters without decompression, generate bubbles in tissues and blood, leading to emboli and neurological deficits, with incidence rates up to 2-3% in recreational divers exceeding safety protocols. Electrical agents induce toxicity through current passage, where alternating currents above 10 mA across the chest provoke by depolarizing myocardial cells, resulting in ; fatality correlates with current density exceeding 1 A/cm². Noise and vibration represent acoustic and oscillatory physical agents. Chronic exposure to noise levels above 85 dBA over 8 hours damages hair cells via and , causing permanent threshold shifts and , with occupational affecting 16% of U.S. workers per NIOSH data. Vibration, particularly hand-arm types at frequencies of 8-16 Hz and accelerations over 2.8 m/s², induces and neuropathy akin to Raynaud's , with prevalence up to 20% in operators after 5-10 years. Radiative agents encompass across the spectrum, exerting toxicity primarily through energy deposition in biological tissues. —alpha particles, beta particles, gamma rays, X-rays, and neutrons—ionizes atoms, producing that cleave DNA strands and induce chromosomal aberrations; absorbed doses above 0.5 acutely suppress hematopoiesis, while chronic low doses (e.g., 100 mSv lifetime) elevate risk by 0.5-1% per via stochastic mutagenesis. manifests in phases, with gastrointestinal subsyndrome at 6-10 causing epithelial sloughing and within days. Non-ionizing radiative agents, including (UV), (IR), microwaves, and radiofrequency fields, cause thermal or photochemical damage without ionization. UV-B (280-315 nm) exposure exceeding 200 J/m² induces cyclobutane in DNA, correlating with 90% of non-melanoma skin cancers; cumulative doses over 10,000 J/m² lifetime increase odds by 1.5-2 times. IR and microwaves elevate tissue temperatures, with power densities above 10 mW/cm² inducing cataracts or burns via , as observed in operators. The principle applies, as low-level exposures (e.g., background at 2-3 mSv/year) pose negligible risk, while high doses deterministically overwhelm repair mechanisms.

Measurement and Quantification

Dose-Response Frameworks

The dose-response relationship in toxicology describes the quantitative association between the administered dose of a toxic agent and the severity or incidence of an , forming the cornerstone for and regulatory standards. This framework posits that the magnitude of response generally increases with dose, though the shape of the curve varies by agent, endpoint, and biological context. Empirical data from controlled experiments, such as those in bioassays, demonstrate that responses can be graded (continuous, like inhibition) or quantal (all-or-nothing, like ), with models fitted to data using statistical methods like or to estimate parameters such as the median effective dose (ED50) or (LD50). Threshold models assume a dose below which no occurs, reflecting biological repair mechanisms or homeostatic adaptations that prevent harm at low exposures. For non-genotoxic agents, such as many industrial chemicals, this framework aligns with observations where cellular defenses mitigate low-level insults, supported by histopathological data showing no observable levels (NOAELs) in studies. The dose (BMD) approach refines this by statistically deriving a lower limit (BMDL) for a specified response , like a 10% increase in , offering a data-driven alternative to NOAELs that accounts for study design variability. Regulatory bodies like the U.S. employ BMD modeling for deriving reference doses, as evidenced in analyses of over 1,000 datasets where BMDL05 values (5% response ) provided more precise potency estimates than traditional methods. In contrast, the linear no-threshold (LNT) model extrapolates a straight-line relationship from high-dose data to zero, assuming without a safe , primarily applied to genotoxic carcinogens and . Originating from atomic bomb survivor studies and supported by assays, LNT underpins standards, such as the International Commission on Radiological Protection's dose limits of 1 mSv/year for the public. However, critiques highlight its failure in low-dose regimes, where epidemiological data from cohorts show no elevated cancer risk below 100 mSv, and toxicological stress tests reveal overestimation of risks compared to or hormetic alternatives. Peer-reviewed evaluations, including those of 1,500+ chemicals, indicate LNT's ideological origins in mid-20th-century advocacy rather than consistent empirical fit across datasets. Hormesis represents a biphasic dose-response framework where low doses stimulate adaptive responses, enhancing resistance or function, while higher doses inhibit or harm, characterized by a J- or U-shaped curve. Meta-analyses of thousands of dose-response datasets in reveal hormetic responses in approximately 30-40% of cases, particularly for growth, longevity, and stress resistance endpoints in model organisms like , nematodes, and . Evidence includes over 3,000 peer-reviewed studies documenting low-dose benefits from agents like , , and phytochemicals, attributed to mechanisms such as upregulated enzymes or pathways. Despite robust preclinical support, hormesis faces regulatory resistance due to precautionary paradigms favoring LNT, though probabilistic frameworks integrating mode-of-action data increasingly incorporate it for refined risk assessments. Advanced frameworks, such as mode-of-action (MOA)-based probabilistic models, integrate toxicogenomic data and key event analysis to characterize dose-response shapes, distinguishing linear from nonlinear behaviors via biomarkers of adversity. For instance, the U.S. National Program's genomic dose-response modeling uses Bayesian approaches to quantify uncertainty in low-dose extrapolations, applied to endpoints like neoplastic lesions in 2-year bioassays. These methods emphasize causal chains—exposure leading to molecular initiating events, cellular responses, and organ-level toxicity—prioritizing empirical validation over default assumptions, as seen in evaluations where MOA evidence shifted assessments from LNT to for specific chemicals.

Traditional Toxicity Metrics

The (LD50) quantifies as the single dose of a substance, expressed in mg/kg body weight, that causes death in 50% of a test population—typically —within a defined observation period, such as 14 days. This value is derived from dose-response experiments involving graded exposures to groups of animals, followed by statistical estimation via methods like probit analysis to fit the resultant sigmoid curve of mortality probability. Lower LD50 figures indicate higher potency, enabling comparative assessments across chemicals; for instance, exhibits an oral LD50 of approximately 6.4 mg/kg in rats, reflecting substantial . The median lethal concentration (LC50) parallels LD50 for inhalation or aquatic exposures, representing the airborne or aqueous concentration lethal to 50% of subjects over a standard duration, often 4–96 hours depending on the endpoint. LC50 values facilitate classification of gases and vapors; hydrogen sulfide, for example, has an LC50 of 444 ppm in rats after 4 hours. Both metrics classify hazards under frameworks like the Globally Harmonized System (GHS), stratifying acute oral toxicity into categories based on LD50 thresholds, with Category 1 denoting the highest risk (LD50 ≤ 5 mg/kg) and Category 5 the lowest (2000 < LD50 ≤ 5000 mg/kg).
GHS Acute Oral Toxicity CategoryLD50 (mg/kg body weight)
1 (Highest toxicity)≤ 5
2> 5 – ≤ 50
3> 50 – ≤ 300
4> 300 – ≤ 2000
5 (Lowest toxicity)> 2000 – ≤ 5000
For repeated or prolonged exposures, the no observed adverse effect level (NOAEL) marks the highest dose in a study yielding no biologically or statistically significant adverse changes relative to controls, ascertained from endpoints like organ histopathology, clinical chemistry, or behavioral alterations in subchronic (e.g., 90-day) or chronic rodent bioassays. The corresponding lowest observed adverse effect level (LOAEL) identifies the minimal dose eliciting such effects. NOAELs underpin regulatory safe exposure limits, such as reference doses (RfDs), via division by uncertainty factors (typically 10–1000) to extrapolate to humans, accounting for pharmacokinetic differences and sensitive subpopulations; for instance, an NOAEL of 10 mg/kg/day might yield an RfD of 0.1 mg/kg/day after a 100-fold adjustment. These thresholds emphasize observable causality in controlled settings but require validation against human data where available, as animal-derived values incorporate inherent extrapolative uncertainties.

Advanced Analytical Techniques

Advanced analytical techniques in toxicology encompass high-resolution instrumental methods, omics-based approaches, and computational models that enable precise identification, quantification, and mechanistic elucidation of toxic effects, surpassing traditional bioassays in sensitivity and throughput. These methods facilitate the detection of low-level exposures and complex mixtures, integrating molecular profiling with to predict adverse outcomes from first-principles perturbations in biological pathways. For instance, chromatography-mass (LC-MS) and gas chromatography-mass (GC-MS) are routinely employed for targeted and untargeted screening of xenobiotics in biological matrices, achieving detection limits in the parts-per-billion range for compounds like pesticides and pharmaceuticals. Omics technologies, including toxicogenomics and , provide comprehensive snapshots of , protein alterations, and metabolite shifts induced by toxicants, revealing causal mechanisms of toxicity at the systems level. Toxicogenomics applies transcriptomics to identify signatures for specific toxicities, such as liver from pathways, with studies demonstrating its utility in early detection before overt histopathological changes. , often via (NMR) or MS platforms, profiles endogenous metabolites to infer disruptions in energy metabolism or , as seen in rodent models exposed to hepatotoxins where altered levels of acylcarnitines and correlate with dose-dependent injury. These approaches have been validated in peer-reviewed cohorts, showing superior predictive power over single-endpoint assays for chronic exposures. New approach methodologies (NAMs), including high-throughput in vitro assays and in silico quantitative structure-activity relationship (QSAR) models, integrate with empirical data to forecast toxicity without extensive . For example, EPA-endorsed NAM batteries combine cellular assays for and read-across predictions to derive points of departure for , reducing uncertainties in extrapolating from high-dose data to human-relevant low doses. Computational toxicodynamics models simulate pharmacokinetic interactions, as in physiologically based pharmacokinetic (PBPK) frameworks that accurately predict of persistent pollutants like PCBs in human tissues based on coefficients and clearance rates. Despite their promise, NAMs require rigorous validation against empirical outcomes to address inter-species variability, with ongoing efforts by regulatory bodies like the to standardize protocols as of 2023.

Classification of Toxic Effects

Acute and Chronic Toxicity

Acute toxicity describes adverse health effects arising from a single high-dose exposure or multiple doses administered over a short period, typically up to 24 hours, with symptoms manifesting immediately or within a brief interval thereafter; these effects are often reversible upon cessation of exposure. In toxicological assessments, acute toxicity is quantified through metrics like the median lethal dose (LD50), which measures the dose required to kill 50% of a test population within a specified timeframe, often via oral, dermal, or inhalation routes in animal models. Examples include cyanide, which induces rapid cellular asphyxiation and death from even brief exposures, or high-dose solvents causing immediate neurological impairment. Chronic toxicity, by contrast, involves adverse effects from repeated low-level exposures over extended periods—often months to years—with onset delayed and outcomes typically irreversible, such as organ damage or . These effects stem from cumulative or persistent physiological disruption, as seen with like lead, where prolonged low-dose exposure leads to neurological deficits, , and renal failure in humans. Chronic studies in , mandated under frameworks like the Toxic Substances Control Act, expose animals to daily doses for up to two years to detect sublethal endpoints including and tumor formation. The distinction hinges on duration, dose intensity, and temporal latency of effects: acute scenarios prioritize immediate survival thresholds, while ones reveal thresholds for long-term resilience, with risks often harder to attribute causally due to variables like age or co-s. Regulatory testing reflects this, with acute protocols (e.g., 401) spanning days versus ones extending lifetimes, though ethical shifts favor alternatives for both to minimize animal use.
AspectAcute ToxicityChronic Toxicity
Exposure PatternSingle or short-term (e.g., <24 hours) high doseRepeated low doses over months/years
Effect OnsetImmediate or rapidDelayed (weeks to years)
ReversibilityOften reversibleGenerally irreversible
Key EndpointsMortality, acute organ failure (e.g., LD50)Cancer, reproductive harm, cumulative damage
Testing DurationDaysUp to lifetime (e.g., 2 years in rodents)

Human Health Classifications

Toxic substances are classified for human health effects through standardized systems that evaluate potential adverse outcomes based on empirical toxicity data, including lethal dose metrics, mechanistic studies, and epidemiological evidence. The Globally Harmonized System of Classification and Labelling of Chemicals (), developed by the United Nations, provides an international framework for identifying health hazards, categorizing them by severity to inform risk management and labeling. GHS health hazard classes encompass acute toxicity, which measures immediate life-threatening effects via oral, dermal, or inhalation routes using LD50/LC50 values from animal tests; skin corrosion/irritation; serious eye damage/irritation; respiratory or skin sensitization; germ cell mutagenicity; carcinogenicity; reproductive toxicity; specific target organ toxicity from single or repeated exposure; and aspiration hazard. These classifications rely on dose-response data, prioritizing causal evidence over speculative risks, though animal-to-human extrapolation introduces uncertainties addressed through safety factors in regulatory applications. Acute toxicity under GHS is divided into five categories, with Category 1 representing the highest hazard (e.g., oral LD50 ≤ 5 mg/kg) and Category 5 the lowest (LD50 > 2000 mg/kg but ≤ 5000 mg/kg or less severe symptoms).
GHS Acute Toxicity CategoryOral LD50 (mg/kg)Dermal LD50 (mg/kg)Inhalation LC50 (vapors, mg/L/4h)Typical Effects
Category 1≤5≤50≤0.5Fatal if swallowed/inhaled/absorbed
Category 2>5 ≤50>50 ≤200>0.5 ≤2.0Fatal if swallowed/inhaled/absorbed
Category 3>50 ≤300>200 ≤1000>2.0 ≤10.0Toxic if swallowed/inhaled/absorbed
Category 4>300 ≤2000>1000 ≤2000>10.0 ≤20.0Harmful if swallowed/inhaled/absorbed
Category 5>2000 ≤5000>2000 ≤5000>20.0 (data limited)May be harmful if swallowed/inhaled
For carcinogenicity, the International Agency for Research on Cancer (IARC), part of the , evaluates agents based on sufficient human evidence, mechanistic data, or animal studies, assigning groups such as (carcinogenic to humans, e.g., with epidemiological links to ) or Group 2A (probably carcinogenic, requiring strong animal evidence and limited human data). These differ from regulatory assessments, as IARC focuses on hazard identification without quantitative risk, potentially overemphasizing animal data despite species differences, while agencies like the U.S. Environmental Protection Agency (EPA) integrate exposure for risk characterization. The EPA employs toxicity categories I through IV for acute hazards, with Category I (e.g., oral LD50 ≤50 mg/kg) indicating high danger requiring skull-and-crossbones labeling, derived from standardized studies to predict lethality. For pesticides specifically, the WHO classifies active ingredients by acute oral/dermal toxicity into classes Ia (extremely hazardous, LD50 ≤5 mg/kg), Ib (highly hazardous, >5-50 mg/kg), II (moderately, >50-500 mg/kg), and III (slightly, >500-5000 mg/kg), guiding global handling and restricting highly toxic formulations in developing regions based on observed poisoning incidents. Classifications across systems emphasize verifiable causal mechanisms, such as inhibition for organophosphates, but debates persist over endpoints where low-dose effects lack robust confirmation, underscoring the need for first-principles scrutiny of extrapolated risks.

Environmental and Ecological Classifications

Environmental toxicity classifications evaluate the potential adverse effects of substances on ecosystems, primarily through standardized hazard criteria that consider acute and chronic impacts on and, to a lesser extent, terrestrial organisms. These systems, such as the Globally Harmonized System of Classification and Labelling of Chemicals (GHS), prioritize empirical toxicity data from laboratory tests on representative species like , crustaceans, , and soil invertebrates to derive hazard categories. The GHS focuses predominantly on environments due to their vulnerability and the prevalence of water-soluble contaminants, with categories determined by median lethal or effect concentrations (LC50/) for acute effects and no-observed-effect concentrations (NOEC) or similar for chronic effects. In the GHS, acute aquatic toxicity is divided into three categories: Category 1 applies to substances with LC50 or values ≤1 mg/L (highly toxic), Category 2 for ≤10 mg/L, and Category 3 for ≤100 mg/L, based on short-term tests (e.g., 96-hour fish LC50, 48-hour EC50, or 72-hour algal growth inhibition). Chronic aquatic toxicity includes four categories, emphasizing long-term sublethal effects; for instance, Category 1 requires a NOEC or EC10 ≤0.1 mg/L combined with acute Category 1 or 2 classification, while Category 4 covers substances with NOEC >10 mg/L but potential for . These criteria are harmonized in the EU's , Labelling and Packaging (CLP) Regulation, which mandates labeling for substances classified as "Aquatic Acute 1" ( symbol with "Very toxic to aquatic life") or "Aquatic Chronic 1" ("Toxic to aquatic life with long-lasting effects"). Beyond direct toxicity, ecological classifications address persistence, , and long-term ecosystem disruption through criteria for persistent, bioaccumulative, and toxic (PBT) substances under the EU REACH Regulation (Annex XIII). A substance qualifies as PBT if it meets all three: persistent (degradation time >60 days in marine, freshwater, or sediment), bioaccumulative (bioconcentration factor ≥2,000 or log Kow >4 with evidence), and toxic (chronic NOEC <0.01 mg/L for aquatic organisms or equivalent mammalian criteria). Very persistent and very bioaccumulative (vPvB) substances have stricter thresholds, such as half-lives >60 days in at least two environmental compartments and BCF ≥5,000, triggering authorization requirements due to their irreversible accumulation in food chains. Terrestrial ecotoxicity classifications remain less standardized globally, with GHS discussions ongoing but not yet formalized; assessments often rely on guidelines for endpoints like earthworm reproduction NOEC or plant growth inhibition. In the U.S., the EPA integrates ecological toxicity data into risk assessments for pesticides, using acute LD50/LC50 values for birds, mammals, and bees to categorize hazards (e.g., highly toxic if avian LC50 <10 mg/kg), alongside chronic reproductive studies to evaluate population-level effects. These frameworks emphasize causal links between exposure and outcomes, such as biomagnification in predators, but gaps persist in addressing complex mixtures or climate-influenced variability.

Influencing Factors

Exposure Routes and Duration

The primary routes of exposure to toxic substances in humans and other organisms are inhalation, ingestion, and dermal absorption, with parenteral routes such as injection being less common outside medical or accidental contexts. Inhalation occurs through the respiratory tract when gases, vapors, aerosols, or particulates are breathed in, enabling rapid systemic absorption due to the large surface area and thin alveolar membrane of the lungs, often leading to immediate effects on respiratory and cardiovascular systems. Ingestion involves oral uptake via contaminated food, water, soil, or dust, where absorption primarily happens in the gastrointestinal tract, influenced by factors like pH, gut motility, and substance solubility, potentially resulting in delayed systemic distribution after hepatic first-pass metabolism. Dermal exposure entails direct contact with skin or mucous membranes, where penetration depends on the substance's lipophilicity, molecular size, skin integrity, and exposure conditions such as occlusion or hydration, typically yielding slower and less complete absorption compared to other routes unless the agent is highly volatile or corrosive. The choice of route significantly modulates toxicity, as it determines the fraction of the administered dose that reaches target tissues, with inhalation often producing higher bioavailability for volatile compounds and dermal routes posing greater risk for lipophilic organics that evade skin barriers. Exposure duration further shapes toxic outcomes by altering the cumulative dose and biological response dynamics, generally categorized as acute, subchronic, or chronic based on standardized toxicological guidelines. Acute exposure refers to a single event or repeated contact lasting up to 14 days, often at high concentrations, which can trigger immediate, reversible effects like irritation or neurotoxicity through overwhelming detoxification pathways. Subchronic exposure spans several weeks to months of intermittent or continuous dosing, bridging acute and long-term patterns and revealing intermediate effects such as organ hypertrophy or early carcinogenesis precursors not evident in shorter assays. Chronic exposure involves prolonged low-level contact over months to years, promoting cumulative damage like fibrosis, neuropathy, or reproductive toxicity via mechanisms including bioaccumulation and epigenetic changes, where even subthreshold doses per event sum to exceed physiological repair capacities. The interplay between route and duration is critical, as longer exposures via inhalation may amplify pulmonary retention and translocation to extrapulmonary sites, while chronic dermal contact can lead to sensitization or percutaneous accumulation not seen acutely. Per Haber's rule, for certain time-dependent toxins, toxicity maintains a near-constant product of concentration and duration (C × t = k), implying that extending exposure time halves the requisite concentration for equivalent lethality in gases like phosgene, though this holds imperfectly for non-gaseous or repairable endpoints. Route-specific durations also influence endpoint selection in risk assessment; for instance, acute oral studies prioritize LD50 metrics, whereas chronic inhalation tests emphasize no-observed-adverse-effect levels () for carcinogenicity. Variability in absorption kinetics—faster for inhalation than dermal—means duration effects are route-dependent, with chronic low-dose ingestion potentially yielding higher risks from microbiome-mediated metabolism than equivalent acute boluses.

Biological and Genetic Variability

Biological variability in toxicity encompasses physiological differences such as age, sex, and health status that modulate an organism's capacity to toxicants. Neonates and infants often exhibit heightened susceptibility due to immature hepatic enzyme systems and underdeveloped renal clearance mechanisms; for example, premature infants exposed to in the mid-20th century suffered from gray baby syndrome, characterized by circulatory collapse and high mortality rates from inadequate . In adults, advanced age correlates with diminished glomerular filtration rates—declining by approximately 50% between ages 20 and 80—and reduced phase I metabolic activity, prolonging exposure to lipophilic toxins like . Sex-based differences arise from hormonal influences on enzyme expression; testosterone suppresses activity in males, potentially increasing toxicity from substrates like in females, who show 20-30% higher activity and faster clearance but greater risk during pregnancy due to altered pharmacokinetics. Genetic variability introduces profound interindividual and population-level differences in toxicant susceptibility through polymorphisms in xenobiotic-metabolizing enzymes (XMEs). Cytochrome P450 (CYP) enzymes, mediating phase I oxidation of over 90% of xenobiotics, display single nucleotide polymorphisms (SNPs) that classify individuals as poor, intermediate, extensive, or ultrarapid metabolizers; CYP2D6 poor metabolizers, comprising 5-10% of Caucasians and <1% of Ethiopians, exhibit 10- to 100-fold reduced activity, elevating toxicity from prodrugs like codeine, which accumulate unmetabolized or form excess active metabolites. Similarly, CYP2C19*2 allele carriers, prevalent in 15-20% of Asians versus 2-5% of Caucasians, impair bioactivation of clopidogrel, indirectly heightening thrombotic risks akin to toxic endpoints, while ultrarapid variants increase reactive intermediate formation and hepatotoxicity. Phase II enzymes like glutathione S-transferases (GSTs) further contribute; GSTM1 null genotypes, absent in 40-60% of individuals depending on ethnicity, reduce conjugation of electrophilic toxins such as aflatoxin B1, correlating with 3-7-fold elevated hepatocellular carcinoma risk in exposed populations. These factors interact causally: genetic polymorphisms dictate baseline metabolic capacity, while biological states like disease (e.g., cirrhosis reducing CYP expression by up to 80%) or nutritional deficiencies (e.g., selenium depletion impairing GST activity) amplify variability. Ethnic disparities in allele frequencies underscore population-specific risks; for instance, higher NAT2 slow acetylator prevalence in Europeans (50%) versus rapid acetylators in Egyptians (80%) alters isoniazid-induced hepatotoxicity profiles. Toxicogenomic studies confirm that such variants explain 20-80% of pharmacokinetic variance for many chemicals, informing precision risk assessment over uniform models.

Chemical Interactions and Mixtures

Chemical interactions occur when the toxicity of one substance modifies the effects of another, altering the overall toxicological outcome beyond what would be predicted from individual exposures alone. These interactions are classified into categories such as , where the combined effect equals the sum of individual toxicities; , where the mixture produces greater toxicity than the sum; , where the effect is less than the sum; and , a form of synergism where one non-toxic or low-toxicity substance enhances the effect of another toxicant. In environmental and occupational settings, mixtures often predominate, yet toxicological assessments frequently default to dose or response , which may underestimate risks if non-additive interactions prevail. Synergistic interactions, though infrequent, can amplify risks significantly; meta-analyses of mixture studies indicate that synergistic deviations exceeding twofold occur in approximately 5% of tested mixtures, with antagonism similarly rare, while additivity dominates in most cases. The "funnel hypothesis" posits that synergy or antagonism becomes more likely as mixture complexity increases, with simpler binary mixtures tending toward additivity and larger mixtures (e.g., >10 components) showing greater deviation potential due to diverse mechanisms like inhibition or . Mechanisms underlying include metabolic potentiation, where one chemical induces enzymes that activate another's toxic , or pharmacodynamic enhancement via shared cellular targets. Specific examples illustrate these effects: and exhibit liver toxicity synergism, with combined exposure causing enhanced hepatocellular damage compared to either alone, due to ethanol's induction of enzymes that bioactivate . In pesticide mixtures, combinations like and demonstrate synergistic , inhibiting more potently than predicted, as observed in rodent studies where mixture exposures exceeded dose-additive expectations by factors of 2-5. Environmental mixtures, such as urban air pollutants (e.g., and ), often show additive respiratory effects but occasional synergism in pathways, complicating for chronic low-dose exposures. Assessing mixture toxicity remains challenging, as over 80% of studies focus on small (2-5 component) mixtures rather than realistic complex exposures encountered in ecosystems or human diets. Regulatory frameworks like those from the U.S. EPA emphasize component-based evaluations, potentially overlooking synergies, though integrated approaches using concentration addition for baseline predictions followed by interaction screening are recommended for high-stakes scenarios like pesticide residues or industrial effluents. Empirical data underscore that while additivity suffices for many mixtures, identifying synergies requires targeted or testing, as low-dose combinations can yield disproportionate effects not captured by linear models.

Regulatory and Societal Dimensions

Major Regulatory Frameworks

The Toxic Substances Control Act (TSCA), enacted by the in 1976 and administered by the Environmental Protection Agency (EPA), authorizes the regulation of chemical substances that may present an unreasonable risk of injury to human health or the environment. TSCA requires manufacturers to report data on chemical production, processing, and , enables the EPA to toxicity testing, and permits restrictions or bans on high-risk substances, including polychlorinated biphenyls (PCBs) phased out by 1979. Amendments via the Frank R. Lautenberg Chemical Safety for the 21st Century Act in 2016 expanded EPA authority to prioritize and evaluate over 80,000 existing chemicals in commerce, mandating risk assessments based on empirical hazard, , and use data without assuming safety thresholds like de minimis risks. As of 2025, TSCA has driven evaluations of substances like (PFAS), with EPA finalizing bans or controls informed by dose-response toxicity studies. In the European Union, the Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) regulation, adopted in 2006 and managed by the European Chemicals Agency (ECHA), requires companies to register substances produced or imported in volumes exceeding 1 tonne per year, providing toxicity data from in vivo and in vitro assays to assess human and environmental hazards. REACH imposes the "no data, no market" principle, shifting proof of safety to industry, and authorizes restrictions on carcinogens, mutagens, or reproductive toxicants (CMRs) based on weight-of-evidence evaluations, with over 2,000 substances registered by 2023 including detailed dossiers on acute and chronic endpoints like LD50 values and NOAELs. By 2025, REACH's updates emphasize safer alternatives and extended producer responsibility, though critiques note implementation delays due to data gaps in mixture toxicity interactions. Internationally, the Globally Harmonized System of Classification and Labelling of Chemicals (GHS), developed by the Economic Commission for Europe (UNECE) and revised periodically since its 2003 adoption, standardizes hazard classification for physical, health (including categories 1-5 based on oral/dermal/ LD/LC50 values), and environmental toxicity, facilitating consistent global communication via pictograms and safety data sheets. Over 70 countries, including the and , have integrated GHS into national systems by 2025, reducing trade barriers while enabling cross-jurisdictional toxicity comparisons, though it focuses on communication rather than mandatory mitigation. Complementary treaties like the Convention on Persistent Organic Pollutants (POPs), effective since 2004 with 186 parties, target bioaccumulative toxins such as and PCBs through elimination or reduction targets, informed by long-term exposure and ecological damage data from monitoring programs. No unified global framework exists for all toxic substances, leading to jurisdictional variances; for instance, TSCA emphasizes post-market surveillance while REACH prioritizes pre-market registration, potentially underregulating mixtures or nanomaterials lacking standardized toxicity protocols. Occupational frameworks, such as the US Occupational Safety and Health Administration (OSHA) standards under the 1970 OSH Act, set permissible exposure limits (PELs) for airborne toxins like benzene (1 ppm 8-hour TWA since 1987), derived from threshold limit value studies balancing carcinogenicity risks. These regimes collectively rely on empirical metrics—e.g., EPA's TSCA risk evaluations integrate benchmark dose modeling for non-cancer effects—but face challenges from evolving data on endocrine disruptors and synergistic effects.

Controversies in Risk Assessment

One major controversy in toxicology risk assessment centers on the choice of dose-response models, particularly the linear no-threshold (LNT) assumption, which posits that carcinogenic risks increase proportionally with any exposure level, even at doses far below those tested experimentally. This model, originating from mid-20th-century radiation studies and extended to chemical carcinogens, underpins regulations like those from the U.S. Environmental Protection Agency (EPA), but critics argue it overestimates low-dose risks by ignoring biological repair mechanisms and empirical data showing no effects or protective responses at sub-toxic levels. For instance, analyses of over 1,000 toxicological studies indicate that LNT fails multiple empirical tests, including consistency with adaptive cellular responses observed and . An alternative framework, , proposes biphasic dose responses where low doses stimulate beneficial effects, such as enhanced cellular repair or stress resistance, before toxicity emerges at higher thresholds—a documented in approximately 30-40% of toxicological endpoints across chemicals, , and stressors. Proponents, including reviews of thousands of peer-reviewed experiments, contend that better aligns with first-principles of , like evolutionary adaptations to mild stressors, and challenges regulatory defaults that assume harm without evidence; however, adoption remains limited due to entrenched LNT precedents in agencies like the International Agency for Research on Cancer (IARC). Skeptics within maintain that hormetic effects may not consistently translate to , though meta-analyses refute this by showing hormesis's prevalence over strict thresholds in non-cancer endpoints as well. Interspecies extrapolation introduces further uncertainty, as toxicity data primarily derive from studies, requiring scaling factors (e.g., allometric adjustments by ) to estimate human risks, yet these often yield inaccuracies due to metabolic and physiological differences. For example, linear body-weight-based extrapolations overestimate human sensitivity for many compounds, while high-dose animal tests—standard in protocols like guidelines—fail to mimic real-world low-dose, chronic human exposures, potentially inflating safety factors by orders of magnitude. Debates persist over default uncertainty factors (typically 10-fold for interspecies and intraspecies variability), with evidence suggesting chemical-specific physiologically based pharmacokinetic (PBPK) models reduce but do not eliminate errors, as validated in cases like where rodent-human discrepancies exceeded 100-fold. The , formalized in the 1992 Rio Declaration and embedded in frameworks like the EU's REACH regulation (effective 2007), exacerbates these modeling disputes by prioritizing hazard avoidance over quantitative risk probabilities when data are incomplete, often resulting in de facto bans on substances like despite low-probability risks. This contrasts with evidence-based approaches in the U.S., where cost-benefit analyses under laws like the Toxic Substances Control Act weigh exposure likelihood and severity; critics of precaution argue it biases toward over-regulation, ignoring benefits like pesticide yield increases (e.g., 20-40% in some crops) and stifling , while proponents cite cases like DDT's phasedown for ecological gains. Empirical comparisons reveal precaution's implementation correlates with higher regulatory costs without proportional health improvements, as seen in divergent EU-U.S. approvals for endocrine disruptors. These controversies highlight tensions between conservative defaults, which guard against underestimation but risk economic overreach, and data-driven refinements like Bayesian probabilistic assessments, increasingly advocated for their in handling uncertainties. Regulatory bodies face pressure from stakeholders, with favoring models to permit safe uses and advocacy groups pushing LNT for maximal protection, underscoring the need for meta-assessments of source biases in peer-reviewed literature. Ongoing shifts toward and data aim to resolve these, but as of 2023, LNT dominance persists in global standards, prompting calls for hormesis-informed revisions to avoid misallocating resources on negligible risks.

Economic Costs and Benefits of Regulation

Regulations aimed at controlling toxic substances, such as the U.S. Clean Air Act Amendments and the European Union's REACH framework, generate direct compliance costs for industries, including chemical testing, risk assessments, substitution of hazardous materials, and administrative reporting. For instance, the EU's REACH regulation, implemented in 2007, has imposed ongoing annual compliance costs estimated at approximately €2.5 billion on businesses, primarily through registration and authorization processes for over 23,000 substances. Similarly, updates to the U.S. Toxic Substances Control Act (TSCA) in 2016 have increased burdens for new chemical reviews, with economic analyses projecting incremental costs in the tens of millions annually for procedural changes alone, though broader industry-wide impacts remain debated amid reports of sector growth. These costs often manifest as higher production expenses passed to consumers or incentives for to less-regulated jurisdictions, potentially reducing domestic in chemical-intensive sectors. Benefits of such regulations are quantified primarily through avoided health and environmental damages, using metrics like the value of a statistical life (VSL) and reduced morbidity costs. The U.S. EPA's prospective analysis of the 1990 Clean Air Act Amendments, which include provisions for hazardous air pollutants like mercury and , estimates total benefits from 1990 to 2020 at over $2 trillion, driven by premature mortality avoidance and respiratory illness reductions, compared to compliance costs of $65 billion—a net benefit ratio exceeding 30:1. For REACH, a 2021 European Chemicals Agency evaluation attributes €2.1 billion in annual health benefits from reduced chemical exposures, including lower cancer and reproductive disorder incidences, surpassing direct costs by a factor of four when accounting for worker and consumer protections. Broader estimates suggest EU chemical regulations yield €11–47 billion yearly in societal gains from minimized healthcare expenditures and ecosystem services. Critiques of these cost-benefit analyses highlight methodological biases that may overstate net positives, particularly from regulatory agencies incentivized to justify expansive rules. Benefits often incorporate co-benefits, such as reductions from toxics controls, inflating totals without isolating toxic-specific effects, while future health gains are discounted at low rates (e.g., 3% vs. 7%), amplifying long-term values. Independent reviews note unquantified costs, including stifled innovation from pre-market testing burdens under TSCA or REACH, and potential economic distortions where stringent rules favor large firms over small ones, though empirical data on job losses remains mixed with no clear causal evidence of net employment decline. Agency-produced analyses, like those from the EPA, warrant scrutiny for optimistic VSL assumptions ($7–11 million per life) derived from willingness-to-pay surveys potentially skewed by contextual framing, underscoring the need for robust sensitivity testing to ensure causal claims of net benefits hold under varied assumptions.
RegulationPeriod/AscopeEstimated CostsEstimated BenefitsSource Notes
U.S. Clean Air Act (Toxics Provisions)1990–2020$65 billion>$2 trillion (health, mortality avoidance)EPA prospective study; includes co-benefits from criteria pollutants.
EU REACHAnnual (post-2007)€2.5 billion (business compliance)€2.1 billion health + broader societal gainsECHA evaluation; focuses on authorization health risks avoided.

Innovations and Alternatives

In Vitro and Computational Models

In vitro models for toxicity assessment involve the use of isolated cells, tissues, or engineered constructs cultured outside living organisms to evaluate adverse effects of chemicals or drugs. These approaches, including two-dimensional (2D) cell monolayers and advanced three-dimensional (3D) spheroids or organoids, enable for endpoints such as , , and organ-specific damage. For instance, liver-derived HepG2 cells or (iPSC)-derived cardiomyocytes are commonly employed to mimic hepatic or cardiac toxicity, respectively, providing mechanistic insights into cellular responses like or production. Such models have gained traction due to ethical concerns over and regulatory pushes toward alternatives, with the global in vitro toxicology market valued at approximately USD 11.92 billion in 2024. Advanced systems, such as (OoC) platforms, integrate to replicate physiological microenvironments, including fluid flow and multi-cellular interactions, enhancing physiological relevance over traditional static cultures. These have shown promise in predicting drug-induced , where retrospective analyses indicate improved concordance with human outcomes compared to simple 2D assays, though overall predictivity remains limited by factors like incomplete metabolic competence. OoC models for or , for example, can assess glomerular filtration or blood-brain barrier permeability, but challenges persist in scaling for routine use and validating against data. Despite advantages in throughput and cost, in vitro models exhibit key limitations, including failure to replicate systemic interactions, metabolism, and chronic exposures characteristic of whole-organism responses. Isolated cells often lack the , immune components, and vascularization present , leading to discrepancies; for example, in vitro assays may overestimate toxicity for compounds requiring bioactivation or underestimate it due to absent compensatory mechanisms. Animal models predict human toxicity with only 40-70% accuracy, yet in vitro systems frequently underperform in bridging this gap without integration with other methods. Computational models, or approaches, employ algorithms to predict toxicity based on , physicochemical properties, or empirical data, bypassing biological experimentation. Quantitative structure-activity relationship (QSAR) models correlate molecular descriptors—such as lipophilicity or features—with toxicological endpoints, enabling rapid screening of large chemical libraries. Recent QSAR applications include predicting acute oral toxicity or immunotoxicity, with models trained on public datasets like ToxCast achieving accuracies up to 80-90% for specific classes, though performance degrades for novel scaffolds outside training domains. Machine learning (ML) advancements have augmented traditional QSAR by incorporating and graph neural networks to handle complex datasets, including integration for endpoint-specific predictions like or carcinogenicity. For , QSAR models have demonstrated superior performance over classical methods by capturing nonlinear structure-activity patterns. Tools like read-across, which infer toxicity from analogous compounds, complement QSAR for data gaps, while addresses model limitations such as dataset imbalances or applicability domain violations. However, computational predictions rely heavily on , with biases in training sets—often derived from or animal studies—potentially propagating errors, and explicit validation against human outcomes remains sparse. Integration of and computational models within new approach methodologies (NAMs) aims to enhance predictive power through hybrid workflows, such as using -derived data to refine QSAR parameters or ML-driven to prioritize compounds for cell-based validation. These combined strategies have accelerated identification of hepatotoxins in , reducing animal use while improving human relevance, though regulatory acceptance lags due to needs for standardized validation and inter-laboratory . Ongoing challenges include bridging the gap to causality and addressing endpoint-specific variabilities, underscoring the need for causal mechanistic modeling over purely correlative approaches.

Toxicogenomics and Omics Approaches

Toxicogenomics integrates with genomic sciences to elucidate how environmental toxins and chemicals perturb , protein profiles, and metabolic pathways at a molecular level. This field emerged in the early 2000s, coinciding with advancements from the completed in 2003, enabling high-throughput analysis of toxicant-induced genomic alterations. By examining genome-wide responses, toxicogenomics identifies signatures of toxicity, such as differential patterns, that precede phenotypic changes like organ damage. Core approaches in toxicogenomics include transcriptomics, which measures mRNA levels to detect early transcriptional responses to stressors; , assessing protein modifications and abundances; and , profiling small-molecule metabolites to capture downstream biochemical disruptions. These methods leverage technologies like sequencing and to generate datasets revealing causal pathways in toxicity, such as activation of stress response genes or inhibition of enzymes following exposure to compounds like acetaminophen. Multi- combines these layers for a systems-level view, improving mechanistic understanding over single- analyses. In predictive toxicology, toxicogenomics facilitates the development of biomarkers for hazard identification and , as demonstrated in databases like the Connectivity Map and Tox21 project, which correlate profiles with known toxicants. Applications extend to drug development, where profiling helps prioritize compounds with low toxicity potential by flagging signatures linked to adverse outcomes like . Recent advances, including single-cell and AI-driven , enhance resolution for inter-individual variability and support regulatory shifts toward models, reducing reliance on while maintaining empirical rigor. Challenges persist in standardizing data across platforms and validating signatures for , necessitating validation against dose-response and exposure metrics.

Future Directions in Predictive Toxicology

Advancements in (AI) and (ML) are poised to enhance predictive toxicology by integrating multi-omics data and results, such as those from the ToxCast database, to forecast toxicity endpoints with greater accuracy than traditional quantitative structure-activity relationship (QSAR) models. Recent reviews indicate that algorithms, trained on diverse datasets including transcriptomics and , can uncover nonlinear toxicity mechanisms, potentially reducing false positives in drug candidate screening by up to 20-30% in benchmark studies. These models emphasize over correlative patterns, addressing limitations in older black-box approaches through techniques like explainable AI (XAI), which provide interpretable rationales for predictions essential for regulatory validation. New approach methodologies (NAMs), encompassing systems, adverse outcome pathways (AOPs), and computational simulations, represent a shift toward human-relevant predictions, minimizing reliance on while incorporating kinetic and dynamic exposure factors. Integration of NAMs with AI enables real-time hazard identification for chemical mixtures, as demonstrated in frameworks combining assays with ML-driven read-across for untested compounds, achieving predictive concordance rates exceeding 80% for endpoints like . Future efforts focus on standardizing data pipelines for NAMs, with initiatives like the FDA's exploration of these tools for faster risk assessments projecting a 50% reduction in preclinical timelines by 2030. Challenges persist in , , and regulatory , yet collaborative platforms are emerging to validate NAMs against historical data, fostering confidence in their causal predictive power. By 2025, projections suggest AI-enhanced predictive could lower drug attrition from toxicity by integrating global datasets, though empirical validation through prospective studies remains critical to counter risks in ML models. Overall, these directions prioritize mechanistic understanding over empirical correlations, aligning with first-principles of dose-response causality to refine safety assessments.

References

  1. [1]
    [PDF] Environmental Health Resoures Self Learning Module: Toxicology
    Toxicology is the study of the harmful effects of substances on humans or animals. The word “toxicity” describes the degree to which a substance is poisonous or ...
  2. [2]
    Toxicology Explained - Virginia Department of Health
    Apr 19, 2023 · The basic principle of toxicology is that "the dose makes the poison," which is a concept attributed to the 15th-century physician Paracelsus. ...Missing: key | Show results with:key<|control11|><|separator|>
  3. [3]
    Key Principles of Toxicology and Exposure - CHEMM
    Evaluating clinical effects based on the amount of exposure is a basic toxicology principle called dose-response. The dose is the total amount of chemical ...
  4. [4]
    What is a LD₅₀ and LC₅₀? - CCOHS
    In general, the smaller the LD50 value, the more toxic the chemical is. The opposite is also true: the larger the LD50 value, the lower the toxicity. The LD50 ...
  5. [5]
    The dose response principle from philosophy to modern toxicology
    A toxic dose is defined as the quantity of a substance that will produce a harmful or untoward effect. Included in the concept of a toxic dose is the amount of ...
  6. [6]
    Module One - Lecture Notes | Toxicology Curriculum ... - CDC Archive
    This term refers to the health effects that occur due to exposure to a toxic substance; also known as a poisonous effect on the body. What is Selective Toxicity ...
  7. [7]
    John William Trevan's concept of Median Lethal Dose (LD50/LC50)
    Toxicologists use MLD (LD50) as the first step to assess the toxicity of a substance. Animal ethicists criticize LD50 tests because animals suffer pain, and ...<|separator|>
  8. [8]
    Toxicology | National Institute of Environmental Health Sciences
    Mar 19, 2025 · Toxicology is the study of the harmful effects of chemicals, substances, or environmental agents on living systems.
  9. [9]
    What is toxicology and how does toxicity occur? - PubMed
    Toxicity is no longer a specific property of drugs and chemicals but an operative term to describe the adverse outcome of a specific drugs-host interaction.
  10. [10]
    [PDF] Module One Introduction to Toxicology
    It relies on the concept that a dose, or a time of exposure (to a chemical, drug, or toxic substance), will cause an effect (response) on the exposed organism.
  11. [11]
    The dose response principle from philosophy to modern toxicology
    His understanding of dose and effect has been paraphrased in the statement “The dose makes the poison”. Knowledge of the dose response relationship enables ...
  12. [12]
    The dose makes the poison - AGROLAB GROUP
    The dose makes the poison ... The original wording of the quote by Paracelsus of Hohenheim ( 1541) is (translated) "All things are poison, and ...
  13. [13]
    The Contribution of Paracelsus to Modern Toxicology
    Jun 24, 2020 · At the heart of Paracelsus' medical theory is the belief that all matter can be reduced to three basic elements: sulphur, mercury and salt.
  14. [14]
    Theophrastus Bombastus Von Hohenheim (Paracelsus) (1493–1541)
    3. Contributions to toxicology. Working in the mines of Villach (Austria), Paracelsus acquired basic knowledge of metals and their properties [2,4].
  15. [15]
    [PDF] The Contribution of Paracelsus to Modern Toxicology - CHIMIA
    Abstract: At the heart of Paracelsus' medical theory is the belief that all matter can be reduced to three basic elements: sulphur, mercury and salt.
  16. [16]
    PROFILES IN TOXICOLOGY Paracelsus
    Paracelsus also encouraged the use of experimental animals to study the effects of chemicals for both beneficial effects and to identify toxic effects. As in ...<|control11|><|separator|>
  17. [17]
    Toxicity - Etymology, Origin & Meaning
    Originating from 1880, from "toxic" + suffix "-ity," this word means the state or quality of being toxic or poisonous.
  18. [18]
    Toxic - Etymology, Origin & Meaning
    Originating from Greek toxikon meaning "poison for arrows," via Latin and French, toxic means "of or pertaining to poisons; poisonous" in meaning.
  19. [19]
    TOXIC Definition & Meaning - Merriam-Webster
    Etymology. Adjective. Late Latin toxicus, from Latin toxicum poison, from Greek toxikon arrow poison, from neuter of toxikos of a bow, from toxon bow, arrow.
  20. [20]
    The ancient Greek roots of the term Toxic - PMC - NIH
    May 4, 2021 · In ancient Greek literature the adjective toxic (Greek: τoξικόν) derives from the noun τόξo, that is the arc. This noun according to the Liddell ...
  21. [21]
    The Odyssey of English: The deadly origins of 'toxic' | Stuff
    Mar 18, 2023 · The word “toxic” derives from the ancient Greek word for bow (the weapon), toxon. Since the time of the epic poet Homer, author of the Iliad and the Odyssey.
  22. [22]
    The ancient Greek roots of the term Toxic - ScienceDirect.com
    The Greek word 'toxic' (τoξικόν) comes from 'tóxō' (τόξo), meaning 'arc', and later gained a medical meaning related to poison.
  23. [23]
    History of Toxicology
    Toxicology as a distinct scientific discipline is fairly modern; however, knowledge of poisons and poisoning incidents date back to ancient times.
  24. [24]
    Toxicology then and now - PubMed
    A review of the history of the evolution of the science of toxicology from the original concepts of Paracelsus through the early development of analytical ...
  25. [25]
    Toxicology - Etymology, Origin & Meaning
    Originating from Greek 'toxikon' meaning "arrow poison," via Latin and French, toxicology is the branch of medicine studying poisons and their antidotes.
  26. [26]
    History and basic concepts of toxicology - ScienceDirect.com
    The amount of time between exposure to a chemical and a toxic response is important in characterizing chemical toxicity. In acute poisoning, the interaction ...
  27. [27]
    From Classical Toxicology to Tox21: Some Critical Conceptual ... - NIH
    Toxicology has made steady advances over the last 60+ years in understanding the mechanisms of toxicity at an increasingly finer level of cellular ...
  28. [28]
    Ebers Papyrus 1500 BCE - Toxipedia
    The papyrus also has many prescriptions showing the treatment of many disorders by animal, plant, and mineral toxins that still occur today. Toxicological ...Missing: poisons | Show results with:poisons
  29. [29]
    History and Scope of Toxicology - AccessBiomedical Science
    The Book of Job (circa 1400 B.C.) speaks of poison arrows (Job 6:4) and Hippocrates (circa 400 B.C.) added a number of poisons and clinical toxicology ...
  30. [30]
    Dioscorides: Pioneer of Forensic Toxicology & Medicine
    Feb 3, 2025 · Dioscorides, a Greek physician in Ancient Rome, revolutionized herbal pharmacology and forensic toxicology.
  31. [31]
    Poisons, Poisoners, and Poisoning in Ancient Rome - ResearchGate
    Pliny and others it is evident that the Romans were aware of a very large. number of toxic (and assumed toxic) substances, of plant, animal and.
  32. [32]
    History of Toxicology Study Guide | Quizlet
    Jan 17, 2024 · History of Toxicology · Hippocrates (~400 BC) · Dioscorides (AD 50) · Poisonings and suicides · King Mithridates VI of Pontus · History of Toxicology.
  33. [33]
    Avicenna's clinical toxicology approach and beneficial materia ...
    Avicenna's clinical toxicology approach and beneficial materia medica against oral poisoning · Abstract · Publication types.Missing: medieval alchemy
  34. [34]
    [PDF] THE HISTORICAL USES OF POISONS
    Mar 10, 2025 · Medieval Arabic Toxicology: The Book on Poisons of Ibn Wahshiya and Its Relation to Early ... and there is little evidence of poisoning via ...
  35. [35]
    [PDF] poisoning through the ages - UTSA
    Arsenic became practically synonymous with poison during the Middle Ages when the art of secret poisoning became part of the social and political life. In ...
  36. [36]
    PIETRO D'ABANO, Trattati dei veleni (Treatise on Poisons)
    In the case of poisoning caused by the mineral litharge, Pietro prescribes methods of making the patient vomit and thus eliminate the poison, such as drinking ...
  37. [37]
    Mathieu Joseph Bonaventure Orfila (1787-1853): The Founder ... - NIH
    Due to his overall contribution to the field, Orfila is considered the father of modern toxicology. Keywords:chemistry, animal experiments, Marsh test, Lafarge ...Missing: developments | Show results with:developments
  38. [38]
    An Everyday Poison | Science History Institute
    Apr 30, 2011 · In 1814 Mateu Orfila published the first textbook on toxicology; in 1836 chemist James Marsh developed the Marsh test, which could identify ...
  39. [39]
    Poisons and the development of Toxicology in the 19th century
    Jun 26, 2024 · The development of scientific toxicology accelerated in the 19th century with prominent Spanish chemist and scholar Mateu Joseph Bonaventura ...
  40. [40]
    Historical milestones and discoveries that shaped the toxicology ...
    The foundations of modern toxicology are built upon the significant milestones and discoveries of serendipity and crude experimentation.
  41. [41]
    [PDF] From Murder to Mechanisms 7000 Years of Toxicology's Evolution
    7000 Years of Toxicology's Evolution. Michael A. Gallo, PhD, DABT (ret) ... – Defined the Toxicon: primary toxic agent. • Experimentation and observation.
  42. [42]
    Chapter 1: History and Scope of Toxicology - AccessPharmacy
    Modern toxicology goes beyond the study of the adverse effects of exogenous agents to the study of molecular biology, using toxicants as tools. Historically, ...<|separator|>
  43. [43]
    [PDF] History Of Toxicology And Environmental Health Tox
    Today, toxicology faces new challenges: Nanotoxicology: studying the effects of nanomaterials. Endocrine disruptors: chemicals that interfere with hormonal.
  44. [44]
    [PDF] SOCIETY OF TOXICOLOGY HISTORY
    May 1, 1986 · Geiling in toxicology emphasize the fact that the development of toxicology to its present important status has not occurred spontaneously ...
  45. [45]
    Chapter 3. Mechanisms of Toxicity - AccessPharmacy
    Toxicity involves toxicant delivery to its target or targets and interactions with endogenous target molecules that may trigger perturbations in cell function ...
  46. [46]
    Chemicals, Pesticides and Toxics Topics | US EPA
    May 15, 2025 · Common Substances · Asbestos · Formaldehyde · Hazardous/Toxic Air Pollutants · Lead · Mercury · Per- and Polyfluoroalkyl Substances (PFAS) · Pesticide ...
  47. [47]
    Toxicity, mechanism and health effects of some heavy metals - PMC
    Chromium toxicity greatly affects the biological processes in various plants such as maize, wheat, barley, cauliflower, citrullus and in vegetables.
  48. [48]
    (PDF) Types, sources and effects of chemical toxicants: An overview.
    Aug 6, 2025 · This review provides a comprehensive account of various toxicants, their types and sources. It also updates the readers about the health ...
  49. [49]
    Our Current Understanding of the Human Health and Environmental ...
    Nov 26, 2024 · Reproductive effects such as decreased fertility or increased high blood pressure in pregnant women. Developmental effects or delays in children ...<|control11|><|separator|>
  50. [50]
    Volatile Organic Compounds' Impact on Indoor Air Quality | US EPA
    Jul 24, 2025 · Health Effects · Eye, nose and throat irritation · Headaches, loss of coordination and nausea · Damage to liver, kidney and central nervous system ...
  51. [51]
    Top Five Chemicals Resulting in Injuries from Acute ... - CDC
    Apr 10, 2015 · Equipment failure was the most commonly reported contributing factor for ammonia (46%), carbon monoxide (45%), and sulfuric acid releases (41%).
  52. [52]
    [PDF] Harmonised Integrated Classification System for Human Health and ...
    The evaluations should be based on all existing data, peer-reviewed ... acceptable when reviewing these chemicals with regard to classification under the ...
  53. [53]
    Biological Toxins as the Potential Tools for Bioterrorism - PMC
    Mar 8, 2019 · One dictionary defines them as “Chemicals produced by living organisms that have toxic properties for another organism”. Toxins are very ...
  54. [54]
    [PDF] List of Biological Toxins - Marshall University
    Biological toxins are hazardous substances from microorganisms, animals, insects, and plants. Acute toxins, with LD50 ≤100 µg/kg, pose the greatest risk.
  55. [55]
    Chapter 10, Work with Biological Toxins - University of Nevada, Reno
    Examples of toxins of biological origin include Diphtheria Toxin, Tetrodotoxin, Pertussis Toxin, Botulinium Toxin, Snake Venom Toxins, Conotoxin and Ricin.
  56. [56]
    Natural Toxins in Food - FDA
    Sep 26, 2024 · Natural toxins are chemicals produced by living things like plants, fungi, bacteria, algae, and animals.
  57. [57]
    Biological Toxins Guidance - Biosafety Program
    Biological toxins are produced by certain bacteria, fungi, protozoa, plants, reptiles, amphibians, fish, echinoderma (spiny urchins and starfish), mollusks, ...
  58. [58]
    Biological Toxins as Medicines - UTMB Health
    Jun 17, 2025 · Animals, plants and microbes can all produce deadly biological toxins. Botulinum from a bacterium; ricin from castor beans; and tetrodotoxin from Pufferfish ...
  59. [59]
    Select Agents and Toxins List
    Select agents and toxins are biological agents and toxins that pose a severe threat to human, animal, or plant health, or to animal and plant products.
  60. [60]
    Physical Hazards and Risks - International Labour Organization
    Examples of physical hazards at work include noise, vibration, radiations, electricity and extreme temperatures.Missing: toxicology mechanical
  61. [61]
    Toxic Agent - an overview | ScienceDirect Topics
    Physical toxic agents include a wide section of the electromagnetic radiation spectrum, from gamma rays through X-rays and ultraviolet, to infrared and ...
  62. [62]
    Effects of Toxins and Physical Agents on the Nervous System
    Apr 12, 2015 · Plants (Representative Examples), Main Clinical Features. Tropane (belladonna) alkaloids, Jimson weed (Datura stramonium); deadly nightshade ...Missing: toxicology | Show results with:toxicology
  63. [63]
    1.4: Physical Hazards - Workforce LibreTexts
    Apr 26, 2022 · Noise and vibration are related physical hazards that are treated very differently in OHS regulation and management. Noise has been well studied ...Missing: toxicology | Show results with:toxicology
  64. [64]
    Hazard Identification, Risk Assessment, and Control Measures as an ...
    Hazard identification. Physical hazards: Mechanical. Electrical. Fire and explosion. Heat. Radiation. Noise. Vibration. Chemical hazards. Dusts. Fugitive dusts.
  65. [65]
    Radiation Injury - Injuries and Poisoning - Merck Manuals
    Large doses of ionizing radiation can cause acute illness by reducing the production of blood cells and damaging the digestive tract.
  66. [66]
    PUBLIC HEALTH STATEMENT - Toxicological Profile for Ionizing ...
    This public health statement tells you about ionizing radiation and the effects of exposure. It does not tell you about non-ionizing radiation.Missing: radiative | Show results with:radiative
  67. [67]
    Radiation sickness - Symptoms and causes - Mayo Clinic
    Feb 13, 2024 · Radiation sickness also is called acute radiation syndrome or radiation poisoning. Radiation sickness is not caused by common medical ...
  68. [68]
    MECHANISMS OF BIOLOGICAL EFFECTS - NCBI - NIH
    This chapter summarized the major mechanisms by which ionizing radiation exerts it toxic effects on cell structure.
  69. [69]
    Ionizing Radiation | Toxic Substances - CDC
    ... radiation is called "ionizing radiation ... Agency for Toxic Substance and Disease Registration Agency for Toxic Substance and Disease Registration.Missing: radiative | Show results with:radiative
  70. [70]
    [PDF] DOSE-RESPONSE RELATIONSHIPS IN TOXICOLOGY
    The science of toxicology is based on the principle that there is a relationship between a toxic reaction (the response) and the amount of poison received (the ...
  71. [71]
    Dose-Response Relationship - an overview | ScienceDirect Topics
    The dose-response relationship evaluates the severity of an effect in relation to the level of exposure, where both are proportional to each other.
  72. [72]
    The use of canonical dose–response models for benchmark dose ...
    The benchmark dose (BMD) approach employs dose–response modeling to determine the dose associated with a small change in response relative to the background ...
  73. [73]
    A Unified Probabilistic Framework for Dose-Response Assessment ...
    Apr 2, 2024 · This paper describes a probabilistic approach for dose-response assessment of both cancer and non-cancer endpoints.
  74. [74]
    It Is Time to Move Beyond the Linear No-Threshold Theory for Low ...
    U.S. Environmental Protection Agency's Comments on Linear No-Threshold Model and ... Hayes' Principles and Methods of Toxicology. Boca Raton, FL: CRC Press; 2014 ...
  75. [75]
    Linear non-threshold (LNT) fails numerous toxicological stress tests
    Sep 25, 2022 · The Linear Non-Threshold (LNT) model was given a series of toxicological stress tests. · In most cases, the LNT model displayed serious ...
  76. [76]
    The linear no-threshold model is less realistic than threshold or ...
    Mar 1, 2019 · The hormetic dose response model is more common than the threshold model in toxicology. Toxicol. Sci., 71 (2003), pp. 414-428. Google Scholar.
  77. [77]
    Hormesis: a revolution in toxicology, risk assessment and medicine
    But, if accepted, the hormetic dose–response model could have a large impact on risk assessment in many significant ways.
  78. [78]
    Hormesis: Why it is important to toxicology and toxicologists
    Dec 9, 2009 · Hormesis principally deals with biological performance–that is, the response of biological systems below the toxic threshold. Above the toxic ...
  79. [79]
    A mode of action-based probabilistic framework of dose-response ...
    Aug 29, 2023 · A main function of dose-response assessment is to estimate a “safe” dose in the target population to support chemical risk assessment. Typically ...
  80. [80]
    NTP Research Report on National Toxicology Program Approach to ...
    NTP convened an expert panel on October 23–25, 2017, to peer review the NTP Research Report on National Toxicology Program Approach to Genomic Dose-Response ...
  81. [81]
    The Key Events Dose-Response Framework: A Cross-Disciplinary ...
    It is a mode-of-action, evidence-based approach that promotes use of data on key events to characterize dose-response relationships, including thresholds, and ...
  82. [82]
    A Review of the LD50 and Its Current Role in Hazard Communication
    Dec 21, 2020 · The acute lethality hazard is typically determined using the LD50 which is defined as the median dose predicted to kill 50% of a given test ...
  83. [83]
    Definition of Toxicological Dose Descriptors (LD50, LC50, EC50 ...
    Jun 17, 2016 · Lower LD50/LC50 value indicates higher acute toxicity. NOAEL. No Observed Adverse Effect Level (NOAEL) is the highest exposure level at which ...
  84. [84]
    GHS Classification Criteria for Acute Toxicity - ChemSafetyPro.COM
    Mar 6, 2018 · LD50 and LC50 values are needed for GHS classification. Acute toxicity category 1 represents the most severe toxicity. GHS classification ...
  85. [85]
    Welcome to ToxTutor - Toxicology MSDT
    NOAEL -- Highest dose at which there was not an observed toxic or adverse effect. LOAEL -- Lowest dose at which there was an observed toxic or adverse effect.
  86. [86]
    [PDF] Derivation of Assessment Factors for Human Health Risk Assessment
    The observed NOAEL is a typical starting point for risk assessment. If a NOAEL is not available or cannot be determined, then a 'lowest observed adverse effect ...
  87. [87]
    Analytical Methods in Toxicology - ACS Publications
    Dec 10, 2014 · This virtual issue represents some of the advances in the development of analytical methods and their applications in chemical toxicology.
  88. [88]
    Metabolomics in Toxicology and Preclinical Research - PMC
    In toxicology, metabolomics is the -omics discipline that is most closely related to classical knowledge of disturbed biochemical pathways. It allows rapid ...
  89. [89]
    Modern Instrumental Methods in Forensic Toxicology - PMC
    This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids ...
  90. [90]
    Overview of systematic toxicological analysis strategies and their ...
    Apr 15, 2023 · This mini-review briefly summarizes commonly applied methods for STA in forensic toxicology, including gas chromatography–mass spectrometry (GC–MS) and liquid ...
  91. [91]
    Systems Toxicology: Applications of Toxicogenomics ... - PubMed
    Toxicogenomics allowed for more sensitive and earlier detection of adverse effects in (animal) toxicity studies. Furthermore, the effects of exposure to ...
  92. [92]
    Advanced technologies in genomic toxicology: Current trend and ...
    Toxicogenomics is a subdiscipline of toxicology that assesses the toxicity using a combination of genomics, advanced computational technologies, and other ...
  93. [93]
    New approach methodologies (NAMs): identifying and overcoming ...
    Mar 25, 2024 · This article focuses on chemical safety science, exposure, hazard, and risk assessment, and explores the nature of these barriers and how they can be overcome.Missing: modern | Show results with:modern
  94. [94]
    New Report Recommends EPA Develop Framework for Evaluating ...
    Jun 16, 2023 · “New approach methods” draw on a range of novel alternative tests, strategies, and models. These methods have the potential to fill the gaps ...
  95. [95]
    Integration of new approach methods for the assessment of data ...
    Feb 19, 2025 · This work demonstrates the feasibility of using a battery of toxicodynamic and toxicokinetic NAMs to provide a NAM-based POD for screening-level ...
  96. [96]
    Innovative tools and methods for toxicity testing within PARC work ...
    Jul 18, 2023 · New approach methodologies (NAMs) have the potential to become a major component of regulatory risk assessment, however, their actual ...
  97. [97]
    7.7.1 Toxic Effects - Cornell EHS
    Toxicity refers to the ability of a chemical to cause harmful effects to the body. As was described by Paracelsus (1493-1541): “What is it that is not poison?
  98. [98]
    [PDF] Single Dose Acute Toxicity Testing for Pharmaceuticals - FDA
    Aug 26, 1996 · DEFINITION. Acute toxicity is the toxicity produced by a pharmaceutical when it is administered in one or more doses during a period not ...
  99. [99]
    A New Method for Determining Acute Toxicity in Animal Models - PMC
    Acute toxicity is defined as the unwanted effect(s) that occurs either immediately or at a short time interval after a single or multiple administration of such ...
  100. [100]
    Acute Oral Toxicity | Department of Toxic Substances Control - CA.gov
    Acute Oral LD 50 is the dose of a substance or mixture of substances, in milligrams per kilogram of test animal body weight.Missing: definition | Show results with:definition
  101. [101]
    40 CFR § 799.9410 - TSCA chronic toxicity.
    Chronic toxicity is the adverse effects occurring as a result of the repeated daily exposure of experimental animals to a chemical by the oral, dermal, or ...
  102. [102]
    Toxicity of Pesticides - Penn State Extension
    Jun 30, 2022 · The chronic toxicity of a pesticide is determined by subjecting test animals to long-term exposure to the active ingredient. Any harmful ...
  103. [103]
    [PDF] Using Toxicity Tests in Ecological Risk Assessment - EPA
    In general, acute and chronic toxicity tests differ in the amount of time required to perform them, their cost, and their resolution. •. Because chronic tests ...
  104. [104]
    [PDF] OECD Test Guideline 401 - Acute Oral Toxicity (1987
    Feb 24, 1987 · It provides information on health hazards likely to arise from a short-term exposure by the oral route.
  105. [105]
    About the GHS - UNECE
    It aims at ensuring that information on physical hazards and toxicity from chemicals be available in order to enhance the protection of human health and the ...
  106. [106]
  107. [107]
    GHS Classification (Rev.11, 2025) Summary - PubChem
    GHS includes criteria for the classification of health, physical and environmental hazards, as well as specifying what information should be included on labels ...
  108. [108]
    IARC Monographs hazard classification
    Jun 16, 2023 · This infographic presents the categories used by the IARC Monographs on the Identification of Carcinogenic Hazards to Humans to classify a substance.
  109. [109]
    List of Classifications - IARC Monographs
    Sep 18, 2025 · Agents classified by the IARC Monographs, Volumes 1–139 ; Helicobacter pylori (infection with), 1 ; Microcystis extracts, 3 ; Opisthorchis felineus ...
  110. [110]
    40 CFR 156.62 -- Toxicity Category. - eCFR
    This section establishes four Toxicity Categories for acute hazards of pesticide products, Category I being the highest toxicity category.Missing: health | Show results with:health
  111. [111]
    The WHO Recommended Classification of Pesticides by Hazard and ...
    May 1, 2020 · The WHO classifies pesticides by acute risk to human health, considering toxicity and lists common technical grade pesticides with their  ...
  112. [112]
    [PDF] GLOBALLY HARMONIZED SYSTEM OF CLASSIFICATION AND ...
    The Globally Harmonized System of Classification and Labelling of Chemicals (GHS) is the culmination of more than a decade of work.
  113. [113]
    [PDF] Chemical Hazard Classification and Labeling: Comparison of OPP ...
    Jul 7, 2004 · GHS organizes the hazard characteristics of chemicals based on “hazard classes”. (specific physical, health or environmental effects, such as ...
  114. [114]
    [PDF] ANNEX 9 GUIDANCE ON HAZARDS TO THE AQUATIC ... - UNECE
    Classification categories and criteria. The hazard categories for acute and chronic aquatic toxicity and their related criteria are set out in. Chapter 4.1 ...
  115. [115]
    [PDF] Criteria for Classification and Labelling of Substances and Mixtures
    Jun 17, 2009 · ACUTE AQUATIC TOXICITY. Acute toxicity to Fish, Crustacea or Algae ... CHRONIC AQUATIC TOXICITY. 4th basic element. Page 15. Multiplying ...Missing: terrestrial | Show results with:terrestrial
  116. [116]
    Overview and Comparison of PBT and vPvB Criteria in EU and USA
    May 25, 2017 · PBT substances are Persistent, Bioaccumulative, and Toxic, while vPvB substances are very Persistent and very Bio-accumulative.
  117. [117]
    [DOC] CLASSIFICATION CRITERIA FOR THE TERRESTRIAL ... - UNECE
    Acute toxicity data for the different terrestrial organisms would normally be determined using an EC50 or LC50 for soil dwelling macro-organisms, such as ...Missing: ecotoxicity | Show results with:ecotoxicity
  118. [118]
    Evaluation Guidelines for Ecological Toxicity Data in the Open ... - EPA
    Aug 28, 2025 · Procedures for screening, reviewing, and using published open literature toxicity data in ecological risk assessments.<|separator|>
  119. [119]
    Technical Overview of Ecological Risk Assessment - Analysis Phase
    Sep 15, 2025 · EPA estimates the toxicity or hazard of a pesticide by evaluating ecological effects tests that vary from short-term (acute) to long-term ( ...Missing: criteria | Show results with:criteria
  120. [120]
    Exposure Assessment Tools by Routes - Ingestion | US EPA
    Apr 1, 2025 · Typically, exposure occurs by one of three exposure routes—inhalation, ingestion, or dermal. Ingestion exposure can occur via consumption of ...
  121. [121]
    Environmental and Exposure Pathways - NCBI - NIH
    The exposure route refers to the way an agent enters the person during an exposure event. Exposure routes include inhalation of gases and aerosols, ingestion of ...
  122. [122]
    Exposure Assessment Tools by Routes - Inhalation | US EPA
    Apr 1, 2025 · Typically, exposure occurs by one of three exposure routes—inhalation, ingestion, or dermal. Inhalation exposure can result from breathing air ...
  123. [123]
    Element 4: Exposure Routes
    Ingestion: Oral ingestion of chemical and radioactive contaminants in groundwater, surface water, soil, and food. · Inhalation · Dermal contact: · External ...
  124. [124]
    [PDF] Exposure to a chemical for a duration of 14 days or less, as specified ...
    Acute Exposure -- Exposure to a chemical for a duration of 14 days or less, as specified in the Toxicological Profiles.
  125. [125]
    Acute vs. Chronic Exposure: The Difference in Chemical Hazards
    Acute exposure is when an individual comes into contact with a chemical substance within a brief range of time, usually from seconds up to hours, or 14 days.
  126. [126]
    Systemic Toxic Effects - Welcome to ToxTutor - Toxicology MSDT
    Subchronic Toxicity Subchronic toxicity results from repeated exposure for several weeks or months. This is a common human exposure pattern for some ...
  127. [127]
    Influence of exposure time on toxicity—An overview - ScienceDirect
    Apr 29, 2016 · The toxic effect on the organism has a longer duration with longer exposure times and consequently there is a lethal toxic effect at a lower concentration.
  128. [128]
    Extrapolating from acute to chronic toxicity in vitro - PMC - NIH
    This study presents a practical means of measuring the chronicity index of substances in vitro and suggests how this important parameter can be used to ...Missing: definition | Show results with:definition
  129. [129]
    Influence of exposure time on toxicity-An overview - PubMed
    Apr 29, 2016 · Data on toxicity of chemicals is usually reported as the LD50, or LC50, with the exposure time from experimental testing in the laboratory reported.
  130. [130]
    Genetic and environmental factors affecting host response to drugs ...
    Genetic and environmental factors affecting host response to drugs and other chemical compounds in our environment ... Environ Health Perspect. 1977 Oct:20:159-82 ...
  131. [131]
    Heterogeneity of Toxicant Response: Sources of Human Variability
    Age-Related Susceptibility. Age is an important factor that can influence an individual's metabolic processes and their ability to eliminate a toxicant. There ...
  132. [132]
    Genetic variability in susceptibility and response to toxicants
    D.L. Eaton et al. Concise review of the glutathione S-transferases and their significance to toxicology. Toxicol. Sci. (1999). L. Ernster et al. Xenobiotics ...
  133. [133]
    Polymorphism of cytochrome P450 and xenobiotic toxicity - PubMed
    The majority of human P450-dependent xenobiotic metabolism is carried out by polymorphic enzymes which can cause abolished, quantitatively or qualitatively ...
  134. [134]
    Polymorphism of cytochrome P450 and xenobiotic toxicity
    P450 polymorphism causes altered metabolism, leading to lack of drug efficacy or adverse effects, and is a major factor in drug metabolism and toxicity.Polymorphism Of Cytochrome... · Abstract · 1. Introduction
  135. [135]
    Genetic Polymorphism and Toxicology—With Emphasis on ...
    Dec 13, 2010 · Following, we provide important examples where CYP polymorphism has caused increased drug or xenobiotic toxicity in a subpopulation.Abstract · CYTOCHROME P450 · GENETIC POLYMORPHISM...
  136. [136]
    Genetic variability in susceptibility and response to toxicants - PubMed
    The polymorphism of CYP enzymes is expected to influence individual sensitivity and toxicity ... Review. MeSH terms. Cytochrome P-450 Enzyme System ...
  137. [137]
    Genetic Risk Prediction: Individualized Variability in Susceptibility to ...
    Jan 6, 2013 · Annual Review of Pharmacology and Toxicology · Volume 53, 2013; Article ... Genetic Risk Prediction: Individualized Variability in Susceptibility ...
  138. [138]
    Genetics and Susceptibility to Toxic Chemicals: Do You (or Should ...
    ... susceptibility to chemically induced toxicity. An individual's risk of disease from exposure to toxic chemicals is determined by a complex interplay between ...
  139. [139]
    Genetic variability in susceptibility and response to toxicants - PubMed
    Research Support, U.S. Gov't, P.H.S.; Review. MeSH terms. Acetyltransferases / genetics; Carcinogens / metabolism; Carcinogens / toxicity; Cytochrome P-450 ...
  140. [140]
    The Influence of Genetic Polymorphisms on Population Variability in ...
    This review provides variability statistics for polymorphic enzymes that are involved in the metabolism of xenobiotics. Six enzymes were evaluated: ...
  141. [141]
    4.1: Interactions - Chemistry LibreTexts
    Jul 6, 2022 · The presence of one chemical decreasing toxicity of another chemical is called: Additivity, Antagonism, Synergism.
  142. [142]
    Synergism and related terms - CCOHS
    In toxicology, synergism refers to the effect caused when exposure to two or more chemicals at one time results in health effects that are greater than the ...
  143. [143]
    Synergistic effects of chemical mixtures: How frequent is rare?
    The number of components in a mixture determines whether synergistic and antagonistic or additive toxicity predominate: the funnel hypothesis. Ecotoxicol ...
  144. [144]
    The Synergistic Toxicity of Pesticide Mixtures: Implications for Risk ...
    ... toxicity of the mixture (as measured by brain AChE inhibition) is the sum of each chemical's predicted toxicity (toxic potential). We determined the individual ...Pesticide Exposures · Single Pesticides · Pesticide Mixtures
  145. [145]
    The synergistic toxicity of the multiple chemical mixtures
    The combined toxicity of five insecticides (chlorpyrifos, avermectin, imidacloprid, λ-cyhalothrin, and phoxim), two herbicides (atrazine and butachlor) and ...
  146. [146]
    Toxicological Study of Human Exposure to Mixtures of Chemicals
    Jul 24, 2024 · This paper aims to evaluate the weaknesses and strengths of chemical mixture studies and to propose the practical design of mixture effect studies.
  147. [147]
    Evaluating Combined Chemical Exposures and Cumulative Toxicity
    Nov 29, 2023 · Research tends to overlook whole mixtures in toxicological testing, with >80% of mixture studies focusing on small, technically simple mixtures ...
  148. [148]
    Critical review and analysis of literature on low dose exposure to ...
    This review examined the published studies that have investigated exposure to mixtures of chemicals at low doses in mammalian in vivo systems.
  149. [149]
    Summary of the Toxic Substances Control Act | US EPA
    Aug 25, 2025 · The Toxic Substances Control Act of 1976 provides EPA with authority to require reporting, record-keeping and testing requirements, and restrictions relating ...Missing: international | Show results with:international
  150. [150]
    Procedures for Chemical Risk Evaluation Under the Toxic ...
    May 3, 2024 · Risk evaluations are scientific documents intended to inform EPA decisions on the regulatory actions needed to address any identified ...
  151. [151]
    Prioritization of Existing Chemicals Under TSCA | US EPA
    Prioritization is the initial step in the process of evaluating existing chemicals under the Toxic Substances Control Act (TSCA)Missing: REACH | Show results with:REACH
  152. [152]
    Navigating Global Chemical Regulations | Compliance Insights for ...
    Jan 24, 2025 · Major Regulatory Frameworks Explained · REACH (Registration, Evaluation, Authorisation, and Restriction of Chemicals) · TSCA (Toxic Substances ...
  153. [153]
    Chemical Industry Regulations That Matter in 2025 - Elchemy
    Sep 22, 2025 · The Toxic Substances Control Act (TSCA) Section 5 has undergone significant updates in 2025, requiring enhanced data submission and reporting ...1. Tsca Updates: New... · 2. Reach Compliance: Eu And... · 3. Osha Chemical Safety...
  154. [154]
    6.5: Regulatory Frameworks - Chemistry LibreTexts
    Apr 25, 2022 · There is no single, overarching global regulatory framework to manage the risks of all chemicals. Instead, different regulations or directives ...
  155. [155]
    Persistent Organic Pollutants: A Global Issue, A Global Response
    The Great Lakes Binational Toxics Strategy, signed by the United States and Canada in 1997, was an agreement aimed to reduce several persistent toxic pollutants ...<|separator|>
  156. [156]
    An Overview of Federal Laws Governing Toxic Chemicals
    The OSH Act, TSCA, RCRA, FIFRA, and EPCRA are federal laws that address toxic chemicals, with the OSH Act being the primary one.
  157. [157]
    The History of the Linear No-Threshold Model and ... - PubMed
    Feb 1, 2023 · This effort is about the history of how LNT came to be the regulatory paradigm and model for cancer risk assessment that it is today.
  158. [158]
    [PDF] Linear non-threshold (LNT) fails numerous toxicological stress tests
    The linear non-threshold (LNT) model has been the sacrosanct dose- response model used in cancer risk assessment for over half a century. At the core of the LNT ...<|separator|>
  159. [159]
    Hormesis and Toxicological Risk Assessment - Oxford Academic
    Abstract. The article highlighted in this issue is “The Hormetic Dose-Response Model is More Common than the Threshold Model in Toxicology” by E. J. Calabr.
  160. [160]
    Review Applications of hormesis in toxicology, risk assessment and ...
    In a recent issue of TiPS we assessed the concept of hormesis in toxicology and pharmacology, together with the historical foundations, definitional dimensions ...
  161. [161]
    Extrapolation of animal toxicity to humans: interspecies comparisons ...
    Linear extrapolations from animals to humans based on body weight equivalence are shown to be inaccurate unless species-specific conversion factors are used.Missing: controversies | Show results with:controversies
  162. [162]
    Risk assessment׳s insensitive toxicity testing may cause it to fail
    The US National Toxicology Program׳s publically funded chronic toxicity tests also – as TG-GLP methods do – use doses high enough to reliably detect effects for ...
  163. [163]
    Extrapolation of animal toxicity to humans: Interspecies comparisons ...
    Linear extrapolations from animals to humans based on body weight equivalence are shown to be inaccurate unless species-specific conversion factors are used.Missing: controversies | Show results with:controversies
  164. [164]
    [PDF] The Precautionary Principle and chemical risks - HAL
    Apr 20, 2012 · The PP is to be applied if the scientific information is inconclusive and there are reasonable grounds for concern that potentially dangerous ...
  165. [165]
    The relationship between risk analysis and the precautionary principle
    In general, the PP is employed as a means of justifying decisions that are contrary to the conclusions of a formal risk assessment.
  166. [166]
    Are Risk Assessment And The Precautionary Principle Equivalent?
    Jun 20, 2002 · I'm going argue that risk assessment plays a central role in the imperative to maximize benefits, while the precautionary principle violates it.Missing: evidence- | Show results with:evidence-
  167. [167]
    Risk assessment in the 21st century: where are we heading?
    Jan 10, 2023 · The use of traditional approaches to generating toxicology data in animals was banned completely from 2013 March 11 for the purposes of ...
  168. [168]
  169. [169]
    Why toxicologists resisted and radiation geneticists supported EPA'S ...
    Sep 1, 2019 · The linear non-threshold (LNT) dose response model for cancer risk assessment has been a controversial concept since its initial proposal ...
  170. [170]
    REACH Authorisation: Balancing Costs with Health Benefits
    Discover the benefits of REACH authorisation: significant health and environmental gains outweigh the 2.5 billion annual cost to businesses. Learn more!
  171. [171]
    Economic Analysis for the Final Rule: Updates to New Chemicals ...
    Dec 18, 2024 · This economic analysis estimates the change in burden and cost associated with the final rule to update the new chemicals procedural ...
  172. [172]
    Chemical Industry claims economic harms from TSCA ...
    May 10, 2022 · Chemical Industry claims economic harms from TSCA implementation days after reporting 'strong growth' for industry.
  173. [173]
    Benefits and Costs of the Clean Air Act 1990-2020, the Second ...
    May 15, 2025 · According to this study, the central benefits estimate exceeds costs by a factor of more than 30 to one. The report was updated in April 2011.
  174. [174]
    Reach's health benefits are four times greater than the costs | News
    Feb 19, 2021 · European Chemicals Agency report claims benefits from bloc's chemical legislation run to €2.1 billion annually.
  175. [175]
    [PDF] THE BENEFITS OF THE EU'S CHEMICAL REGULATIONS - BEUC
    It presents clear evidence that regulations have an economic benefit, which according to one estimate is in the region of €11 – 47 billion a year. This far ...
  176. [176]
    [PDF] Cost-Benefit Analysis and the Problem of Long-term Harms from ...
    The regulation's benefits outweighed its costs using a discount rate of · 3%, but not using a discount rate of 7%. Toxic chemical regulations – especially those ...
  177. [177]
    Policy Monitor—The Economics of Toxic Substance Control and the ...
    ... economics of toxic substance control and complicate the implementation of chemical regulations. ... assessment must consider both the chemical's toxicity ...
  178. [178]
    [PDF] REVIEWS Rethinking the Role of Cost-benefit Analysis
    Revesz and Livermore focus heavily on the use of CBA to block regulation of toxic chemicals. ... developing toxicity information and determining the net benefits ...
  179. [179]
    Evolution of toxicity testing platforms from 2D to advanced 3D ...
    In vitro3D models would provide improved methods to evaluate and comprehend drug response, thereby reducing the burden on animal usage. Further, reducing the ...
  180. [180]
    Advanced In Vitro Models for Preclinical Drug Safety - MDPI
    In this review, we discuss the significance, current status, and future prospects of advanced platforms, specifically organ-on-a-chip models, in supporting ...
  181. [181]
    In-Vitro Toxicology Testing Market Size to Hit USD 33.13 Billion by ...
    Aug 4, 2025 · The in-vitro toxicology testing market size was valued at USD 11.92 billion in 2024 and is expected to hit around USD 33.13 billion by 2034, ...
  182. [182]
    Perspective: How complex in vitro models are addressing ... - Frontiers
    Feb 18, 2025 · A large retrospective review by the pharmaceutical industry found that the positive concordance between liver toxicity in animal tests and human ...
  183. [183]
    Complex in vitro models positioned for impact to drug testing in ...
    Aug 27, 2024 · This review comes from a pharma perspective, and seeks to provide an evaluation of where CIVMs are poised for maximum impact in the drug development process.
  184. [184]
    In vitro test systems and their limitations - PMC - NIH
    Limitations include differences between isolated cells and in vivo cells, difficulties with xenobiotic metabolism, cell interactions, and simulating long-term ...
  185. [185]
    In Vitro Toxicology Lectures
    The goal of the In Vitro Toxicology Lecture series is to feature important research using in vitro and alternative techniques to study basic mechanisms.
  186. [186]
    Are In Vitro Tests Suitable for Regulatory Use? - Oxford Academic
    Jul 17, 2009 · LIMITATIONS OF IN VITRO MODELS. The broad use of in vitro tests in academic and industrial research in recent years has somewhat obscured the ...
  187. [187]
    Evaluation of QSAR models for tissue-specific predictive toxicology ...
    A systematic review of tissue-specific QSAR models was conducted to analyze utility for occupational and military chemical risk prediction.
  188. [188]
    Predicting Chemical Immunotoxicity through Data-Driven QSAR ...
    May 28, 2024 · This study provides a computational modeling strategy that can utilize large public toxicity data sets for modeling immunotoxicity and other toxicity endpoints.
  189. [189]
    Evaluating applicability domain of acute toxicity QSAR models for ...
    Nov 26, 2024 · Quantitative Structure-Activity Relationship (QSAR) models can be used to predict the risk of novel and emergent chemicals causing adverse ...
  190. [190]
    Recent advances in AI-based toxicity prediction for drug discovery
    ... toxicity profiling (Richard et al., 2016). These datasets are frequently employed as benchmarks for evaluating classification models in predictive toxicology.Abstract · Introduction · Computational models for... · Endpoint-specific toxicity...
  191. [191]
    A deep-learning approach to predict reproductive toxicity ... - Frontiers
    Jul 21, 2025 · Quantitative Structure–Activity Relationship (QSAR) models use mathematical methods to model relationships between chemical structures' ...<|control11|><|separator|>
  192. [192]
    Analysis of implicit and explicit uncertainties in QSAR prediction of ...
    Accordingly, implicit and explicit uncertainty indicators, expressed in peer-reviewed papers on the QSAR modeling of neurotoxicity, were first identified.
  193. [193]
    Advancing predictive toxicology: overcoming hurdles and shaping ...
    Jan 6, 2025 · Due to the relative abundance of in vitro toxicology data vs. in vivo, many computational models for toxicity are built on data derived from ...
  194. [194]
    applications of artificial intelligence in ADMET and toxicity prediction
    Oct 6, 2025 · Despite these notable advances, computational toxicology continues to face substantial challenges. Current toxicity datasets often exhibit ...
  195. [195]
    Editorial: Advances in and applications of predictive toxicology: 2022
    Jul 25, 2023 · Predictive Toxicology uses and integrates in silico, in chemico and in vitro approaches named “New Approach Methodologies (NAMs)” to predict the potential ...
  196. [196]
    The role of machine learning in predictive toxicology: A review of ...
    Omics data integration enhances AI predictions of drug-induced toxicity. •. Validation techniques ensure AI model reliability in toxicology. •. AI accelerates ...
  197. [197]
    Toxicogenomics - Insights into the present and future
    Dec 23, 2001 · 1) completion of the Human Genome Project · 2) substantial progress toward the completion of genome projects for a number of model animals that ...
  198. [198]
    Introduction - Applications of Toxicogenomic Technologies to ... - NCBI
    Toxicogenomics is defined as the application of genomic technologies (for example, genetics, genome sequence analysis, gene expression profiling, proteomics, ...
  199. [199]
    understanding and predicting compound-induced toxicity from gene ...
    This review highlights developing methods in the toxicogenomics field and their applications to understanding and predicting compound induced toxicity.
  200. [200]
    Toxicogenomic Technologies - NCBI - NIH
    Toxicogenomics uses these new technologies to analyze genes, genetic polymorphisms, mRNA transcripts, proteins, and metabolites.
  201. [201]
    Toxicogenomics - an overview | ScienceDirect Topics
    Toxicogenomics is defined as the study of how genetic variation affects individual responses to environmental toxins, evidenced by the relationship between ...
  202. [202]
    Toxicogenomics in Predictive Toxicology in Drug Development
    In this review, we provide an overview of the many possibilities for toxicogenomics including technology platforms, data interpretation, and regulatory ...
  203. [203]
    Toxicogenomics: A 2020 Vision: Trends in Pharmacological Sciences
    Dec 26, 2018 · Toxicogenomics (TGx) has contributed significantly to toxicology and now has great potential to support moves towards animal-free approaches in regulatory ...
  204. [204]
    the role of omics in short-term in vivo studies | Toxicological Sciences
    Aug 25, 2025 · In addition to transcriptomics, modern toxicology is increasingly turning to broader “omics” approaches, such as metabolomics, proteomics, ...
  205. [205]
    Decade of Toxicogenomic Research and Its Contribution to ...
    Jul 12, 2012 · Abstract. Toxicogenomics enjoyed considerable attention as a ground-breaking addition to conventional toxicology assays at its inception.
  206. [206]
    AI-based toxicity prediction models using ToxCast data
    In the toxicology field, most toxicity prediction models rely on quantitative structure activity relationship (QSAR) principles, the concept of inferring the ...
  207. [207]
    Artificial Intelligence-Driven Drug Toxicity Prediction - NIH
    Jun 23, 2025 · In this review, we focus on the application of AI in the field of drug toxicity prediction and systematically summarize the relevant literature ...
  208. [208]
    A call to action: Advancing new approach methodologies (NAMs) in ...
    Jun 11, 2025 · This has been more recently defined as “NAMs denote alternatives to traditional toxicity methods that typically involve animal testing. These ...
  209. [209]
    [PDF] Potential Approaches to Drive Future Integration of New Alternative ...
    Oct 7, 2024 · NAMs are regarded as a possible way to drive faster and more accurate human risk assessments of compounds that. FDA regulates (e.g., as drugs, ...
  210. [210]
    Machine Learning Toxicity Prediction: Latest Advances by Toxicity ...
    ... Predictive Toxicology: Recent Applications and Future Directions for Classification Models. ... ToxinPredictor: Computational models to predict the toxicity ...
  211. [211]
    Overcoming barriers to machine learning applications in toxicity ...
    Dec 11, 2023 · In predictive toxicology, understanding the rationale behind toxicity predictions is crucial, where decisions have far-reaching implications for ...