Fact-checked by Grok 2 weeks ago

Altmetrics

Altmetrics, short for "alternative metrics," are quantitative indicators used to assess the broader societal and online with scholarly outputs—such as articles, datasets, and books—by aggregating data from sources including mentions, news coverage, blog discussions, references, and documents, thereby complementing or contrasting traditional bibliometric measures centered on peer . The term was coined in September 2010 by Jason Priem, then a doctoral student in , who proposed it via as a means to harness emergent web data for evaluating research influence more rapidly and diversely than delays allow. This innovation stemmed from recognition that digital platforms enable real-time tracking of attention, potentially revealing impacts on public discourse, , and interdisciplinary audiences overlooked by networks. Following Priem's introduction, altmetrics gained traction through early manifestos and tools like ImpactStory (co-founded by Priem) and Altmetric.com, established in 2011 by Euan Adie to compute aggregated "Attention Scores" weighting source volume, influence, and recency. These scores, visualized via "donuts" categorizing attention sources (e.g., Twitter, Reddit, mainstream news), have been integrated into publisher platforms and institutional repositories to signal potential reach, with empirical studies showing moderate positive correlations to eventual citation counts in fields like medicine and ecology, though often lagging in predictive power for long-term influence. Adoption has accelerated amid open access movements and funder demands for demonstrable societal return on research investment, yet altmetrics remain adjunctive rather than substitutive, as they prioritize visibility over validated quality or causal effects. Criticisms highlight altmetrics' vulnerability to noise, such as inflated scores from controversial or sensational rather than rigorous , and susceptibility to via bots or coordinated sharing campaigns, undermining claims of measuring true . Peer-reviewed analyses, including evaluations against UK quality scores, reveal weak or inconsistent links to expert-assessed excellence across disciplines, suggesting altmetrics better attention than substantive value and risk incentivizing performative over depth. Despite these limitations, ongoing refinements—such as and source weighting—aim to enhance reliability, positioning altmetrics as a partial tool in multifaceted amid evolving landscapes.

Origins and History

Coining of the Term and Initial Manifesto

The term altmetrics was coined by Jason Priem, then a doctoral at the at Chapel Hill's School of Information and Library Science, in a post on September 29, 2010. In the post, Priem wrote: "I like the term #articlelevelmetrics, but it fails to imply diversity of measures. Lately, I'm liking #altmetrics," positioning the concept as an alternative to traditional article-level metrics focused primarily on citations, emphasizing instead a broader array of online indicators. Shortly thereafter, Priem co-authored the Altmetrics Manifesto with Dario Taraborelli, Paul Groth, and Cameron Neylon, published online in October 2010. The manifesto critiqued conventional citation-based evaluation as "slow" and lagging "years behind real impact," arguing that it incentivizes "conventionality" and overlooks accountability in while most papers eventually garner some citations regardless of influence. It proposed altmetrics as complementary measures drawing from interactions (such as tweets and shares), mentions, references, coverage, downloads, and other web-based signals to quantify the "diverse, heterogeneous impacts" of scholarship that citations miss, including rapid dissemination in the "" of online scholarly discourse. This rationale stemmed from empirical observations of growing online scholarly activity, where web data provided near-real-time proxies for influence unavailable through delayed citation counts. The manifesto's reception was polarized: it garnered enthusiasm among and advocates for highlighting timely, non-elite forms of impact beyond gatekeeping, but elicited from bibliometricians, who questioned altmetrics' validity in reliably measuring substantive scholarly amid potential from transient online attention.

Early Developments and Key Milestones (2010–2020)

In , the launch of dedicated altmetrics aggregators marked a pivotal step in operationalizing the concept beyond theoretical manifestos. , founded by Euan Adie, began tracking scholarly mentions across , news outlets, and policy documents, providing researchers with aggregated attention data via embeddable widgets. Concurrently, Plum Analytics emerged, offering services to institutions for monitoring diverse impact indicators such as downloads, shares, and discussions. These tools capitalized on the expanding availability of from platforms like and , facilitating automated collection of online interactions tied to research outputs via DOIs. By 2012, further milestones included the public launch of Impactstory, an open-source platform developed by Jason Priem and others to generate narrative profiles of research influence, integrating metrics from sources like blogs, , and academic networks. PLOS ONE advanced practical adoption that year by announcing its Altmetrics Collection, extending its 2009 Article-Level Metrics system—which already captured views, citations, and early social signals—to highlight articles with notable online buzz. Partnerships proliferated, with journals embedding provider badges (e.g., Altmetric's colorful donuts) to display real-time scores, enabling authors and readers to gauge broader societal engagement without relying solely on delayed citation counts. From 2016 to 2020, altmetrics saw maturation through exploratory integrations into evaluation practices, driven partly by empirical critiques of traditional metrics' vulnerabilities, such as cartels where coordinated groups artificially boost rankings via citing. Publishers and funders piloted altmetrics in assessments to capture faster, multifaceted impacts, though challenges in data consistency persisted due to varying source weights and platform algorithm changes. This period's growth was underpinned by widespread usage—reaching over 100 million assignments by 2018—and enhanced access, which lowered barriers to harvesting signals from expanding ecosystems, fostering scalability and cross-platform .

Definition and Core Concepts

Precise Definition and Scope

Altmetrics encompass quantitative indicators derived from online activities surrounding scholarly outputs, such as mentions, shares, and discussions on platforms (e.g., X, formerly ), blogs, news sites, and reference managers, capturing post-publication attention traces that are systematically trackable via public . These metrics focus exclusively on digital footprints of engagement occurring after a work's release, excluding pre-publication processes like or traditional database citations, and are limited to verifiable, platform-specific counts rather than qualitative assessments of influence. For example, a might accrue altmetric data from 50 X posts linking to it within days of publication, reflecting immediate visibility among online audiences. The core scope delineates measurable impacts as aggregated counts of interactions from predefined sources, such as policy citations in documents or saves in tools like , while excluding unquantifiable or offline influences like classroom adoptions or private discussions. Altmetrics do not extend to causal evaluations of real-world application, such as changes or technological implementations stemming from research, as these require longitudinal tracing beyond API-accessible data. Empirical studies from the , analyzing datasets from platforms like and blogs, demonstrated that altmetrics capture visibility in non-academic spheres but exhibit low to moderate with counts (Spearman rho often 0.1–0.3 across disciplines), indicating they delineate a separate dimension of attention with minimal overlap in identifying high-impact works. A key construct within this scope is the Altmetrics Attention Score, defined as a weighted summation of attention sources tailored to individual outputs, prioritizing rapid, diverse signals over normalized benchmarks. This score quantifies exposure breadth—for instance, weighting a outlet mention higher than a single —but inherently measures raw visibility without verifying comprehension, endorsement, or quality, as sources may include automated shares, bots, or dissenting commentary. Thus, while altmetrics empirically bound scholarly impact to observable online metrics, they necessitate interpretation cautious of gaming vulnerabilities, such as coordinated amplification campaigns observed in early platform data.

Distinction from Traditional Bibliometrics

Traditional , exemplified by metrics such as the —which quantifies a researcher's productivity and citation impact—and journal impact factors—which average citations received by articles in a journal over a two-year window—rely exclusively on formal peer citations within scholarly . These measures accumulate slowly, often requiring years for citations to accrue, as they depend on deliberate evaluation and incorporation by domain experts, ensuring a degree of rigor through vetted validation. In divergence, altmetrics aggregate signals from non-academic online activities, such as mentions and webpage views, enabling near-immediate feedback (frequently within days of publication) but introducing susceptibility to transient noise, including bots, , or coordinated campaigns that do not necessitate substantive engagement. Causal differences arise from their foundational mechanisms: bibliometric citations represent causal chains of expert acknowledgment and knowledge building, filtered by and relevance to advancing fields, whereas altmetrics track broader attention diffusion, often amplified by platform algorithms that prioritize virality and over evidentiary depth, potentially fostering chambers where ideological alignment or drives metrics independently of scholarly merit. Empirical analyses underscore this disconnect, with multiple studies reporting weak positive correlations between altmetric attention scores and citation counts—typically Pearson's r values under 0.3 across disciplines—suggesting altmetrics more reliably indicate early or than enduring intellectual influence. For instance, a 2014 comprehensive comparison of top articles found altmetrics captured distinct high-visibility outliers but failed to align with citation-based rankings, highlighting their non-equivalence in assessing . While bibliometrics exhibit shortcomings, such as field-specific biases (e.g., faster citation rates in biomedicine versus mathematics) and insensitivity to non-citation societal effects, they mitigate superficial inflation through their lag and exclusivity to expert networks. Altmetrics, conversely, risk overemphasizing manipulable or ephemeral signals—evident in cases where retracted papers garnered sustained online buzz post-retraction—without inherent safeguards against low-quality amplification, though they can reveal impacts invisible to citation logs, like policy uptake. This empirical divergence implies neither fully supplants the other; correlations remain too modest for altmetrics to predict bibliometric success reliably, positioning them as complementary diagnostics rather than proxies.

Data Sources and Measurement Categories

Social Media Mentions and Discussions

Social media mentions and discussions form a core component of altmetrics by tracking conversational engagement with scholarly outputs on platforms including (rebranded as X in 2022), , and blogs. These sources aggregate metrics such as shares, likes, retweets, comments, and direct mentions, which reflect public and peer discourse rather than passive consumption. , a primary provider, monitors over 14,000 blogs alongside social platforms for links and references to research, emphasizing discussion volume as an indicator of broader societal attention. Despite generating high volumes of data—often in the thousands of mentions for high-profile papers—these metrics exhibit a low due to factors like automated bot activity, self-promotional posts, and ephemeral trends that dilute meaningful scholarly dialogue. A 2012 analysis of altmetrics highlighted persistent noise from irrelevant or superficial interactions, even after efforts, underscoring that raw counts rarely equate to substantive . Platforms like contribute niche discussions in subreddits, but these are prone to echo chambers and non-expert commentary, further complicating interpretability. Empirical studies from the , such as a 2013 examination of and other services, revealed statistically significant but weak correlations between volumes and traditional metrics like citations (e.g., positive scores associated with higher citations across disciplines), with even looser ties to article downloads or sustained engagement. This suggests discussions serve more as early buzz indicators than reliable proxies for deeper influence, as short-lived conversations often fail to predict long-term scholarly uptake. In the 2020s, platform dynamics have intensified challenges; following the Twitter-to-X rebrand, overall activity declined yet remained above pre-2020 levels, while academic users reported reduced visibility and engagement due to algorithmic shifts favoring paid or high-follower accounts. Many scholars migrated to alternatives like Bluesky, eroding X's dominance in tracking research discussions and prompting altmetrics providers to adapt data collection amid fragmented audiences. Providers continue monitoring X for its residual role in academic sharing, but the shift highlights the fragility of reliance on single-platform discussions for robust altmetrics.

Views, Downloads, and Saves

Views in altmetrics encompass page views on publisher websites, captured through server logs or integrated , providing an early indicator of accessibility shortly after . Downloads, similarly, track full-text retrievals such as PDF files, with often standardized under the to enable consistent reporting and comparability across publishers and platforms. These usage metrics accumulate more rapidly than traditional citations, frequently registering within days of release, as they reflect immediate online consumption rather than delayed peer validation. Saves, or bookmarking actions, are derived from reference management tools like and , where users archive articles for personal libraries, signaling perceived future relevance. Open access publication enhances these metrics, with studies showing open access articles in medical journals achieving significantly higher page views and PDF downloads than subscription-based equivalents, attributed to barrier-free access. This boost underscores how publication models influence visibility, though raw counts from diverse sources may vary in reliability due to differences in tracking methodologies. Despite their , views, downloads, and saves primarily trace passive exposure and do not reliably correlate with or substantive , as metrics capture events without verifying or retention. For example, automated bots or cursory scans can inflate views, while saves may reflect topical interest over thorough review, limiting their interpretive value without contextual analysis. Thus, these indicators serve best as proxies for reach in altmetrics frameworks, complemented by more active measures elsewhere.

Recommendations, Citations, and Policy Documents

Altmetrics encompass references to scholarly outputs in policy documents, which include government guidelines, white papers, reports from policy institutes, and international organization publications such as those from the (WHO). These mentions typically involve explicit citations or discussions of research findings informing policy decisions, signaling potential real-world application beyond academia. Empirical studies using data reveal such policy citations are infrequent, affecting a very small proportion of scientific papers—often less than 1% in large samples from databases like —yet they carry substantial interpretive weight as indicators of societal influence due to the authoritative nature of the citing entities. Recommendations in altmetrics capture user-endorsed highlights from platforms like , where readers rate and suggest books, providing insight into public engagement with scholarly monographs or accessible texts. Similarly, Facebook recommendations track algorithmic or user-promoted shares that endorse content, though these are distinguished from mere mentions by their affirmative intent. These sources bridge informal public validation with structured feedback, offering a hybrid metric that complements traditional but remains susceptible to popularity biases rather than rigorous evaluation. Citations within altmetrics extend to non-journal contexts, such as early or alternative references in repositories like (PMC), clinical guidelines, and syllabi, which reflect practical adoption in , healthcare, or preliminary scholarly discourse. Clinical guidelines, for instance, document research integration into evidence-based medical protocols, with tracking citations from sources providing healthcare decision guidance. Government citations in policy overlap here, forming a hybrid category that approximates traditional in formality while emphasizing applied impact; however, these constitute a minor fraction of overall altmetric signals, typically under 5% of attention scores in domain-specific analyses, underscoring their rarity amid dominant volume.

Calculation Methods and Scoring Systems

Components of the Altmetric Attention Score

The Altmetric Attention Score (AAS) aggregates attention events from tracked online sources into a single metric through an automated, proprietary that emphasizes volume, source type, and contextual modifiers. Each unique mention—defined as one per individual per source to prevent duplication from repeated shares—is weighted according to the perceived reach of the platform or outlet, then adjusted for factors such as the poster's (e.g., follower count or authority) and, in some cases, recency thresholds for historical data inclusion. The core computation follows a summation structure: AAS is the rounded sum across sources of (mention count from source * source weight * applicable modifiers). Source weights assign higher values to outlets with broader audiences; for instance, mainstream news articles receive a base weight of 8, blogs 5, policy documents or edits 3, and posts 1 (with at 0.25). Modifiers refine this: news sources are tiered by prominence (e.g., major outlets like amplified over niche sites), while mentions incorporate "reach" (audience size), "promiscuity" (tendency to share indiscriminately), and bias adjustments to discount low-effort or automated activity. Certain sources, like reads or citation databases, contribute zero weight by design. Illustrative examples highlight the aggregation without implying equivalence to scholarly quality. A 2023 research article garnering 100 unique mentions and 5 outlet citations might yield a base of 100 (tweets at weight 1) plus 40 ( at weight 8), totaling around 140 before modifiers potentially elevating it to approximately 150 if influential accounts or high-tier are involved. In practice, final scores diverge from raw tallies due to these adjustments; for example, 84 tweets combined with 2 mentions produced an AAS of 85 rather than a simple 100, reflecting deductions for retweet discounts (0.85 ) or lower-influence posters. Scores vary substantially by , with biomedical outputs often registering higher due to denser and ecosystems compared to fields like , where attention clusters in peer reviews or syllabi (both weight 1).

Algorithmic Weighting and Potential Biases in Computation

The Altmetric Attention Score assigns differential weights to mentions from various sources based on an that factors in source type, reach, and perceived influence, with documents and outlets typically receiving higher multipliers than individual posts such as . For instance, a single mention in a document may contribute substantially more to the score than a , reflecting assumptions about the former's authoritative impact, though exact multipliers remain proprietary and vary by context. These weights derive from heuristic judgments rather than rigorous empirical testing against outcomes like long-term citation accrual or policy adoption, leading to critiques that they embed unverified assumptions about source quality. Algorithm designers prioritize factors like outlet prestige and dissemination potential, yet analyses indicate limited validation, with weights potentially overemphasizing transient visibility over substantive engagement. A key bias arises from the algorithm's sensitivity to volume-driven attention, which amplifies sensational or timely topics prone to hype, as higher-weight sources like often prioritize novelty over depth. This can distort scores towards or spikes, deviating from neutral impact measurement. For example, in 2020, COVID-19-related research outputs saw altmetric scores surge due to intensive coverage, with top papers achieving scores far exceeding norms in comparable fields, driven by aggregated mentions rather than balanced assessment. Such computational biases risk conflating raw with meaningful , as unweighted or low-weight sources (e.g., expert ) may signal deeper but contribute less, while hype cycles inflate scores without corresponding of causal . Empirical reveals that these choices, while intending to approximate societal , often lack and , potentially perpetuating imbalances favoring accessible, populist channels over specialized ones.

Adoption and Practical Implementation

Integration in Academic Institutions and Publishers

Major publishers have incorporated altmetrics into their platforms to provide authors and readers with indicators of online attention beyond traditional citations. Elsevier began displaying Attention Scores and badges on article pages across hundreds of journals starting in 2015, integrating data into tools like to track mentions, news coverage, and policy citations. Similarly, implemented article-level metrics, including altmetrics components such as social shares and views, as early as 2013-2014 to highlight broader dissemination of content. These integrations allow publishers to embed altmetric widgets directly in digital object identifiers (DOIs) and journal interfaces, facilitating real-time visibility tracking for over 250 Elsevier titles by the mid-2010s. Academic institutions have adopted altmetrics for internal evaluations and reporting, often through institutional licenses to aggregators. Universities in the UK, for instance, leveraged altmetrics during preparations for the 2021 (REF) to quantify publication trends and societal engagement, with analyses showing higher attention scores for openly accessible outputs. Adoption rates vary, but by 2020, numerous institutions subscribed to altmetrics services to benchmark departmental outputs against peers, driven by mandates from funders like emphasizing demonstrable public impact. Key motivations include capturing timely societal reach, particularly amplified by open access policies; studies indicate open access articles garner 20-50% higher altmetric scores due to increased online accessibility and sharing. This complements citation lags in fast-moving fields, enabling institutions to report on policy citations and media uptake in funding bids. While traditional bibliometrics remain dominant, altmetrics integration has grown empirically, with publisher dashboards reporting sustained use for author services and journal benchmarking since the 2010s.

Available Tools, Platforms, and Vendor Examples

, developed by , is a leading proprietary platform that aggregates attention data from sources including , news outlets, blogs, and policy documents, presenting it via embeddable badges and detailed visualizations such as the "donut" graphic representing the Attention Score. It offers free tools like the for browser-based querying of article metrics and access for non-commercial , enabling institutions and individuals to integrate attention indicators into repositories and personal profiles. However, its proprietary aggregation methods limit full transparency in source selection and scoring. PlumX Metrics, provided by following its acquisition of Plum Analytics, categorizes interactions into five groups—captures (e.g., saves in ), citations, mentions, social shares, and usage (views/downloads)—and displays them through icons on publisher platforms without a single weighted score. This approach allows granular analysis of types but relies on 's for comprehensive coverage, with harvesting that may prioritize affiliated content. Dimensions, also from , supplies free badges that combine altmetrics with citation data in interactive formats, facilitating quick assessments of publication reach across academic and online channels. For transparency-focused alternatives, Impactstory stands out as an open-source platform enabling researchers to create profiles tracking diverse impacts via public APIs, integrating with for verifiable outputs and emphasizing accessible, shareable metrics without commercial gatekeeping. While proprietary vendors like and PlumX offer broader, automated tracking suitable for institutional integration, open tools such as Impactstory promote verifiability through code availability, though they may cover fewer sources due to reliance on volunteer-maintained data feeds.

Applications and Interpretive Frameworks

Valid Use Cases for Broader Impact Assessment

Altmetrics enable the tracking of dissemination into spheres by aggregating s in official documents, offering quantifiable evidence of on processes that traditional metrics overlook. A study examining .com data found that mentions effectively identify articles shaping governmental and organizational agendas, with and document counts correlating to documented broader impacts in areas like and environmental regulation. In climate , for example, papers achieving high Altmetric Attention Scores through coverage—such as a 2023 of ExxonMobil's warming projections with a score of 8,686 from 823 outlets—have fueled debates on corporate , demonstrating how uptake signals potential pathways to adoption. These metrics complement citation-based assessments by revealing interdisciplinary and societal engagement, particularly for publications that amplify reach beyond academic silos. Data from Pfizer-sponsored works (2014–2019) showed articles receiving markedly higher mentions and contributions to policy discussions compared to closed access counterparts, with 18 of the top 20 highest-scoring outputs being openly available. Similarly, in the and domain, altmetrics analysis of over 12,000 mentions and 1,500 news items for publications (2012–2021) provided dashboards of audience engagement—distinguishing researchers from public actors—and supported evaluations for and by evidencing non-scholarly . Such applications succeed when altmetrics are integrated as supplementary indicators, capturing rapid-response impacts like public discourse in urgent fields. Empirical cases confirm their role in validating efficacy, where elevated scores from diverse sources predict sustained external attention without supplanting peer-reviewed rigor.

Methodological Limitations and Misinterpretation Risks

Altmetrics scores can mislead users by conflating raw attention volume with substantive validation or comprehension, as mentions and shares frequently prioritize over rigorous evaluation of content quality. For example, interactions often capture transient interest without indicating whether the audience has read, understood, or endorsed the underlying , leading to overinterpretation of virality as merit. A prominent illustration involves retracted publications, which retain elevated altmetric attention long after withdrawal; analysis of cross-platform data from 2010 to 2020 revealed that social media and news mentions for such papers declined minimally post-retraction, perpetuating potential misinformation through uncorrected visibility. In one dataset of over 3,000 retracted biomedical articles, 279 retracted within a year still accrued ongoing media and social engagement, underscoring how altmetrics fail to dynamically adjust for validity flags. Methodologically, altmetrics suffer from low inter-provider , with scores for identical outputs varying due to disparate times, source coverage, and algorithmic thresholds; a 2015 comparison of 1,000 articles across providers like Altmetric.com and ImpactStory found discrepancies in counts for the same metrics, such as Mendeley readers, attributable to incomplete API syncing and selective indexing. This variability precludes reproducible results, as re-queries at different intervals or via alternative platforms yield divergent figures, complicating longitudinal or cross-study assessments. Disciplinary imbalances exacerbate under-measurement risks, particularly in humanities fields where online dissemination norms favor traditional outlets over social platforms; a 2015 study of humanities publications from 2012 documented sparse altmetric coverage, with fewer than 10% registering notable scores compared to equivalents, reflecting lower baseline engagement rather than diminished impact. Such gaps arise from humanities scholars' reliance on non-digital networks, rendering altmetrics ill-suited for equitable evaluation across domains without field-normalized adjustments.

Empirical Evidence on Effectiveness

Correlations with Citation-Based Metrics

Studies examining the relationship between altmetrics, particularly the Attention Score (AAS), and traditional citation counts have generally reported weak to moderate positive correlations, with Pearson's values typically ranging from 0.19 to 0.40 across various fields. A 2021 of health sciences publications found a pooled correlation of =0.19 (95% : 0.12–0.25), indicating limited overlap between attention and scholarly citations. Similarly, analyses in and literature for articles published around 2016 yielded =0.40 for citations but only =0.25 with journal impact factors, highlighting discipline-specific variability and overall modest associations. Correlations tend to be stronger for recently published and articles, where altmetrics capture initial visibility that aligns more closely with early accrual. publications exhibit an "altmetrics advantage," with higher AAS driven by broader online dissemination, though this does not consistently translate to sustained gains. However, these links weaken over time; altmetrics modestly forecast short-term (within 1–2 years) but show negligible predictive power for long-term trajectories, suggesting they reflect transient buzz rather than enduring scholarly influence. Inconsistencies across studies underscore that while positive correlations exist, they are insufficient to equate altmetric attention with citation-based impact, with r values rarely exceeding 0.3 in cross-disciplinary samples and often lower in mature fields like (r≈0.25). This pattern implies altmetrics may proxy early diffusion but fail to capture deeper quality signals embedded in citations accumulated over 5+ years.

Longitudinal Studies and Predictive Validity Assessments

A longitudinal analysis of altmetric indicators from PlumX across publications from 2012 to 2015 revealed distinct life cycles, with social media mentions (e.g., ) peaking within the first year and declining sharply thereafter, while readership metrics like showed more sustained accumulation over multiple years but still plateaued after 3–5 years. Policy document citations exhibited the slowest growth, often requiring 5–10 years to reach meaningful levels, contrasting with the rapid but ephemeral nature of news and blog coverage. These patterns persisted across fields, underscoring altmetrics' sensitivity to source-specific temporal dynamics rather than uniform predictors of ongoing attention. Over a decade (2008–2018 cohorts), coverage and trends in five altmetric sources evolved variably: Twitter coverage surged from under 30% for older papers to over 90% for recent ones, reflecting platform maturity, while blogs declined in relative prominence and policy mentions remained sparse even after 10 years. readership correlated with later citations in some analyses, yet overall altmetric-citation associations weakened when adjusting for publication age, as early social buzz often decoupled from cumulative scholarly uptake. Assessments of indicate limited forecasting of enduring impact, with altmetrics explaining modest variance in long-term citations (typically R² < 0.20 across fields), as immediate attention from sources like fails to reliably anticipate sustained citation trajectories beyond initial visibility. During the period (2020–2022), altmetric scores inflated dramatically for related outputs due to heightened public and media engagement, yet this attention yielded only short-term citation premiums (e.g., 4.8-fold initial boost fading by 66–77% within a year), highlighting a from persistent scholarly . Longitudinal tracking post-2020 confirms that such spikes do not enhance journals' or papers' long-term metrics proportionally, attributing stagnation in to altmetrics' toward transient virality over rigorous validation.

Controversies and Criticisms

Vulnerability to Manipulation and Gaming

Altmetrics' dependence on unverified signals exposes them to via automated bot networks that simulate engagement through repeated mentions, likes, or shares. These bots, often deployed in coordinated campaigns, can generate illusory spikes in attention scores by mimicking activity without contributing meaningful . For instance, actors may employ scripts to amplify a paper's visibility on platforms like , exploiting APIs that altmetrics providers track, thereby inflating scores by factors observed in empirical analyses of suspicious bursts. Coordinated human-driven efforts further exacerbate vulnerabilities, such as organized groups or journal-affiliated accounts systematically sharing content to boost metrics absent genuine scholarly interest. Documented cases from onward reveal self-promotional tactics by publishers, where repetitive cross-posting across networks elevated attention without proportional evidence of broader impact. This gaming circumvents traditional gates, as altmetrics aggregate raw counts rather than vetted interactions, enabling rapid, low-cost inflation that traditional citations resist due to their slower, expert-mediated accrual. Empirical scrutiny has detected anomalous patterns in 10-20% of tracked social signals attributable to non-organic sources, including bot-like and echo-chamber , which erode the metrics' verifiability. Providers like acknowledge these risks but rely on post-hoc filtering, such as excluding obvious , yet causal loopholes persist in open ecosystems where manipulators adapt faster than detection algorithms. Consequently, unmitigated gaming undermines altmetrics' role in , as inflated scores may prioritize visibility over substantive , distorting incentives in open-access environments.

Prioritization of Virality Over Scholarly Rigor

Altmetrics, by aggregating mentions from , outlets, and documents, often elevate research outputs that achieve rapid public dissemination through or controversy rather than enduring scholarly validation. This mechanism inherently favors virality, as platforms like (now X) and amplify emotionally charged or timely topics, rewarding attention over methodological soundness. For instance, in 2021, 98 of the 100 scientific articles with the highest scores focused on , including studies on and —treatments that generated intense debate but were subsequently criticized for lacking robust evidence in randomized trials. Such patterns demonstrate how altmetrics can prioritize immediate buzz, incentivizing researchers to frame findings in ways that provoke outrage or hope, diverging from the slower, peer-driven scrutiny that underpins scientific merit. Empirical analyses reveal limited alignment between high altmetric attention and indicators of rigor, such as low retraction rates or prestigious awards. Retracted COVID-19 papers, particularly those on vaccines, continued to accrue altmetric mentions post-retraction, with some sustaining elevated scores due to persistent online sharing of misinformation. Studies examining broader correlations find weak or field-specific links between altmetric scores and peer-assessed quality, with no consistent for scholarly excellence; for example, altmetrics showed the lowest association with quality in arts and , and only modest ties in sciences. This disconnect underscores an incentive structure where hype—often tied to public fears or trends like 2020s —outpaces validation, normalizing media-fueled metrics as proxies for impact despite their divergence from causal evidence of scientific advancement. The resultant distortion encourages a feedback loop wherein scholars pursue appeal, potentially at the expense of rigorous, incremental work that may not garner immediate but contributes to foundational knowledge. High-profile cases, such as debated studies amassing amid safety concerns, illustrate how altmetrics can validate dynamics over empirical falsification, eroding the first-principles emphasis on replicability and in favor of performative dissemination. Without adjustments for these biases, altmetrics risk entrenching a system where scholarly rigor yields to the transient rewards of controversy.

Influence of Media and Ideological Biases

Altmetrics' reliance on mentions from outlets and embeds ideological biases prevalent in those platforms, as providers like .com curate lists of tracked sources that favor , which empirical analyses have shown to exhibit a systemic left-leaning tilt in coverage. For instance, a study of U.S. outlets found that all except ' Special Report and scored to the left of the political center on a guest measure derived from transcripts. This curation process limits altmetrics to a narrow selection of predominantly progressive-leaning sources, potentially skewing scores toward aligning with dominant narratives in and , where left-wing perspectives prevail due to institutional homogeneity. Critics argue that such weighting disadvantages heterodox or conservative-leaning research, as mainstream outlets often underreport or critically frame dissenting findings on topics like differences or policy critiques of progressive interventions, resulting in lower visibility and scores. In , for example, altmetrics reveal a highly hierarchical distribution of that correlates with patterns but likely extends to ideological , suppressing minority viewpoints in a field where conservative scholars report systemic marginalization. Defenders counter that altmetrics' inclusion of diversifies signals beyond elite media gatekeepers, capturing engagement that can elevate contrarian work, though empirical data indicate social platforms historically amplify left-leaning until recent platform changes. Platform data from 2023 onward suggest topics dominating altmetrics often reflect priorities, with progressive-aligned issues like climate policy or garnering disproportionate mentions, while conservative-leaning analyses on economic or receive muted coverage due to source selection biases. This dynamic risks conflating virality with validity, as ideological echo chambers in curated outlets prioritize narrative fit over empirical rigor, underscoring altmetrics' vulnerability to societal skews rather than .

Recent Developments and Future Directions

Advancements in AI-Driven Analysis (2023–2025)

In September 2025, Altmetric launched an AI-powered sentiment analysis feature in its Explorer platform, utilizing large language models to assess the emotional tone of online mentions tracking research outputs. This tool processes mentions from sources such as social media and news outlets, assigning scores from -3 (strongly negative) to +3 (strongly positive) based on expressed opinions toward the research itself, thereby enabling differentiation between mere attention volume and qualitative reception. The feature supports sentiment classification by generating breakdowns, visualizations, and filters within the platform, allowing users to identify trends like endorsements from key opinion leaders or emerging reputation risks. Integration of AI-driven enhances contextual understanding, moving beyond raw counts in the Altmetric Attention Score to incorporate nuance, such as distinguishing advocacy from criticism. Early vendor evaluations indicate improved interpretability for research impact assessment, though independent empirical validation of accuracy enhancements, including potential reductions in interpretive noise from ambiguous mentions, awaits peer-reviewed confirmation. AI advancements have also enabled real-time processing of expansive, heterogeneous data streams in altmetrics, facilitating more responsive tracking of attention dynamics across platforms. While specific implementations for anomaly detection—such as flagging irregular spikes in mentions indicative of coordinated activity—remain undetailed in public disclosures, the underlying machine learning capabilities position altmetrics for refined filtering of non-substantive noise, pending demonstration through controlled studies. These developments, concentrated in proprietary tools like Altmetric's, reflect a shift toward quantitative-qualitative metrics, with ongoing refinements reported to prioritize research-specific sentiment over general virality.

Emerging Research and Potential Reforms

Recent analyses of altmetrics data from November 2021 to November 2024 indicate a temporal decline in overall Attention Scores and mentions on X (formerly Twitter), coinciding with the platform's acquisition and rebranding, as well as external events like the 2024 U.S. elections. Despite this, X remains a dominant source for research-related posts, with weekly activity exceeding pre-2019 levels and showing increased virality through higher repost ratios, though total posts have fallen from pandemic peaks of over 3 million per month to around 2 million by early 2024. In response to platform fragmentation, altmetrics providers have integrated emerging networks like Bluesky in December 2024, enabling tracking across its growing user base of over 23 million accounts to capture scholarly discussions migrating from X. Emerging research advocates broadening altmetrics scopes to enhance reliability, including systematic coverage of decentralized platforms such as Mastodon and Bluesky, alongside non-English content from regions like China and India to mitigate Anglo-centric biases. Proposals emphasize adding timestamps to scores to account for content ephemerality, such as post deletions on X, and leveraging large language models for contextual sentiment analysis to discern substantive engagement from transient noise. These reforms aim to improve metric stability and interoperability via protocols like ActivityPub, fostering a more resilient framework less dependent on single platforms. Potential directions include indices combining altmetrics with traditional s, where recent studies demonstrate altmetrics' utility in early of long-term counts, suggesting integrated models could better validate . Reforms prioritizing —such as enhanced for expert-driven outlets over general public mentions—seek to emphasize causal connections to verifiable outcomes, like adoption or clinical application, rather than raw volume, though empirical validation of such adjustments remains preliminary. Ongoing work calls for greater transparency in algorithms to align metrics with scholarly rigor amid evolving digital landscapes.

References

  1. [1]
    What are altmetrics?
    Altmetrics help everyone involved in research to track meaningful engagement, such as mentions, citations and much more. Find out more.
  2. [2]
    Evaluating Research Impact Using Altmetrics
    Aug 12, 2025 · Altmetrics are social media indicators and web based analytics used to demonstrate the potential impact and reach of scholarly research outputs.
  3. [3]
    Research Guides: Measuring Your Scholarly Impact: Altmetrics
    May 28, 2025 · The term "altmetrics" (alternative metrics) is used to describe approaches to measure the impact of scholarship by using new social media tools ...
  4. [4]
    Chapter 1. Introduction to Altmetrics | Library Technology Reports
    Altmetrics as a term was coined in September 2010 by Jason Priem, a doctoral student at UNC-Chapel Hill's School of Information and Library Science (see figure ...
  5. [5]
    Altmetrics explained - Subject guides - Monash University
    The term altmetrics, a shortening of "alternative" metrics, was first shared on Twitter in 2010 by Jason Priem, a doctoral student who went on to co-found ...
  6. [6]
    [PDF] A brief history of altmetrics - Research Trends
    Jun 1, 2014 · The above manifesto signaled the birth of altmetrics. It grew from the recognition that the social web provided opportunities to create new ...
  7. [7]
    About Us - Altmetric
    Altmetric has served the research community since 2011. We aim to broaden and deepen our understanding of the value of research and how it affects people and ...Missing: history | Show results with:history
  8. [8]
    Research impact: Altmetrics make their mark - Nature
    Aug 21, 2013 · 'Altmetrics', a term coined in 2010 by ImpactStory co-founder Jason Priem, refers to a range of measures of research impact that go beyond ...
  9. [9]
    Predicting citation impact from altmetric attention in clinical and ... - NIH
    Feb 27, 2022 · Altmetrics show how published research is influencing public knowledge and discourse among scientific organizations, researchers in the same or ...
  10. [10]
    Correlation between Altmetric Attention Scores and citation scores ...
    This cross-sectional study suggests that AAS have a moderate positive correlation with citation scores. Despite this, AAS must be interpreted with caution.
  11. [11]
    Evaluative altmetrics: is there evidence for its application to research ...
    Jul 24, 2023 · This paper develops a method for applying altmetrics in the evaluation of researchers, focusing on a case study of the Environment/Ecology ESI field ...Missing: controversies criticisms
  12. [12]
    Unpacking The Altmetric Black Box - The Scholarly Kitchen
    Aug 24, 2021 · Altmetric always pushes folks to dive into the actual mentions to evaluate for sentiment and context. They always stress that the number doesn't ...
  13. [13]
    Limitations of using Altmetrics in impact analysis
    Apr 15, 2018 · The other big one is that controversial papers get high scores even if they are bad papers (but that is similar to citations in some ways).Missing: criticisms | Show results with:criticisms
  14. [14]
    Do altmetric scores reflect article quality? Evidence from the UK ...
    Mar 14, 2023 · It exploits expert norm-referenced peer review scores from the UK Research Excellence Framework 2021 for 67,030+ journal articles in all fields ...Missing: key | Show results with:key
  15. [15]
    Alternative metrics, traditional problems? Assessing gender ... - NIH
    May 19, 2023 · The use of altmetrics has been met with considerable criticism along three main axes. Firstly, it remains unclear what altmetrics are meant to ...
  16. [16]
    A critical review on altmetrics: can we measure the social impact ...
    Jul 2, 2021 · Altmetrics measure the digital attention received by a research output. They allow us to gauge the immediate social impact of an article by taking real-time ...
  17. [17]
    Have we reached the limits of altmetrics? - Research Information
    Dec 12, 2023 · Up to now the problems of manipulation have been minimal, but with AI the risk is that such informal metrics could quickly become meaningless.Missing: controversies criticisms
  18. [18]
    Jason Priem on X: "I like the term #articlelevelmetrics, but it fails to ...
    I like the term #articlelevelmetrics, but it fails to imply *diversity* of measures. Lately, I'm liking #altmetrics. 2:28 AM · Sep 29, 2010.Missing: August | Show results with:August
  19. [19]
    None
    Nothing is retrieved...<|separator|>
  20. [20]
    "altmetrics: a manifesto" by Jason Priem, Dario Taraborelli et al.
    These alt-metrics reflect the broad, rapid impact of scholarship in this burgeoning eco-system. We call for more tools and research based on altmetrics.
  21. [21]
    (PDF) The State of Altmetrics: A Tenth Anniversary Celebration
    Altmetric was founded in 2011 by my delightful predecessor, Euan Adie, for a simple purpose: to help researchers see the. in uence of their work in real ...
  22. [22]
    Altmetrics – a complement to conventional metrics - PubMed Central
    Jun 5, 2015 · Public Library of Science – Article-Level Metrics (PLoS-ALM) was launched in 2009 to provide article-level metrics on every article across all ...
  23. [23]
    The launch of ImpactStory: using altmetrics to tell data-driven stories
    Sep 25, 2012 · A new webapp aiming to provide a broader picture of impact to help scholars understand more about the audience and reach of their research.Missing: aggregators | Show results with:aggregators
  24. [24]
    Announcing the Altmetrics Collection - EveryONE - PLOS One
    Nov 1, 2012 · Announcing the Altmetrics Collection. November 1, 2012 Jyoti Madhusoodanan Article-Level Metrics Collections. Featured image. With the ...
  25. [25]
    embeddable Badges - Altmetric Support
    Altmetric badges allow you to add altmetrics data to your content with a minimum of fuss and technical effort – all you need to do is add two lines of code to ...Missing: PLOS ONE
  26. [26]
    Visualizing Citation Cartels - The Scholarly Kitchen
    Sep 26, 2016 · By their very nature, citation cartels are difficult to detect. Unlike self-citation, which can be spotted when there are high levels of ...Missing: critiques driving adoption
  27. [27]
    Altmetrics: diversifying the understanding of influential scholarship
    Aug 23, 2016 · Altmetrics as a concept, however, is much younger, having been first articulated in 2010 by a group of scientists in the Altmetrics Manifesto ( ...
  28. [28]
    Do altmetrics point to the broader impact of research? An overview ...
    Altmetrics, articulated by Priem et al. (2010) and a group of scientists, uses data from the social web to track and quantify interactions and is proposed as ...Missing: skepticism | Show results with:skepticism
  29. [29]
    Altmetrics – A Collated Adjunct Beyond Citations for Scholarly Impact
    The present review aims to analyse the correlation of altmetrics with the traditional citations in medical research.
  30. [30]
    The Use of Altmetrics as a Tool for Measuring Research Impact
    Feb 13, 2015 · Altmetrics are a broad class of statistics which attempt to capture research impact through non-traditional means (Priem et al. Citation2010, ...
  31. [31]
    [PDF] Do 'altmetrics' correlate with citations? Extensive comparison of ...
    Jan 17, 2014 · Altmetrics scores (particularly mentions in blogs) are able to identify highly cited publications with higher levels of precision than journal ...
  32. [32]
    Societal and scientific impact of policy research: A large-scale ...
    One approach is altmetrics, which provides a quantitative means of measuring the broader impact of publications, as highlighted by the NISO Alternative ...Societal And Scientific... · 5. Results · 5.3. Effects Of Different...<|separator|>
  33. [33]
    Do altmetrics correlate with citations? Extensive comparison of ...
    Jan 17, 2014 · Altmetrics show positive but weak correlations with citations, and do not reflect the same concept of impact. Altmetrics can identify highly ...
  34. [34]
    A correlation comparison between Altmetric Attention Scores and ...
    Apr 5, 2018 · How consistent are altmetrics providers? Study of 1000 PLOS ONE publications using the PLOS ALM, Mendeley and Altmetric. com APIs. altmetrics ...
  35. [35]
    Correlation Between Altmetric Score and Traditional Bibliometrics in ...
    Feb 8, 2021 · While alternative metrics cannot replace traditional bibliometrics, they may serve a complimentary role in describing the influence of research.
  36. [36]
    [PDF] Examining the Relationship between Altmetric Score and Traditional ...
    Jun 30, 2021 · At this time, Altmetrics are best used as an adjunct that is complementary but not an alternative to traditional bibliometrics for assessing ...
  37. [37]
    Our sources - Altmetric
    Discover Altmetric data sources and how they help you keep track of your online engagement. Find out more about our data sources.
  38. [38]
    How to share your work online - Altmetric
    May 10, 2019 · Altmetric tracks news sources from all over the world, social media platforms like Twitter and Reddit, Wikipedia, over 14,000 blogs, and ...
  39. [39]
    Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact
    Mar 20, 2012 · This is a major source of noise in the impact signal, and while our normalization approach is helpful in reducing this noise, it is not perfect.Missing: ratio | Show results with:ratio
  40. [40]
    Reddit: an important tool in the research engagement tool bag
    Mar 28, 2023 · Although Reddit is not typically the focus of altmetrics research, the platform can act as an informative and valuable tool in altmetrics research.
  41. [41]
    Do Altmetrics Work? Twitter and Ten Other Social Web Services
    Statistically significant associations were found between higher metric scores and higher citations for articles with positive altmetric scores in all cases ...
  42. [42]
    Since Twitter became X… - Altmetric
    Feb 26, 2024 · The current data shows that despite a decline in activity, X's engagement remains higher than pre-COVID levels, underscoring its place in the research ...
  43. [43]
    X's Altmetric Hegemony Ceding to Bluesky - Inside Higher Ed
    Aug 8, 2025 · Since Musk rebranded Twitter to X, many within higher education—including some universities themselves—have decided to leave the platform, ...Missing: impact | Show results with:impact
  44. [44]
    X (formerly Twitter) Tracking update - Altmetric Support
    Sep 23, 2025 · We will continue to track attention to mentions of academic research outputs shared on X at this time. X remains one of the major platforms used ...
  45. [45]
    What is altmetrics counting and how do altmetrics help authors?
    Other ways publishers and aggregators are helping authors is by putting the altmetrics data in context for interpretation. For example: Impact Story normalizes ...Missing: launch | Show results with:launch
  46. [46]
    COUNTER Metrics: Home
    The COUNTER standard means that publishers, aggregators and technology providers can deliver credible, consistent, comparable usage metrics to libraries and ...About · Code of Practice · What's New · Education
  47. [47]
    Scholarly Metrics: Altmetrics - Research Guides
    May 8, 2025 · Altmetrics also track other measures such as views and downloads of the original article. Altmetrics, used in complement with citation ...Missing: saves | Show results with:saves
  48. [48]
    Altmetrics - Measuring Research Impact
    Apr 11, 2025 · Downloaded or printed; Bookmarked or saved to citation managers like Mendeley, Zotero, etc. Mentioned in Blog commentaries; Shared on social ...
  49. [49]
    The impact of open access on citations, Pageviews, and downloads
    Sep 19, 2025 · Results OA articles (n = 78) demonstrated significantly higher citation counts, page views, and PDF downloads compared with subscription-based ...
  50. [50]
    Bibliometrics and Altmetrics: Measuring the Impact of Knowledge
    Jul 15, 2025 · Altmetrics are nontraditional metrics assessing scholarly impact from online activity, like downloads and social media mentions, providing a ...Missing: saves | Show results with:saves
  51. [51]
    Policy Documents - Altmetric Support
    Aug 4, 2021 · Tracked policy sources and document types range from government guidelines, reports or white papers; independent policy institute publications; ...
  52. [52]
    What Can Altmetric.com Tell Us About Policy Citations of Research ...
    Jan 7, 2018 · This study therefore aimed to explore the usefulness of Altmetric.com data as a means of identifying and categorizing the policy impact of research articles.
  53. [53]
    (PDF) How many scientific papers are mentioned in policy-related ...
    Aug 6, 2025 · Using data from Altmetric, we study how many papers indexed in the Web of Science (WoS) are mentioned in policy-related documents. We find that ...
  54. [54]
    Comparing Goodreads reader ratings with citations to history books
    In this paper we show the potential of Goodreads reader ratings to identify the impact of books beyond academia. As a unique altmetric data source, Goodreads ...
  55. [55]
    Clinical Guidelines - Altmetric Support
    Nov 19, 2024 · Altmetric team has just launched a new source: Clinical Guidelines (or CGs) as a whole new attention category within Altmetric.
  56. [56]
    Clinical Guidelines - Altmetric
    A new standard in tracking clinical impact with Clinical Guidelines data in Altmetric. With this new attention source, you can monitor the real-world influence ...<|control11|><|separator|>
  57. [57]
    Relationship of Altmetric Attention Score to Overall Citations ... - JACC
    Aug 3, 2020 · This is the first study to examine the relationship of AAS and its components with download and citations ... policy components compared with the ...
  58. [58]
    How is the Altmetric Attention Score calculated?
    Apr 14, 2025 · The Altmetric Attention Score for a research output provides an indicator of the amount of attention that it has received.Missing: components | Show results with:components
  59. [59]
    Numbers behind Numbers: The Altmetric Attention Score and ...
    May 26, 2015 · The Altmetric attention score is an automatically calculated, weighted algorithm. It is based on 3 main factors: 1. The volume of the mentions ( ...
  60. [60]
    Altmetrics: Mistaking the Means for the End - The Scholarly Kitchen
    May 1, 2014 · We'll see a rise in the flashy and sensationalistic, research on fad diets, Facebook and Sudoku, rather than meaningful but more mundane work ...
  61. [61]
    The Altmetric Attention Score: What Does It Mean and Why Should I ...
    The Altmetric score represents a weighted count of the amount of attention for a research output from a variety of sources.
  62. [62]
    How COVID-19 affected academic publishing: a 3-year study of 17 ...
    May 27, 2025 · As with citations, the distribution of Altmetrics scores is skewed, with most papers and preprints receiving little or no media or social media ...Missing: inflated | Show results with:inflated
  63. [63]
    Assessment of the Dissemination of COVID-19–Related Articles ...
    Our research characterized the top 100 COVID-19–related articles by AAS in the Altmetric database. Altmetrics could complement traditional citation count.Missing: inflated | Show results with:inflated
  64. [64]
    Altmetrics, Biased Metrics, and Contentious Metrics - PubMed
    Altmetrics increasingly are recognized as tools that aim to measure the real-time reach and influence of an academic article.Missing: computation sensationalism
  65. [65]
    Elsevier launches new article metrics module in abstract and citation ...
    Elsevier launches new article metrics module in abstract and citation database Scopus · Press Release: Elsevier [July 30, 2015] · How it works.
  66. [66]
    Usefulness of altmetrics for measuring the broader impact of research
    Sep 9, 2014 · Usefulness of altmetrics for measuring the broader impact of research: A case study using data from PLOS (altmetrics) and F1000Prime (paper tags).
  67. [67]
    Altmetrics - an overview | ScienceDirect Topics
    Among the publishers currently using Altmetrics are PLoS, BioMed Central, Nature, and Elsevier, the latter of which has 250 journals using this tool (http ...
  68. [68]
    Uncover open access trends and prepare for REF 2021 - Altmetric
    Sep 9, 2020 · Get a picture of the trends in online mentions for your open access publications and understand how altmetrics can be used in your REF 2021 ...Missing: pilots | Show results with:pilots
  69. [69]
    [PDF] Institutional Altmetrics and Academic Libraries
    Here again, libraries have an opportunity to take advantage of their long–standing relationships with publishers and advocate on behalf of faculty authors for.
  70. [70]
    Effects of open access publishing on article metrics in ...
    Jan 11, 2024 · Previous studies suggest that OA publishing may increase both citation counts and Altmetric scores of scientific articles – an effect that has ...
  71. [71]
    Unlocking innovation: the value of Open Access in scientific ...
    Oct 21, 2024 · Overall, the percentage of OA publications increased from 53% in 2014 to 69% in 2019. We continue to see a positive trend in OA publishing ...
  72. [72]
    Altmetric: Reveal online attention to research
    Altmetric tracks and analyzes online attention markers to published research to see where research is making a difference.Altmetric data · Free Tools · Altmetric Case Studies · Attention Score
  73. [73]
    Free Tools - Altmetric
    Free tools. We offer a range of free tools for researchers and institutions to help you get started. · Altmetric Bookmarklet · Institutional repository badges.Free Badges for Individual · Bookmarklet for Researchers · Researcher data access
  74. [74]
    PlumX Metrics | Uncover and tell the stories of research - Elsevier
    PlumX Metrics provide insights into the ways people interact with individual pieces of research output (articles, conference proceedings, book chapters, and ...
  75. [75]
    Altmetric.com or PlumX: Does it matter? - Rasuli - Wiley Online Library
    Oct 15, 2024 · This study has two main objectives: first, to evaluate how the choice between Altmetric.com and PlumX affects the results of alternative metrics analysis.<|separator|>
  76. [76]
    Dimensions Badges: A new way to see citations - Altmetric
    Jan 26, 2018 · Altmetric helped create the Dimensions badges, an interactive visualization that shows citation data for individual publications.
  77. [77]
    Impactstory: Discover the online impact of your research
    Impactstory is an open-source website that helps researchers explore and share the the online impact of their research.
  78. [78]
    Tools - Altmetrics - Guides at University of Pittsburgh
    Sep 12, 2025 · ImpactStory is a free tool that allows researchers to build their own profiles and is not attached to any institution.
  79. [79]
    How can policy document mentions to scholarly papers be ...
    Oct 7, 2023 · One of them is based on altmetrics, which is seen as an interesting possibility to quantitatively measure the broader impact of publications ( ...
  80. [80]
    Analysis: The climate papers most featured in the media in 2023
    Jan 10, 2024 · Using Altmetric data, Carbon Brief has compiled a list of the 25 most talked-about climate-related papers that were published in 2023.
  81. [81]
    Evaluative altmetrics: is there evidence for its application to research ...
    Jul 25, 2023 · This paper develops a method for applying altmetrics in the evaluation of researchers, focusing on a case study of the Environment/Ecology ESI field ...
  82. [82]
    Usefulness of altmetrics for measuring the broader impact of research
    Jul 26, 2025 · Purpose – The purpose of this case study is to investigate the usefulness of altmetrics for measuring the broader impact of research. Design/ ...
  83. [83]
    Altmetrics: Alternative Metrics for Measuring the Impact of Research
    Feb 3, 2025 · Limitations · Altmetrics can be easily distorted or misinterpreted · Almetrics are attention indicators that may not be measuring scholarly ...Missing: methodological | Show results with:methodological
  84. [84]
    A critical review on altmetrics: can we measure the social impact ...
    Altmetrics measure the digital attention received by a research output. They allow us to gauge the immediate social impact of an article by taking real-time ...Missing: controversies criticisms
  85. [85]
    Dynamics of cross-platform attention to retracted papers - PMC
    Jun 14, 2022 · The spread of potentially inaccurate or misleading results from retracted papers can harm the scientific community and the public. Here, we ...
  86. [86]
    Media and social media attention to retracted articles according to ...
    May 12, 2021 · Within our sample of 3,097 articles, 279 were retracted within a year since we retrieved data from Altmetric. For these articles we could use ...
  87. [87]
    [PDF] How consistent are altmetrics providers? Study of 1000 PLOS ONE ...
    It seems that similar metrics differ across different providers due to the difference in collection time, data sources and methods of collection among ...
  88. [88]
    [PDF] Using altmetrics for assessing research impact in the humanities
    Consequently, this paper analyses the altmetric coverage and impact of humanities-oriented articles and books published by Swedish universities during 2012.Missing: underrepresentation | Show results with:underrepresentation
  89. [89]
    An extensive analysis of the presence of altmetric data for Web ... - NIH
    The majority of altmetric events go to publications in the fields of Biomedical and Health Sciences, Social Sciences and Humanities, and Life and Earth Sciences ...
  90. [90]
    Meta‐Analysis of Correlations between Altmetric Attention Score ...
    Apr 7, 2021 · In health sciences, currently altmetric score has a positive but weak correlation with number of citations (pooled correlation = 0.19, 95% CI 0.12 to 0.25).1. Introduction · 3. Results · 3.2. Meta-Analysis Results
  91. [91]
    Examining the correlation between Altmetric score and citation count ...
    Jun 20, 2020 · We found that for articles published in 2016, the Altmetric score was weakly correlated with citation count and its median score was strongly ...
  92. [92]
    five altmetric sources observed over a decade show evolving trends ...
    Five altmetric data sources were recorded (Twitter, Mendeley, News, Blogs and Policy) and analysed for temporal trends, with particular attention being paid to ...Missing: milestones | Show results with:milestones
  93. [93]
    Early indicators of scientific impact: Predicting citations with altmetrics
    In this article, we use altmetrics to predict the short-term and long-term citations that a scholarly publication could receive.
  94. [94]
    Correlation between Altmetric Attention Scores and citation scores ...
    There was a moderate positive correlation between Altmetric Attention Scores (AAS) and citation scores (CS) overall, with a weak correlation for Surgery.
  95. [95]
    (PDF) The life cycle of the altmetric impact: a longitudinal study of six ...
    Mar 24, 2017 · The main objective of this study is to describe the life cycle of altmetric indicators in a sample of publications.
  96. [96]
    [PDF] Could scientists use Altmetric.com scores to predict longer ... - arXiv
    Altmetric.com scores intuitively, in conjunction with journal impact, to get an idea of which articles are more likely to attract longer term citations.
  97. [97]
    The significant yet short-term influence of research covidization on ...
    Mar 6, 2023 · This study uses a generalized difference-in-differences approach to examine the impact of publishing COVID-19 papers on journal citations and related metrics.<|control11|><|separator|>
  98. [98]
    Gaming altmetrics
    Sep 18, 2013 · Gaming in altmetrics is any activity intended to influence article metrics, where the content adds no value to the conversation around a paper.Missing: vulnerability | Show results with:vulnerability
  99. [99]
    Chapter 3. Issues, Controversies, and Opportunities for Altmetrics
    The primary defense of altmetrics against accusations of gaming vulnerability therefore comes down to three main points. First, efforts to game the system ...
  100. [100]
    (PDF) Research functionality and academic publishing: Gaming with ...
    Aug 10, 2025 · Altmetrics also suffers from accidental manipulation as the research articles may be discussed on the social web for negative purposes, such as ...Missing: vulnerability | Show results with:vulnerability
  101. [101]
    How easy is it to game the altmetric score?
    Jun 28, 2019 · Like any metric, there's a potential for gaming of altmetrics: Anyone with enough time on their hands can artificially inflate the altmetrics ...Missing: vulnerability | Show results with:vulnerability
  102. [102]
    The "Dark Side" of Academics? Emerging Issues in the Gaming and ...
    Aug 7, 2025 · The dramatic expansion of the use of metrics in higher education institutions worldwide has brought with it gaming and manipulation ...
  103. [103]
    [PDF] Suppression - Issues in Science and Technology
    Of the 100 scientific articles with the highest. Altmetric score in 2021, 98 were about COVID-19, including papers on ivermectin and hydroxychloroquine, two ...
  104. [104]
    Sharing of retracted COVID-19 articles: an altmetric study - PMC
    The correlation between the time since retraction and Altmetric Attention Score was 0.19, indicating that articles retracted further in the past did not ...
  105. [105]
    Do altmetrics correlate with the quality of papers? A large-scale ...
    In this study, we address the question whether (and to what extent, respectively) altmetrics are related to the scientific quality of papers (as measured by ...
  106. [106]
    Mapping global public perspectives on mRNA vaccines and ... - Nature
    Nov 14, 2024 · Our findings reveal widespread negative sentiment and a global lack of confidence in the safety, effectiveness, and trustworthiness of mRNA vaccines and ...
  107. [107]
    High social media attention scores are not reflective of study quality
    This is consistent with research finding that Altmetric Attention scores are not necessarily reflective of study quality [41] . One cause of the discrepancy may ...
  108. [108]
    [PDF] A MEASURE OF MEDIA BIAS1 - Columbia University
    Our results show a strong liberal bias. All of the news outlets except Fox News'. Special Report and the Washington Times received a score to the left of the ...Missing: altmetrics | Show results with:altmetrics
  109. [109]
    A call for broadening the altmetrics tent to democratize science ... - NIH
    Feb 7, 2025 · Left and right Altmetric badges represent publications with Altmetric scores of at least 10 and 100, respectively. (C) Temporal decline in ...
  110. [110]
    [PDF] Altmetric scores in Political Science are gendered – does it matter?
    Jun 19, 2023 · Finding that altmetrics show a highly hierarchical and gendered spread of attention to work in political science, they ask how and why these.
  111. [111]
    An assessment of the precision and recall of Altmetric.com news ...
    To our knowledge, no empirical study has documented the frequency of these missed mentions. Previous work that relies on Altmetric news mention data. Altmetric ...
  112. [112]
    Altmetric adds Sentiment Analysis to social media tracking
    Sep 2, 2025 · Altmetric has introduced a new AI-powered sentiment analysis feature, providing research teams with deeper insights into the public response ...Missing: anomaly detection
  113. [113]
    Sentiment analysis - Altmetric Support
    Aug 27, 2025 · Sentiment analysis in Altmetric uses AI to understand opinions on research, assigning scores from -3 (strongly negative) to 3 (strongly ...Missing: anomaly | Show results with:anomaly
  114. [114]
    Altmetric Sentiment Analysis
    Our Sentiment Analysis tool uses AI to examine each mention of your research and assign it a sentiment score. Find out more.Missing: anomaly detection
  115. [115]
    Altmetric launches AI sentiment analysis for research impact
    Sep 3, 2025 · Altmetric has launched an AI-powered sentiment analysis feature to provide deeper insights into how research is received online.Missing: anomaly detection
  116. [116]
  117. [117]
    A call for broadening the altmetrics tent to democratize science ...
    Feb 7, 2025 · Left and right Altmetric badges represent publications with Altmetric scores of at least 10 and 100, respectively. (C) Temporal decline in ...Missing: embedding | Show results with:embedding
  118. [118]
    Track your research across emerging platforms with Bluesky on
    Dec 3, 2024 · As one of the first to integrate Bluesky, Altmetric continues to empower the research community with transparent and resilient data tracking.
  119. [119]
    Early indicators of scientific impact: Predicting citations with altmetrics
    Aug 10, 2025 · In this article, we use altmetrics to predict the short-term and long-term citations that a scholarly publication could receive. We build ...