Fact-checked by Grok 2 weeks ago
References
-
[1]
[PDF] Benchmarking Weak Supervision on Realistic Tasks - NIPS papersWeak supervision (WS) is a popular approach for label-efficient learning, leveraging diverse sources of noisy but inexpensive weak labels to automatically ...
-
[2]
[PDF] The Weak Supervision Landscape - arXivMar 30, 2022 · In this paper we introduce a framework for categorising weak supervision settings by identifying the key set of dimen- sions that should be used ...
-
[3]
Image Annotation For Computer Vision And AI Model TrainingRating 4.4 (114) Jul 10, 2025 · Cost range: Simple annotations: $0.10–$0.50 per image; Complex annotations (e.g., pixel-level, 3D): $1–$10+ per image; Enterprise ...
-
[4]
[2103.00429] Medical Image Segmentation with Limited SupervisionFeb 28, 2021 · The labeling costs for medical images are very high, especially in ... medical image segmentation, which typically requires intensive pixel/voxel- ...
-
[5]
Fine-tuning coreference resolution for different styles of clinical ...A gold standard is commonly created by manual annotation whose quality is measured by inter-annotator agreement. ... below 80 %. Without any fine-tuning ...
-
[6]
Class-imbalanced datasets | Machine LearningAug 28, 2025 · Learn how to overcome problems with training imbalanced datasets by using downsampling and upweighting.
-
[7]
[PDF] The Optimal Sample Complexity of PAC LearningThe objective in PAC learning is to produce a classifier that, with probability at least 1−δ, has error rate at most ε. To qualify as a PAC learning algorithm, ...<|separator|>
-
[8]
Snorkel: Rapid Training Data Creation with Weak Supervision - arXivNov 28, 2017 · Abstract page for arXiv paper 1711.10160: Snorkel: Rapid Training Data Creation with Weak Supervision.Missing: original | Show results with:original
-
[9]
Distant supervision for relation extraction without labeled dataMike Mintz, Steven Bills, Rion Snow, and Daniel Jurafsky. 2009. Distant supervision for relation extraction without labeled data. In Proceedings of the Joint ...
-
[10]
Doubly Robust CrowdsourcingJan 12, 2022 · Most such datasets are constructed using crowdsourcing services such as Amazon Mechanical Turk which provides noisy labels from non-experts at a ...Missing: weak | Show results with:weak
-
[11]
Leveraging large language models for knowledge-free weak ...Mar 10, 2025 · We propose an approach leveraging fine-tuning LLMs and weak supervision with virtually no domain knowledge that still achieves consistently dominant ...
-
[12]
A survey on semi-supervised learning | Machine LearningNov 15, 2019 · Semi-supervised learning is the branch of machine learning concerned with using labelled as well as unlabelled data to perform certain ...
-
[13]
1 Introduction to Semi-Supervised Learning - MIT Press DirectWe now propose a generalization of the smoothness assumption that is useful for semi-supervised learning; we thus call it the “semi-supervised smoothness.
- [14]
-
[15]
[PDF] arXiv:1910.13188v1 [cs.LG] 29 Oct 2019Oct 29, 2019 · The key idea behind consistency regularization is smoothness assumption: if two input features are close in the high-density region, then so ...<|control11|><|separator|>
- [16]
-
[17]
[PDF] Semi-Supervised Classification by Low Density SeparationThe cluster assumption states that the decision boundary should not cross high density regions, but instead lie in low density regions.
-
[18]
Semi-Supervised Learning, Explained with Examples - AltexSoftMar 29, 2024 · Say, a company with 10 million users analyzed five percent of all transactions to classify them as fraudulent or not while the rest of the data ...
-
[19]
A Cluster-then-label Semi-supervised Learning Approach ... - NatureMay 8, 2018 · Among the assumptions, smoothness and cluster assumption are the basis for most of the state-of-the-art techniques. In the smoothness assumption ...Missing: formal | Show results with:formal
-
[20]
(PDF) A survey on semi-supervised learning - ResearchGateNov 15, 2019 · ... A survey on semi-supervised learning. Jesper E. van Engelen1·Holger H. Hoos1,2. Received: 3 December 2018 / Revised: 20 September 2019 ...
-
[21]
[2210.03594] Label Propagation with Weak Supervision - arXivOct 7, 2022 · In this paper, we introduce a novel analysis of the classical label propagation algorithm (LPA) (Zhu & Ghahramani, 2002) that moreover takes advantage of ...
-
[22]
Laplacian Eigenmaps for Dimensionality Reduction and Data ...Jun 1, 2003 · We consider the problem of constructing a representation for data lying on a low-dimensional manifold embedded in a high-dimensional space.Missing: assumption | Show results with:assumption
-
[23]
Data Programming: Creating Large Training Sets, Quickly - arXivMay 25, 2016 · We therefore propose a paradigm for the programmatic creation of training sets called data programming in which users express weak supervision strategies or ...
-
[24]
[PDF] Transductive Inference for Text Classification using Support Vector ...The paper presents an anal- ysis of why TSVMs are well suited for text classification. These theoretical findings are supported by experiments on three test col ...Missing: seminal | Show results with:seminal
-
[25]
[PDF] Manifold Regularization: A Geometric Framework for Learning from ...These algorithms are related to spectral clustering and Laplacian Eigenmaps (Belkin and Niyogi, 2003a). 3. We elaborate on the RKHS foundations of our ...
- [26]
-
[27]
Self-training with Noisy Student improves ImageNet classificationNov 11, 2019 · Noisy Student Training uses a teacher model to generate pseudo labels, then trains a noisy student model, achieving 88.4% top-1 accuracy on ...Missing: extension 2023
-
[28]
Snorkel: Rapid Training Data Creation with Weak Supervision - PMCWe present Snorkel, the first end-to-end system for combining weak supervision sources to rapidly create training data. We built Snorkel as a prototype to study ...Missing: original | Show results with:original
-
[29]
Essential Guide to Weak Supervision | Snorkel AIWeak supervision is an approach to machine learning in which high-level and often noisier sources of supervision are used to create much larger training sets ...Missing: paper | Show results with:paper
-
[30]
Language Models in the Loop: Incorporating Prompting into Weak ...Using multiple labeling functions gives rise to the key technical challenge in programmatic weak supervision: resolving their disagreements without access to ...
-
[31]
Probability of error of some adaptive pattern-recognition machinesThe paper analyzes the probability of error for a taught machine, which converges to an optimal detector, and an untaught machine, which performs almost as ...
-
[32]
[PDF] Combining Labeled and Unlabeled Data with Co-Training yWe consider the problem of using a large unla- beled sample to boost performance of a learn- ing algorithm when only a small set of labeled.
-
[33]
[PDF] UNSUPERVISED WORD SENSE DISAMBIGUATION RIVALING ...This paper presents an unsupervised algorithm that can accurately disambiguate word senses in a large, completely untagged corpus) The algorithm avoids the ...
-
[34]
DivideMix: Learning with Noisy Labels as Semi-supervised LearningDivideMix is a framework for learning with noisy labels using semi-supervised learning. It divides data into labeled and unlabeled sets and trains on both.Missing: CleanLab library 2021
-
[35]
(PDF) Leveraging large language models for knowledge-free weak ...We propose an approach leveraging fine-tuning LLMs and weak supervision with virtually no domain knowledge that still achieves consistently dominant performance ...
- [36]
-
[37]
Weak Supervision: A Survey on Predictive MaintenanceMay 11, 2025 · For deploying weak supervision at an organizational scale, Bach et al. (2019) outline three core principles: versatile assimilation of ...
-
[38]
[PDF] Weakly-Supervised Salient Object Detection via Scribble AnnotationsCompared with laborious pixel-wise dense labeling, it is much easier to label data by scribbles, which only costs. 1∼2 seconds to label one image.Missing: advances | Show results with:advances<|control11|><|separator|>
-
[39]
[2206.00147] Unbiased Implicit Feedback via Bi-level OptimizationMay 31, 2022 · Abstract:Implicit feedback is widely leveraged in recommender systems since it is easy to collect and provides weak supervision signals.
-
[40]
Statistical learning and language acquisition - PMC - PubMed CentralThis paper reviews current research on how statistical learning contributes to language acquisition.
-
[41]
Children's coding of human action: cognitive factors influencing ...We used imitation as a tool for investigating how young children code action. The study was designed to examine the errors children make in re-enacting manual ...
-
[42]
The free-energy principle: a unified brain theory? - NatureJan 13, 2010 · Karl Friston shows that different global brain theories all describe principles by which the brain optimizes value and surprise.
-
[43]
Humans monitor learning progress in curiosity-driven explorationOct 13, 2021 · Curiosity-driven learning is foundational to human cognition. By enabling humans to autonomously decide when and what to learn, curiosity ...Missing: querying | Show results with:querying
-
[44]
[PDF] Weakly-Supervised Reinforcement Learning for Controllable BehaviorIn this work, we introduce a framework for using weak supervision to automatically disentangle this semantically meaningful subspace of tasks from the enormous ...