Fact-checked by Grok 2 weeks ago
References
-
[1]
A Survey on Self-supervised Learning: Algorithms, Applications, and ...Jan 13, 2023 · Self-supervised learning (SSL), a subset of unsupervised learning, aims to learn discriminative features from unlabeled data without relying ...Missing: definition | Show results with:definition
-
[2]
A Cookbook of Self-Supervised Learning### Summary of Key Sections on Self-Supervised Learning from arXiv:2304.12210
-
[3]
A Simple Framework for Contrastive Learning of Visual ... - arXivFeb 13, 2020 · This paper presents SimCLR: a simple framework for contrastive learning of visual representations. We simplify recently proposed contrastive self-supervised ...
- [4]
-
[5]
Survey on Self-Supervised Learning: Auxiliary Pretext Tasks ... - MDPIThis paper surveys self-supervised feature-learning methods drawn from images. It details the motivation for this research and the terminologies of the field, ...<|control11|><|separator|>
-
[6]
[PDF] 1987-Modular Learning in Neural NetworksComputer simulations of learning using internal units have been restricted to small-scale systems. This paper describes a way of coupling autoassociative ...
-
[7]
[1312.6114] Auto-Encoding Variational Bayes - arXivDec 20, 2013 · Authors:Diederik P Kingma, Max Welling. View a PDF of the paper titled Auto-Encoding Variational Bayes, by Diederik P Kingma and 1 other authors.
-
[8]
[PDF] Extracting and Composing Robust Features with Denoising ...To this end we display the filters obtained after initial training of the first denoising autoencoder on MNIST digits. Fig- ure 3 shows a few of these filters ...
-
[9]
Emergence of simple-cell receptive field properties by learning a ...Jun 13, 1996 · Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Bruno A. Olshausen &; David J. Field. Nature ...
-
[10]
Representation Learning with Contrastive Predictive Coding - arXivJul 10, 2018 · We propose a universal unsupervised learning approach to extract useful representations from high-dimensional data, which we call Contrastive Predictive Coding.
-
[11]
Momentum Contrast for Unsupervised Visual Representation LearningWe present Momentum Contrast (MoCo) for unsupervised visual representation learning. From a perspective on contrastive learning as dictionary look-up,
-
[12]
[PDF] Text Transformations in Contrastive Self-Supervised Learning - IJCAITechniques proposed for standard data augmentation in low- resource learning settings can also be used to generate pos- itive samples for contrastive ...
-
[13]
Bootstrap your own latent: A new approach to self-supervised ... - arXivJun 13, 2020 · We introduce Bootstrap Your Own Latent (BYOL), a new approach to self-supervised image representation learning. BYOL relies on two neural ...
-
[14]
Exploring Simple Siamese Representation Learning - arXivNov 20, 2020 · In this paper, we report surprising empirical results that simple Siamese networks can learn meaningful representations even using none of the following:
-
[15]
Unsupervised Learning of Visual Features by Contrasting Cluster ...Jun 17, 2020 · In this paper, we propose an online algorithm, SwAV, that takes advantage of contrastive methods without requiring to compute pairwise comparisons.
-
[16]
Emerging Properties in Self-Supervised Vision Transformers - arXivApr 29, 2021 · In this paper, we question if self-supervised learning provides new properties to Vision Transformer (ViT) that stand out compared to convolutional networks ( ...
-
[17]
Self-supervised learning for medical image classification - NatureApr 26, 2023 · One pioneering contrastive-based method is SimCLR, which outperformed supervised models on ImageNet benchmark using 100 times fewer labels.
-
[18]
[PDF] Deep Generative Modelling: A Comparative Review of VAEs, GANs ...Interrelated with generative models is the field of self- supervised learning where the focus is on learning good in- termediate representations that can be ...
-
[19]
How is self-supervised learning different from unsupervised learning?A key distinction lies in the presence of an explicit learning objective. Unsupervised methods like clustering (e.g., K-means) or dimensionality reduction ...
-
[20]
Augmentations vs Algorithms: What Works in Self-Supervised ... - arXivMar 8, 2024 · The main distinction between the two is that SSL uses some form of weak labels generated from the input data to induce learning, while ...<|control11|><|separator|>
-
[21]
[PDF] A Survey of Self-Supervised Learning from Multiple PerspectivesJan 13, 2023 · After the self-supervised pretraining process is completed, the learned model can be further transferred to downstream tasks (especially when ...
-
[22]
What is the difference between self-supervised and unsupervised ...May 7, 2023 · Self-supervised learning IS an unsupervised learning algorithm that uses certain methods to derive learning signals without explicit labels.Missing: compute | Show results with:compute
-
[23]
Unsupervised Learning Evaluation Metrics Explained - Insight7The Silhouette Score is a crucial tool for assessing cluster cohesion in unsupervised learning. It quantifies how well each data point fits into its assigned ...
-
[24]
[1810.04805] BERT: Pre-training of Deep Bidirectional Transformers ...Oct 11, 2018 · Unlike recent language representation models, BERT is designed to pre-train deep bidirectional representations from unlabeled text by jointly ...
-
[25]
XLNet: Generalized Autoregressive Pretraining for Language ... - arXivJun 19, 2019 · We propose XLNet, a generalized autoregressive pretraining method that (1) enables learning bidirectional contexts by maximizing the expected likelihood.
-
[26]
RoBERTa: A Robustly Optimized BERT Pretraining Approach - arXivJul 26, 2019 · We present a replication study of BERT pretraining (Devlin et al., 2019) that carefully measures the impact of many key hyperparameters and training data size.
-
[27]
Pre-training Text Encoders as Discriminators Rather Than GeneratorsMar 23, 2020 · Abstract page for arXiv paper 2003.10555: ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators.
-
[28]
[2005.14165] Language Models are Few-Shot Learners - arXivMay 28, 2020 · Specifically, we train GPT-3, an autoregressive language model with 175 billion parameters, 10x more than any previous non-sparse language model ...
-
[29]
[1906.01502] How multilingual is Multilingual BERT? - arXivJun 4, 2019 · In this paper, we show that Multilingual BERT (M-BERT), released by Devlin et al. (2018) as a single language model pre-trained from monolingual corpora in 104 ...
-
[30]
SSP: Self-Supervised Prompting for Cross-Lingual Transfer to Low ...Jun 27, 2024 · We investigate their effectiveness for NLP tasks in low-resource languages (LRLs), especially in the setting of zero-labelled cross-lingual transfer (0-CLT).
-
[31]
[2209.15007] Understanding Collapse in Non-Contrastive Siamese ...Sep 29, 2022 · Abstract:Contrastive methods have led a recent surge in the performance of self-supervised representation learning (SSL).Missing: mitigations batch normalization
- [32]
-
[33]
Feature Normalization Prevents Collapse of Noncontrastive ...Aug 14, 2025 · Contrastive learning is a self-supervised representation learning ... Feature Normalization Prevents Collapse of Noncontrastive Learning Dynamics.Missing: methods mitigations batch
-
[34]
[PDF] Towards Better Understanding of Domain Shift on Linear-Probed ...We find that not only do linear probes fail to generalize on some shift benchmarks, but linear probes trained on some shifted data achieve low train accuracy, ...<|control11|><|separator|>
-
[35]
A study on the distribution of social biases in self-supervised ...Foundation models trained on web-scraped datasets propagate societal biases to downstream tasks. While counterfactual generation enables bias analysis ...
-
[36]
On the Out-of-Distribution Generalization of Self-Supervised LearningMay 22, 2025 · In this paper, we focus on the out-of-distribution (OOD) generalization of self-supervised learning (SSL). By analyzing the mini-batch construction during the ...
- [37]
-
[38]
Self-Supervised Learning: A Comprehensive Survey of Methods ...Aug 28, 2025 · This comprehensive survey provides an in-depth analysis of the evolution, methodologies, and applications of self-supervised learning across ...Missing: definition | Show results with:definition
- [39]
- [40]
- [41]
-
[42]
The Rise of Self-Supervised Learning in Autonomous Systems - MDPIAug 16, 2024 · In response, self-supervised learning (SSL) has emerged as a promising alternative, leveraging unlabeled data to learn meaningful ...1. Introduction · 6.3. Self-Supervised... · 8. Challenges And Future...
-
[43]
A survey on self-supervised methods for visual representation learningMar 4, 2025 · Such a task typically entails a popular failure mode, called representation collapse. It commonly describes trivial solutions, e.g., constant ...