Fact-checked by Grok 2 weeks ago

Max Welling

Max Welling is a and prominent figure in , renowned for his foundational contributions to probabilistic , including the co-development of variational autoencoders (VAEs) that have revolutionized generative modeling in . He holds a research chair in as a full professor at the University of Amsterdam's Informatics Institute, where he directs the Amsterdam Machine Learning Lab (AMLab), and is co-founder and Chief AI Officer of CuspAI, an AI startup focused on materials design. Welling's academic journey began with a PhD in theoretical high-energy physics from in 1998, supervised by Nobel laureate , focusing on . Following his doctorate, he pursued postdoctoral research at the (1998–2000), under (2000–2001), and the (2001–2003), transitioning from physics to . He joined the in 2013, while previously holding a professorship at the , and serving as Vice President of Technologies at AI Research. Welling is also a fellow of the Canadian Institute for Advanced Research (CIFAR), a founding board member of the European Laboratory for Learning and Intelligent Systems (ELLIS), a member of the Royal Academy of Arts and Sciences (since 2025), and co-founder of Scyfer B.V., a spin-off focused on applications. His research emphasizes Bayesian methods, deep generative models, and the mathematical foundations of , with over 190,000 citations and an of 117 as of 2025. Notable works include the seminal paper on VAEs, which introduced efficient variational inference for latent variable models and earned the ICLR Test of Time Award in 2024, as well as advancements in semi-supervised learning with deep generative models. Welling has received prestigious honors such as the ECCV Koenderink Prize in 2010 for contributions to and the ICML Test of Time Award in 2021, reflecting the enduring impact of his probabilistic approaches to neural networks and AI scalability.

Education

Doctoral studies

Max Welling was born in October 1968 in the . He pursued his early academic studies in physics in the , building a strong foundation in theoretical principles that informed his later research. Welling completed his PhD in theoretical high energy physics at in 1998, under the supervision of , who was awarded the in 1999 for elucidating the quantum structure of electroweak interactions. His doctoral thesis, titled Classical and Quantum Gravity in 2+1 Dimensions and defended on January 19, 1998, centered on applications to gravitational models in lower-dimensional spacetimes. These investigations emphasized mathematical frameworks for describing classical and quantum gravitational dynamics. During his graduate studies, Welling grew interested in computational techniques for simulating physical phenomena, which motivated his eventual shift toward applying such methods beyond traditional physics. The rigorous mathematical supervision under 't Hooft fostered analytical approaches that later shaped his probabilistic modeling in .

Postdoctoral research

Following his PhD in from in 1998, Max Welling held a postdoctoral fellowship at the (Caltech) from 1998 to 2000 as a Postdoctoral Scholar, where he focused on simulations under the supervision of . This position allowed him to apply numerical methods from his physics training to complex simulation problems, laying groundwork for his later transition to . Welling then conducted postdoctoral research under Geoffrey E. Hinton, first at from 2000 to 2001, and subsequently at the from 2001 to 2003, marking his entry into through work on energy-based models. During this period, he collaborated closely with Hinton on probabilistic graphical models, leveraging concepts from statistical physics to address inference challenges in undirected models like Boltzmann machines. His physics background from the PhD provided key analytical tools for developing (MCMC) methods, bridging simulations in physical systems to probabilistic modeling in AI. Welling's initial publications from the Toronto period advanced Boltzmann machines and MCMC techniques, including algorithms for efficient sampling in high-dimensional spaces. Notable contributions include the 2002 paper introducing a contrastive divergence learning algorithm for mean-field Boltzmann machines, which improved by approximating gradients without full MCMC sampling. He also co-authored work on approximate in Boltzmann machines using mean-field and Bethe approximations to handle NP-hard exact . Additionally, in collaboration with Hinton and others, he developed energy-based models for sparse overcomplete representations, demonstrating their application to in high-dimensional data like natural images. These efforts established foundational techniques for generative modeling and sampling that influenced subsequent developments in .

Academic career

University of California, Irvine

Max Welling was appointed as an Assistant Professor in the Department of Cognitive Sciences at the , in 2003. He was promoted to Associate Professor in 2008. During his tenure, Welling established and led efforts in building a vibrant research community at UCI, including contributions to the Center for Machine Learning and Intelligent Systems, where he focused on advancing probabilistic approaches in the field. He mentored several students, guiding their work on methods and related topics in statistical . Welling secured key funding to support his , including NSF grants IIS-0447903 and IIS-0535278, which enabled investigations into probabilistic modeling techniques applicable to areas such as vision and . These grants facilitated collaborations with other UCI and external partners, fostering interdisciplinary work in computational modeling. Additionally, he received a 2006 Multidisciplinary University Research Initiative (MURI) , which supported advanced in probabilistic frameworks for visual and learning. After nine years at UCI, Welling departed in to take up a professorship at the , drawn by enhanced opportunities for scaling research within Europe's burgeoning AI ecosystem.

University of Amsterdam

In , Max Welling was appointed as Full Professor and Research Chair in at the University of Amsterdam's Informatics Institute, part of the Faculty of . This role marked his transition from the University of California, Irvine, where his prior faculty experience in building machine learning programs informed the establishment of collaborative research environments at the UvA. Upon joining the UvA, Welling founded the Amsterdam Machine Learning Lab (AMLAB) in 2013 and has directed it continuously since then, guiding a team focused on interdisciplinary projects that bridge , , and applied domains. Under his leadership, AMLAB has fostered collaborations with industry partners, such as through co-directed labs including the Qualcomm-UvA Lab (QUVA) and the Bosch-UvA DELTA Lab, emphasizing practical advancements in methodologies. Welling has supervised numerous PhD students and postdoctoral researchers at AMLAB, prioritizing AI applications in scientific fields like physics and to drive innovative problem-solving. As of 2025, he remains actively engaged in his professorial duties, including teaching graduate-level courses on probabilistic , while contributing to initiatives such as the European Lighthouse on for (ELISE) and the European for Summit (AIS25).

Research contributions

Probabilistic methods

Max Welling has significantly advanced variational inference methods for approximate Bayesian computation within neural networks, enabling efficient handling of in complex models. A key contribution is the development of a scalable framework that leverages the (ELBO) to optimize the variational posterior approximation. The ELBO is defined as \mathcal{L}(\theta, \phi; x) = \mathbb{E}_{q_\phi(z|x)}[\log p_\theta(x,z) - \log q_\phi(z|x)], which provides a tractable lower bound on the log , allowing gradient-based optimization even for high-dimensional latent variables in neural architectures. This approach, introduced in collaboration with Diederik Kingma, facilitates approximate by reparameterizing the sampling process to make it differentiable, thus scaling to large datasets without relying on expensive (MCMC) simulations. In the realm of Bayesian neural networks (BNNs), Welling's work focuses on improving by specifying appropriate priors on network weights and deriving flexible variational posteriors. For instance, he co-developed multiplicative normalizing flows to enhance the expressiveness of approximate posteriors in BNNs, transforming simple distributions like Gaussians into more complex ones that better capture posteriors and epistemic uncertainty. These priors, often chosen as scale mixtures or hierarchical structures, allow BNNs to quantify both aleatoric and epistemic uncertainties in predictions, outperforming deterministic networks in tasks with limited data. This has proven particularly useful for reliable decision-making in safety-critical applications, where overconfident predictions are mitigated through posterior sampling. Welling's early contributions to MCMC methods for energy-based models laid groundwork for efficient sampling in deep architectures. In joint work with and others, he explored MCMC techniques to learn sparse representations in overcomplete models, using contrastive divergence as an approximation to full MCMC for tractable training on high-dimensional data. Building on this, adaptations of (HMC) were incorporated to explore the energy landscape more effectively, leveraging dynamics to propose distant samples and reduce autocorrelation in chains for deep energy-based models. These methods improve mixing in non-convex potentials typical of deep networks. These probabilistic techniques have enabled scalable on large datasets, with applications in —such as modeling image distributions on benchmarks like MNIST and —and in physics simulations, where they support for complex dynamical systems like particle interactions. For example, stochastic gradient variants of MCMC, developed by , allow approximate posterior sampling in scenarios with massive data volumes, as seen in simulations of physical processes. These methods also underpin generative models, where variational approximations facilitate learning latent representations.

Generative models

Max Welling has made foundational contributions to generative modeling, particularly through the development of variational autoencoders (VAEs) and extensions to other probabilistic architectures. In collaboration with Diederik P. Kingma, he co-authored the seminal paper "Auto-Encoding Variational Bayes," which introduced VAEs as a class of probabilistic graphical models that combine deep neural networks with variational inference to learn latent representations of . The VAE architecture consists of an encoder that maps input to a , typically parameterized by \mu and variance \sigma^2, and a that reconstructs the from samples in this space, enabling efficient generation of new samples. A was the reparameterization trick, which allows through stochastic layers by expressing latent variables as z = \mu + \sigma \odot \epsilon, where \epsilon \sim \mathcal{N}(0, I) and \odot denotes element-wise multiplication, facilitating end-to-end training via . This approach relies on variational inference as the optimization backbone, approximating the intractable posterior to maximize a lower bound on the likelihood. Building on VAEs, extended generative models to semi-supervised learning settings. In the 2014 NeurIPS paper "Semi-supervised Learning with Deep Generative Models," co-authored with Kingma, Rezende, and Shakir Mohamed, he proposed deep generative models that integrate labeled and unlabeled data to improve performance when labels are scarce. These models use a joint generative-discriminative framework where the generative component learns the underlying data distribution, while the discriminative part infers class labels, achieving state-of-the-art results on datasets like MNIST by leveraging unlabeled examples to refine latent representations. This work demonstrated how generative priors can enhance generalization in low-data regimes, with empirical improvements in accuracy over purely supervised baselines. Earlier in his career, advanced unsupervised feature learning using restricted Boltzmann machines (RBMs), stochastic neural networks that model data as joint distributions over visible and hidden units. In the 2003 NeurIPS paper "Wormholes Improve Contrastive Divergence," co-authored with Andriy Mnih and Geoffrey E. Hinton, he introduced techniques to accelerate training of RBMs by modifying the sampling process with "wormholes" that connect distant states, reducing mixing time in contrastive divergence () learning. approximates maximum likelihood by performing short chains starting from data, and Welling's improvements addressed limitations in standard by enhancing exploration of the model distribution, leading to better feature extraction for tasks like image denoising and pretraining deep networks. His contributions to -based training, including extensions to mean-field approximations in Boltzmann machines, laid groundwork for scalable in early pipelines. More recently, Welling has focused on equivariant generative models tailored for geometric data, particularly in molecular design. In the 2022 paper "Equivariant Diffusion for Molecule Generation in 3D," co-authored with Emiel Hoogeboom, Victor Garcia Satorras, and Clément Vignac, he developed an that generates molecular structures invariant to rotations and translations by incorporating symmetry-aware score matching in the reverse diffusion process. This approach outperforms non-equivariant baselines in generating valid 3D molecules, as measured by metrics like validity and novelty on benchmarks such as QM9, enabling applications in by producing geometrically feasible conformers directly in continuous space. These models extend generative paradigms to handle the symmetries inherent in physical systems, improving efficiency and realism in structure generation. In 2023, co-introduced (BFNs), a new class of generative models that perform amortized to sample jointly from data and latent variables, allowing for exact likelihood computation and diverse generation without autoregressive factorization. Presented at ICLR 2024, BFNs address limitations in existing and models by iteratively updating distribution parameters via forward and backward passes, showing promise for structured data like molecules and sequences.

Industry and entrepreneurship

Early industry roles

In 2017, following the acquisition of his startup Scyfer B.V. by Qualcomm Technologies, Inc., Max Welling assumed the role of Vice President of Technologies at Qualcomm Netherlands, where he led AI research initiatives focused on mobile and edge computing applications. This position built on the earlier establishment of the Qualcomm-UvA Deep Vision Lab (QUVA) in 2015, a collaborative effort between Qualcomm and the University of Amsterdam that Welling co-directed to advance deep learning for embedded systems and real-time vision processing on resource-constrained devices. Welling co-founded Scyfer B.V. earlier that year as a spin-off from the , specializing in solutions for practical applications such as image recognition and predictive modeling. The company's acquisition by in August 2017 integrated its expertise into Qualcomm's broader ecosystem, enabling Welling and his team to contribute to scalable deployments on mobile platforms. During his tenure, Welling oversaw the development of probabilistic tools tailored for real-time , including innovations in modeling uncertainty from inputs to enhance reliability in edge environments. A key outcome of this work was the pursuit of patents on techniques for based on probability distributions of sensor data, which addressed challenges in fusing noisy inputs from multiple sensors for applications like autonomous and . These efforts drew brief inspiration from prototypes informed by research at the Amsterdam Lab (AMLAB), adapting academic probabilistic methods to industry-scale hardware constraints. Welling maintained a hybrid role during this period, balancing his Qualcomm responsibilities with his professorship at the until 2021, fostering a bridge between academic innovation and commercial deployment.

Recent ventures

From 2021 to 2024, Max Welling served as a Distinguished Scientist at AI for , where he led the Amsterdam laboratory focused on advancing -driven scientific discovery across domains such as molecular simulations and . Under his leadership, the lab developed tools to accelerate simulations of physical systems, including equivariant models that generate 3D molecular structures while preserving symmetries essential for accurate physical predictions. These efforts aimed to bridge with quantum and classical simulations for applications in chemistry and physics. In 2024, Welling co-founded CuspAI, where he serves as , pioneering a startup that leverages generative for materials design to expedite the discovery of novel molecules. In September 2025, CuspAI raised $100 million in Series A , co-led by NEA and Northzone, to scale its platform for materials innovation addressing and challenges. The platform accelerates molecule property analysis by a factor of 10 compared to traditional methods, enabling rapid iteration in designing materials for real-world challenges. At CuspAI, key initiatives include systems for sustainable materials, such as advanced sorbents for carbon capture and efficient battery components, drawing briefly on generative modeling techniques from prior academic research to optimize molecular structures. Through these ventures, Welling has contributed to patents on equivariant neural architectures that enhance simulations for molecular and materials applications, including gauge-equivariant convolutional networks applicable to drug-like . Collaborations stemming from and CuspAI extend to for —such as structure-based modeling with equivariant diffusion—and climate initiatives, including partnerships with for innovative sustainable materials and Kemira for -optimized chemicals in and emissions reduction.

Recognition

Awards

Max Welling has received several prestigious awards recognizing his foundational contributions to and . In 2010, he was awarded the Koenderink Prize at the European Conference on Computer Vision (ECCV) for his pioneering work applying probabilistic models to problems, including the development of efficient inference techniques for complex visual data representations. Welling's advancements in scalable Bayesian methods earned him the Best Paper Award at the 29th (ICML) in 2012 for the paper "Bayesian Posterior Sampling via Stochastic Gradient Fisher Scoring," co-authored with Sungjin Ahn and Anoop Korattikara, which introduced an efficient algorithm for posterior sampling in high-dimensional spaces using stochastic gradients. This work built on his earlier innovations in (MCMC) techniques, highlighting his impact on variational and sampling-based inference. In 2018, he received the Best Paper Award at the (ICLR) for "Spherical CNNs," co-authored with Taco Cohen, which extended convolutional neural networks to operate on spherical data, enabling rotationally invariant processing for applications like climate modeling and molecular analysis. His long-term influence on generative modeling was honored with Test of Time Awards from both ICML and ICLR. The ICML 2021 Test of Time Award recognized the 2011 paper "Bayesian Learning via Stochastic Gradient Langevin Dynamics," co-authored with Yee Whye Teh, for introducing stochastic gradient MCMC methods that facilitate scalable Bayesian inference in deep learning, demonstrating its enduring impact. Similarly, the inaugural ICLR 2024 Test of Time Award was given to the 2013 paper "Auto-Encoding Variational Bayes," co-authored with Diederik P. Kingma, for establishing the variational autoencoder (VAE) framework, a cornerstone of generative models that integrates probabilistic inference with neural networks and has inspired extensions like semi-supervised learning approaches. In 2025, he received an Honorable Mention in the AI 2000 Most Influential Scholar Award in Machine Learning from AMiner. These awards underscore Welling's role in bridging probabilistic methods with deep learning, including invited keynotes at conferences such as NeurIPS and ICML where he discussed synergies between physics-inspired models and AI.

Fellowships and memberships

Max Welling has been a of the Canadian Institute for Advanced Research (CIFAR) since 2016 and is affiliated with the Learning in Machines & program, which focuses on advancing understanding of through interdisciplinary research in and . In 2025, Welling was elected as a member of the Royal of Arts and Sciences (KNAW), recognizing his outstanding contributions to science, particularly in and . Welling serves on the of the NeurIPS Foundation since 2015, contributing to the governance and strategic direction of one of the premier conferences in . He also held the position of Associate Editor-in-Chief for IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI) from 2011 to 2015, overseeing editorial processes for high-impact research in and . Additionally, is a of the European Laboratory for Learning and Intelligent Systems (), where he serves on the founding board, playing a key role in fostering collaborative research across as part of broader initiatives to strengthen the continent's .

References

  1. [1]
    Max Welling - Amsterdam Machine Learning Lab
    Dr. Max Welling is a research chair in Machine Learning at the University of Amsterdam and a Distinguished Scientist at MSR. He is a fellow at the Canadian ...Missing: biography | Show results with:biography
  2. [2]
    Max Welling - Simons Foundation
    Dr. Max Welling is a research chair in Machine Learning at the University of Amsterdam and a Distinguished Scientist at MSR. He is a fellow at the Canadian ...
  3. [3]
    Researcher - Max Welling - INSTICC
    Dr. Max Welling is a research chair in Machine Learning at the University of Amsterdam and a VP Technologies at Qualcomm. He has a secondary appointment as a ...
  4. [4]
    Max Welling - OpenReview
    Max Welling. Pronouns: he/him. Full Professor, Informatics Institute, University of Amsterdam. Joined October 2016. Names. Max Welling (Preferred).<|control11|><|separator|>
  5. [5]
    Max Welling - CIFAR
    Max Welling is a computer scientist who works in artificial intelligence (expert systems, machine learning, robotics). He holds a research chair in machine ...
  6. [6]
    ‪Max Welling‬ - ‪Google Scholar‬
    CAIO CuspAI & Professor Machine Learning, University of Amsterdam - ‪‪Cited by 190340‬‬ - ‪Machine Learning‬ - ‪Artificial Intelligence‬ - ‪Statistics‬
  7. [7]
    ICLR 2024 Test of Time Award for Max Welling - Informatics Institute
    May 30, 2024 · Prof. Dr. Max Welling from the University of Amsterdam has won the prestigious ICLR Test of Time Award 2024 for his groundbreaking work on Auto-Encoding ...
  8. [8]
    Professor Max Welling - The Alan Turing Institute
    Max Welling is a Distinguished Scientist within Microsoft Research (MSR) AI4Science, based in Amsterdam. He finished his PhD in theoretical high energy physics.Missing: biography | Show results with:biography
  9. [9]
    Cusp Ai Limited - Company Profile - Endole
    Max Welling. Director • Founder • Dutch • Lives in Netherlands • Born in Oct 1968. Alyn Chad Edwards. Director • Founder • British • Lives in England • Born in ...
  10. [10]
    Max Welling: Advancing AI with Probabilistic Deep Learning - AI VIPs
    Discover Max Welling's contributions to AI, including probabilistic deep learning, Bayesian inference, and geometric machine learning. #AI.
  11. [11]
    Gerard 't Hooft - PhD students - webspace.science.uu.nl
    Max Welling, Title thesis: Classical and Quantum Gravity in 2+1 Dimensions. Graduation date: 19 January 1998. Sebastian de Haro, Title thesis: Quantum ...
  12. [12]
    Classical and quantum gravity in 2+1 dimensions - Utrecht University
    Welling, M. (1998). Classical and quantum gravity in 2+1 dimensions. [Doctoral thesis 1 (Research UU / Graduation UU), Utrecht University]. Welling, M.. / ...
  13. [13]
    Classical and quantum gravity in 2+1 dimensions - Inspire HEP
    Classical and quantum gravity in 2+1 dimensions. M. Welling(. Utrecht U. ) Jan 19, 1998. Supervisor: G. 't Hooft. Thesis: PhD. Utrecht U.Missing: Max | Show results with:Max
  14. [14]
    Max Welling - DSI - Data Science Institute - The University of Chicago
    Dr. Max Welling is a research chair in Machine Learning at the University of Amsterdam and a Distinguished Scientist at MSR. He is a fellow at the Canadian ...Missing: biography | Show results with:biography
  15. [15]
    [PDF] Trustees, Administration, Faculty - Caltech Course Catalog
    Max Welling, Ph.D. Richard N. Merkin Distinguished. Visiting Professor in Engineering and. Applied Science. B.S., Utrecht University, 1993; Ph.D.,. 1998 ...
  16. [16]
    Approximate inference in Boltzmann machines - ScienceDirect.com
    Inference in Boltzmann machines is NP-hard in general. As a result approximations are often necessary. We discuss first order mean field and second order ...
  17. [17]
    Dr M. Welling, professor of Machine Learning
    Jan 22, 2013 · Max Welling ... In addition to his professorship at the UvA, Welling has also held an appointment at the University of California Irvine (USA) ...
  18. [18]
    Prof. dr. M. (Max) Welling - University of Amsterdam
    Prof. dr. M. (Max) Welling. Faculty of Science. Informatics Institute. Visiting address. Science Park 904; Room number: L4.17. Postal address. Postbus 94323
  19. [19]
    Industry Leaders in Signal Processing and Machine Learning: Max ...
    Max Welling is recipient of the ECCV Koenderink Prize in 2010 and the ICML Test of Time award in 2021. He directs the Amsterdam Machine Learning Lab (AMLAB) and ...Missing: key | Show results with:key
  20. [20]
    Engaging the world's leading researchers to advance machine ...
    Aug 24, 2015 · Professors Arnold Smeulders and Max Welling, along with Assoc. ... Cees Snoek, will direct the new lab and approximately 15-20 PhD students and ...
  21. [21]
    Max WELLING | Department of Computer Systems | Research profile
    Max WELLING | Cited by 118161 | of University of Amsterdam, Amsterdam (UVA) | Read 295 publications | Contact Max WELLING.<|control11|><|separator|>
  22. [22]
    Strategic Research Agenda 2023 - ELISE
    Progress towards a European AI powerhouseAI for European grand challenges. An ... Max Welling (Amsterdam). University of Amsterdam; Qualcomm AI Research.<|control11|><|separator|>
  23. [23]
    Max Welling - AIS25 Summit
    Max Welling is a full professor and research chair in machine learning at the University of Amsterdam and a Merkin distinguished visiting professor at Caltech.Missing: profile | Show results with:profile
  24. [24]
    Max Welling: Will AI Save Us—or Break Us?
    Sep 17, 2025 · Ahead of World Summit AI 2025 (October 08–09, Amsterdam), Max Welling, Professor at the University of Amsterdam and CTO & Co-Founder of ...
  25. [25]
    [1312.6114] Auto-Encoding Variational Bayes - arXiv
    Dec 20, 2013 · We introduce a stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability ...Missing: ELBO | Show results with:ELBO
  26. [26]
    Multiplicative Normalizing Flows for Variational Bayesian Neural ...
    Mar 6, 2017 · Multiplicative Normalizing Flows for Variational Bayesian Neural Networks. Authors:Christos Louizos, Max Welling.
  27. [27]
    Semi-Supervised Learning with Deep Generative Models - arXiv
    We revisit the approach to semi-supervised learning with generative models and develop new models that allow for effective generalisation from small labelled ...
  28. [28]
    Wormholes Improve Contrastive Divergence - NIPS papers
    Wormholes Improve Contrastive Divergence. Part of Advances in Neural ... Authors. Max Welling, Andriy Mnih, Geoffrey E. Hinton. Abstract. In models that ...
  29. [29]
    A New Learning Algorithm for Mean Field Boltzmann Machines
    We present a new learning algorithm for Mean Field Boltzmann Machines based on the contrastive divergence optimization criterion ... Max Welling & Geoffrey E.Missing: persistent | Show results with:persistent
  30. [30]
    [2203.17003] Equivariant Diffusion for Molecule Generation in 3D
    Mar 31, 2022 · Equivariant Diffusion for Molecule Generation in 3D. Authors:Emiel Hoogeboom, Victor Garcia Satorras, Clément Vignac, Max Welling.
  31. [31]
    Qualcomm Bolsters Position in Artificial Intelligence Research ...
    Aug 15, 2017 · In 2007, Qualcomm started exploring spiking neuron approaches to machine learning for computer vision and motion control applications, and later ...
  32. [32]
    Dr. Max Welling on how AI will impact the world, now and in the future
    May 22, 2018 · Welling joined the Qualcomm Technologies family last summer when we acquired his company Scyfer B.V., which created powerful AI solutions for ...
  33. [33]
    QUVA Lab
    In QUVA Lab we perform bleeding edge fundamental research with 10+ PhD students and postdocs and work with researchers at Qualcomm AI Research.
  34. [34]
    QUVA lab celebrates the past and looks to the future
    Sep 29, 2020 · The UvA's hugely successful collaboration with Qualcomm – the QUVA Deep Vision Lab – will be celebrated on 6 October with an event marking ...
  35. [35]
    Research at Microsoft 2021: Collaborating for real-world change
    Dec 15, 2021 · Distinguished scientist and renowned ML researcher Max Welling will lead the lab, bringing a deep background in physics and quantum computing to ...
  36. [36]
    Max Welling will lead new Microsoft Research Lab in Amsterdam
    Jul 20, 2021 · I am super excited to announce that I will be joining Microsoft Research (MSR) as a distinguished scientist on September 1, 2021. MSR will open ...<|control11|><|separator|>
  37. [37]
    Equivariant Diffusion for Molecule Generation in 3D - Microsoft
    Jul 17, 2022 · This work introduces a diffusion model for molecule generation in 3D that is equivariant to Euclidean transformations. Our E(3) Equivariant ...
  38. [38]
    Machine learning, molecular simulation, and the opportunity for ...
    Jul 20, 2021 · I'm delighted to announce that Max will be joining Microsoft Research in September and that we'll be opening a new lab in Amsterdam, which Max ...
  39. [39]
    From Algorithms to Atoms: Our Investment in CuspAI | NEA
    Sep 11, 2025 · Professor Max Welling (Co-founder & CTO) is a Professor at University of Amsterdam, and previously VP Technology at Qualcomm AI Research and ...
  40. [40]
    cusp.ai
    Dr Chad Edwards. CEO and Co-Founder ; Prof. Max Welling. CTO and Co-Founder ; Dr Markus Hoffmann. Chief Strategy Officer ; Debbie Toms. Chief of Staff ; Dr Felix ...Missing: 2023 | Show results with:2023
  41. [41]
    CuspAI, harnessing AI to unlock materials breakthroughs in months ...
    Mar 11, 2025 · My co-founder, leading AI researcher, Professor Max Welling, and I ... Chad Edwards is the cofounder and CEO of CuspAI. Maddynews UK.Missing: 2023 | Show results with:2023
  42. [42]
    CuspAI: When AI Meets Carbon Capture - Amsterdam Science Park
    Startup CuspAI uses AI to find materials for carbon capture. Professor Max Welling talks about the tech and the power of Amsterdam Science Park.
  43. [43]
    Our Investment in CuspAI: Pioneering AI-Designed Materials to ...
    Jun 18, 2024 · We are thrilled to announce our investment in CuspAI, the AI-powered platform for next-generation materials to tackle global sustainability and clean energy ...
  44. [44]
    Gauge equivariant geometric graph convolutional neural network
    This application claims the benefit of and priority to U.S. Provisional Patent Application No. 62/971,129, filed on Feb. 6, 2020, the entire contents of which ...<|control11|><|separator|>
  45. [45]
  46. [46]
    Kemira and CuspAI Forge Strategic Partnership to Pioneer AI-Driven ...
    Jul 9, 2025 · This water industry-first collaboration aims to revolutionize materials innovation within the chemical sector through the integration of advanced AI ...
  47. [47]
    Center Member wins Koenderink Prize at European Conference on ...
    Sep 1, 2010 · Max Welling, professor of computer science, has been awarded European Conference on Computer Vision's Koenderink Prize in recognition of his ...
  48. [48]
    Computer science graduate students and professor win ICML best ...
    Jul 1, 2012 · Co-authored by Ph.D. students Sungjin Ahn and Anoop Korattikara, and computer science professor Max Welling, the paper “Bayesian Posterior ...
  49. [49]
    Exploring what's possible when machine “sight” evolves - Qualcomm
    Apr 29, 2018 · Qualcomm AI researchers receive prestigious ICLR award: "Spherical CNNs” wins 2018 Best Paper Award.
  50. [50]
    Bayesian Learning via Stochastic Gradient Langevin Dynamics
    Test of Time Bayesian Learning via Stochastic Gradient Langevin Dynamics. Yee Teh · Max Welling 2021 Test Of Time
  51. [51]
    ICLR 2024 Test of Time Award - ICLR Blog
    May 7, 2024 · Test of Time. Auto-Encoding Variational Bayes. Diederik Kingma, Max Welling ... Variational Autoencoder (VAE). The lasting value of this work ...
  52. [52]
    Learning in Machines & Brains - Research Programs - CIFAR
    Max Welling. Fellow. Learning in Machines & Brains. University of ... Fellow CIFAR Azrieli Global Scholar 2016-2018. Learning in Machines & Brains.
  53. [53]
    Max Welling appointed member of the KNAW - Informatics Institute
    May 15, 2025 · Max Welling has been selected as a new member by the Royal Netherlands Academy of Arts and Sciences (KNAW). KNAW members are appointed based ...Missing: 2020 | Show results with:2020
  54. [54]
    Max Welling | IEEE Xplore Author Details
    He directs the Amsterdam Machine Learning Lab (AMLAB), and co-directs the Qualcomm-UvA deep learning lab (QUVA) and the Bosch-UvA Deep Learning lab (DELTA).