nLab machine learning

Redirected from "machine-learning".
Contents

Context

Probability theory

Computability

Contents

Idea

Machine learning is a branch of computer science which devises algorithms to learn from data so as to perform tasks without being explicitly programmed to do so. Notable approaches include neural networks, support vector machines and Bayesian networks.

References

Classical machine learning

Textbook accounts:

See also:

Expository review:

On transformers and large language models (LLM):

  • Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, Illia Polosukhin, Attention is all you need, in Advances in Neural Information Processing Systems 30 (NIPS 2017) pdf

  • Michael R. Douglas, Large language models, arXiv:2307.05782

With regards to kernel methods:

Proposal for applications of category theory to machine learning:

The following framework claims to relate to homotopy type theory

On a definition of artificial general intelligence

  • S. Legg, M. Hutter, Universal intelligence: a definition of machine intelligence, Minds & Machines 17, 391–444 (2007) doi

  • M. Hutter, Universal artificial intelligence: sequential decisions based on algorithmic probability, Springer 2005; book presentation pdf

  • Shane Legg, Machine super intelligence, PhD thesis, 2008 pdf

Quantum machine learning

In view of quantum computation (ie. quantum machine learning):

  • Diego Ristè, Marcus P. da Silva, Colm A. Ryan, Andrew W. Cross, Antonio D. Córcoles, John A. Smolin, Jay M. Gambetta, Jerry M. Chow & Blake R. Johnson,

    Demonstration of quantum advantage in machine learning, npj Quantum Information volume 3, Article number: 16 (2017) (doi:10.1038/s41534-017-0017-3)

  • Jacob Biamonte, Peter Wittek, Nicola Pancotti, Patrick Rebentrost, Nathan Wiebe, Seth Lloyd, Quantum Machine Learning, Nature 549 (2017) 195–202 (doi:10.1038/nature23474)

    EurekaAlert, Quantum Machine Learning 14-Sep-2017

  • Iris Cong, Soonwon Choi, Mikhail D. Lukin, Quantum convolutional neural networks, Nature Physics volume 15, pages 1273–1278 (2019) (doi:10.1038/s41567-019-0648-8)

    (on quantum neural networks)

  • Yunchao Liu, Srinivasan Arunachalam, Kristan Temme, A rigorous and robust quantum speed-up in supervised machine learning (arXiv:2010.02174)

  • Melanie Swan, Renato P dos Santos, Frank Witte, Between Science and Economics, Volume 2: Quantum Computing Physics, Blockchains, and Deep Learning Smart Networks, World Scientific 2020 (doi:10.1142/q0243)

  • Stefano Mangini, Francesco Tacchino, Dario Gerace, Daniele Bajoni, Chiara Macchiavello, Quantum computing models for artificial neural networks, EPL (Europhysics Letters) 134(1), 10002 (2021) (arXiv:2102.03879)

review:

and with emphasis on classically controlled NISQ-computes:

Critical discussion:

  • Pablo Bermejo, Paolo Braccia, Manuel S. Rudolph, Zoë Holmes, Lukasz Cincio, M. Cerezo: Quantum Convolutional Neural Networks are (Effectively) Classically Simulable [arXiv:2408.12739]

    “We present this as a challenge to the community and argue that until the viability of QCNNs for [not classically simulable] datasets can be demonstrated there is no reason to believe QCNNs will be useful. […] Hence we boldly claim: There is currently no evidence that QCNNs will work on classically non-trivial tasks, and their place in the upper echelon of promising QML architectures should be seriously revised.

Applications

There are or will be innumerable applications. Here are some:

To mathematical structures in algebraic geometry, representation theory, number theory and combinatorics:

reviewed in:

To the conformal bootstrap:

(…)

Last revised on September 2, 2024 at 13:29:09. See the history of this page for a list of all contributions to it.