nLab neural network

Contents

Contents

Idea

A neural network is a class of functions used in both supervised and unsupervised? machine learning to approximate a correspondence between samples in a dataset and their associated labels.

Definition

Definition

Where K dK\subset \mathbb{R}^d is compact, {T L} LN\{T_L\}_{L\leq N \in \mathbb{N}} a finite set of affine maps such that T L(x)=W L,x+b LT_L(x) = \langle W_L,x\rangle + b_L where W LW_L is the L thL^{th} layer weight matrix and b Lb_L the L thL^{th} layer bias, g:g:\mathbb{R}\to\mathbb{R} a non-linear activation function, a neural network is a function f:K d mf:K\subset \mathbb{R}^d \to \mathbb{R}^m, such that on input xx, computes the composition:

f(x)=(T LgT L1gT 1)(x)f(x) = (T_L\circ g \circ T_{L-1}\circ g \circ \dots \circ T_1)(x)

where gg is applied component-wise.

Typically, T 1T_1 is called the input layer, T LT_L the output layer, and layers T 2T_2 to T L1T_{L-1} are hidden layers. In particular, a real-valued 1-hidden layer neural network with computes:

f(x)=b+ i=1 na ig(W i,x+b)f(x) = b' + \sum_{i=1}^n a_i g(\langle W_i, x\rangle + b)

where a=(a 1,,a n)a = (a_1, \dots, a_n) is the output weight, bb' the output bias, W iW_i the i thi^{th} row of the hidden weight matrix, and bb the hidden bias. Here, the hidden layer is nn-dimensional.

Relation to differential equations and dynamical systems

A relation between deep neural networks, differential equations, and dynamical systems was proposed in (CMHRBH17, LZLD17, Weinan17)

Victor Lopez-Pastor and Florian Marquardt proposed that certain time-reversible? Hamiltonian systems? exhibit self-learning behaviour and a physical version of the backpropagation algorithm.

Relation to renormalization group flow

A relation between deep neural networks (DNNs) based on Restricted Boltzmann Machines (RBMs) and renormalization group flow in physics was proposed in (MS14).

References

General

Textbook account:

  • Daniel A. Roberts, Sho Yaida, Boris Hanin, The Principles of Deep Learning Theory, Cambridge University Press 2022 (arXiv:2106.10165)

On the learning algorithm as analogous to differential equations and dynamical systems:

  • Bo Chang, Lili Meng, Eldad Haber, Lars Ruthotto, David Begert, Elliot Holtham, Reversible Architectures for Arbitrarily Deep Residual Neural Networks, (arXiv:1709.03698)

  • Yiping Lu, Aoxiao Zhong, Quanzheng Li, Bin Dong, Beyond Finite Layer Neural Networks: Bridging Deep Architectures and Numerical Differential Equations, (arXiv:1710.10121).

  • Weinan E, A Proposal on Machine Learning via Dynamical Systems, Communications in Mathematics and Statistics, 5, 1–11 (2017). (doi:10.1007/s40304-017-0103-z)

  • Victor Lopez-Pastor, Florian Marquardt, Self-learning Machines based on Hamiltonian Echo Backpropagation, (arXiv:2103.04992)

On the learning algorithm as gradient descent of the loss functional:

On the learning algorithm as analogous to the AdS/CFT correspondence:

  • Yi-Zhuang You, Zhao Yang, Xiao-Liang Qi, Machine Learning Spatial Geometry from Entanglement Features, Phys. Rev. B 97, 045153 (2018) (arxiv:1709.01223)

  • W. C. Gan and F. W. Shu, Holography as deep learning, Int. J. Mod. Phys. D 26, no. 12, 1743020 (2017) (arXiv:1705.05750)

  • J. W. Lee, Quantum fields as deep learning (arXiv:1708.07408)

  • Koji Hashimoto, Sotaro Sugishita, Akinori Tanaka, Akio Tomiya, Deep Learning and AdS/CFT, Phys. Rev. D 98, 046019 (2018) (arxiv:1802.08313)

Category theoretic treatments of deep learning in neural networks:

Quantum neural networks (in quantum computation for quantum machine learning):

  • Iris Cong, Soonwon Choi & Mikhail D. Lukin, Quantum convolutional neural networks, Nature Physics volume 15, pages 1273–1278 (2019) (doi:10.1038/s41567-019-0648-8)

  • Andrea Mari, Thomas R. Bromley, Josh Izaac, Maria Schuld, Nathan Killoran, Transfer learning in hybrid classical-quantum neural networks, Quantum 4, 340 (2020) (arXiv:1912.08278)

  • Stefano Mangini, Francesco Tacchino, Dario Gerace, Daniele Bajoni, Chiara Macchiavello, Quantum computing models for artificial neural networks, EPL (Europhysics Letters) 134(1), 10002 (2021) (arXiv:2102.03879)

Topological deep learning for data supported on topological domains such as simplicial complexes, cell complexes, and hypergraphs:

  • Ephy R. Love, Benjamin Filippenko, Vasileios Maroulas, Gunnar Carlsson, Topological Deep Learning (arXiv:2101.05778)

  • Mathilde Papillon, Sophia Sanborn, Mustafa Hajij, Nina Miolane, Architectures of Topological Deep Learning: A Survey on Topological Neural Networks (arXiv:2304.10031)

  • Mustafa Hajij et al., Topological Deep Learning: Going Beyond Graph Data (pdf)

Relation to tensor networks

Application of tensor networks and specifically tree tensor networks:

  • Ding Liu, Shi-Ju Ran, Peter Wittek, Cheng Peng, Raul Blázquez García, Gang Su, Maciej Lewenstein, Machine Learning by Unitary Tensor Network of Hierarchical Tree Structure, New Journal of Physics, 21, 073059 (2019) (arXiv:1710.04833)

  • Song Cheng, Lei Wang, Tao Xiang, Pan Zhang, Tree Tensor Networks for Generative Modeling, Phys. Rev. B 99, 155131 (2019) (arXiv:1901.02217)

Relation to renormalization group flow

Relation to deep learning to renormalization group flow:

  • Pankaj Mehta, David J. Schwab - An exact mapping between the Variational Renormalization Group and Deep Learning, 2014 (arXiv:1410.3831)

Further discussion under the relation of renormalization group flow to bulk-flow in the context of the AdS/CFT correspondence:

  • Yi-Zhuang You, Zhao Yang, Xiao-Liang Qi, Machine Learning Spatial Geometry from Entanglement Features, Phys. Rev. B 97, 045153 (2018) (arxiv:1709.01223)

  • W. C. Gan and F. W. Shu, Holography as deep learning, Int. J. Mod. Phys. D 26, no. 12, 1743020 (2017) (arXiv:1705.05750)

  • J. W. Lee, Quantum fields as deep learning (arXiv:1708.07408)

  • Koji Hashimoto, Sotaro Sugishita, Akinori Tanaka, Akio Tomiya, Deep Learning and AdS/CFT, Phys. Rev. D 98, 046019 (2018) (arxiv:1802.08313)

Last revised on April 24, 2023 at 19:12:28. See the history of this page for a list of all contributions to it.