nLab function machine

Contents

Contents

Idea

A function machine is a generalization of neural networks to potentially infinite dimensional layers, motivated by the study of universal approximation of operators and functionals over abstract Banach spaces.

Definition

A function machine’s layers is defined by the type of layers it is composed with.

Definition

Let K d,K dK \subset \mathbb{R}^d, K' \subset \mathbb{R}^{d'} both compact, TT is some affine map.

  • (Operator layer) When T: 1(K,μ) 1(K,μ)T:\mathcal{L}^1(K, \mu)\to \mathcal{L}^1(K', \mu), T=T 𝔬T = T^\mathfrak{o} is said to be an operator layer if there is some continuous family of measures (W tμ)(W_t \ll \mu) indexed by tKt \in K' and a function b 1(K,μ)b \in \mathcal{L}^1(K, \mu) such that:
T 𝔬:f(t KfdW t+b(t))T^\mathfrak{o}: f \mapsto \left(t \mapsto \int_K f dW_t + b(t)\right)
  • (Functional layer) When T: 1(K,μ) dT:\mathcal{L}^1(K, \mu) \to \mathbb{R}^{d'}, T=T 𝔣T = T^\mathfrak{f} is said to be a functional layer if there is some measure WμW \ll \mu and a vector b db \in \mathbb{R}^d such that:
T 𝔣:ffdW+bT^\mathfrak{f}: f \mapsto \int f dW + b
  • (Basis layer) When T: d 1(K,μ)T:\mathbb{R}^d \to \mathcal{L}^1(K',\mu), T=T 𝔟T=T^\mathfrak{b} is said to be a basis layer if there is some function w:K dw:K'\to\mathbb{R}^d and b 1(K,μ)b \in \mathcal{L}^1(K', \mu) such that:

    T 𝔟:y(t i=1 dy i[w(t)] i+b(t))T^\mathfrak{b} : y \mapsto \left(t \mapsto \sum_{i=1}^d y_i[w(t)]_i + b(t) \right)
  • (Fully-connected layer) When the inputs and outputs of a layer are finite dimensional, we yield the standard fully-connected layer, denoted T 𝔫T^\mathfrak{n}

One can now, for instance, define an 1-hidden layer operator network to be the composition of two operator layers:

F[f]=(T 𝔬gT 𝔬)[f]=sb(s)+ Kg( KfdW t+b(t))dW sF[f] = (T^\mathfrak{o}\circ g \circ T^\mathfrak{o})[f] = s \mapsto b'(s) + \int_{K''}g\left(\int_K f dW_t + b(t) \right)dW_s

where (W tμ) tK(W_t \ll \mu)_{t \in K''}, (W sμ) sK(W_s\ll\mu)_{s \in K'} are families of absolutely continuous measures w.r.t. the Lebesgue measure μ\mu, bb and bb' are measurable bias functions over compact domains KK'' and KK' respectively, and gg an activation function.

References

  • William Guss, Deep Function Machines: Generalized Neural Networks for Topological Layer Expression,(arXiv:1612.04799

  • William Guss, Ruslan Salakhutdinov, On Universal Approximation by Neural Networks with Uniform Guarantees on Approximation of Infinite Dimensional Maps,(arXiv:1910.01545

Last revised on January 14, 2020 at 22:15:44. See the history of this page for a list of all contributions to it.