algebraic quantum field theory (perturbative, on curved spacetimes, homotopical)
quantum mechanical system, quantum probability
interacting field quantization
The Coleman–Mandula theorem is a no-go theorem motivated by the possibilities of Lie group symmetries in quantum field theory in Minkowski space-time.
Any Lie group containing the Poincaré group $P$ (in 4d) as a subgroup and containing a maximal internal symmetry group $G$ must be a direct product of those. In addition, $G$ must be a semisimple Lie group with additional $U(1)$ (circle group) factors.
The generalization of this statement to super Lie algebras is known as the Haag–Łopuszański–Sohnius theorem.
Gel’fand and Likhtman showed that with a slight extension of the concept of Lie group, one can get that $P$ and $G$ combine in a nontrivial way. This happens for example in the supersymmetric case.
Sidney Coleman, Jeffrey Ellis Mandula, All Possible Symmetries of the S Matrix, Physical Review 159 (5): 1251–1256 (1967) (doi:10.1103/PhysRev.159.1251)
I. M. Gel'fand, E. S. Likhtman, JETP Letters 13, 323 (1971)
Review includes
Last revised on May 11, 2022 at 07:35:41. See the history of this page for a list of all contributions to it.