nLab complexity theory




Computational complexity theory is the study of how much and how many resources are required to perform computations. Unlike other systems theories, computational complexity theory is purely mathematical and focuses on analysis of formal logic and abstract machines.


Historically, computational complexity theory arose from questions in computability theory and cryptography. In computability theory, we wanted to know what might be uncomputable relative to certain models of computation. In cryptography, we wanted to know which ciphers could be broken for a given amount of computational resources.

Current Directions

Today, the main focus of computational complexity theory is analysis of the complexity classes P? and NP. Their relationship determines which of Impagliazzo's five worlds? is physical reality.

Additionally, descriptive complexity theory is an active research programme which characterizes complexity classes via formal logic rather than abstract machines.


Textbook accounts:

  • Christos Papadimitriou (1994), Computational Complexity (1st ed.), Addison Wesley.

  • Michael Sipser, Part III of: Introduction to the Theory Of Computation, 3rd ed: Cengage Learning (2012) [ISBN:978-1-133-18779-0,pdf, pdf]

See also:

Discussion of complexity classes via linear logic:

  • Pierre Boudes, Damiano Mazza, Lorenzo Tortora de Falco, An Abstract Approach to Stratification in Linear Logic (arXiv:1206.6504)

Discussion for quantum computation and quantum supremacy:

Last revised on November 27, 2022 at 17:36:53. See the history of this page for a list of all contributions to it.