Spahn information theory (Rev #4, changes)

Showing changes from revision #3 to #4: Added | Removed | Changed

Overview

There are many theories explicating specific features of the naive or intuitive notion of information.

The academic sites of these theories range over philosophy, mathematics (type theory, measure theory / probability theory / statistics), computer science, physics, communication sciences, psychology / sociology, economics, semiotics, cybernetics, etc.

Information theories in methodological proximity to mathematics are e.g.

  • statistical information theory

  • semantic information theory

  • algorithmic information theory

  • constructive-type-theoretical information theory?

  • analytic principle?

  • coding theory?

  • computational complexity theory?

  • infon?

  • meaning

  • knowledge

  • structure?

Contributors to information theory

  • Statistical information theory: Claude Shannon, Warren Weaver, Ronald Fisher

  • Semantic information theory: Yehoshua Bar-Hillel, Rudolf Carnap

  • Algorithmic information theory: Ray Solomonoff, Andrey Kolmogorov, Gregory Chaitin

  • Constructive-type-theoretical information theory: Giuseppe Primiero, Tijn Borghuis, Fairouz Kamareddine, Rod Nedepelt

  • Miscellaneous: Fred Dretske, Keith Devlin, Jon Barwise, Jeremy Seligman, Jaakko Hintikka

References

Revision on August 4, 2012 at 18:56:01 by Stephan Alexander Spahn?. See the history of this page for a list of all contributions to it.