Showing changes from revision #4 to #5:
Added | Removed | Changed
There are many theories explicating specific features of the naive or intuitive notion notions of information.
The academic sites of these theories range over philosophy, mathematics (type theory, measure theory / probability theory / statistics), computer science, physics, communication sciences, psychology / sociology, economics, semiotics, cybernetics, etc.
Information theories in methodological proximity to mathematics are e.g.
statistical information theory
semantic information theory
algorithmic information theory
constructive-type-theoretical information theory?
analytic principle?
coding theory?
computational complexity theory?
infon?
meaning
knowledge
structure?
Statistical information theory: Claude Shannon, Warren Weaver, Ronald Fisher
Semantic information theory: Yehoshua Bar-Hillel, Rudolf Carnap
Algorithmic information theory: Ray Solomonoff, Andrey Kolmogorov, Gregory Chaitin
Constructive-type-theoretical information theory: Giuseppe Primiero, Tijn Borghuis, Fairouz Kamareddine, Rod Nedepelt
Miscellaneous: Fred Dretske, Keith Devlin, Jon Barwise, Jeremy Seligman, Jaakko Hintikka
Giuseppe Primiero, information and knowledge - a constructive type-theoretical approach, Springer, 2008
Mark Burgin, Theory of information, 2010
Last revised on August 6, 2012 at 13:11:14. See the history of this page for a list of all contributions to it.