There are many theories explicating specific features of naive or intuitive notions of information.
The academic sites of these theories range over philosophy, mathematics (type theory, measure theory / probability theory / statistics), computer science, physics, communication sciences, psychology / sociology, economics, semiotics, cybernetics, etc.
Information theories in methodological proximity to mathematics are e.g.
statistical information theory
semantic information theory
algorithmic information theory
constructive-type-theoretical information theory?
computational complexity theory?
Statistical information theory: Claude Shannon, Warren Weaver, Ronald Fisher
Semantic information theory: Yehoshua Bar-Hillel, Rudolf Carnap
Algorithmic information theory: Ray Solomonoff, Andrey Kolmogorov, Gregory Chaitin
Miscellaneous: Fred Dretske, Keith Devlin, Jon Barwise, Jeremy Seligman, Jaakko Hintikka
Mark Burgin, Theory of information, 2010