User:Jkindler1/Books/Information Theory

From Wikipedia, the free encyclopedia


Information Theory[edit]

Understanding Information Structures[edit]

Information theory
Entropy (information theory)
Independent and identically distributed random variables
A Mathematical Theory of Communication
Lossless compression
Statistical inference
Natural language processing
Cryptography
Neuroscience
Data compression
Ecology
Quantum computer
Linguistics
Pattern recognition
Anomaly detection
Data analysis
Random variable
Kolmogorov complexity
Algorithmic information theory
Information-theoretic security
Lossy compression
Adaptive system
Artificial intelligence
Complex system
Complex systems
Informatics (academic field)
Machine learning
Systems science
Cryptanalysis
Entropy in thermodynamics and information theory
Probability theory
Common logarithm
Mutual information
Binary logarithm
Natural logarithm
Algorithmic probability
Bayesian inference
Pigeonhole principle
Conditional entropy
Cross entropy
Entropy encoding
Entropy estimation
History of entropy