User:Waterbug89/Books/Entropy
Appearance
The Wikimedia Foundation's book rendering service has been withdrawn. Please upload your Wikipedia book to one of the external rendering services. |
You can still create and edit a book design using the Book Creator and upload it to an external rendering service:
|
This user book is a user-generated collection of Wikipedia articles that can be easily saved, rendered electronically, and ordered as a printed book. If you are the creator of this book and need help, see Help:Books (general tips) and WikiProject Wikipedia-Books (questions and assistance). Edit this book: Book Creator · Wikitext Order a printed copy from: PediaPress [ About ] [ Advanced ] [ FAQ ] [ Feedback ] [ Help ] [ WikiProject ] [ Recent Changes ] |
Entropy
[edit]An information-theoretic view
[edit]- Introduction
- Introduction to entropy
- Entropy (order and disorder)
- Entropy (arrow of time)
- History of entropy
- Entropy
- Gibbs' inequality
- Tsallis entropy
- Entropy (statistical thermodynamics)
- Nonextensive entropy
- Information theory
- Information
- Information theory
- Entropy in thermodynamics and information theory
- Kullback–Leibler divergence
- Information gain in decision trees
- Differential entropy
- Limiting density of discrete points
- Joint entropy
- Self-information
- Mutual Information
- Mutual information
- Multivariate mutual information
- Conditional mutual information
- Pointwise mutual information
- Compression
- Shannon's source coding theorem
- Coding theory
- Entropy encoding
- Data compression
- Lossless compression
- Move-to-front transform
- Burrows-Wheeler transform
- General topics
- Information gain ratio
- Binary entropy function
- Measure-preserving dynamical system
- Variation of information
- Conditional quantum entropy
- Hartley function
- Fisher information metric
- Entropy rate
- Entropy power inequality
- Conditional entropy
- Rényi entropy
- Generalized entropy index
- Volume entropy
- Von Neumann entropy
- Cross entropy
- Gibbs algorithm
- Topological entropy
- Exformation
- Lyapunov exponent
- Recurrence quantification analysis
- Entropic uncertainty
- Coherent information
- Negentropy
- Inequalities in information theory
- Transfer entropy
- Estimation
- Entropy estimation
- Approximate entropy
- Maximum Entropy
- Principle of maximum entropy
- Maximum entropy probability distribution
- Maximum-entropy Markov model
- Markov chain
- Markov model
- Markov information source
- Information source (mathematics)
- Markov chain
- Measures
- Quantities of information
- Bit
- Nat (information)
- Dit (information)
- Ban (information)
- Applications
- Entropy monitoring
- Ascendency
- Predictability
- Entropy (computing)
- Perplexity
- Information retrieval
- Latent semantic indexing
- Social entropy
- Entropy and life
- Variety (cybernetics)
- Complexity
- Complexity
- Computational complexity theory
- Kolmogorov complexity
- Chain rule for Kolmogorov complexity
- Communication complexity
- Mathematical prerequisites
- Random variable
- Bernoulli process
- Randomness
- Independence (probability theory)
- Partition of a set
- Landauer's principle
- Asymptotic equipartition property
- Expected value
- Probability space
- Probability density function
- Partition function (mathematics)
- Logarithm
- Natural logarithm
- Dependent and independent variables
- Pseudocount
- Additive smoothing