Jump to content

User:K Smeltz/Books/Information Statistics

From Wikipedia, the free encyclopedia


Information

[edit]

Statistics

[edit]
Algorithmic information theory
Algorithmic probability
Alternating decision tree
Approximate entropy
Ascendency
Binary combinatory logic
Binary entropy function
Binary lambda calculus
C4.5 algorithm
CHAID
Chain rule for Kolmogorov complexity
Chaitin's constant
Cheung–Marks theorem
Computational indistinguishability
Conditional entropy
Conditional mutual information
Cramér–Rao bound
Cross entropy
Decision rules
Decision stump
Decision tree
Decision tree learning
Decision tree model
Differential entropy
Entropy (information theory)
Entropy encoding
Entropy estimation
Entropy in thermodynamics and information theory
Exformation
Fisher information
Fisher information metric
Gene expression programming
Gibbs algorithm
Gradient boosting
Grafting (decision trees)
ID3 algorithm
Immerman–Szelepcsényi theorem
Incremental decision tree
Inequalities in information theory
Information gain in decision trees
Information gain ratio
Information theory
Jeffreys prior
Joint entropy
Kolmogorov complexity
Kolmogorov structure function
Kullback–Leibler divergence
Landauer's principle
Linear partial information
Logistic model tree
Maximum entropy probability distribution
Measure-preserving dynamical system
Minimum description length
Minimum message length
Multivariate mutual information
Mutual information
Negentropy
Nonextensive entropy
Nyquist–Shannon sampling theorem
Observed information
Partition function (mathematics)
Perplexity
Pointwise mutual information
Principle of maximum entropy
Pruning (decision trees)
Pseudorandom ensemble
Pseudorandom generator
Queap
Random forest
Randomness tests
Rényi entropy
Schwartz–Zippel lemma
Self-information
Shannon's source coding theorem
Shannon–Hartley theorem
Topological entropy
Transfer entropy
Tsallis entropy
Variation of information