# Logical depth

Logical depth is a measure of complexity for individual strings devised by Charles H. Bennett based on the computational complexity of an algorithm that can recreate a given piece of information. It differs from Kolmogorov complexity in that it considers the computation time of the algorithm with nearly minimal length, rather than the length of the minimal algorithm.

Formally, in the context of some universal computer ${\displaystyle U}$ the logical depth of a string ${\displaystyle x}$ to significance level ${\displaystyle s}$ is given by ${\displaystyle {\text{min}}\{T(p):(|p|-|p^{*}| the running time of the fastest program that produces ${\displaystyle x}$ and is no more than ${\displaystyle s}$ longer than the minimal program.