Jump to content

Computation time

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Pascal.Tesson (talk | contribs) at 03:56, 26 September 2006 (cat). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In computational complexity theory, computation time is a measure of how many steps are used by some abstract machine in a particular computation. For any given model of abstract machine, the computation time used by that abstract machine is a computational resource which can be used to solve computational problems. Many important complexity classes are defined in terms of a certain amount of computation time on a certain abstract machine. These time classes share many characteristics with each other, but their relationships to each other and to complexity classes on other resources are still largely not understood.

The most common model of abstract machine used to count computation time is the Turing machine. Any Turing-machine-like abstract machine which has a state control and a tape head can measure computational time, in terms of the number of steps made by the state control and tape head. Thus, for various abstract models, we can define different computational resources: deterministic time on a deterministic Turing machine, nondeterministic time on a nondeterministic Turing machine, quantum time on a quantum Turing machine, etc. The computation time on an input is equal to the depth of the computation tree on that input.

Computation time measures satisfy time hierarchy theorems, meaning that an asymptotically greater amount of computation time will always allow the computation of strictly larger complexity classes.