State space

From Wikipedia, the free encyclopedia
  (Redirected from State space (dynamical system))
Jump to navigation Jump to search
Vacuum World, a shortest path problem with a finite state space

In the theory of discrete dynamical systems, a state space is the set of all possible configurations of a system.[1] For example, a system in queueing theory defining the number of customers in a line would have state space {0, 1, 2, 3, ...}. State spaces can be either infinite or finite. An example of a finite state space is that of the toy problem Vacuum World, in which there are a limited set of configurations that the vacuum and dirt can be in.

Definition[edit]

The states space is a directed graph where each possible state of a dynamical system is represented by a vertex, and there is a directed edge from a to b if and only if ƒ(a) = b where the function f defines the dynamical system.

State spaces are useful in computer science as a simple model of machines. Formally, a state space can be defined as a tuple [NASG] where:

  • N is a set of states
  • A is a set of arcs connecting the states
  • S is a nonempty subset of N that contains start states
  • G is a nonempty subset of N that contains the goal states.

Properties[edit]

A state space has some common properties:

In many games the effective state space is small compared to all reachable states. For instance, in chess the effective state space is the set of postions that can be reached by game-legal moves. This is far smaller than the set of positions that can be achieved by placing combinations of the available chess pieces directly on the board.

State space search explores a state space.

See also[edit]

References[edit]