Dependency relation
This article needs additional citations for verification. (March 2008) |
In computer science, in particular in concurrency theory, a dependency relation is a binary relation that is finite,[1]: 4 symmetric, and reflexive;[1]: 6 i.e. a finite tolerance relation. That is, it is a finite set of ordered pairs , such that
- If then (symmetric)
- If is an element of the set on which the relation is defined, then (reflexive)
In general, dependency relations are not transitive; thus, they generalize the notion of an equivalence relation by discarding transitivity.
If (also called alphabet) denotes the set on which is defined, then the independency induced by is the binary relation
That is, the independency is the set of all ordered pairs that are not in . The independency relation is symmetric and irreflexive. Conversely, given any symmetric and irreflexive relation on a finite alphabet, the relation
is a dependency relation.
The pairs and ,[citation needed] or the triple (with induced by ) are sometimes called the concurrent alphabet[citation needed] or the reliance alphabet. In this case, elements are called dependent if holds, and independent, else (i.e. if holds).[1]: 6
Given a reliance alphabet , a symmetric and irreflexive relation can be defined on the free monoid of all possible strings of finite length by: for all strings and all independent symbols . The equivalence closure of is denoted , or , and called -equivalence. Informally, holds if the string can be transformed into by a finite sequence of swaps of adjacent independent symbols. The equivalence classes of are called traces,[1]: 7–8 and are studied in trace theory.
Examples
Given the alphabet , a possible dependency relation is , see picture.
The corresponding independency is . Then e.g. the symbols are independent of one another, and e.g. are dependent. The string is equivalent to and to , but to no other string.
References