Round-trip delay time
In telecommunications, the round-trip delay time (RTD) or round-trip time (RTT) is the length of time it takes for a signal to be sent plus the length of time it takes for an acknowledgment of that signal to be received. This time delay therefore consists of the propagation times between the two points of a signal.
Network links with both a high bandwidth and a high RTT can have a very large amount of data (the bandwidth-delay product) "in flight" at any given time. Such "long fat pipes" require a special protocol design. One example is the TCP window scale option.
The RTT was originally estimated in TCP by:
- RTT = (α · Old_RTT) + ((1 − α) · New_Round_Trip_Sample)
Where α is constant weighting factor(0 ≤ α < 1). Choosing a value α close to 1 makes the weighted average immune to changes that last a short time (e.g., a single segment that encounters long delay). Choosing a value for α close to 0 makes the weighted average respond to changes in delay very quickly.
This was improved by the Jacobson/Karels algorithm, which takes standard deviation into account as well.
Once a new RTT is calculated, it is entered into the equation above to obtain an average RTT for that connection, and the procedure continues for every new calculation.
- Definition by the National Telecommunications and Information Administration's Institute for Telecommunication Sciences in Boulder, Colorado
|This article related to telecommunications is a stub. You can help Wikipedia by expanding it.|
|This computer networking article is a stub. You can help Wikipedia by expanding it.|