Jump to content

Delay-gradient congestion control

From Wikipedia, the free encyclopedia
(Redirected from Delay-gradient algorithm)

In computer networking, delay-gradient congestion control refers to a class of congestion control algorithms, which react to the differences in round-trip delay time (RTT), as opposed to classical congestion control methods, which react to packet loss[1] or an RTT threshold being exceeded.[2][3] Such algorithms include CAIA Delay-Gradient (CDG)[2][4] and TIMELY.[3]

See also

[edit]

References

[edit]
  1. ^ Jonathan Corbet (20 May 2015). "Delay-gradient congestion control". LWN.net.
  2. ^ a b David A. Hayes; Grenville Armitage (May 2011). Revisiting TCP congestion control using delay gradients (PDF). 10th International IFIP TC 6 Networking Conference (NETWORKING 2011).
  3. ^ a b Radhika Mittal; Vinh The Lam; Nandita Dukkipati; Emily Blem; Hassan Wassel; Monia Ghobadi; Amin Vahdat; Yaogong Wang; David Wetherall; David Zats (2015). TIMELY: RTT-based Congestion Control for the Datacenter. SIGCOMM 2015.
  4. ^ Grenville Armitage; Naeem Khademi (2013). Using Delay-Gradient TCP for Multimedia-Friendly 'Background' Transport in Home Networks. IEEE 38th Conference on Local Computer Networks (LCN 2013).