Question: Suppose that TCP is measuring RTTs of 1.0 second, with a mean deviation of 0.1 second. Suddenly the RTT jumps to 5.0 seconds, with no
Suppose that TCP is measuring RTTs of 1.0 second, with a mean deviation of 0.1 second. Suddenly the RTT jumps to 5.0 seconds, with no deviation. Compare the behaviors of the original and Jacobson/Karels algorithms for computing TimeOut. Specifically, how many timeouts are encountered with each algorithm? What is the largest TimeOut calculated? Use δ = 1/8.
Step by Step Solution
3.37 Rating (169 Votes )
There are 3 Steps involved in it
Okay lets discuss this by first stating what the two algorithms are The original algorithm estimates the RTT Round Trip Time using an exponential weig... View full answer
Get step-by-step solutions from verified subject matter experts
