in Unknown Category
17,807 views
3 votes
3 votes
Suppose Host A wants to send a large file to Host B. The path from Host A to Host B has three links, of rates R1 =500 kbps, R2 = 2 Mbps, and R3 = 1 Mbps.

a. Assuming no other traffic in the network, what is the throughput for the file transfer.
b. Suppose the file is 4 Million bytes. Roughly, how long will it take to transfer the file to Host B.
c. Repeat (a) and (b), but now with R2 reduced to 100 Kbps.

 

while calculating transfer time why we are not using {4 million byes(*(1/500kbps)+(1/2Mbps)+(1/Mbps))} instead of (4 million bytes/500kbps)??
in Unknown Category
17.8k views

1 comment

Is this question asked from Kurose?
0
0

1 Answer

1 vote
1 vote
(a) Throughput is limited by the minimum of the capacity of the links. Here, minimum is R1. So throuhput is 500 kbps.

(b) Divide the size of the file by the throughput to get approximate time to transfer to B.

$t=\frac{4*10^6*8}{500*10^3}=64$sec.

(c) R2 being reduced to 100 kbps, so throuhput is now 100 kbps.

Time to transfer $t=\frac{4*10^6*8}{100*10^3}=320$sec.

The links R1, R2 and R3 are connected sequentially, so the whole transfer has to wait for the slowest link. That is why we do not consider individual transfer times.

Related questions