Consider two computers, A and B are connected by a wire that transmits at a rate
ID: 3639881 • Letter: C
Question
Consider two computers, A and B are connected by a wire that transmits at a rate of 1Mbps(1 million bits/second). Suppose that A sends 200 bits to B and the distance between A and B is 50km.a). How long is it from when A transmits the first bit until B recieves it, assuming that data travels at the speed of light (3 x 10^8 meters/second)?
b). How long does it take A to put all 200 bits onto the wire?
c). How long is it from when A starts sending the bits until B has completely received it?
please show all work
Explanation / Answer
Light travels at 3*10^8 meters per sec = 200,000 km per second To travel 50 km = 50/200,000 = 250 micro seconds a. To travel 50 km, it needs 250 micro seconds b. Bits are transmitted at 1 Million bps. So bits are also pulled out at the same rate. So to pull out 200 bits, 200/1000000 = 200 micro seconds c. Lets say A starts sending at time T1. It needs 200 micro seconds to pull out all bits. After pulling out the last bit, the last bit needs 250 micro seconds to reach B. In between all bits will be transmitted. So 200 + 250 = 450 micro seconds.