Suppose that a synchronous serial data transmission is clocked by two clocks (one at the sender and
Question:
Suppose that a synchronous serial data transmission is clocked by two clocks (one at the sender and one at the receiver) that each has a drift of 1 minute in one year. How long a sequence of bits can be sent before possible clock drift could cause a problem?
Assume that a bit waveform will be good if it is sampled within 40% of its center and that the sender and receiver are resynchronized at the beginning of each frame. Note that the transmission rate is not a factor, as both the bit period and the absolute timing error decrease proportionately at higher transmission rates.
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Question Posted: