As I've been reviewing what I've done so far, I started thinking about some way to test my claims in the tutorial discussing the need for accurate clocks in UART. I recently acquired a logic analyzer, which is fast proving to be an extremely useful tool for me! I'll probably use it quite a bit in the coming months and years trying to understand and debug some of the designs I'm working through.
In any case, I thought it might be interesting to use it to show what happens when clocks are off slightly. The four channels shown here are all measuring the same signal, with the Red channel clocking at the correct rate (9600 baud), Yellow at 3% Error (effectively 9888 baud), Green at 6% Error (the theoretical error from using two +/- 3% clocks, effectively 10,176 baud), and Blue at 10% Error (effectively 10,560 baud).
The second image zooms in to the first characters so you can see the frame errors more clearly. The dots placed on each plot show where each channel is sampling the data stream. At 6% error, the software is smart enough to decode the characters, but the stop bit on each is missed, making communication unreliable. In practice, you need better than 5% combined error to have reliable communication in UART. (Using identical systems, each needs a clock that is accurate to better than 2.5%.)
Note that this isn't really true in all cases; I'm using 8N1 encoding here. If I were to use larger frames than the 10 bits used for 8N1, the smaller clock errors may eventually cause a frame error as well. If you use a protocol that sends more than 10 bits, you'll need even more accuracy in your clocks.
Just an interesting little test I put together to try out my fancy logic analyzer.