Graham Davies wrote: > The main limiting factor for speed is the settling time of the D-to-A and > comparator. If you terminate a bit period too soon by clocking too fast, > you may not get a true comparison of the current guess and the value to be > converted because the D-to-A output and the comparator are still changing. > This effect is at its worst when you're working on the most significant bits > because then the D-to-A output swings are largest.
Thank you for the clarification, that more or less confirms my initial assumptions. > If you incorrectly guess the most significant bit, it doesn't mean > that's how far off the result will be because the rest of the bits > will make up for the bad guess as far as possible. So, you might get > something like 01111111 when the correct result would be 10000110. That's interesting. So one could say that the probability of getting one bit wrong correlates "somehow" with how near the actual value is to the bit transition. That makes sense, considering a typical transient response curve... This was the point I was missing... I thought if the MSB is wrong, I couldn't trust the sampled value at all. Has anybody done some measurements on this topic? Would be interesting to know how the error increases with clock frequency. Thank you, Andreas _______________________________________________ AVR-chat mailing list [email protected] http://lists.nongnu.org/mailman/listinfo/avr-chat
