Dave Wreski <[EMAIL PROTECTED]> wrote:
> The conversation stemmed from the fact that both implement the V.90
> standard, and how there could be a difference between one vendor's
> implementation and another.
> 
> Does anyone know of a doc on the web that explains the difference in more
> technical terms?

Two modems with correct implementations of V.90 will interoperate with each
other under ideal conditions.  However, under less-than-ideal conditions (i.e.,
real world conditions), performance may be reduced, or in the worst case, the
connection may fail entirely.

This is because V.90 (like all V-series modem specifications) dictates the
correct modulation of any given data, but does not dictate any specific
algorithms for the demodulator.  All the vendors tend to use the same general
(well-known) demodulation techniques, but the specific implementations may be
better or worse in terms of how much distortion and what types of distortion
they can deal with.  For example, brand A might handle phase jitter better
than brand B, but brand B might be better at dealing with a frequency offset.
Those are only two of the many parameters that come into play.

There exist standard models of line impairments for modems to be tested
against, and equipment to simulate arbitrary line impairments, so in practice
this variation does not tend to cause huge problems.  Of course, the major
modem vendors can afford to spend more time and money fine-tuning their
algorithms and filter coefficients than could Fred's Discount Modem Company.
To some extent this effect is not very pronounced because generally Fred's
will buy a modem chip set that another large company has designed.  On the
other hand, the analog front end design of modems is non-trivial and can
have substantial effects on performance, so the use of a good chip set is
not sufficient to guarantee that the finished product is reliable.

This sort of thing is not specific to modems; it happens with a lot of
other standards as well.  For instance, the MPEG video and audio compression
standards define what a given bit stream will decomress into, but don't
define a specific compression algorithm.  Sort of the opposite of the modem
situation, because there are different constraints.

For really detailed information, see the book "The Theory and Practice of
Modem Design" by John A.C. Bingham, published by John Wiley & Sons in 1988,
ISBN 0471851086.  It doesn't specifically cover V.34 and V.90, since it
predates them, but it has quite good coverage of the general technology.
I'm glad I bought my copy years ago; it's up to $120 now!  Too bad I didn't
get John to autograph it for me back when we both worked at Telebit.

Cheers,
Eric


-- 
  PLEASE read the Red Hat FAQ, Tips, Errata and the MAILING LIST ARCHIVES!
http://www.redhat.com/RedHat-FAQ /RedHat-Errata /RedHat-Tips /mailing-lists
         To unsubscribe: mail [EMAIL PROTECTED] with 
                       "unsubscribe" as the Subject.

Reply via email to