Hi Austin.

Austin Franklin wrote:

> > > If you do the math, you'll find that using a 14-bit
> > > A/D on most CCD scanners is kind of silly; in such
> > > cases, one LSB generally equates to about 10-50
> > > microvolts of signal.
>
> > How do you work out this figure?
> > I make it more like 170 microvolts, since most CCDs have a saturation
> voltage in
> > the region of 2.8 volts.
>
> I believe both are wrong.  The voltage output range of the CCD has to be
> matched to the input range of the A/D.

Of course it does, but the voltage to toggle the LSB of the A/D, *relative to
the maximum voltage from the CCD* is one sixteen thousandth of the CCDs maximum
voltage, which is about 170 microvolts. It would be wasteful of dynamic range
for a scanner designer not to use the CCD close to its saturation voltage.

> You can NOT associate the volts/bit
> of the A/D without knowing the input voltage range of the A/D.  And you
> especially can not associate the volts/bit of the output voltage of the CCD
> without knowing the circuitry between the CCD and the A/D, and then the A/D
> capture range.

I wasn't even thinking of the input of the A/D, just the voltage relative to the
CCD maximum output.

I've looked at the data sheets of nearly all the currently available CCD linear
array sensors, and they really don't vary that much. They saturate at around 2.8
volts and have a dynamic range of around 5000 to 1, whether they're made by
Kodak, Sony or NEC.
A/D converters vary more, of course, but I don't see what the input voltage has
to do with the usability of a 14 bit output. Any A/D converter that wasn't able
to 'see' the voltage required to toggle its LSB wouldn't be of much use.

> > The extra bits also give room for the scanner hardware to take advantage
> of any
> > improvement in sensor technology that may come along, without a major
> re-build. A
> > bit of 'future proofing' by the circuit designers.
>
> Again, I disagree completely, that is not how the system is designed.  The
> CCD has an output voltage range that is then amplified or attenuated and
> also voltage shifted to match the input voltage range of the A/D.

Yes, that's true, but most A/D converters made specifically for scanner use have
programmable gain and offset amplifiers built in.
Ref: Analog Devices AD9816 and AD9814, and the Texas Instruments series of
scanner A to Ds.

Anyway, adding a bit of gain is hardly the basis for an entire design
philosophy.
"Output of CCD = 2.8 volts, input of A/D converter = 4 volts. Ugh! Add gain of
1.4."
That's a no-brainer.
And doesn't explain why all the scanner manufacturers are now moving to 14 bits.

I don't think it's entirely a marketing numbers game.

Regards,     Pete.


Reply via email to