Not just that. The common type of CCD/CMOS array currently being used is
front illuminated. That means the part that illuminates the pixel is in
front. This means the light getting to it must pass through and around this
part of the array. As a result I'm pretty sure in the case of front
illumination the angle of incidence (angle it strikes the array) is very
critical to minimize distortion from the hardware in front.
Also this hardware forms a layer in front of the actual pixel array itself
which any dust will get on. So blowing it off with air will not damage the
array itself as it is under this layer.
The reason they are transitioning to rear illumination is that the
definition of each pixel improves without this front hardware distorting the
light path. Of course to work effectively in rear illumination the space
between the illumination and the pixels themselves must be even smaller than
in front illumination. As a result rear illumination arrays have to be
polished to a rear thickness of only 10 microns, making them more delicate
right now.
I'm also of the opinion that due to the differences between how the film
lies and how the array lies that lenses for digital cameras also needs a
flatter field than is necessary with film. Because the only cameras where
the film lies absolutely flat is ones with vacuum backs on them. So there is
likely a little more leeway in field curvature in film cameras than in
digital ones where the array is absolutely flat to some fraction of a wave
of sodium light.
Kent Gittings

-----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]]On Behalf Of Peifer, William
[OCDUS]
Sent: Monday, October 29, 2001 11:42 AM
To: '[EMAIL PROTECTED]'
Subject: RE: 'analogical' lenses coating and CCD, not fully compatible?


On the subject of "analog" lenses, Tom C. (aimcompute) wrote:
> If any lens of sufficient quality attached to the camera body achieves
> a critical focus on the focal plane, and the image transmitted to the
> focal plane covers the entire sensor area, be it CCD or film, that's all
> that matters....

and Rob Studdert replied:
> Maybe but there is possibly an angle of incidence factor ie film is
> not as sensitive to the angle of the light hitting its surface whereas
> the CCD cells really function optimally when hit by perpendicular
> rays....

Hi Tom, Rob, et al.,

All this talk about "analog" vs. "digital" lenses has got me wondering a
bit.  I'm curious where this whole idea of CCD sensors requiring (or
preferring) perpendicular rays originated.  I'm pretty convinced that it
must have originated because somewhere along the line, something got taken
out of context, and a fundamentally incorrect idea grew from there.  From
the standpoint of the underlying physics, Tom is absolutely right -- the
purpose of a lens is to bring an image to critical focus at the focal plane,
and the nature of the sensor (film, CCD, CMOS, or other) isn't particularly
relevant.  After all, if all the light rays strike the sensor
perpendicularly, then they are necessarily parallel and thus cannot form an
image at the focal plane!

I suspect that this perpendicular-ray story -- dare I say "legend"? -- may
have originated from a misinterpretation of the characteristic behavior of
CCD sensors.  We all know that in single-chip color CCD sensors, some of the
pixels are sensitive to red, others to green, and still others to blue.  For
the case of color cameras with single CCD sensors, color sensitivity is
imparted to a particular pixel by incorporating a microscopic optic -- a
lenslet and filter -- in front of that pixel, which I believe is
accomplished as part of the manufacturing process for the sensor chip.  I
can imagine that the numerical aperture of this microscopic optic may not be
terribly large, and it might very well constrain the field of view of its
corresponding pixel.  Maybe someone that knows more about chip fab can
comment on this.  Anyway, although each individual pixel may very well be
"looking" through an optic with small numerical aperture, it's only
"looking" a very short distance (microns?  tenths of microns?) to the
illuminated spot on the focal plane directly in front of it.  In fact, this
is precisely what you want.  If each pixel had a more "wide-angle" view, it
would not only register the intensity of light directly in front of it, but
it would also register the intensity of light from a immediately adjacent
pixels (perhaps pixels intended to sense a different color), resulting in a
spatially and chromatically degraded image.  The characteristics of the
macroscopic, "analog" lens mounted onto the front of the camera -- focal
length, f-number, etc. -- isn't particularly relevant, except that a faster
"analog" lens will make each pixel-size spot of light at the focal plane
correspondingly brighter.

Jaume's original question about spectral characteristics of particular
lenses and lens coatings is interesting as well.  The general strategy in
designing the ~lens~ is, among other things, to reduce chromatic aberration;
that is, to get red, green, and blue rays from a single object point to
focus at a single point on the same focal plane.  I think lens ~coatings~
are generally optimized to match the response of the human eye, rather than
the film emulsion.  (Likewise, most film emulsions -- excluding infrared, of
course -- are designed to match the human eye.)  I believe that the general
strategy in designing antireflection coatings (like SMC) is to minimize the
reflective loss of green light, since green is the color our eyes are most
sensitive to.  This doesn't mean that the coated lens passes primarily green
light; rather, it means that for the 1% or 2% of light that would otherwise
be lost at each air-glass interface of an uncoated lens element, the lens
designers try to "rescue" the green component by applying a green-optimized
antireflection coating.  CCDs are more sensitive to the red end of the
spectrum than the human eye.  You might imagine that in order to maximize
the signal level at the focal plane of the CCD, a lens designer might
consider using antireflection coatings optimized for passing red light.
However, this would yield an image with what we would perceive as a highly
perturbed color balance.  In fact, for consumer imaging applications,
designers use filters that ~decrease~ the intensity of far red and near
infrared light impinging on the sensor.  Thus, I can't imagine that consumer
digital camera designers would go to the expense of new lens designs, or
bodies specific for old vs. new lenses.  (Although that would certainly be
an interesting marketing gimmick....)

Just as a final aside, I'll mention a pet peeve of mine.  It seems that in
many discussions, we refer to film-based and CCD-based imaging as "analog"
and "digital".  This is really an artificial distinction.  CCDs, after all,
~are~ analog sensors, and the readout electronics for CCDs are analog
circuits.  The only thing that makes "digital" cameras digital is the way
the analog signal array is stored after being read off the CCD sensor.  A
minor point, but a pet peeve nonetheless.

Bill Peifer
Rochester, NY
-
This message is from the Pentax-Discuss Mail List.  To unsubscribe,
go to http://www.pdml.net and follow the directions. Don't forget to
visit the Pentax Users' Gallery at http://pug.komkon.org .



**********************************************************************
This email and any files transmitted with it are confidential and
intended solely for the use of the individual or entity to whom they
are addressed. If you have received this email in error please notify
the system manager.

This footnote also confirms that this email message has been swept by
MIMEsweeper for the presence of computer viruses.

www.mimesweeper.com
**********************************************************************
-
This message is from the Pentax-Discuss Mail List.  To unsubscribe,
go to http://www.pdml.net and follow the directions. Don't forget to
visit the Pentax Users' Gallery at http://pug.komkon.org .

Reply via email to