on 10/23/01 1:36 AM, Julian Robinson at [EMAIL PROTECTED] wrote:

> And I don't understand the advantage in differentiating between scanner
> pixels and screen pixels or any other pixel - just makes things more complex?
> 
> Julian
> 
> At 15:37 23/10/01, you wrote:
>>
>> I use these terms:
>> Scanner - spi - (scan) samples per inch
>> Monitor - ppi - pixels per inck
>> Printer - dpi - dots (of ink) per inch
>> I think this came from Dan Margulis's "Professional Photoshop"
>> Maris
>> 
>> ----- Original Message -----
>> From: "Rob Geraghty" <[EMAIL PROTECTED]>
>> To: <[EMAIL PROTECTED]>
>> Sent: Monday, October 22, 2001 8:45 PM
>> Subject: filmscanners: Pixels per inch vs DPI
> 

I like Maris' terms.

Differentiation is important at least because a 1440 dpi printer doesn't
print 1440 pixels per inch. It prints dots per inch and a mosaic of dots is
required to render an image pixel.

With scanners, saying samples per inch tends to suggest samples within the
optical resolution of the scanner, although 'over sampling' is a term known
in the science of digital signal processing that relates to creating
artificial samples using interpolation of actual samples.

Raster displays have always been described in terms of pixels, as have
raster imaging applications, such as Photoshop.

Wire Moore

Reply via email to