> Hm. I don`t think that this is correct:
> With 1000 dpi in the test backend I get an image with 7874*7874 pixels.
> With 16 bits/pixel this is a size of 354MB. Xsane does store the
> original image (with full depth) and the enhanced image with 8 bits/sample
> what makes additional 177MB in this case. The sum is 531 MB.

Still a hell lot of memory...

> May be a down sampling algorithm would be a good idea, but I hear the
> first time of a backend that produces so large preview images.
> If this is a general problem (not only one of one backend) then I
> will think about a down sampling algorithm in xsane.

There is currently at least one scanner on the market (LS-8000ED)
which (in hardware) hasn't got any reasonable resolutions suitable for
previews, and I don't expect the rumoured upcoming Nikon 4x5 scanner
to be any different. Therefore doing it in the frontend would be a
good idea. That, however, would still mean that the entire image would
have to be transferred to the frontend first. Therefore it might be
better to do it in the backend, which in turn means that the backend
will require large amounts of memory (unless downsampling is done
intelligently). Maybe both would be the best way...

Anyway, I'm planning to do downsampling in the backend in the
not-too-far future.

  Andras

===========================================================================
Major Andras
    e-mail: and...@users.sourceforge.net
    www:    http://andras.webhop.org/
===========================================================================

Reply via email to