Stef,

I haven't had a chance to spend any significant amount of time with it. 
I'm hoping to get to it in August when I have some vacation time. There 
is quite a bit of work yet to do on it.

I'll start by stubbing out all the calibration routines and just make 
sure it still scans as well as the old backend. Next I'll slowly add 
back the various calibrations and get them working for X1100 scanners. 
The amount of time a scan takes due to backtracking is unacceptable on 
some of the scans I tried - we're talking about several minutes to do a 
small scan at 150dpi! You thought this was due to the shading 
calibration, but I have my doubts. I'll take it out to see if the 
performance improves. The calibration scans seem to take a lot of time 
as well, but that could be due to the new method for finding home

Where did you get the calibration algorithms from (gain, offset, and 
shading)? Do you know if the algorithms will work for scanners with a 
CIS head like the X1100s? Some of the hard coded ranges look "funny" for 
my scanner...

Fred.


St?phane VOLTZ wrote:
> Le mardi 25 juillet 2006 01:00, Fred Odendaal a ?crit :
>   
>> The X1270 is not supported by the X1100 backend. You are somewhat
>> correct in your assumption of what is happening. The scanner is trying
>> to find the home position by scanning in the reverse direction - looking
>> for a set pattern off the glass. If it can't find it, it will make a
>> loud noise as the scan head bangs against the end. This is bad for your
>> scanner, so I'd advise against continuing to try it.
>>
>> regards,
>> Fred Odendaal
>>
>>     
>
>       Hello Fred,
>
>       by the way, what are your plans regarding the experimental version of 
> the 
> lexmark backend ? It has the inital groundwork needed for multi-models 
> support. Also it has some improvements such as calibration and arbitrary scan 
> size area.
>
>       As far I understood, the blocking point you got was calibration tuning 
> for 
> your model. Did you have some time to give a look ?
>
> Regards,
>       Stef
>
>   
-------------- next part --------------
An HTML attachment was scrubbed...
URL: 
http://lists.alioth.debian.org/pipermail/sane-devel/attachments/20060729/6879a2b4/attachment.htm
From j...@jon.demon.co.uk  Sun Jul 30 10:15:00 2006
From: j...@jon.demon.co.uk (Jon Chambers)
Date: Sun Jul 30 10:15:56 2006
Subject: [sane-devel] Re: dell1600n_net.c in sane-backends-1.0.18
In-Reply-To: <200607300051.taa23...@nts.nts.umn.edu>
References: <200607300051.taa23...@nts.nts.umn.edu>
Message-ID: <pine.lnx.4.64.0607301111390.6...@vadim1.home>


Hi Scott,

Oops!  Well spotted: I will amend this for the next release!

By the way, do you have a Dell 1600n and if so does the driver work ok for 
you (aside from its monstrous size!)?

cheers,
Jon

====================== Jon Chambers =====================
  http://www.jon.demon.co.uk, 020 8575 7097, 07931 961669
=========================================================


On Sat, 29 Jul 2006, Scott S. Bertilson wrote:

>  At line 1765 the code allocate a char array:
>       char tempFilename[TMP_MAX] = "scan.dat";
> I just tried to build it on FreeBSD-6.1 and noticed that
> while installing it took a _very_ long time to strip
> the shared object for dell1600n_net.  Out of curiousity,
> I looked at the size of the shared object and found that
> it was about 300 MB (yes, megabytes).  After poking around,
> I narrowed it down to the line referenced above.  After
> a little more looking around, I found out that
>       TMP_MAX
> is the maximum number of unique _names_ generated by
> the algorithm used whereas "L_tmpnam" is the symbol
> defined to specify the maximum _length_ of the string
> generated.  This is documented on both FreeBSD-6.1
> and Gentoo Linux.  I get a very reasonable 24.5 KB
> object file after changing this.
>  I'm guessing that there must be something interesting
> about the gcc-3.4.4 compiler on my FreeBSD-6.1 machine
> because I don't get a preposterously large object file
> on Gentoo with gcc-3.3.6.
>                               Regards, Scott
>

Reply via email to