Hi Folks,

    Another breakthrough in my OS X/SANE project yesterday: I can now 
display a scanned image in my Cocoa-based interface.  Unfortunately it's 
a distorted image and I'm not sure how to go about figuring out what's 
going on.  It looks as if each line of pixels is shifted one pixel to 
the left from the one previous to it until the errors become too great 
and the remainder of the image turns solid black.  I'm using the default 
settings (i.e., no calls to sane_control_option) so I'm able to do a 
direct comparison with the output from scanimage with no options set.  
You can find the two images side-by-side here:

http://www.pemburnia.com/images/output.jpg

    Here's the relevent code  (some stuff, like adding the PNM header, 
eliminated for clarity):

    //*** Create a NSBitmapImageRep to handle conversion from PNM to TIFF
    NSBitmapImageRep * pnm = [NSBitmapImageRep alloc];
   
    status = sane_get_parameters (saneHandle,&parameters);
    maxlength = (parameters.bytes_per_line * parameters.lines);
    status = sane_start (saneHandle);

    //*** Point the SANE_Byte * imageBuffer to the address of the 
NSMutableData *dataBuffer
    imageBuffer = [dataBuffer mutableBytes];
           
    while (status == SANE_STATUS_GOOD) {
         //*** Read the data from the scanner
        status = sane_read 
(saneHandle,imageBuffer,maxlength,&bytesReturned);
    }
           
    //*** Translate the PNM file format into bitmap representation
    [pnm _initBitmapFromPNM: (NSData *) headerData errorMessage: NULL];
    //*** Write the data into the image object
    [image initWithData: (NSData * )[pnm TIFFRepresentation]];
           
    return image;


Regards,

Mark



Reply via email to