>       When I pluck this document with a maxdepth=30 maxwidth=300, 
> maxdepth gets cut off at about 20, and only the first two albums are 
> shown with usable links (albums 3 and 4 are "offsite", not fetched 
> by Plucker).

        I can confirm this bug is also present in the C++ distiller in 
CVS as well, but it is behaving much different, with similar results 
(failure to fetch all 121 images and related links):

        With _any_ maxdepth setting, I only get the first album's 
links fetched, skipping albums 2 through 4. Tapping on the very first 
link in Plucker (a 100x100 thumbnail image of a house), throws an 
error of: 

        "Insufficient memory for uncompressing the 
         page. You will need to free up RAM memory."

        I've got plenty of free RAM, and this happens after a soft or 
hard reset, with and without zlib compression enabled.

        By taking out --maxdepth=30 from the cplucker build command, 
cplucker fetches 47 links from the first album, and then stops 
fetching, completes the document creation, and creates what appears to 
be a corrupted document.

        If I add --tables, cplucker fetches only 24 links from the 
first album, using the same maxdepth setting (30) as in all prior 
tests. 

        If I use --tables, --maxdepth=300, --maxwidth=300, cpluck only 
fetches 11 links from the first album, and creates a corrupted doc.

        In all cases, the documents created are trashed, and I am 
unable to tap on or open even the first link.

        This is just bizarre... 


David A. Desrosiers
[EMAIL PROTECTED]
http://gnu-designs.com
_______________________________________________
plucker-dev mailing list
plucker-dev@rubberchicken.org
http://lists.rubberchicken.org/mailman/listinfo/plucker-dev

Reply via email to