I am trying to plucker-build a set of html documents from my local harddrive. Plucker collects all the documents (4416 of them) but then before it starts converting those documents its says.
---- all 4416 pages retrieved and parsed ---- Writing out collected data... Writing document 'LDS Website' to file /home/nathan/.plucker/ldsgetn.pdb Killed I have tried twice.. Not sure what is happening here. If I remove a bunch of the documents it works. Any ideas? Could it be not enough memory on my machine, plucker is using a very significant amount of memory by this point. Or is there just a limit to how many documents plucker-build will handle? There is a LOT of links between these documents.. Can there be too many links? Anyway I could split the documents into two groups but I would prefer not to do that because of the number of links between the pages. Any suggestions or ideas would be welcome. Nathan Bullock ===== Visit my website at http://www.nathanbullock.org ______________________________________________________________________ Post your free ad now! http://personals.yahoo.ca _______________________________________________ plucker-list mailing list [email protected] http://lists.rubberchicken.org/mailman/listinfo/plucker-list

