On Thu, 03 Aug 2006 19:04:12 -0500
"J. Alejandro Ceballos Z. -JOAL-" <[EMAIL PROTECTED]> wrote:

> 
>  I've created a perl routine that reads a RTF file, and while 
> interpreting, it is storing the information on a Mysql database.
> It is like:  while not EOF (read line, validate, db insert it)
> 
> The problem is that after a few registers (around 1'500,000) a
> message of "out of memory" appears and the program stops.
> 
> I think that is because the programs runs faster than the information
> is stored, so the queue of "insertions" grows and less memory for the
> process.
> 
> I tried using finish statement, but if it is after the insert, I got
> a message "nothing to finish". If it is at the end... the programs
> first become out of memory. I tried using a sleep command, but it is
> not working.
> 
> Any idea about how to prevent that "out of memory". Or maybe is
> another kind of error?


Hello,

I suspect, that you read the files, store it in an array which then is
queued for insertion. The information within the arrays is not
cleared. You also might not close all files, and keep on
opening new files, while not closing them; although, I can't see, how
this is a problem.
One thing I do with POSTGRESQL is start transactions, and commiting them
in the end. this allows a really fast insert statement (many thousand
per second) - if something goes wrong, then all input is lost.. BUT it
goes in really fast, and no long queues are created.

But all in all your error is way too vague. Is there a way of finding
out WHAT exactly yields that error? Perl itself or mySQL, or rtf parsing
libraries... Noone knows your code better than you, so try adding debug
messages, counters and the such.
Sorry if this isn't much help to you.

HTH


Cheers,
  Alex

-- 

shakespeare:
/(bb|[^b]{2})/

Attachment: signature.asc
Description: PGP signature

Reply via email to