On Tue, 2004-06-15 at 08:08, J S wrote:
> Hi,
> 
> I have a perl script which runs a bulk insert. When I run an insert with 
> about 100,000 lines it keels over with the following message:
> 
> DBD::mysql::st execute failed: Out of memory (Needed 6100848 bytes) at 
> ./parse.pl line 227, <> line 100005.
> 
> There is 8GB of memory on the box so I'm sure there is enough memory there. 
> Is there a setting in my.cnf which I need to tweak?

How large is the data?  How much of that 8GB is used by other
processes?  Have you watched the output of 'top' while the script is
running?  MySQL has tweakable limits on how large a particular insert
can be, but this error looks like perl is truely running out of memory,
rather than being denied by MySQL.

-- 
. Garth Webb
. [EMAIL PROTECTED]
.
. shoes * éå * schoenen * ëí * chaussures * zapatos
. Schuhe * ÏÎÏÎÏÏÏÎÎ * pattini * é * sapatas * ÐÐÑÐÐÐÐ

--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to