I did 'handler table_name read limit large_numbers'. Is there a way I can use lower 
number, but automatically loop through the number and display all of the table 
records? Thanks.

-----Original Message-----
From: mos [mailto:[EMAIL PROTECTED]
Sent: Thursday, September 02, 2004 10:37 AM
To: [EMAIL PROTECTED]
Subject: RE: tuning suggestion for large query


At 04:13 PM 9/1/2004, Sun, Jennifer wrote:
>Thanks Mike.
>Seems like even with handler, the big query process is still consuming all 
>my RAM and swap and being killed with error
>'VM: killing process mysql
>__alloc_pages: 0-order allocation failed (gfp=0x1d2/0)'
>
>I would like to find a startup parameter either for client or server to 
>limit per thread memory usage.

Can't help you there, sorry. I don't think that would work because MySQL is 
putting the retrieved rows into memory to get a snapshot of the table, so 
memory has to decrease. If you decrease the memory for the thread, then it 
will hit a wall.

I thought Handler would work because it is using a cursor into the table 
and shouldn't be consuming much memory. You fetch 10,000 rows at a time, 
then fetch another 10,000 rows etc.. It should only be consuming enough 
memory to hold 10,000 rows. Can you post the code showing how you're using 
Handler?

The only other thing I can suggest is to put a select statement inside of a 
loop and select using either a primary key, say rcd_id 1 to 9999, export 
them, then select 10000 to 19999 and export it etc until you reach eof. 
This should be quite fast and consumes low amount of memory. I would 
recommend using the primary key instead of say "select ... LIMIT 
<offset>,<limit>" because the offset will need to read over the first 'n' 
offset rows and this can be quite slow. Because you're doing only part of 
the table at a time, you won't get an exact snapshot if people are updating 
the table as you are exporting it, but it will be very low on memory.

Mike


>-----Original Message-----
>From: mos [mailto:[EMAIL PROTECTED]
>Sent: Wednesday, September 01, 2004 4:39 PM
>To: [EMAIL PROTECTED]
>Subject: Re: tuning suggestion for large query
>
>
>At 10:40 AM 9/1/2004, you wrote:
> >Hi,
> >
> >We have a job that do 'select * from big-table' on a staging mysql
> >database, then dump to data warehouse, it is scheduled to run once a day,
> >but may be run manually. Also we have several other small OLTP database on
> >the same server.
> >When the big job run, it would use all the physical mem and swap, all
> >other process slow down because of this.
> >
> >I would like to limit the resource usage for each mysql client  so that
> >they can use only certain max amount of RAM, and don't select everything
> >into memory before display it to users. However, I couldn't find any
> >parameter would let me implement it.
> >Anybody ever encounter the same issue before? Please share your experience.
> >
> >Thanks.
> >
> >-
>
>
>Try the Handler command with an Index if you need the results sorted. It
>may do the trick. :)
>
>Mike
>
>
>--
>MySQL General Mailing List
>For list archives: http://lists.mysql.com/mysql
>To 
>unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]


-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]


--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to