Best way to purge old records from a huge table?

2010-06-04 Thread Brian Dunning
Hey all - I have a table with 12,000,000 records spread over about 6 years. I'm trying to delete all but the last 2 years, but no matter how small of a group I try to delete at a time, it keeps hanging up the server and I eventually have to restart MySQL. The table looks like this:

Re: Best way to purge old records from a huge table?

2010-06-04 Thread Ananda Kumar
dont use a single delete statment. Use a stored proc, loop through and delete record by record and commit for every 10k. In this way, your mysql will not hang and if you replication setup, slave also will not lag behind. regards anandkl On Fri, Jun 4, 2010 at 8:40 PM, Brian Dunning

Re: Best way to purge old records from a huge table?

2010-06-04 Thread Krishna Chandra Prajapati
Hi Brian, I would suggest you to use mk-archiver (Maatkit Tools) for this activity. http://www.percona.com/files/presentations/Make_Life_Easier_Maatkit_v2.pdf Regards, Krishna On Fri, Jun 4, 2010 at 8:40 PM, Brian Dunning br...@briandunning.comwrote: Hey all - I have a table with

RE: Best way to purge old records from a huge table?

2010-06-04 Thread Martin Gainty
pas n'importe quel effet légalement obligatoire. Étant donné que les email peuvent facilement être sujets à la manipulation, nous ne pouvons accepter aucune responsabilité pour le contenu fourni. From: br...@briandunning.com Subject: Best way to purge old records from a huge table? Date

Re: Best way to purge old records from a huge table?

2010-06-04 Thread Johan De Meersman
sujets à la manipulation, nous ne pouvons accepter aucune responsabilité pour le contenu fourni. From: br...@briandunning.com Subject: Best way to purge old records from a huge table? Date: Fri, 4 Jun 2010 08:10:07 -0700 To: mysql@lists.mysql.com Hey all - I have a table with 12,000,000

Re: Best way to purge old records from a huge table?

2010-06-04 Thread Shawn Green
Brian Dunning wrote: Hey all - I have a table with 12,000,000 records spread over about 6 years. I'm trying to delete all but the last 2 years, but no matter how small of a group I try to delete at a time, it keeps hanging up the server and I eventually have to restart MySQL. The table looks

mysqlimport huge table

2009-02-24 Thread Shuly Avraham
Hi, I have a huge table with about 50 millions of rows. I dumped the table using mysqldump -T , as text dump, and now trying to import it to a database on another server, but it keeps hanging. Are there any options or server variables I can set to help out with it? Thanks, Shuly. -- MySQL

Re: mysqlimport huge table

2009-02-24 Thread Baron Schwartz
Hello Shuly, Try this. http://www.mysqlperformanceblog.com/2008/07/03/how-to-load-large-files-safely-into-innodb-with-load-data-infile/ On Tue, Feb 24, 2009 at 1:08 PM, Shuly Avraham sh...@cshl.edu wrote: Hi, I have a huge table with about 50 millions of rows. I dumped the table using

Re: mysqlimport huge table

2009-02-24 Thread Shuly Avraham
-large-files-safely-into-innodb-with-load-data-infile/ On Tue, Feb 24, 2009 at 1:08 PM, Shuly Avraham sh...@cshl.edu wrote: Hi, I have a huge table with about 50 millions of rows. I dumped the table using mysqldump -T , as text dump, and now trying to import it to a database on another server

[X-POST] Fastest way to dump this huge table

2007-05-02 Thread Brian Dunning
I have a huge MySQL table, 2.1 million records, 200MB. Once a week I need to dump it in CSV format and zip the file. This is not on my server, and it's in production, so I don't want to risk testing different methods and possibly hanging up their server for a period of time, so I wanted to

RE: [X-POST] Fastest way to dump this huge table

2007-05-02 Thread Jay Blanchard
[snip] I have a huge MySQL table, 2.1 million records, 200MB. Once a week I need to dump it in CSV format and zip the file. This is not on my server, and it's in production, so I don't want to risk testing different methods and possibly hanging up their server for a period of time, so I

Re: [X-POST] Fastest way to dump this huge table

2007-05-02 Thread Ryan Stille
I use a script like this: #!/bin/bash DATE=`date +%A` DESTFILE=/home/mysql-backups/mysql-dump-$DATE /usr/bin/mysqldump --skip-extended-insert -uroot -ppassword mydatabase $DESTFILE.sql /usr/bin/zip -qjTm $DESTFILE.zip $DESTFILE.sql I end up with: mysql-dump-Friday.zip

Re: [X-POST] Fastest way to dump this huge table

2007-05-02 Thread Dan Buettner
A few observations: 1 - if the table is in the InnoDB format, you aren't going to lock up their server, as InnoDB doesn't do table locking. SHOW TABLE STATUS LIKE 'tablename' will tell you. 2 - Ryan's mysqldump script looks useful - also, there's a little-used option with mysqldump that lets

Re: [X-POST] Fastest way to dump this huge table

2007-05-02 Thread Brian Dunning
The table is MyISAM, does that matter? On May 2, 2007, at 7:28 AM, Dan Buettner wrote: A few observations: 1 - if the table is in the InnoDB format, you aren't going to lock up their server, as InnoDB doesn't do table locking. SHOW TABLE STATUS LIKE 'tablename' will tell you. 2 -

Re: [X-POST] Fastest way to dump this huge table

2007-05-02 Thread Dan Buettner
MyISAM does table level locking, which is to say that read (select) and write (insert/update/delete) cannot happen at the same time. One will wait for the other. If your select takes 10 seconds, then any write operations will block for those 10 seconds. Other read processes should be

Re: [PHP] [X-POST] Fastest way to dump this huge table

2007-05-02 Thread Brian Dunning
Thanks to everyone who answered, think I've got enough info now to handle it. :) -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]

Huge table

2005-10-25 Thread Luke Vanderfluit
Hi. I have upgraded RT Request Tracker. I'm using Mysql 4.0.18, Apache 1.3.27 (with static modperl module). The database we are using is quite large. The main problem we are having is that certain processes can take quite a long time. This is due to mysql churning through data. I'd like to

Re: Huge table

2005-10-25 Thread Brent Baisley
I would check the slow query log. Queries that are taking a long time (meaning slow) should be logged there. You can check your MySQL configuration file to find out how long a query will run before it's considered slow and where it is logging the info. On Oct 25, 2005, at 3:16 AM, Luke

Re: Huge table

2005-10-25 Thread Gleb Paharenko
Hello. MySQL allows you to find out the slow queries. Have a look here: http://dev.mysql.com/doc/mysql/en/Slow_query_log.html Luke Vanderfluit wrote: Hi. I have upgraded RT Request Tracker. I'm using Mysql 4.0.18, Apache 1.3.27 (with static modperl module). The database we are

R: R: connectorJ huge table problem

2004-06-24 Thread Leonardo Francalanci
Leonardo, Earlier you said that a statement without a parameter ('?') works, but one with a parameter doesn't. In both cases, are you still using a PreparedStatement? -Mark yes, it is always a preparedstatement. If you need more details (or you want me to try something else)

connectorJ huge table problem

2004-06-22 Thread Leonardo Francalanci
I have a large table (64,000,000 rows). Everything was fine when the table was 16,000,000 rows, now connectorJ crashes: java.sql.SQLException: java.lang.ArrayIndexOutOfBoundsException: 6 at com.mysql.jdbc.ServerPreparedStatement.executeInternal(ServerPreparedStateme nt.java:908)

RE: connectorJ huge table problem

2004-06-22 Thread Victor Pendleton
Are you using a streaming connection? -Original Message- From: Leonardo Francalanci To: Mysql Sent: 6/22/04 7:43 AM Subject: connectorJ huge table problem I have a large table (64,000,000 rows). Everything was fine when the table was 16,000,000 rows, now connectorJ crashes

R: connectorJ huge table problem

2004-06-22 Thread Leonardo Francalanci
Are you using a streaming connection? ehm... how do I know? My conn string is jdbc:mysql://192.168.0.253:3000/LETSPARTY1?autoReconnect=yes where 192.168.0.253 is a machine on my LAN... -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:

R: connectorJ huge table problem

2004-06-22 Thread Leonardo Francalanci
] Inviato: martedì 22 giugno 2004 14.43 A: Mysql Oggetto: connectorJ huge table problem I have a large table (64,000,000 rows). Everything was fine when the table was 16,000,000 rows, now connectorJ crashes: java.sql.SQLException: java.lang.ArrayIndexOutOfBoundsException: 6

RE: connectorJ huge table problem

2004-06-22 Thread Paul McNeil
8:55 AM To: 'Mysql ' Subject: R: connectorJ huge table problem Are you using a streaming connection? ehm... how do I know? My conn string is jdbc:mysql://192.168.0.253:3000/LETSPARTY1?autoReconnect=yes where 192.168.0.253 is a machine on my LAN... -- MySQL General Mailing List For list

R: connectorJ huge table problem

2004-06-22 Thread Leonardo Francalanci
Dude. I wouldn't post your server ip here. ouch. Well, it's a LAN address! -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]

RE: connectorJ huge table problem

2004-06-22 Thread Victor Pendleton
was non streaming. -Original Message- From: Leonardo Francalanci To: 'Mysql ' Sent: 6/22/04 7:55 AM Subject: R: connectorJ huge table problem Are you using a streaming connection? ehm... how do I know? My conn string is jdbc:mysql://192.168.0.253:3000/LETSPARTY1?autoReconnect=yes

R: connectorJ huge table problem

2004-06-22 Thread Leonardo Francalanci
A non-streaming connection will place the resultset in memory. With large tables you can run out of memory and the application will stop. I get only 1 record per time (accessing by key), so I don't think it could be an issue. Thank you anyway! -- MySQL General Mailing List For list

R: connectorJ huge table problem

2004-06-22 Thread Leonardo Francalanci
of days... -Messaggio originale- Da: Leonardo Francalanci [mailto:[EMAIL PROTECTED] Inviato: martedì 22 giugno 2004 15.09 A: Mysql Oggetto: R: connectorJ huge table problem I have also weird results: I access data by ID, which is a primary key in the table. Some times (again, very

Re: connectorJ huge table problem

2004-06-22 Thread Mark Matthews
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Leonardo Francalanci wrote: I have a large table (64,000,000 rows). Everything was fine when the table was 16,000,000 rows, now connectorJ crashes: java.sql.SQLException: java.lang.ArrayIndexOutOfBoundsException: 6 at

Re: R: connectorJ huge table problem

2004-06-22 Thread Alec . Cawley
Leonardo Francalanci [EMAIL PROTECTED] wrote on 22/06/2004 14:26:54: A non-streaming connection will place the resultset in memory. With large tables you can run out of memory and the application will stop. I get only 1 record per time (accessing by key), so I don't think it could be

R: R: connectorJ huge table problem

2004-06-22 Thread Leonardo Francalanci
It looks as if you are being over-complex anyway. Isn't what you want SELECT * FROM PARTITIONED_B ORDER BY RAND() LIMIT 1 ; which picks a single random record. No, I have to know which is the ID I pick before doing the select (because I splitted my data, and when I access the splitted

R: connectorJ huge table problem

2004-06-22 Thread Leonardo Francalanci
What happens if you use a nightly snapshot of Connector/J from http://downloads.mysql.com/snapshots.php? I tried, but I still have the same problem: java.sql.SQLException: java.lang.ArrayIndexOutOfBoundsException: 6 at

Re: R: R: connectorJ huge table problem

2004-06-22 Thread Richard F. Rebel
Next time you come over, maybe you can help hold him or take the shot. He's such a pain to take pictures of, he walks right up to the camera. On Tue, 2004-06-22 at 09:50, Leonardo Francalanci wrote: It looks as if you are being over-complex anyway. Isn't what you want SELECT * FROM

R: R: R: connectorJ huge table problem

2004-06-22 Thread Leonardo Francalanci
Next time you come over, maybe you can help hold him or take the shot. He's such a pain to take pictures of, he walks right up to the camera. Mmmh... my English is very bad... I can't understand... is it a joke or just a mistake? -- MySQL General Mailing List For list archives:

Re: R: R: R: connectorJ huge table problem

2004-06-22 Thread SGreen
to: Subject: R: R: R: connectorJ huge table problem 06/22/2004 10:18

Re: R: R: connectorJ huge table problem

2004-06-22 Thread Fagyal, Csongor
Richard F. Rebel wrote: Next time you come over, maybe you can help hold him or take the shot. He's such a pain to take pictures of, he walks right up to the camera. Yeah, I also have problems taking snapshots of my cat... use FEL to pre-flash so he won't blink when you take the actual

Re: R: connectorJ huge table problem

2004-06-22 Thread Mark Matthews
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Leonardo Francalanci wrote: What happens if you use a nightly snapshot of Connector/J from http://downloads.mysql.com/snapshots.php? I tried, but I still have the same problem: java.sql.SQLException: java.lang.ArrayIndexOutOfBoundsException: 6

DELETE on a huge table; how long *should* it take?

2003-12-05 Thread Chris Elsworth
Hello, I have quite a large table, 45 million rows, which has 3 indexes on it. The rows are evenly distributed across one particular index, which records the time the row was inserted. At any given time there's between 20 and 21 days worth of rows, and every night I delete anything over 20 days.

Re: DELETE on a huge table; how long *should* it take?

2003-12-05 Thread Dobromir Velev
PROTECTED] Sent: Friday, December 05, 2003 12:42 Subject: DELETE on a huge table; how long *should* it take? Hello, I have quite a large table, 45 million rows, which has 3 indexes on it. The rows are evenly distributed across one particular index, which records the time the row was inserted

CREATE INDEX problems on sparse column of huge table

2003-03-23 Thread Steve Allaway
I am trying to create indexes on several columns of a huge table in MySQL 4.0.12 running on Windows 2000 Server. The table contains 42 million rows, and occupies about 7G of disk space. The server is a 1.5GHz P4 with 1G RAM, and performs fairly well - the load data infile takes from 22 - 28

HUGE Table Help

2001-11-21 Thread Kevin Smith
Hi All, I'm considering making a huge table, by that I mean approx. 26 millions records which will be static data, no updates, or deletes, etc. will be performed on this table, only select statements. The schema on the table will be as follows.. The address fields will be 56 characters long

Performance Problems with Huge Table

2001-05-05 Thread Carlos Fernando Scheidecker Antunes
I am having performance problems with my server with larger tables. I Have a 512MB Pentium III Red Hat Linux 7 server running MySQL version 3.23.32 packaged by Red Hat This server's /etc/my.cnf is as follows : [mysqld] pid-file=/var/run/mysqld/mysqld.pid datadir=/var/lib/mysql port=3306

Re: Performance Problems with Huge Table

2001-05-05 Thread ryc
you want info from the vendor table included in queries involving the parts table. ryan - Original Message - From: Carlos Fernando Scheidecker Antunes [EMAIL PROTECTED] To: MySQL List [EMAIL PROTECTED] Sent: Saturday, May 05, 2001 4:53 PM Subject: Performance Problems with Huge Table I am