Re: mysqldump on large database..

2001-06-16 Thread SDiZ Cheng

- Original Message -
 In the last episode (Jun 15), SDiZ Cheng said:
  When i use mysqldump,
  seems that MySQL will open all the tables in the database.
  But, what i have is: a database will over 10,000 tables ..
  I get a Too much opened file error on this..
  I think increasing open_file_limit is not possible,
  because of the limit of the OS itself..

 Which OS are you running?  You should be able to open 2 files
 (index+data) on most Unixes with a little tuning.

I am using linux. yes, i can do they by a simple
  echo 4294967295  /proc/sys/fs/file-max 
But i don't want to see my system out of memory.
as the kernel doc said, it won't free the memory once
it use as a filehandle.. although those filehandle may
be reuse, but, the problem is: i won't do that twice .

 It may be slow, but  it should work.  Alternatively, you can temporarily
move tables 1000 at
 a time into another directory, mysqldump them, and move them back.

thanks for your suggest.. i will try it =)


-
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/   (the list archive)

To request this thread, e-mail [EMAIL PROTECTED]
To unsubscribe, e-mail [EMAIL PROTECTED]
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php




Re: mysqldump on large database..

2001-06-15 Thread Dan Nelson

In the last episode (Jun 15), SDiZ Cheng said:
 When i use mysqldump, 
 seems that MySQL will open all the tables in the database.
 But, what i have is: a database will over 10,000 tables ..
 I get a Too much opened file error on this..
 I think increasing open_file_limit is not possible,
 because of the limit of the OS itself..

Which OS are you running?  You should be able to open 2 files
(index+data) on most Unixes with a little tuning.  It may be slow, but
it should work.  Alternatively, you can temporarily move tables 1000 at
a time into another directory, mysqldump them, and move them back.

-- 
Dan Nelson
[EMAIL PROTECTED]

-
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/   (the list archive)

To request this thread, e-mail [EMAIL PROTECTED]
To unsubscribe, e-mail [EMAIL PROTECTED]
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php




Re: mysqldump on large database..

2001-06-15 Thread John Cichy

Hello

I created a perl script that might help. Because I like to have a full backup 
of my databases and table backups (I know I can split the full backup, but 
I'm too lazy ;), I created a perl script that queries MySQL for the table 
names in the database and then creates a shell script with individual 
mysqldump commands for each table, after it creates the script it executes 
it. I think this will help in this case because it is only asking MySQL to 
dump one table at a time. With 10,000 tables the script will run for a long 
time, but at least it will be automated. Contact me off the list and I will 
be glad to send it to you.

Have a great day...
John

Someone please correct me if my idea of how mysqldump runs (i.e. it will 
still attempt to open all files) is incorrect.


On Friday 15 June 2001 13:36, Dan Nelson wrote:
 In the last episode (Jun 15), SDiZ Cheng said:
  When i use mysqldump,
  seems that MySQL will open all the tables in the database.
  But, what i have is: a database will over 10,000 tables ..
  I get a Too much opened file error on this..
  I think increasing open_file_limit is not possible,
  because of the limit of the OS itself..

 Which OS are you running?  You should be able to open 2 files
 (index+data) on most Unixes with a little tuning.  It may be slow, but
 it should work.  Alternatively, you can temporarily move tables 1000 at
 a time into another directory, mysqldump them, and move them back.

-
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/   (the list archive)

To request this thread, e-mail [EMAIL PROTECTED]
To unsubscribe, e-mail [EMAIL PROTECTED]
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php




mysqldump on large database..

2001-06-14 Thread SDiZ Cheng

When i use mysqldump, 
seems that MySQL will open all the tables in the database.
But, what i have is: a database will over 10,000 tables ..
I get a Too much opened file error on this..
I think increasing open_file_limit is not possible,
because of the limit of the OS itself..

Can anyone suggest an alternative for mysqldump?

--
Cheng Yuk Pong (SDiZ) 
4096/1024 DH/DSS   0xA4C6FAD3
FE28 E6D4 AD21 5D4F F07B  EEA6 3C88 5DBB A4C6 FAD3



-
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/   (the list archive)

To request this thread, e-mail [EMAIL PROTECTED]
To unsubscribe, e-mail [EMAIL PROTECTED]
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php




Re: mysqldump on large database..

2001-06-14 Thread Seung-woo Nam

Wow! What kind of database would have 10,000 tables? Wouldn't you think
the database design had a serious flaw in the first place?

Seung-woo Nam

SDiZ Cheng wrote:

When i use mysqldump, 
seems that MySQL will open all the tables in the database.
But, what i have is: a database will over 10,000 tables ..
I get a Too much opened file error on this..
I think increasing open_file_limit is not possible,
because of the limit of the OS itself..

Can anyone suggest an alternative for mysqldump?

--
Cheng Yuk Pong (SDiZ) 
4096/1024 DH/DSS   0xA4C6FAD3
FE28 E6D4 AD21 5D4F F07B  EEA6 3C88 5DBB A4C6 FAD3






-
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/   (the list archive)

To request this thread, e-mail [EMAIL PROTECTED]
To unsubscribe, e-mail [EMAIL PROTECTED]
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php




Re: mysqldump on large database..

2001-06-14 Thread SDiZ Cheng

Yes.. I know.. But that's not my design.
That will be change in a few month..
My job now is: dump all of them out.

- Original Message - 
 Wow! What kind of database would have 10,000 tables? Wouldn't you think
 the database design had a serious flaw in the first place?
 
 Seung-woo Nam
 
 SDiZ Cheng wrote:
 
 When i use mysqldump, 
 seems that MySQL will open all the tables in the database.
 But, what i have is: a database will over 10,000 tables ..
 I get a Too much opened file error on this..
 I think increasing open_file_limit is not possible,
 because of the limit of the OS itself..
 
 Can anyone suggest an alternative for mysqldump?
 
 --
 Cheng Yuk Pong (SDiZ) 
 4096/1024 DH/DSS   0xA4C6FAD3
 FE28 E6D4 AD21 5D4F F07B  EEA6 3C88 5DBB A4C6 FAD3
 
 
 
 
 
 
 -
 Before posting, please check:
http://www.mysql.com/manual.php   (the manual)
http://lists.mysql.com/   (the list archive)
 
 To request this thread, e-mail [EMAIL PROTECTED]
 To unsubscribe, e-mail [EMAIL PROTECTED]
 Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php
 
 
 


-
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/   (the list archive)

To request this thread, e-mail [EMAIL PROTECTED]
To unsubscribe, e-mail [EMAIL PROTECTED]
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php