Hello

I created a perl script that might help. Because I like to have a full backup 
of my databases and table backups (I know I can split the full backup, but 
I'm too lazy ;), I created a perl script that queries MySQL for the table 
names in the database and then creates a shell script with individual 
mysqldump commands for each table, after it creates the script it executes 
it. I think this will help in this case because it is only asking MySQL to 
dump one table at a time. With 10,000 tables the script will run for a long 
time, but at least it will be automated. Contact me off the list and I will 
be glad to send it to you.

Have a great day...
John

Someone please correct me if my idea of how mysqldump runs (i.e. it will 
still attempt to open all files) is incorrect.


On Friday 15 June 2001 13:36, Dan Nelson wrote:
> In the last episode (Jun 15), SDiZ Cheng said:
> > When i use mysqldump,
> > seems that MySQL will open all the tables in the database.
> > But, what i have is: a database will over 10,000 tables ..
> > I get a "Too much opened file" error on this..
> > I think increasing open_file_limit is not possible,
> > because of the limit of the OS itself..
>
> Which OS are you running?  You should be able to open 20000 files
> (index+data) on most Unixes with a little tuning.  It may be slow, but
> it should work.  Alternatively, you can temporarily move tables 1000 at
> a time into another directory, mysqldump them, and move them back.

---------------------------------------------------------------------
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/           (the list archive)

To request this thread, e-mail <[EMAIL PROTECTED]>
To unsubscribe, e-mail <[EMAIL PROTECTED]>
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php

Reply via email to