Richard, there's no inherent problem around 60 MB - I routinely dump data ranging from a few KB to a few GB.
One thing I note is that you are using the mysql command, which is the interactive database client. You want to use mysqldump, the client program that dumps data from the database in SQL format. Why are you sure that a 46 MB dump doesn't contain everything? Granted, you have a dump in the last that was larger, but perhaps some unneeded data has been removed? Dan On 4/11/07, Richard <[EMAIL PROTECTED]> wrote:
Hello, I've got a problem with mysql 5 on my debian server. I've got a forum on this server and untill the database reached about 60 Mo I could dump the database with either phpmyadmin or with the command : mysql -u user -p'password' databasename > backup_date.sql My last backup that worked was about 56Mb, but now since I've gone over the 60mb my backup files with phpmyadmin and mysqldump are only around 46Mb and therefore don't contain everything and also when I do a mysql -u user -p'password' databasename > backup_date.sql it never finishes, and even if I wait for two hours the bacup_date.sql file is 0Mb ... The forum runs well and I use no compression I save the file in simple .sql text format. Any ideas as to why it does this or how I can fix it would be great ! I've gone through my my.cnf file and I can't see any setting that seems to block this. If you need any further information please let me know Thanks in advance, Richard -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED]