On 17/10/17 08:27, Adam Weremczuk wrote: > On 16/10/2017 16:23, Alan Brown wrote: >> >> use the dbcheck utility to cleanup the database. That's what it's >> there for (make sure nothing else is running!) > > Hi Alan, > > I have no problem being patient and I can leave the job running for > days over Xmas. > My concern is, no matter how much time I give it, it's going to > terminate / hang / crash because of the 300k hard limit. > The way it's constructed - is it guaranteed to eventually succeed no > matter what the number of zombie records is?
yes it will. Don't try to make it larger by messing with the source code. Mysql will bloat badly during the process. And that brings up a companion issue: The reason it's taking so long is that you not tuned your mysql properly. Out of the box it is only setup for 32MB (yes, MB) systems. As a result it is writing hundreds of temporary files when deleting records. Take the time to tune it and things will perform more smoothly, but Postgresql will work even better and is almost entirely self-tuning. More importantly, pgsql doesn't suffer from mysql's bloated memory footprints and only takes as much as it needs to actually work and on large databases can be 20-30 times faster on the same system (particularly on inserts, but we found that for busy fileset restores it was the difference between 15-20 minutes consulting the catalog and 10 seconds.) MySQL is good at what it's designed for (small databases), but attempting to push it beyond its design criteria is unwise as the maintenance and system load goes up dramatically. If your DB is 13GB then you're well beyond its design criteria. Alan > > Regards > Adam > ------------------------------------------------------------------------------ Check out the vibrant tech community on one of the world's most engaging tech sites, Slashdot.org! http://sdm.link/slashdot _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users