On Wed, 04 May 2016 11:44:17 +0100
"Rob Willett" <rob.sqlite at robertwillett.com> wrote:

> Hi,
> 
> We think we know the answer to this, but we?ll ask the question 
> anyway.
> 
> We?re trying to backup a 10GB live running database 
> ?as-fast-as-we-possibly-can? without stopping updates coming in. The 
> updates come every 2-3 mins, and write a chunk of data in. We can?t 
> really stop the database updates, well we can but we don?t want to.
> 
> 1. We had a quick look to see if we could copy the sqlite file over
> in the short interval between updates but sadly cp simply wasn?t fast 
> enough. We get around 3GB copied before an update happens, which 
> basically renders the cp useless.
> 
> 2. If we use the command line sqlite <filename> .dump >
> <backupfilename> it works, but its very slow.
> 
> 3. Using the Sqlite C API works but is also very slow.
> 
> 4. We don?t have the option of an LVM snapshot as the file system is 
> in a Container <sigh>.
> 
> So is there any other method of doing a quick snapshot? Failing that, 
> our solution will be to stop any updates for the duration of the cp 
> command, and then restart the process afterwards. Its not the end of
> the world but it would have to be done out of normal working hours.
> 
> This is going to become a bigger problem for us as the database will 
> only get bigger so any advice welcomed.

If you only want the data, you can attach/open a new db file, create schema 
without indexes, select all data from tables and insert them in new db tables. 
You don't write the indexes and should be faster. If you need the indexes, you 
can create them later.


> Thanks
> 
> Rob
---   ---
Eduardo Morras <emorrasg at yahoo.es>

Reply via email to