*src* can be an in memory database, it works too :)
on backup, I just use:
*destdb = apsw.Connection(__DATABASE_FILE__)**
**with destdb.backup("main", _connection, "main") as backup:**
**while not backup.done:**
**backup.step(100)*
it's a bit faster
HI everyone,
First, i would like to thank you for your comments.
Then, for those who run linux, the following works:
1- copy your db.sqlite to /dev/shm/db.sqlite (thus in ram fs)
2- play with it through sqlalchemy
3- backup it to disk when needed
src = apsw.Connection(ram
Well, what Jeff wrote is also true.
I do hot-copy of databases because we have a set of products that have
full automated builds and, to increase performance, I made the build
generate the SQLite database on memory and then dump it to the filesystem.
Cheers,
Richard.
Hi,
I'm currently run
Hi,
I'm currently running several python applications (each app using
sqlalchemy) accessing (read/write) a single SQLite database stored on disk.
For performance reasons, I would like to store this db file in RAM memory
(ie, in my /dev/shm)
The applications would then access a shared in-memor
Hi Pierre!
SQLAlchemy doesn't do that, because it depends on the underlying
connection layer but I already had this question and made a solution.
You simply have to use another SQLite library, ASPW, -
https://code.google.com/p/apsw/
There are some material over the web (Stack Exchange) that