Hi,

For an application I use SQLite as datafile. I have written a "compiler" 
(script chain in Linux) for creating my database file. There are 
dependencies between tables and "compiling" my single database takes 
about 1-2 hours. When there is an error I have to restart the whole 
procedure. Very bad.

In order to overcome this problem, I divided my script in small chunks 
and use "make". Each scripts takes now a few minutes and creates its own 
SQLite dat-file. When another script needs data from another file, it 
just uses "ATTACH DATABASE". Works fine.

BUT: I end up with 10 files instead of one; all of them having their 
indices. But for my application I need one file.

My question now is: Is there a simple, fast and efficient way to just 
merge these databases to a single file?

The one solution I have is to recreate all tables (CREATE TABLE) in a 
new file and use INSERT INTO ... SELECT FROM (and again using ATTACH 
DATABASE) and after that to recreate each single index.

But:
a) This takes very long
b) I have to write code for CREATE TABLE's twice
c) I have to write code for CREATE INDEX's twice

Really cool would be something like:

cat db1.dat db2.dat db3.dat > final.dat

;-)

Thank you in advance,
Luke

_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to