/  It's not important that the 2 db files are exactly the same all the time
/>/  that people are editing them, but only when they 'finalise' a 'package'.
/>/  So what if some code in the 'packaging' process performed a sequence of
/>/  queries that read all the data from the db, table by table, and inserted
/>/  it into a new db.
/>/
/>/  I don't mind the extra coding, and reluctantly can put up with the extra
/>/  time taken to package at the end if need be.
/
If that's the case, can't you just dump the database, say, as a text
file containing a sequence of SQL statements? Sort by table name, and
within each table by rowid.


Hi Igor,

thanks for your response.

That is another option that I may have to pursue. Although if some of the data 
is images, sounds, other programs etc,
it would all have to be re-interpreted into text which would blow out the file 
size to almost double (I would suspect ?),
then the file would have to be re-parsed and re-injected into an SQLite db by 
each person who receives the file, not just the person creating it.

It is another thought I could look into, some other way of exporting the data 
out that would be the same on any system.
Just seems a shame that a large collection of data in one file (that is 
searchable and ready to go etc.) is what databases and especially SQLite do so 
well.


Cheers,

David.

_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to