At 15:51 16/03/2007, you wrote:
Dennis,

  Yes the data will be read later by down stream processing.

I do have the option of either putting the data into sqlite at the start (when its read) or putting it into a flat file and then Later loading it into a sqlite db via a downstream job.

A great deal of the data columns are simple numeric values and thats where sqlite really shines in that is portable between systems having differing endianness.

Here is a summary of the entire processing where a1/b1 are different processes and probably differnt host platforms.

a1, read from producing system, generate output data (be it flat file or sqlite).
  a2. Compress data file.
  a3. Transfer  compressed datafile to target system.
  a4. Goto a1

  b1. Receive datafile and uncompress.
b2. Read datafile and load into Master DB. (This might just be a simple attach).
  b3. Massage data in Master db ???
  b4. Read and process data from MasterDb. Delete or mark as deleteable.
b5. Delete processed data from MasterDb. (this could be in a seperate thread).
  b6. Goto step b1.

The nice thing about simply attaching as a DB in step b2 is that when all data is processed from step b4 then step b5 to purge is a simple detach and operating system unlink for the underlying datafile. Which I suspect will be infinately faster than a sql delete command.

  Thanks,
  Ken

Ken, i have your same scenario, the producing system generate data and it's read by sbc card, but as Dennis says in last message, have sqlite running on that card system. From time to time dump database to b system for backup. It's lot easier than read/compress/transfer/insert on different machines.




-------------------------------------------------------------------------------------------
Useful Acronymous : DMCA = Don't Make Content Accessible

-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------

Reply via email to