Howdy,

I'm aggregating data from several Sqlite files into a Postgres db.
The sqlite files are storage for several apps I use: Shotwell,
Firefox, Zotero, Banshee ... I just watch and pull from them.

I've been using "import sqlite3" so far, dumping sql from sqlite,
using it to create the Postgres tables. I then add columns to meet
my own needs. I now can diff 2 sqlite files, so I know what rows
need updating and adding in the Postgres tables.

I feel I should be using Sqlalchemy, but have been intimidated by
the wealth of choices SA offers. I don't want to start down the
wrong road.

However, as I look towards coding change merging, and
the new level of complexity it presents, I think it's time to
take the plunge.

Data specs:

- source data lives in other-owned files
- replicate source data tables in Postgres
- add columns to Postgres tables
- keep Postgres synced with sqlite sources

My proclivities:

- comfortable in Python, SQL not so much
- roadmap
  - pull into the Postgres db from other sources
    - file system content
    - email
    - other db's: Mysql, rdf, ...
  - feed Sphinxsearch from the Postgres db

I would greatly appreciate any suggestions
on how to proceed.

Thanks,
Kent

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalchemy@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.

Reply via email to