I'm trying to load data into a database from a JSON data source. In theory, 
my data (mostly) matches my database table schema - the data is a list of 
dictionaries, with keys for the various columns. But, as the source data is 
JSON, I can't be 100% certain it doesn't contain oddities. Also, some 
fields contain string representations where I actually want particular data 
types (datetime values stored as ISO format strings, for example).

My current prototype code simply picks out the various fields by hand, 
using rec.get(name, default) to deal with possibly missing fields, and then 
does a conversion where needed. But it all seems very repetitive, given 
that I have a database schema with all the column names and types available 
to me. I could write some code to introspect the table definition, strip 
out the unneeded fields, apply type conversions and default values, etc, 
but it's going to be fiddly to get right and it seems to me to be a fairly 
common requirement for loading data into databases. So I wondered if I'd 
missed something built into SQLAlchemy that already did something like this.

Does anyone know of an existing solution, before I end up writing one of my 
own?

Thanks,
Paul

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to