I've looked all over the web, past mailing list archives, etc., but I can't seem to 
find help with what seems to me to be a relatively simple problem. I'm new to python, 
please take mercy on my troubles!

I'm trying to create an insert statement into tables like the following using 
parameter bindings:

create table load_data(
            load_data_id    SERIAL,
            user_data       VARCHAR(32),
            user_number     INTEGER,
            
            PRIMARY KEY(load_data_id) )

now, I want to do the following:

        cursor.execute("INSERT INTO load_data(user_data,user_number) values(?,?)", 
[val1,val2]) 

I have to properly set val1 and val2 to the correct datatypes.  There are a bunch of 
people talking about "setTypeTranslation" and "setinputsizes(sizes)" is mentioned in 
the PEP document at www.python.org, as well as some sapdb documention I found.   But 
there are not examples that I could find of how to use this.

Now, this is really easy in Java, and even easier in Perl.  Someone please tell me 
it's easy in Python as well!

I'm reading my data from comma separated files, so it all comes in looking like 
strings.

Please don't tell me I should use:

        cursor.execute("INSERT INTO load_data(user_data,user_number) values(%s,%s)" % 
(val1,val2)) 

or something like that. Doing it this way is not using parameter bindings (as well as 
being a potential security hazard, just google for 'SQL injection attack' and you'll 
see what I mean.)

Also, since I'll be loading lots of tables with data this way, It's worth my time to 
try to write a generic function that would read the column type information and create 
the correct binds for any given table.  I have way too many tables to load (and 
constantly changing ones as well) to write a custom cursor.execute for each one!

Any help at all will be greatly appreciated!

__John Napiorkowski

Reply via email to