> I am building a query to hit a Postgres (8.0.1) database > from Python (4.2.1) on Linux. Here's how I've been doing > it for the past year or so: > ... > query = ''' > SELECT * > FROM my_table > ....
Its a really bad idea to use SELECT * FROM in production code. There are two main reasons: 1) If the database structure changes your code is likely to break since SELECT * does not normally guarantee anything about the order of fields returned, so if the table gets an extra field added you might find the order changing. At the very least there will be an extra item in your tuple ofvalues returned whichj may well break your code. 2) Using * also prevents the database from precompiling your query and caching it, thus you will slow down the processing by forcing a SQL compile step each time. (This is true on Oracle, DB2 and Interbase, I don't know about Postgres but assume it is similar) Of course if you are the only user and the database is small these are not likely to be major issues but if two or more apps use the same database or if many users are hitting it it could be. SELECT * is great for experimenting but in production code its much safer to explicitly list the fields that you want back. > query = ''' > UPDATE my_table > SET state = 'processed' > WHERE id IN ids_to_process > ''' > > This would, of course, set the 'state' column to 'processed' > for all of the ids in the list, but can not figure out how > to get this into a query to pass to the database. What have you tried? What happened? It should just be a case of using variable interpolation as you did for the Select. Alan G. _______________________________________________ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor