there's a bulk_insert method too. http://web2py.com/books/default/chapter/29/06?search=bulk
Il giorno lunedì 20 maggio 2013 10:36:32 UTC+2, Rocco ha scritto: > > Dear all, > > I've a loop to insert about 2K records into a postgres database (running > on same host). > > This is the used code: > > for row in cnv.data_matrix: > sensor_n=0 > for element in row: > db.CTD_DATA.insert(CTD_STATION_ID=stationid,SENSOR= > sensor_n,VALUE=float(element)) > sensor_n+=1 > > > It takes more than 20 seconds, sometimes... also more. > I could change the database structure to reduce the number of inserts, but > there is a way to aggregate multiple insert o to improve the performances? > Thanks in advance for any suggestion. > -- --- You received this message because you are subscribed to the Google Groups "web2py-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to web2py+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/groups/opt_out.