[web2py] Re: Bulk insert: how to improve performances?

2013-05-20 Thread Niphlod
there's a bulk_insert method too.

http://web2py.com/books/default/chapter/29/06?search=bulk

Il giorno lunedì 20 maggio 2013 10:36:32 UTC+2, Rocco ha scritto:

 Dear all,

 I've a loop to insert about 2K records into a postgres database (running 
 on same host).

 This is the used code:

 for row in cnv.data_matrix:
sensor_n=0
for element in row:
   db.CTD_DATA.insert(CTD_STATION_ID=stationid,SENSOR=
 sensor_n,VALUE=float(element))
   sensor_n+=1


 It takes more than 20 seconds, sometimes... also more.
 I could change the database structure to reduce the number of inserts, but 
 there is a way to aggregate multiple insert o to improve the performances?
 Thanks in advance for any suggestion.


-- 

--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.




[web2py] Re: Bulk insert: how to improve performances?

2013-05-20 Thread Rocco
Many thanks, Niphlod!

-- 

--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.




[web2py] Re: Bulk insert: how to improve performances?

2013-05-20 Thread Paolo valleri
I didn't know that as well, Is there a limit in the number of dicts passed 
to the bulk_insert?

Paolo

On Monday, May 20, 2013 11:18:51 AM UTC+2, Niphlod wrote:

 there's a bulk_insert method too.

 http://web2py.com/books/default/chapter/29/06?search=bulk

 Il giorno lunedì 20 maggio 2013 10:36:32 UTC+2, Rocco ha scritto:

 Dear all,

 I've a loop to insert about 2K records into a postgres database (running 
 on same host).

 This is the used code:

 for row in cnv.data_matrix:
sensor_n=0
for element in row:
   db.CTD_DATA.insert(CTD_STATION_ID=stationid,SENSOR=
 sensor_n,VALUE=float(element))
   sensor_n+=1


 It takes more than 20 seconds, sometimes... also more.
 I could change the database structure to reduce the number of inserts, 
 but there is a way to aggregate multiple insert o to improve the 
 performances?
 Thanks in advance for any suggestion.



-- 

--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.




[web2py] Re: Bulk insert: how to improve performances?

2013-05-20 Thread Niphlod
this is the implementation

def bulk_insert(self, table, items):
return [self.insert(table,item) for item in items]

no limits there, but it doesn't leverage any native BULK operation, so, 
don't pass a list of 1 zillion dicts.

-- 

--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.




[web2py] Re: Bulk insert: how to improve performances?

2013-05-20 Thread Paolo valleri
At least with sqlite wrapping a list of insert by adding respectively at 
the beginning and at the end
BEGIN TRANSACTION/END TRANSACTION
allows to speed up the performance, see: 
http://stackoverflow.com/questions/1711631/how-do-i-improve-the-performance-of-sqlite
could we do something like that in our bulk_insert?

Paolo

On Monday, May 20, 2013 2:57:30 PM UTC+2, Niphlod wrote:

 this is the implementation

 def bulk_insert(self, table, items):
 return [self.insert(table,item) for item in items]

 no limits there, but it doesn't leverage any native BULK operation, so, 
 don't pass a list of 1 zillion dicts.



-- 

--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.




[web2py] Re: Bulk insert: how to improve performances?

2013-05-20 Thread Niphlod
web2py executes all commands always within a transaction.

Il giorno lunedì 20 maggio 2013 15:04:23 UTC+2, Paolo valleri ha scritto:

 At least with sqlite wrapping a list of insert by adding respectively at 
 the beginning and at the end
 BEGIN TRANSACTION/END TRANSACTION
 allows to speed up the performance, see: 
 http://stackoverflow.com/questions/1711631/how-do-i-improve-the-performance-of-sqlite
 could we do something like that in our bulk_insert?

 Paolo

 On Monday, May 20, 2013 2:57:30 PM UTC+2, Niphlod wrote:

 this is the implementation

 def bulk_insert(self, table, items):
 return [self.insert(table,item) for item in items]

 no limits there, but it doesn't leverage any native BULK operation, so, 
 don't pass a list of 1 zillion dicts.



-- 

--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.