At least with sqlite wrapping a list of insert by adding respectively at 
the beginning and at the end
BEGIN TRANSACTION/END TRANSACTION
allows to speed up the performance, see: 
http://stackoverflow.com/questions/1711631/how-do-i-improve-the-performance-of-sqlite
could we do something like that in our bulk_insert?

Paolo

On Monday, May 20, 2013 2:57:30 PM UTC+2, Niphlod wrote:
>
> this is the implementation
>
> def bulk_insert(self, table, items):
>         return [self.insert(table,item) for item in items]
>
> no limits there, but it doesn't leverage any native BULK operation, so, 
> don't pass a list of 1 zillion dicts.
>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to