if it is a one time data load and i have the proper constraints set in 
SQLLite, i would use raw SQL.

if this was repetitive use and i wanted to use web2py validation i would 
use the DAL with bulk insert (making sure that i validate the data before 
adding to the insert queue)

On Friday, August 17, 2012 12:38:56 PM UTC-7, Mike Girard wrote:
>
> 10 minutes is all right. Still, if there are two methods and they are 
> qualitatively equal apart from speed, I'd prefer to use the faster one.
>
> So posing the question, once again, is there a compelling reason to use 
> the DAL for bulk uploads. More specficially, what extras, if any, are being 
> added by the DAL for inserts that a bulk insert using SQLLite directly 
> won't add?
>
>
> On Friday, August 17, 2012 12:58:01 PM UTC-4, nick name wrote:
>>
>> On Friday, August 17, 2012 8:29:12 AM UTC-4, Mike Girard wrote:
>>>
>>> The data will be coming from a large XML file, so my script will parse 
>>> that and make inserts into several different tables. It's fairly 
>>> straightforward.
>>>
>>> So is it correct to say that -
>>>
>>> 1. There is no compelling reason to do this without the DAL 
>>> 2.  My options in the DAL are bulk_insert, looping db.query and csv 
>>> import and that performance wise they're similar? 
>>>
>> 2 is correct (as long as you are going through the DAL; db.executesql 
>> would be the non-DAL way from within web2py - and of course, you could use 
>> your DB's native facilities)
>>
>> 1. Is correct if you are not doing this often - e.g., it might take 10 
>> minutes as opposed to 1 minute without DAL (just assuming, not based on any 
>> actual measurement). So what?
>>
>> If you do this once an hour, then DAL processing and the individual 
>> record insertion (even if you use bulk_insert or csv) might make it too 
>> slow for you, and you would be better off looking at your database's native 
>> bulk loading facilities.
>>
>

-- 



Reply via email to