"bulk insert is a way faster than regular insert when you have many rows"

I think we need to clarify terms. By Massimo's own account in the web2py 
book, the DAL bulk insert is not faster than db.insert unless you are using 
the GAE. So are you talking about your db's native bulk methods or is the 
book wrong?

Could someone just please answer what, if anything, is being sacrificed 
when you use your database's own bulk loading methods instead of using the 
DAL? Why the DAL religion about this?

On Sunday, August 19, 2012 5:09:43 PM UTC-4, Martin.Mulone wrote:
>
> bulk insert is a way faster than regular insert when you have many rows. 
> If you are under mysql you can use load data infile, this is incredible 
> fast, but you need special privileges under mysql.
>
> 2012/8/19 Andrew <awill...@gmail.com <javascript:>>
>
>> Is it possible that we add a "native bulk insert" function which is coded 
>> up in each adapter.  Even bulk_insert is an odbc 1 row at a time-slow for 
>> big files.  I need to load huge files all the time and I am writing custom 
>> modules to do this with a native loader.  Should this be a dal option? 
>>  Worth noting that this type of operation is a batch, back end thing,  I 
>> wouldn't do this for a end user web app.
>>
>> I would expect that each DBMS needs different info to start a bulk load, 
>> so the interface may be tricky, or just pass a dict and let the adapter 
>> work it out.
>> What do you think?
>>
>> --
>>
>>
>>
>>
>
>
> -- 
>  http://www.tecnodoc.com.ar
>
>

-- 



Reply via email to