ps: that being said, it would be better to do this using an executesql 
statement and let the backend do the job

update table_a
set field_a1 = field_b1,
....
from table_a
inner join
table_b
on table_b.id = table_a.item_id

unfortunately SQLite doesn't allow update statements with joins (but a 
serious db will).

Il giorno mercoledì 22 maggio 2013 11:30:04 UTC+2, Niphlod ha scritto:
>
> np, confirmed that there are no leaks involved, the second point was more 
> or less "am I doing it right?"
>
> my issue with the tests not being viable is that if speed matters, the 
> example needs to be as close to reality as possible to help you figure out 
> "the best way to do it". For example an index on the ITEM_ID where you join 
> the tables is the first point that pops up in mind to speed up the 
> retrieval of the resultset.
>
> As for the update_or_insert vs row.update_record, I don't see the 
> point.... you have two tables that needs to be merged or you have one 
> master table that needs to override values on the second table ?
>
> Seems to me the second environment is the one you're in: if that's the 
> case, ditch update_or_insert().
> instead of
> row = db(query)
> row.update_record(**arguments)
>
>
>
> do
>
> db(query).update(**arguments)
>
>
>
>
> Il giorno mercoledì 22 maggio 2013 11:18:27 UTC+2, Simon Ashley ha scritto:
>>
>> Ok, here's the reality.
>> Benchmarking on 10k records (either method), you get a through put of 
>> approximately 100 records a second (should complete in 1.5 hours).   
>>
>> The row.update completes in 3.5 hours. 
>> The update_or_insert takes > 7 hours.
>> (with available memory maxed out, no caching involved. 
>> Ditto with the former with/ without caching)
>>
>> Performance of either method is significantly slower than expected. 
>> Was an attempt to explore alternative ways of doing simple batch updates 
>> within the DAL.
>>
>> With the insert_or_update, my assumption was that it not possible to do a 
>> row.update, where the row was the subject of a join.
>> (correct me if I'm wrong).
>> Also didn't like additional query to get source data in the row.update 
>> method.
>> (that was the only data that was cached)
>>
>> Point taken with the post example (close to reality but trashed for 
>> security reasons). Sorry. 
>> Will do an example on hash files at the first opportunity.
>>
>> Point understood on SQLite vs production databases, but focus was on 
>> simple updates/ data prep.
>> Be gentle, haven't had a lot of sleep in the last 3 weeks :)
>>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to