updating 40.000 records should take no longer than a couple of minutes.
I think you should optimise your query before going any further.
You have an inner SELECT sentence that executes before anything. It
joins EVERY row in your table (1,000,000+) with at most 3 other rows in
the same table, so
Thanks a lot.. That is what i searched.. In fact your query is very good for
little changes, but i will have to use another method when updating all my rows
because the performance is not very good alas.
My data set contains something like 4 rows to update in 1+ million records
and data_raw
I understand data_sys is the average value for the 3 days, from at the
day before to the day after.
This should do what you want, in one pass. Check the average function in
the subselect. If what you want is to divide by 3 no matter how many
records where found, enable the commented lin
On Sun, Jan 23, 2005 at 11:36:11AM +, adam etienne wrote:
> In fact the computation is somewhat more complex than an average and the
> data set is quite large... I did some test with view & triggers but it's
> too slow..
Can you provide any more detail about the algorithm and the number
of
Thanks for your answer
In fact the computation is somewhat more complex than an average and the
data set is quite large... I did some test with view & triggers but it's
too slow..
Moreover, sometime i need to do big insertion or update and then other time
i need juste little updat
On Sat, Jan 22, 2005 at 12:51:20PM +, adam etienne wrote:
>
> I have some trouble updating a table like this one :
> date | data_raw | data_sys
> 12-01 | 5 | 4.5
> 13-01 | 6 | 6
> 14-01 | 7 |