Hi,

I have this sample-code (see above) and I was wondering wether it is possible 
to speed things up.



What this code does is the following:

x is 4D array (you can imagine it as x, y, z-coordinates and a time-coordinate).

So x contains 50x50x50 data-arrays for 91 time-points.

Now I want to reduce the 91 time-points.
I want to merge three consecutive time points to one time-points by calculating 
the mean of this three time-points for every x,y,z coordinate.

The reduce-sequence defines which time-points should get merged.
And the apply-function in the for-loop calculates the mean of the three 
3D-Arrays and puts them into a new 4D array (data_reduced).



The problem is that even in this example it takes really long.
I thought apply would already vectorize, rather than loop over every coordinate.

But for my actual data-set it takes a really long time … So I would be really 
grateful for any suggestions how to speed this up.




x <- array(rnorm(50 * 50 * 50 * 90, 0, 2), dim=c(50, 50, 50, 91))



data_reduced <- array(0, dim=c(50, 50, 50, 90/3))

reduce <- seq(1,90, 3)



for( i in 1:length(reduce) ) {

        data_reduced[ , , , i]    <-    apply(x[ , , , reduce[i] : 
(reduce[i]+3) ], 1:3, mean) 
}

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to