Hey,
I just read another post about calling R from C. Someone on stackoverflow (DWin makes me suspect its David W.?) referenced this: http://www.math.univ-montp2.fr/~pudlo/R_files/call_R.pdf
Which made me think: Why is a loop in R bad, but in C not?

And where exactly does looping cost the most? I wrote a piece of code for my bachelor's thesis where I loop from 1 to 500, and estimate a boosted model in every iteration. The procedure takes 2-6 minutes. In this example the loop (instead of some kind of apply()) shouldn't cost too much time, right? I suspect it's way worse if someone would loop from 1 to 10000 and perform only a small task (a mean(), for example) in each loop. Can someone confirm this?

Regards,
 Alex

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to