On Mon, Apr 20, 2009 at 10:37 AM, Stefan Grosse <singularit...@gmx.net> wrote: > I know of people doing optimization stuff which needs a > lot of computational power. They use Matlab since it is easy for them > to use multiple processors (+multiple pc's). R at the moment only uses > one processor and also does not yet (there is a project working on it) > something like just in time compilation which appears to be in Matlab.
R can already use multiple processors. Some builds of R, and all builds of REvolution R, use mathematical libraries which will run many computations in parallel on multiprocessor/multicore machines. It's equally possible to run computations on multiple machines (e.g. clusters) with packages like ParallelR or snow. (I don't know how easy this is in Matlab, but it's pretty easy with R.) Parallel and distributed computing can often help where the computation runs in R, but takes a long time to complete. A separate issue is one where the data size is very large, in which case you may want to consider a 64-bit build of R (we just released one for Windows - seeĀ http://tinyurl.com/cuec9g [blog.revolution-computing.com]) or using a package like bigmemory or biglm. # David Smith -- David M Smith <da...@revolution-computing.com> Director of Community, REvolution Computing www.revolution-computing.com Tel: +1 (206) 577-4778 x3203 (San Francisco, USA) Check out our upcoming events schedule at www.revolution-computing.com/events ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.