I looked at snow and  makeSOCKcluster(c("localhost", "localhost")) and
it perfectly suits my needs. In addition, I want to work on a "real
cluster" which uses MPI, so these two will not interfere. I plan to
use snowSOCKcluster(...) then for the usage of the cores, and MPI (on
the level of the whole simulation) to run one complete simulation on
one node.

Thanks

Rainer


On Mon, Sep 29, 2008 at 3:46 PM, Rainer M Krug <[EMAIL PROTECTED]> wrote:
> On Mon, Sep 29, 2008 at 3:05 PM, Luke Tierney <[EMAIL PROTECTED]> wrote:
>> Look at the configure output below: configure is finding mpi.h in
>> /usr/lib/mpich/include, soit is using mpich, not lam.  You may be able
>> to tell Rmpi specifically where to look for LAM stuff or you may need
>> to uninstall mpich (almost surely that can be avoided but it may be
>> easier if you don't really need mpich).
>
> OK - I don't mind if I am using lam or mpich, so which one should I use?
> I am confused.
>
> If it helps, I am using Ubuntu Hardy
>
> Rainer
>
>>
>> Best,
>>
>> luke
>>
>> On Mon, 29 Sep 2008, Rainer M Krug wrote:
>>
>>> On Mon, Sep 29, 2008 at 12:19 AM, Martin Morgan <[EMAIL PROTECTED]>
>>> wrote:
>>>>
>>>> "Rainer M Krug" <[EMAIL PROTECTED]> writes:
>>>>
>>>>> Hi
>>>>>
>>>>> I am trying to utilize my dual core processor (and later a
>>>>> High-performance clusters (HPC) ) by using the Rmpi, snow, snowfall,
>>>>> ... packages, but I am struggling at the beginning, i.e. to initialise
>>>>> the "cluster" on my dual core computer. Whenever I try to initialize
>>>>> it (via sfInit(parallel=TRUE, cpus=2) or mpi.spawn.Rslaves(nslaves=2)
>>>>> ), I get an error message:
>>>>>
>>>>>> sfInit(parallel=TRUE, cpus=2)
>>>>>
>>>>> Forced parallel. Using session: XXXXXXXXR_rkrug_143706_092708
>>>>>
>>>>> Error in mpi.comm.spawn(slave = mpitask, slavearg = args, nslaves =
>>>>> count,  :
>>>>>  MPI_Comm_spawn is not supported.
>>>>> Error in sfInit(parallel = TRUE, cpus = 2) :
>>>>>  Starting of snow cluster failed! Error in mpi.comm.spawn(slave =
>>>>> mpitask, slavearg = args, nslaves = count,  :
>>>>>  MPI_Comm_spawn is not supported.
>>>>>
>>>>> and
>>>>>
>>>>>> mpi.spawn.Rslaves(nslaves=2)
>>>>>
>>>>> Error in mpi.spawn.Rslaves(nslaves = 2) :
>>>>>  You cannot use MPI_Comm_spawn API
>>>>
>>>> This error comes from Rmpi,
>>>>
>>>>> head(mpi.spawn.Rslaves, 6)
>>>>
>>>> 1 function (Rscript = system.file("slavedaemon.R", package = "Rmpi"),
>>>> 2     nslaves = mpi.universe.size(), root = 0, intercomm = 2, comm = 1,
>>>> 3     hosts = NULL, needlog = TRUE, mapdrive = TRUE)
>>>> 4 {
>>>> 5     if (!is.loaded("mpi_comm_spawn"))
>>>> 6         stop("You cannot use MPI_Comm_spawn API")
>>>>
>>>> and occurs when the compiler variable MPI2 is undefined when your
>>>
>>>> package is installed. Likely this means that your mpi installation is
>>>> either old (unlikely?) or that your Rmpi installation failed to
>>>> properly detect the installed mpi version. It's difficult to know
>>>> which, without more information on how the mpi and Rmpi installations
>>>> went, at a minimum the result of R's sessionInfo() command and mpirun
>>>> --version but likely including the output of Rmpi's installation.
>>>
>>> Version:
>>> lamboot -V
>>>
>>> LAM 7.1.2/MPI 2 C++/ROMIO - Indiana University
>>>
>>>       Arch:           i486-pc-linux-gnu
>>>       Prefix:         /usr/lib/lam
>>>       Configured by:  buildd
>>>       Configured on:  Sun Mar 23 08:07:16 UTC 2008
>>>       Configure host: rothera
>>>       SSI rpi:        crtcp lamd sysv tcp usysv
>>>
>>> Below find the session info and the info from the instalaltion of Rmpi.
>>>
>>> In the install log from Rmpi, it says:
>>> checking whether MPICH2 is declared... no
>>> checking whether MPICH2 is declared... (cached) no
>>>
>>> So what went wrong? Do I have to start the "cluster" on the dual-core
>>> machine?
>>> Can I set the compiler variable MPI2 manually?
>>>
>>>> version
>>>
>>>              _
>>> platform       i486-pc-linux-gnu
>>> arch           i486
>>> os             linux-gnu
>>> system         i486, linux-gnu
>>> status
>>> major          2
>>> minor          7.2
>>> year           2008
>>> month          08
>>> day            25
>>> svn rev        46428
>>> language       R
>>> version.string R version 2.7.2 (2008-08-25)
>>>>
>>>> sessionInfo()
>>>
>>> R version 2.7.2 (2008-08-25)
>>> i486-pc-linux-gnu
>>>
>>> locale:
>>>
>>> LC_CTYPE=en_ZA.UTF-8;LC_NUMERIC=C;LC_TIME=en_ZA.UTF-8;LC_COLLATE=en_ZA.UTF-8;LC_MONETARY=C;LC_MESSAGES=en_ZA.UTF-8;LC_PAPER=en_ZA.UTF-8;LC_NAME=C;LC_ADDRESS=C;LC_TELEPHONE=C;LC_MEASUREMENT=en_ZA.UTF-8;LC_IDENTIFICATION=C
>>>
>>> attached base packages:
>>> [1] stats     graphics  grDevices utils     datasets  methods   base
>>>
>>> other attached packages:
>>> [1] snow_0.3-3    Rmpi_0.5-5    snowfall_1.53
>>>
>>> loaded via a namespace (and not attached):
>>> [1] tools_2.7.2
>>>>
>>>> install.packages(c("Rmpi"), dep=TRUE, repo="http://cbio.uct.ac.za/CRAN/";)
>>>
>>> Warning in install.packages(c("Rmpi"), dep = TRUE, repo =
>>> "http://cbio.uct.ac.za/CRAN/";) :
>>>  argument 'lib' is missing: using '/usr/local/lib/R/site-library'
>>> trying URL 'http://cbio.uct.ac.za/CRAN/src/contrib/Rmpi_0.5-5.tar.gz'
>>> Content type 'application/x-gzip' length 94643 bytes (92 Kb)
>>> opened URL
>>> ==================================================
>>> downloaded 92 Kb
>>>
>>> * Installing *source* package 'Rmpi' ...
>>> checking for gcc... gcc
>>> checking for C compiler default output file name... a.out
>>> checking whether the C compiler works... yes
>>> checking whether we are cross compiling... no
>>> checking for suffix of executables...
>>> checking for suffix of object files... o
>>> checking whether we are using the GNU C compiler... yes
>>> checking whether gcc accepts -g... yes
>>> checking for gcc option to accept ISO C89... none needed
>>> I am here /usr/lib/mpich and it is MPICH
>>> Try to find mpi.h ...
>>> Found in /usr/lib/mpich/include
>>> Try to find libmpi.so or libmpich.a
>>> Found libmpich in /usr/lib/mpich/lib
>>> ##########################################
>>> checking whether MPICH2 is declared... no
>>> checking whether MPICH2 is declared... (cached) no
>>> ##########################################
>>> checking for openpty in -lutil... yes
>>> checking for main in -lpthread... yes
>>> configure: creating ./config.status
>>> config.status: creating src/Makevars
>>> ** libs
>>> gcc -std=gnu99 -I/usr/share/R/include -DPACKAGE_NAME=\"\"
>>> -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\"
>>> -DPACKAGE_BUGREPORT=\"\" -DHAVE_DECL_MPICH2=0 -DHAVE_DECL_MPICH2=0
>>> -I/usr/lib/mpich/include  -DMPICH -fPIC     -fpic  -g -O2 -c
>>> conversion.c -o conversion.o
>>> gcc -std=gnu99 -I/usr/share/R/include -DPACKAGE_NAME=\"\"
>>> -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\"
>>> -DPACKAGE_BUGREPORT=\"\" -DHAVE_DECL_MPICH2=0 -DHAVE_DECL_MPICH2=0
>>> -I/usr/lib/mpich/include  -DMPICH -fPIC     -fpic  -g -O2 -c
>>> internal.c -o internal.o
>>> gcc -std=gnu99 -I/usr/share/R/include -DPACKAGE_NAME=\"\"
>>> -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\"
>>> -DPACKAGE_BUGREPORT=\"\" -DHAVE_DECL_MPICH2=0 -DHAVE_DECL_MPICH2=0
>>> -I/usr/lib/mpich/include  -DMPICH -fPIC     -fpic  -g -O2 -c
>>> RegQuery.c -o RegQuery.o
>>> gcc -std=gnu99 -I/usr/share/R/include -DPACKAGE_NAME=\"\"
>>> -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\"
>>> -DPACKAGE_BUGREPORT=\"\" -DHAVE_DECL_MPICH2=0 -DHAVE_DECL_MPICH2=0
>>> -I/usr/lib/mpich/include  -DMPICH -fPIC     -fpic  -g -O2 -c Rmpi.c -o
>>> Rmpi.o
>>> gcc -std=gnu99 -shared  -o Rmpi.so conversion.o internal.o RegQuery.o
>>> Rmpi.o -L/usr/lib/mpich/lib -lmpich -lutil -lpthread -fPIC
>>> -L/usr/lib/R/lib -lR
>>> ** R
>>> ** demo
>>> ** inst
>>> ** preparing package for lazy loading
>>> ** help
>>> >>> Building/Updating help pages for package 'Rmpi'
>>>    Formats: text html latex example
>>>  hosts                             text    html    latex
>>>  internal                          text    html    latex
>>>  mpi.abort                         text    html    latex
>>>  mpi.apply                         text    html    latex   example
>>>  mpi.barrier                       text    html    latex
>>>  mpi.bcast                         text    html    latex
>>>  mpi.bcast.Robj                    text    html    latex
>>>  mpi.bcast.cmd                     text    html    latex
>>>  mpi.cart.coords                   text    html    latex   example
>>>  mpi.cart.create                   text    html    latex   example
>>>  mpi.cart.get                      text    html    latex   example
>>>  mpi.cart.rank                     text    html    latex   example
>>>  mpi.cart.shift                    text    html    latex   example
>>>  mpi.cartdim.get                   text    html    latex   example
>>>  mpi.comm                          text    html    latex   example
>>>  mpi.comm.disconnect               text    html    latex
>>>  mpi.comm.free                     text    html    latex
>>>  mpi.comm.inter                    text    html    latex
>>>  mpi.comm.set.errhandler           text    html    latex
>>>  mpi.comm.spawn                    text    html    latex
>>>  mpi.const                         text    html    latex
>>>  mpi.dims.create                   text    html    latex   example
>>>  mpi.exit                          text    html    latex
>>>  mpi.finalize                      text    html    latex
>>>  mpi.gather                        text    html    latex   example
>>>  mpi.gather.Robj                   text    html    latex   example
>>>  mpi.get.count                     text    html    latex
>>>  mpi.get.processor.name            text    html    latex
>>>  mpi.get.sourcetag                 text    html    latex
>>>  mpi.info                          text    html    latex
>>>  mpi.init.sprng                    text    html    latex
>>>  mpi.intercomm.merge               text    html    latex
>>>  mpi.parSim                        text    html    latex
>>>  mpi.parapply                      text    html    latex   example
>>>  mpi.probe                         text    html    latex
>>>  mpi.realloc                       text    html    latex
>>>  mpi.reduce                        text    html    latex
>>>  mpi.remote.exec                   text    html    latex   example
>>>  mpi.scatter                       text    html    latex   example
>>>  mpi.scatter.Robj                  text    html    latex   example
>>>  mpi.send                          text    html    latex   example
>>>  mpi.send.Robj                     text    html    latex
>>>  mpi.sendrecv                      text    html    latex   example
>>>  mpi.setup.rng                     text    html    latex
>>>  mpi.spawn.Rslaves                 text    html    latex   example
>>>  mpi.universe.size                 text    html    latex
>>>  mpi.wait                          text    html    latex
>>> ** building package indices ...
>>> * DONE (Rmpi)
>>>
>>> The downloaded packages are in
>>>       /tmp/RtmpdX6zlZ/downloaded_packages
>>>>
>>>
>>>>
>>>> Dirk Eddelbuettel
>>>> (https://stat.ethz.ch/pipermail/r-devel/2008-September/050665.html)
>>>> suggested that snow's makeSOCKcluster is an easier starting point for
>>>> single computer 'clusters' or other configurations where significant
>>>> system administration is not desired -- these should work without
>>>> additional software on most systems, even if more limiting in the long
>>>> term (in my opinion). See, e.g., the examples on the help page for
>>>> clusterApply for basic operation. A bit oddly, apparently the
>>>> 'snowfall' package restricts snow's functionality to mpi clusters. So
>>>> you might start with snow directly.
>>>>
>>>> Martin
>>>>
>>>>>
>>>>> I followed the pdf "Developing parallel programs using snowfall" by
>>>>> Jochen Knaus,. installed the relevant libraries and programs, but it
>>>>> does not work.
>>>>>
>>>>> I am stuck.
>>>>>
>>>>> Any help appreciated,
>>>>>
>>>>> Rainer
>>>>>
>>>>>
>>>>> --
>>>>> Rainer M. Krug, PhD (Conservation Ecology, SUN), MSc (Conservation
>>>>> Biology, UCT), Dipl. Phys. (Germany)
>>>>>
>>>>> Centre of Excellence for Invasion Biology
>>>>> Faculty of Science
>>>>> Natural Sciences Building
>>>>> Private Bag X1
>>>>> University of Stellenbosch
>>>>> Matieland 7602
>>>>> South Africa
>>>>>
>>>>> ______________________________________________
>>>>> R-help@r-project.org mailing list
>>>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>>>> PLEASE do read the posting guide
>>>>> http://www.R-project.org/posting-guide.html
>>>>> and provide commented, minimal, self-contained, reproducible code.
>>>>
>>>> --
>>>> Martin Morgan
>>>> Computational Biology / Fred Hutchinson Cancer Research Center
>>>> 1100 Fairview Ave. N.
>>>> PO Box 19024 Seattle, WA 98109
>>>>
>>>> Location: Arnold Building M2 B169
>>>> Phone: (206) 667-2793
>>>>
>>>
>>>
>>>
>>>
>>
>> --
>> Luke Tierney
>> Chair, Statistics and Actuarial Science
>> Ralph E. Wareham Professor of Mathematical Sciences
>> University of Iowa                  Phone:             319-335-3386
>> Department of Statistics and        Fax:               319-335-3017
>>   Actuarial Science
>> 241 Schaeffer Hall                  email:      [EMAIL PROTECTED]
>> Iowa City, IA 52242                 WWW:  http://www.stat.uiowa.edu
>>
>
>
>
> --
> Rainer M. Krug, PhD (Conservation Ecology, SUN), MSc (Conservation
> Biology, UCT), Dipl. Phys. (Germany)
>
> Centre of Excellence for Invasion Biology
> Faculty of Science
> Natural Sciences Building
> Private Bag X1
> University of Stellenbosch
> Matieland 7602
> South Africa
>



-- 
Rainer M. Krug, PhD (Conservation Ecology, SUN), MSc (Conservation
Biology, UCT), Dipl. Phys. (Germany)

Centre of Excellence for Invasion Biology
Faculty of Science
Natural Sciences Building
Private Bag X1
University of Stellenbosch
Matieland 7602
South Africa

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to