Hope I didn't misunderstand your question. If you implement
your profiling library in C where you do your real instrumentation,
you don't need to implement the fortran layer, you can simply link
with Fortran to C MPI wrapper library -lmpi_f77. i.e.
<OMPI>/bin/mpif77 -o foo foo.f -L<OMPI>/lib -lmpi_f77 -lYourProfClib
where libYourProfClib.a is your profiling tool written in C. If you
don't want to intercept the MPI call twice for fortran program,
you need to implment fortran layer. In that case, I would think you
can just call C version of PMPI_xxx directly from your fortran
layer, e.g.
void mpi_comm_rank_(MPI_Comm *comm, int *rank, int *info) {
printf("mpi_comm_rank call successfully intercepted\n");
*info = PMPI_Comm_rank(comm,rank);
}
A.Chan
----- "Nick Wright" <nwri...@sdsc.edu> wrote:
Hi
I am trying to use the PMPI interface with OPENMPI to profile a
fortran program.
I have tried with 1.28 and 1.3rc1 with --enable-mpi-profile switched
on.
The problem seems to be that if one eg. intercepts to call to
mpi_comm_rank_ (the fortran hook) then calls pmpi_comm_rank_ this
then
calls MPI_Comm_rank (the C hook) not PMPI_Comm_rank as it should.
So if one wants to create a library that can profile C and Fortran
codes at the same time one ends up intercepting the mpi call
twice. Which is
not desirable and not what should happen (and indeed doesn't
happen in
other MPI implementations).
A simple example to illustrate is below. If somebody knows of a
fix to
avoid this issue that would be great !
Thanks
Nick.
pmpi_test.c: mpicc pmpi_test.c -c
#include<stdio.h>
#include "mpi.h"
void mpi_comm_rank_(MPI_Comm *comm, int *rank, int *info) {
printf("mpi_comm_rank call successfully intercepted\n");
pmpi_comm_rank_(comm,rank,info);
}
int MPI_Comm_rank(MPI_Comm comm, int *rank) {
printf("MPI_comm_rank call successfully intercepted\n");
PMPI_Comm_rank(comm,rank);
}
hello_mpi.f: mpif77 hello_mpi.f pmpi_test.o
program hello
implicit none
include 'mpif.h'
integer ierr
integer myid,nprocs
character*24 fdate,host
call MPI_Init( ierr )
myid=0
call mpi_comm_rank(MPI_COMM_WORLD, myid, ierr )
call mpi_comm_size(MPI_COMM_WORLD , nprocs, ierr )
call getenv('HOST',host)
write (*,*) 'Hello World from proc',myid,' out of',nprocs,host
call mpi_finalize(ierr)
end
_______________________________________________
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users
_______________________________________________
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users