Re: [OMPI devel] complete newbie question regarding --enable-mpi-profile option

2009-06-16 Thread Nifty Tom Mitchell
On Tue, Jun 16, 2009 at 12:49:52AM +0530, Leo P. wrote: > >Hi Eugene, >Thanks for the information. And i had already clicked on the "Show >All" button in the profiler before i send an email to the group. But >it did not work :( >Also Eugene, can you please help me understand

Re: [OMPI devel] Hang in collectives involving shared memory

2009-06-16 Thread Bryan Lally
Ashley Pittman wrote: Whilst the fact that it appears to only happen on your machine implies it's not a general problem with OpenMPI the fact that it happens in the same location/rep count every time does swing the blame back the other way. This sounds a _lot_ like the problem I was seeing, my

Re: [OMPI devel] Hang in collectives involving shared memory

2009-06-16 Thread Ashley Pittman
On Tue, 2009-06-16 at 13:39 -0600, Bryan Lally wrote: > Ashley Pittman wrote: > > > Whilst the fact that it appears to only happen on your machine implies > > it's not a general problem with OpenMPI the fact that it happens in the > > same location/rep count every time does swing the blame back th

Re: [OMPI devel] Hang in collectives involving shared memory

2009-06-16 Thread Bryan Lally
Ashley Pittman wrote: Do you have a stack trace of your hung application to hand, in particular when you say "All processes have made the same call to MPI_Allreduce. The processes are all in opal_progress, called (with intervening calls) by MPI_Allreduce." do the intervening calls include mc

Re: [OMPI devel] Enabling debugging and profiling in openMPI (make"CFLAGS=-pg -g")

2009-06-16 Thread Jeff Squyres
On Jun 12, 2009, at 3:56 PM, Ralph Castain wrote: The firewall should already be solved. Basically, you have to define a set of ports in your firewall that will let TCP messages pass through, and then tell OMPI to use those ports for both the TCP BTL and the OOB. "ompi_info --params btl t

[OMPI devel] 1.3.3 Release Schedule

2009-06-16 Thread Brad Benton
All: We are close to releasing 1.3.3. This is the current plan: - Evening of 6/16: collect MTT runs on the current branch w/the current 1.3.3 features & fixes - If all goes well with the overnight MTT runs, roll a release candidate on 6/17 - Put 1.3.3rc1 through its paces over the next coupl