I am seeing several Cart Fortran tests (like MPI_Cart_coords_f) segv in
opal_memory_ptmalloc2_int_free when OMPI trunk is compiled with icc
12.1.0 for 64 bit on linux. Just wondering if anyone has seen anything
similar to this with a different version of icc. Other non-Intel
compilers seem t
Per Brian's recent MTL updates, the PSM MTL is busted. I notice the following
when I run on a machine that has the PSM software stack installed:
[ompi_r00lez:19108] mca: base: component_find: unable to open
/scratch/local/jsquyres/bogus/lib/openmpi/mca_mtl_psm:
/scratch/local/jsquyres/bogus/li
On 5/24/12 8:55 AM, "Jeff Squyres" wrote:
>Per Brian's recent MTL updates, the PSM MTL is busted. I notice the
>following when I run on a machine that has the PSM software stack
>installed:
>
>[ompi_r00lez:19108] mca: base: component_find: unable to open
>/scratch/local/jsquyres/bogus/lib/openmp
On May 24, 2012, at 12:07 PM, Barrett, Brian W wrote:
>> I'll file a bug about this; I'm assuming this is an issue the 1.7 RMs
>> will care about.
>
> Did you file a bug? Ralph fixed this one and I fixed its sister in probe,
> so all should work now...
Yep -- Ralph closed it.
Thanks!
--
Jef
Terry,
What you are seeing is a bug in the vectorizer in the Intel 2011.6.233 release.
We've talked about this before. You should probably remove that compiler from
your system(s). I think the new release of OpenMPI describes this problem, but
does not stop if from occurring. I write a patc
Actually, I don't think the below is the issue. I think the
OMPI_ARRAY_INT_2_LOGICAL macro is doing a free on line 193 when it
shouldn't because the OMPI_ARRAY_LOGICAL_2_INT macro calling an empty
OMPI_ARRAY_LOGICAL__2_INT_ALLOC macro which in the other case that macro
actually does a malloc.
Forgot to add the date of my compiler was 2011.10.11 so I wonder if it
might not have this issue you mentioned below. Anyways, I'll keep the
below in mind as I try to run more tests.
thanks,
--td
On 5/24/2012 2:06 PM, Larry Baker wrote:
Terry,
What you are seeing is a bug in the vectorizer
FYI.
I think I have fixes ready, but I am bummed that we didn't fix the whole
paffinity mess properly in 1.6. :-(
Begin forwarded message:
> From: Open MPI
> Subject: [Open MPI] #3108: Affinity still busted in v1.6
> Date: May 24, 2012 2:59:42 PM EDT
> Cc:
>
> #3108: Affinity still busted i
FWIW, I think we imported your patch a while ago. Here it is on the trunk:
https://svn.open-mpi.org/trac/ompi/browser/trunk/opal/mca/memory/linux/malloc.c#L3933
And here it is on v1.6:
https://svn.open-mpi.org/trac/ompi/browser/branches/v1.6/opal/mca/memory/linux/malloc.c#L3933
On May 24, 20
Dang! You're absolutely right! Mea culpa.
Thanks,
Larry Baker
US Geological Survey
650-329-5608
ba...@usgs.gov
On 24 May 2012, at 1:19 PM, Jeff Squyres wrote:
> FWIW, I think we imported your patch a while ago. Here it is on the trunk:
>
> https://svn.open-mpi.org/trac/ompi/browser/trunk/
10 matches
Mail list logo