Re: [OMPI users] Configure fails with icc 10.1.008

2007-12-07 Thread de Almeida, Valmor F.
Eric, I see you are using a gentoo distro like me. My version uses the vanilla kernel 2.6.22.9 and gcc-4.1.2. I have the following intel compiler versions installed: 10.0.026 10.1.008 10.1.011 9.1.052 None of them are able to build a functional version of openmpi-1.2.4. I've been posting thi

Re: [OMPI users] Configure fails with icc 10.1.008

2007-12-07 Thread Eric Thibodeau
Jeff, Thanks...at 23h30 coffee is far off... I saw the proper section of the config.log showing exactly that (hello world not working). For everyone else's benefit, ICC (up to 10.1.008) is _not_ compatible with GCC 4.2... (guess I'll have to retro back to 4.1 series...) Eric Jeff Squyres

Re: [OMPI users] Configure fails with icc 10.1.008

2007-12-07 Thread Chris Slaughter
I've been using Open MPI 1.2.4 with Intel 10.1 for about a month now with no problems. Can you compile a simple C++ hello world type program? I would try this to verify the compiler installation... On Dec 7, 2007 7:58 AM, Jeff Squyres wrote: > This is not an Open MPI problem; Open MPI is simply

Re: [OMPI users] Using mtrace with openmpi segfaults

2007-12-07 Thread Jeff Squyres
I'm not sure what using mallopt would do when combined with Open MPI's ptmalloc, but I can't imagine that it would be anything good. If you want users to be able to use mallopt, you should probably disable Open MPI's ptmalloc. On Dec 6, 2007, at 10:16 AM, Jeffrey M Ceason wrote: Is ther

Re: [OMPI users] Configure fails with icc 10.1.008

2007-12-07 Thread Jeff Squyres
This is not an Open MPI problem; Open MPI is simply reporting that your C++ compiler is not working. OMPI tests a trivial C++ program that uses the STL to ensure that your C++ program is working. It's essentially: #include int main () { std::string foo = "Hello, world" ; return 0; }

Re: [OMPI users] MPI_Bcast not broadcast to all processes

2007-12-07 Thread alireza ghahremanian
Dear Jeff I want to send an integer vector of size 4000.It is a very confusing problem. --- Jeff Squyres wrote: > If you're seeing the same error from 2 entirely > different MPI > implementations, it is possible that it is an error > in your code. > > Ensure that all processes are calling MP