Jed, Sorry, I oversight this mail. My other choice is to use mpich. Do you have any idea whether this will work?
Thomas Am 17.10.2012 22:21, schrieb Jed Brown: > As I said in the last mail, for the time-being you have to use a > non-broken MPI. I listed the tickets. Open MPI has had multiple > reduced test cases for several years now, but they haven't done > anything to fix it yet. > > I will write an alternate implementation of PetscSF that does not use > one-sided, but until then, you need a working MPI for the new > MatTranspose(). > > On Wed, Oct 17, 2012 at 2:57 PM, Thomas Witkowski > <thomas.witkowski at tu-dresden.de > <mailto:thomas.witkowski at tu-dresden.de>> wrote: > > -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121017/d3aa51b9/attachment.html>
