Hi, I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran the examples in the demo directory.
$ python test_mat_ksp.py => runs as expected (serial) $ mpiexec -np 2 python test_mat_ksp.py => fails with the following output: Traceback (most recent call last): File "<...>/demo/kspsolve/test_mat_ksp.py", line 15, in <module> execfile('petsc-ksp.py') File "<...>/demo/kspsolve/test_mat_ksp.py", line 6, in execfile try: exec(fh.read()+"\n", globals, locals) File "<string>", line 15, in <module> File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve (src/petsc4py.PETSc.c:153555) petsc4py.PETSc.Error: error code 92 [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c [0] PCSetUp_ICC() line 21 in <...>/src/ksp/pc/impls/factor/icc/icc.c [0] MatGetFactor() line 4240 in<...>/src/mat/interface/matrix.c [0] You cannot overwrite this option since that will conflict with other previously set options [0] Could not locate solver package (null). Perhaps you must ./configure with --download-(null) <...> ------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code.. Per user-direction, the job has been aborted. ------------------------------------------------------- -------------------------------------------------------------------------- mpirun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[23110,1],0] Exit code: 1 -------------------------------------------------------------------------- What have I done wrong ?