Ahah! That did do the trick. Many thanks!

On Wed, Oct 8, 2014 at 11:57 PM, Dmitry Karpeyev <[email protected]>
wrote:

> libMesh isn't finding the MPI that PETSc installed. Earlier this fact was
> a bit obscured (to me) by your use of --with-mpi=0, I think.  In fact, it
> looks like libMesh isn't really finding the PETSc you built -- just the
> source tree.  Giving the explicit PETSC_ARCH to libMesh's configure should
> do the trick.  In your case it is PETSC_ARCH=arch-linux2-c-debug.
>
> Dmitry
> On Oct 8, 2014 10:00 PM, "Kameeko Kiwi" <[email protected]> wrote:
>
>> I reinstalled PETSc with mpi, with the flag suggested. I get different
>> errors now when I try to compile Libmesh with:
>>
>> ./configure --prefix=/home/kameeko/software/libmesh_build3
>> --enable-everything --enable-petsc --enable-mpi
>> PETSC_DIR=/home/kameeko/software/petsc
>>
>> the first of which is:
>>
>> In file included from
>> /home/kameeko/software/petsc/arch-linux2-c-debug/include/mpi.h:2175:0,
>>                  from ./contrib/parmetis/include/parmetis.h:17,
>>                  from src/partitioning/parmetis_partitioner.C:40:
>> /home/kameeko/software/petsc/arch-linux2-c-debug/include/mpicxx.h:2737:34:
>> error: declaration of C function 'void Parmetis::MPI::Init(int&, char**&)'
>> conflicts with
>> /home/kameeko/software/petsc/arch-linux2-c-debug/include/mpicxx.h:2736:13:
>> error: previous declaration 'void Parmetis::MPI::Init()' here
>>
>> I attached the new config logs and the end of the terminal output when I
>> run make for Libmesh...
>>
>> On Wed, Oct 8, 2014 at 9:47 PM, Dmitry Karpeyev <[email protected]>
>> wrote:
>>
>>> I should add that unless there is a good reason for using --with-mpi=0,
>>> simply replacing it with --download-mpich should fix this problem.
>>>
>>> Dmitry.
>>>
>>> On Wed, Oct 8, 2014 at 8:34 PM, Dmitry Karpeyev <[email protected]>
>>> wrote:
>>>
>>>> The problem seems stem from the fact that you are configuring PETSc
>>>> --with-mpi=0.
>>>> Is this what you want?
>>>>
>>>> This means no MPI is used by PETSc.  libMesh detects that fact, and
>>>> configures itself without MPI.
>>>> However, ParMETIS (distributed with libMesh) doesn't seem to know this
>>>> and attempts to use the
>>>> mpiuni headers from PETSc, as if they were the real MPI headers (I'm
>>>> not sure exactly how ParMETIS
>>>> is configured in this case and how it locates the mpiuni mpi.h
>>>> header).  In any event, the mpiuni mpi.h isn't
>>>> meant to be included by external libraries (e.g., ParMETIS) as an
>>>> "implementation" of MPI.
>>>>
>>>> Should mpiuni/mpi.h include petscsys.h so that PETSC_EXTERN is
>>>> defined?  That's a side question for
>>>> PETSc, but I don't see how ParMETIS would work without a real MPI, so
>>>> your solution
>>>> may be to turn it off ParMETIS (switch to METIS?), if you want to
>>>> continue using --with-mpi=0.
>>>>
>>>> Dmitry.
>>>>
>>>> On Wed, Oct 8, 2014 at 7:57 PM, Kameeko Kiwi <[email protected]> wrote:
>>>>
>>>>> I attached the configure logs and terminal output.
>>>>>
>>>>> The error I gave was the first error. I didn't see any complaints
>>>>> about petscconf.h, so I'm guessing it was included successfully. The PETSc
>>>>> "make test" was successful. I don't believe I have multiple versions of
>>>>> PETSc on my path, since this was my first time installing it.
>>>>>
>>>>> On Wed, Oct 8, 2014 at 7:30 PM, Dmitry Karpeyev <[email protected]>
>>>>> wrote:
>>>>>
>>>>>> Can you send PETSc's configure.log and libMesh's config.log? Even
>>>>>> better if you could send the terminal output of your libMesh configure 
>>>>>> run
>>>>>> (redirected to something like lib me she configure.log).
>>>>>>
>>>>>> It looks like you are configuring PETSc or libMesh (or both) in a way
>>>>>> that no MPI is found and PETSc's internal "fake MPI" (aka mpiuni) is 
>>>>>> being
>>>>>> used. That could be causing problems for libMesh, at least the way you 
>>>>>> got
>>>>>> it configured.
>>>>>>
>>>>>> Dmitry.
>>>>>> On Oct 8, 2014 6:03 PM, "Kameeko Kiwi" <[email protected]> wrote:
>>>>>>
>>>>>>> Hello,
>>>>>>>
>>>>>>> I'm trying to compile Libmesh with the latest version of PETSc
>>>>>>> (3.5.2),
>>>>>>> using the following command:
>>>>>>>
>>>>>>> ./configure --prefix=/home/kameeko/software/libmesh_build3
>>>>>>> --enable-everything --enable-petsc --disable-laspack
>>>>>>> PETSC_DIR=/home/kameeko/software/petsc
>>>>>>>
>>>>>>> Running make afterwards gives many errors, all similar to
>>>>>>>
>>>>>>> /home/kameeko/software/petsc/include/mpiuni/mpi.h:120:14: error:
>>>>>>> expected
>>>>>>> '=', ',', ';', 'asm' or '__attribute__' before 'void'
>>>>>>>
>>>>>>> Thanks.
>>>>>>>
>>>>>>> ------------------------------------------------------------------------------
>>>>>>> Meet PCI DSS 3.0 Compliance Requirements with EventLog Analyzer
>>>>>>> Achieve PCI DSS 3.0 Compliant Status with Out-of-the-box PCI DSS
>>>>>>> Reports
>>>>>>> Are you Audit-Ready for PCI DSS 3.0 Compliance? Download White paper
>>>>>>> Comply to PCI DSS 3.0 Requirement 10 and 11.5 with EventLog Analyzer
>>>>>>>
>>>>>>> http://pubads.g.doubleclick.net/gampad/clk?id=154622311&iu=/4140/ostg.clktrk
>>>>>>> _______________________________________________
>>>>>>> Libmesh-users mailing list
>>>>>>> [email protected]
>>>>>>> https://lists.sourceforge.net/lists/listinfo/libmesh-users
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
------------------------------------------------------------------------------
Meet PCI DSS 3.0 Compliance Requirements with EventLog Analyzer
Achieve PCI DSS 3.0 Compliant Status with Out-of-the-box PCI DSS Reports
Are you Audit-Ready for PCI DSS 3.0 Compliance? Download White paper
Comply to PCI DSS 3.0 Requirement 10 and 11.5 with EventLog Analyzer
http://pubads.g.doubleclick.net/gampad/clk?id=154622311&iu=/4140/ostg.clktrk
_______________________________________________
Libmesh-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to