Hello,
I now able to simulate a two fields non-linear system with fem, and I
get a good speed up with Dirichlet BC. Thanks for your help.
About Neumann BC, I need to add "-ksp_type fgmres", else the line search
failed (even with -snes_mf).
More over, the Neumann version works only in sequential. Indeed, in
parallel, the boundary conditions are applied inside the domain. I guess
it is because the label are automatically add after the mesh
partitioning. This problem doesn't appear in ex12 because there is a
test in the plugin function, but it is not possible to do the same thing
with a complex shape. I don't know what it must be done.
In Neumann case, the 1D-integral apears only in the residual part (ie
the right side hand of the system). Is it done in the
FEMIntegrateBdResidualBatch function ?
Thanks
Olivier B
On 10/10/2013 02:37 AM, Matthew Knepley wrote:
On Thu, Oct 3, 2013 at 4:48 AM, Olivier Bonnefon
<olivier.bonne...@avignon.inra.fr
<mailto:olivier.bonne...@avignon.inra.fr>> wrote:
Hello,
Thank you for your answer, I'm now able to run the ex12.c with
Neumann BC (options -bc_type neumann -interpolate 1).
I have adapted the ex12.c for the 2D-system:
-\Delta u +u =0
It consists in adapting the fem->f0Funcs[0] function and adding
the jacobian function fem->g0Funcs[0].
My implementation works for Dirichlet BC.
With Neumann BC(with options -bc_type neumann -interpolate 1), the
line search failed. I think my jacobian functions are corrects,
because the option "-snes_mf_operator" leads to the same behavior.
Sorry this took me a long time. Do you mean that it converges with
-snes_mf?
Do you know what I have missed ?
In Neumann case, Where is added the 1d-integral along \delta \Omega ?
You are correct that I am not doing the correct integral for the
Jacobian. I will put it in soon. That
is why it should work with the FD approximation since the residual is
correct.
Thanks,
Matt
Thanks,
Olivier Bonnefon
On 09/26/2013 06:54 PM, Matthew Knepley wrote:
On Thu, Sep 26, 2013 at 6:04 AM, Olivier Bonnefon
<olivier.bonne...@avignon.inra.fr
<mailto:olivier.bonne...@avignon.inra.fr>> wrote:
Hello,
I have implemented my own system from ex12.c. It works with
Dirichlet BC, but failed with Neumann one.
So, I'm came back to the example
/src/snes/example/tutorial/ex12.c, and I tried with Neumann BC:
./ex12 -bc_type NEUMANN
Here is the full list of tests I run (just checked that it passes
in 'next'):
https://bitbucket.org/petsc/petsc/src/f34a81fe8510aa025c9247a5b14f0fe30e3c0bed/config/builder.py?at=master#cl-175
Make sure you use an interpolated mesh with Neumann conditions
since you need faces.
Matt
This leads to the following crach:
[0]PETSC ERROR: --------------------- Error Message
------------------------------------
[0]PETSC ERROR: No support for this operation for this object
type!
[0]PETSC ERROR: Unsupported number of vertices 0 in cell 8
for element geometry computation!
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble
shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: ./ex12 on a arch-linux2-c-debug named
pcbiom38 by olivierb Thu Sep 26 14:53:32 2013
[0]PETSC ERROR: Libraries linked from
/home/olivierb/SOFT/petsc-3.4.2/arch-linux2-c-debug/lib
[0]PETSC ERROR: Configure run at Thu Sep 26 14:44:42 2013
[0]PETSC ERROR: Configure options --with-debugging=1
--download-fiat --download-scientificpython
--download-generator --download-triangle --download-ctetgen
--download-chaco --download-netcdf --download-hdf5
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: DMPlexComputeCellGeometry() line 732 in
/home/olivierb/SOFT/petsc-3.4.2/src/dm/impls/plex/plexgeometry.c
[0]PETSC ERROR: DMPlexComputeResidualFEM() line 558 in
/home/olivierb/SOFT/petsc-3.4.2/src/dm/impls/plex/plexfem.c
[0]PETSC ERROR: SNESComputeFunction_DMLocal() line 75 in
/home/olivierb/SOFT/petsc-3.4.2/src/snes/utils/dmlocalsnes.c
[0]PETSC ERROR: SNESComputeFunction() line 1988 in
/home/olivierb/SOFT/petsc-3.4.2/src/snes/interface/snes.c
[0]PETSC ERROR: SNESSolve_NEWTONLS() line 162 in
/home/olivierb/SOFT/petsc-3.4.2/src/snes/impls/ls/ls.c
[0]PETSC ERROR: SNESSolve() line 3636 in
/home/olivierb/SOFT/petsc-3.4.2/src/snes/interface/snes.c
[0]PETSC ERROR: main() line 582 in
"unknowndirectory/"/home/olivierb/solvers/trunk/SandBox/PETSC/LANDSCAPE/REF/ex12.c
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 56.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI
processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
With Gbd, I saw that DMPlexGetConeSize is 0 for the last point.
Do I have forget a step to use Neumann BC ?
Thanks
Olivier Bonnefon
--
Olivier Bonnefon
INRA PACA-Avignon, Unité BioSP
Tel: +33 (0)4 32 72 21 58
<tel:%2B33%20%280%294%2032%2072%2021%2058>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to
which their experiments lead.
-- Norbert Wiener
--
Olivier Bonnefon
INRA PACA-Avignon, Unité BioSP
Tel:+33 (0)4 32 72 21 58 <tel:%2B33%20%280%294%2032%2072%2021%2058>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which
their experiments lead.
-- Norbert Wiener
--
Olivier Bonnefon
INRA PACA-Avignon, Unité BioSP
Tel: +33 (0)4 32 72 21 58