Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-13 Thread Lucas Campos
Dear Wolfgang,

Thank you for your explanation. Currently, I am using a code that was not
written by me, and uses the MatrixTools::apply_boundary_values() approach.
I try to change it to use the ConstraintMatrix one. For that, step-40 seems
to be the best starting point, like Mark did.

Thanks again,
Lucas



On 13 October 2017 at 16:43, Wolfgang Bangerth 
wrote:

> On 10/13/2017 08:39 AM, Lucas Campos wrote:
>
>>
>> In general, using MatrixTools::apply_boundary_values() is not the
>> way to go
>> with MPI programs. Rather, use a ConstraintMatrix and incorporate the
>> boundary
>> values into the same object as you do with hanging node constraints.
>>
>>
>> This is the way to go due to correctness, or in the sense of scalability?
>>
>
> MatrixTools::apply_boundary_values() needs access to the elements of the
> matrix because it wants to modify elements after they have already been
> written into the matrix. That is already difficult if the matrix is owned
> by PETSc or Trilinos -- we can get access to these elements, but it is not
> efficient to do so.
>
> But the bigger issue is that the function wants to access elements not
> only for the rows of constrained DoFs, but also for the columns. That means
> that you may have to access elements that are actually stored on other
> processors -- something that can not be done efficiently. Consequently,
> MatrixTools::apply_boundary_values() does not attempt to eliminate
> columns of the matrix, and you will end up with a non-symmetric matrix even
> if your problem is symmetric.
>
> It is better to use the approach via ConstraintMatrix that deals with
> entries before they even get into the matrix (and therefore in particular
> before matrix entries are sent to other processors).
>
> Best
>  W.
>
> --
> 
> Wolfgang Bangerth  email: bange...@colostate.edu
>
>www: http://www.math.colostate.edu/~bangerth/
>
> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see https://groups.google.com/d/fo
> rum/dealii?hl=en
> --- You received this message because you are subscribed to a topic in the
> Google Groups "deal.II User Group" group.
> To unsubscribe from this topic, visit https://groups.google.com/d/to
> pic/dealii/5hC7jODg-7k/unsubscribe.
> To unsubscribe from this group and all its topics, send an email to
> dealii+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-13 Thread Lucas Campos
Dear Bangerth,

When you mention 

In general, using MatrixTools::apply_boundary_values() is not the way to go 
> with MPI programs. Rather, use a ConstraintMatrix and incorporate the 
> boundary 
> values into the same object as you do with hanging node constraints.


This is the way to go due to correctness, or in the sense of scalability?

Bests,
Lucas

On Tuesday, 10 October 2017 17:48:57 UTC+2, Wolfgang Bangerth wrote:
>
> On 10/10/2017 08:40 AM, Mark Ma wrote: 
> > 
> > I want to solve a heat equation in the time domain with distributed 
> memory 
> > using MPI, but the results are incorrect. In order to do so, I reference 
> > tutorial step-23 for time updating method and step-40 for implementing 
> MPI. 
> > May I ask whether my boundary condition is right or not? Should we do 
> > compress() after apply_boundary_values()? Thanks in advance! 
>
> Jack -- how exactly is your solution wrong when you look at it? Do the 
> boundary values look wrong? Are they correct if you run your MPI program 
> with 
> just one MPI process? 
>
> In general, using MatrixTools::apply_boundary_values() is not the way to 
> go 
> with MPI programs. Rather, use a ConstraintMatrix and incorporate the 
> boundary 
> values into the same object as you do with hanging node constraints. 
> That's 
> what all of the parallel programs do, if I recall correctly. 
>
> Best 
>   W. 
>
> -- 
>  
> Wolfgang Bangerth  email: bang...@colostate.edu 
>  
> www: http://www.math.colostate.edu/~bangerth/ 
>
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-13 Thread Wolfgang Bangerth

On 10/13/2017 02:06 AM, Mark Ma wrote:


later, I changed the control into
|
|
SolverControlsolver_control (5*system_rhs.size(),1e-12*system_rhs.l2_norm())
|
|
this works well for structure size of um or nm. I think previous setting may 
lead to a loss of precision so that the results are always incorrect.


Yes, indeed -- using a tolerance relative to the size of the rhs vector is 
important.


Best
 W.

--

Wolfgang Bangerth  email: bange...@colostate.edu
   www: http://www.math.colostate.edu/~bangerth/

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.

To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[deal.II] Re: Error when applying initial values to MPI::Vector in multiple dimensions

2017-10-13 Thread 'Maxi Miller' via deal.II User Group
Additional: Even though it compiles in release mode, as soon as I run it 
with multiple nodes, I get a segfault at that place:
mpirun noticed that process rank 0 with PID 0 on node linux-lb8c exited on 
signal 11 (Segmentation fault).




Am Freitag, 13. Oktober 2017 11:05:01 UTC+2 schrieb Maxi Miller:
>
> I try to apply initial values to a vector defined as 
> *LinearAlgebraTrilinos::MPI::Vector 
> *using 
> VectorTools::project (dof_handler, hanging_node_constraints,
>  QGauss(fe.degree+1),
>  InitialValues(),
>  local_solution);
>
>
>
> When initializing the variable fe (as FESystem) with one or two 
> components, it works fine. For more than two components I get the error
>
>  
> An error occurred in line <1366> of file 
> <~/Downloads/dealii/include/deal.II/numerics/vector_tools.templates.h> in 
> function 
> void dealii::VectorTools::{anonymous}::project(const 
> dealii::Mapping&, const dealii::DoFHandler&, const 
> dealii::ConstraintMatrix&, const dealii::Quadrature&, const 
> dealii::Function&, VectorType&, bool, 
> const dealii
> ::Quadrature<(dim - 1)>&, bool) [with VectorType = 
> dealii::TrilinosWrappers::MPI::Vector; int dim = 2; typename 
> VectorType::value_type = double] 
> The violated condition was:  
> (dynamic_cast* > 
> (&(dof.get_triangulation()))==nullptr) 
> Additional information:  
> You are trying to use functionality in deal.II that is currently not 
> implemented. In many cases, this indicates that there simply didn't appear 
> much of a need for it, or that the author of the original code did not have 
> the time to implement a particular case. If you
>  hit this exception, it is therefore worth the time to look into the code to 
> find out whether you may be able to implement the missing functionality. If 
> you do, please consider providing a patch to the deal.II development sources 
> (see the deal.II website on how to contri
> bute). 
>  
> Stacktrace: 
> --- 
> #0  /opt/dealII/lib/libdeal_II.g.so.9.0.0-pre:  
> #1  /opt/dealII/lib/libdeal_II.g.so.9.0.0-pre: void 
> dealii::VectorTools::project<2, dealii::TrilinosWrappers::MPI::Vector, 
> 2>(dealii::DoFHandler<2, 2> const&, dealii::ConstraintMatrix const&, 
> dealii::Quadrature<2> const&, dealii::Function<2, 
> dealii::TrilinosWrappers::MPI
> ::Vector::value_type> const&, dealii::TrilinosWrappers::MPI::Vector&, bool, 
> dealii::Quadrature<(2)-(1)> const&, bool) 
> #2  ./main: Step15::MinimalSurfaceProblem<2>::run() 
> #3  ./main: main 
>  
>  
> [linux-lb8c:15830] *** Process received signal *** 
> [linux-lb8c:15830] Signal: Aborted (6) 
> [linux-lb8c:15830] Signal code:  (-6) 
> [linux-lb8c:15830] [ 0] /lib64/libpthread.so.0(+0x12270)[0x7f294a477270] 
> [linux-lb8c:15830] [ 1] /lib64/libc.so.6(gsignal+0x110)[0x7f2946c1f0d0] 
> [linux-lb8c:15830] [ 2] /lib64/libc.so.6(abort+0x151)[0x7f2946c206b1] 
> [linux-lb8c:15830] [ 3] 
> /opt/dealII/lib/libdeal_II.g.so.9.0.0-pre(+0x6b9e5d1)[0x7f295b49e5d1] 
> [linux-lb8c:15830] [ 4] 
> /opt/dealII/lib/libdeal_II.g.so.9.0.0-pre(_ZN6dealii18deal_II_exceptions9internals5abortERKNS_13ExceptionBaseE+0x1a)[0x7f295b49edaf]
>  
> [linux-lb8c:15830] [ 5] 
> /opt/dealII/lib/libdeal_II.g.so.9.0.0-pre(_ZN6dealii18deal_II_exceptions9internals11issue_errorINS_18StandardExceptions17ExcNotImplementedEEEvNS1_17ExceptionHandlingEPKciS7_S7_S7_T_+0x98)[0x7f2957373ea1]
>  
> [linux-lb8c:15830] [ 6] 
> /opt/dealII/lib/libdeal_II.g.so.9.0.0-pre(+0x3f38e23)[0x7f2958838e23] 
> [linux-lb8c:15830] [ 7] 
> /opt/dealII/lib/libdeal_II.g.so.9.0.0-pre(_ZN6dealii11VectorTools7projectILi2ENS_16TrilinosWrappers3MPI6VectorELi2EEEvRKNS_10DoFHandlerIXT_EXT1_EEERKNS_16ConstraintMatrixERKNS_10QuadratureIXT_EEERKNS_8FunctionIXT1_ENT0_10value_typeEEERSH_bRKNSC_IX
> miT_Li1b+0x2f)[0x7f295894906e] 
> [linux-lb8c:15830] [ 8] 
> ./main(_ZN6Step1521MinimalSurfaceProblemILi2EE3runEv+0xc08)[0x420d08] 
> [linux-lb8c:15830] [ 9] ./main(main+0x3c)[0x414ad0] 
> [linux-lb8c:15830] [10] 
> /lib64/libc.so.6(__libc_start_main+0xea)[0x7f2946c09f4a] 
> [linux-lb8c:15830] [11] ./main(_start+0x2a)[0x41477a] 
> [linux-lb8c:15830] *** End of error message *** 
> Abgebrochen (Speicherabzug geschrieben)
>
> when running in debug mode. It runs fine in release mode. Why does that 
> happen for more than two components, and how can I fix/circumvent that? Or 
> did I (again) forget something? 
>
> My minimal example is attached, the behaviour happens when setting 
> NUM_COMPONENTS via 
>
> #define NUM_COMPONENTS 100
>
> to a value larger than 2.
>
>
> Thank you!
>
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, 

[deal.II] Error when applying initial values to MPI::Vector in multiple dimensions

2017-10-13 Thread 'Maxi Miller' via deal.II User Group
I try to apply initial values to a vector defined as 
*LinearAlgebraTrilinos::MPI::Vector 
*using 
VectorTools::project (dof_handler, hanging_node_constraints,
 QGauss(fe.degree+1),
 InitialValues(),
 local_solution);



When initializing the variable fe (as FESystem) with one or two 
components, it works fine. For more than two components I get the error

 
An error occurred in line <1366> of file 
<~/Downloads/dealii/include/deal.II/numerics/vector_tools.templates.h> in 
function 
void dealii::VectorTools::{anonymous}::project(const dealii::Mapping&, 
const dealii::DoFHandler&, const dealii::ConstraintMatrix&, const 
dealii::Quadrature&, const dealii::Function&, VectorType&, bool, const dealii
::Quadrature<(dim - 1)>&, bool) [with VectorType = 
dealii::TrilinosWrappers::MPI::Vector; int dim = 2; typename 
VectorType::value_type = double] 
The violated condition was:  
(dynamic_cast* > 
(&(dof.get_triangulation()))==nullptr) 
Additional information:  
You are trying to use functionality in deal.II that is currently not 
implemented. In many cases, this indicates that there simply didn't appear much 
of a need for it, or that the author of the original code did not have the time 
to implement a particular case. If you
 hit this exception, it is therefore worth the time to look into the code to 
find out whether you may be able to implement the missing functionality. If you 
do, please consider providing a patch to the deal.II development sources (see 
the deal.II website on how to contri
bute). 
 
Stacktrace: 
--- 
#0  /opt/dealII/lib/libdeal_II.g.so.9.0.0-pre:  
#1  /opt/dealII/lib/libdeal_II.g.so.9.0.0-pre: void 
dealii::VectorTools::project<2, dealii::TrilinosWrappers::MPI::Vector, 
2>(dealii::DoFHandler<2, 2> const&, dealii::ConstraintMatrix const&, 
dealii::Quadrature<2> const&, dealii::Function<2, dealii::TrilinosWrappers::MPI
::Vector::value_type> const&, dealii::TrilinosWrappers::MPI::Vector&, bool, 
dealii::Quadrature<(2)-(1)> const&, bool) 
#2  ./main: Step15::MinimalSurfaceProblem<2>::run() 
#3  ./main: main 
 
 
[linux-lb8c:15830] *** Process received signal *** 
[linux-lb8c:15830] Signal: Aborted (6) 
[linux-lb8c:15830] Signal code:  (-6) 
[linux-lb8c:15830] [ 0] /lib64/libpthread.so.0(+0x12270)[0x7f294a477270] 
[linux-lb8c:15830] [ 1] /lib64/libc.so.6(gsignal+0x110)[0x7f2946c1f0d0] 
[linux-lb8c:15830] [ 2] /lib64/libc.so.6(abort+0x151)[0x7f2946c206b1] 
[linux-lb8c:15830] [ 3] 
/opt/dealII/lib/libdeal_II.g.so.9.0.0-pre(+0x6b9e5d1)[0x7f295b49e5d1] 
[linux-lb8c:15830] [ 4] 
/opt/dealII/lib/libdeal_II.g.so.9.0.0-pre(_ZN6dealii18deal_II_exceptions9internals5abortERKNS_13ExceptionBaseE+0x1a)[0x7f295b49edaf]
 
[linux-lb8c:15830] [ 5] 
/opt/dealII/lib/libdeal_II.g.so.9.0.0-pre(_ZN6dealii18deal_II_exceptions9internals11issue_errorINS_18StandardExceptions17ExcNotImplementedEEEvNS1_17ExceptionHandlingEPKciS7_S7_S7_T_+0x98)[0x7f2957373ea1]
 
[linux-lb8c:15830] [ 6] 
/opt/dealII/lib/libdeal_II.g.so.9.0.0-pre(+0x3f38e23)[0x7f2958838e23] 
[linux-lb8c:15830] [ 7] 
/opt/dealII/lib/libdeal_II.g.so.9.0.0-pre(_ZN6dealii11VectorTools7projectILi2ENS_16TrilinosWrappers3MPI6VectorELi2EEEvRKNS_10DoFHandlerIXT_EXT1_EEERKNS_16ConstraintMatrixERKNS_10QuadratureIXT_EEERKNS_8FunctionIXT1_ENT0_10value_typeEEERSH_bRKNSC_IX
miT_Li1b+0x2f)[0x7f295894906e] 
[linux-lb8c:15830] [ 8] 
./main(_ZN6Step1521MinimalSurfaceProblemILi2EE3runEv+0xc08)[0x420d08] 
[linux-lb8c:15830] [ 9] ./main(main+0x3c)[0x414ad0] 
[linux-lb8c:15830] [10] 
/lib64/libc.so.6(__libc_start_main+0xea)[0x7f2946c09f4a] 
[linux-lb8c:15830] [11] ./main(_start+0x2a)[0x41477a] 
[linux-lb8c:15830] *** End of error message *** 
Abgebrochen (Speicherabzug geschrieben)

when running in debug mode. It runs fine in release mode. Why does that happen 
for more than two components, and how can I fix/circumvent that? Or did I 
(again) forget something? 

My minimal example is attached, the behaviour happens when setting 
NUM_COMPONENTS via 

#define NUM_COMPONENTS 100

to a value larger than 2.


Thank you!

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
#include 
#include 
#include 
#include 
#include 
#include 
#include 
#include 
#include 

#include 
#include 
#include 
#include 
#include 
#include 
#include 
#include 

#include 
#include 
#include 
#include 
#include 
#include 

#include 
#include 
#include 
#include 
#include 
#include 

#include 
#include 
#include 

#include 
#include 
#include 
#include 

#include 

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-13 Thread Mark Ma
It is actually my stupid mistake in solvercontrol,

I used
SolverControl solver_control (dof_handler.n_dofs(),1e-12)
This works when the geometry size in the order of 1, but fails at 1e-6, or
even 1e-9. Here I actually want to make a micrometer or nanometer size
structure.

later, I changed the control into
SolverControl solver_control (5*system_rhs.size(),1e-12*system_rhs.l2_norm()
)
this works well for structure size of um or nm. I think previous setting
may lead to a loss of precision so that the results are always incorrect.

Thanks for your ideas and have a nice day,
Mark

2017-10-12 14:29 GMT+02:00 Wolfgang Bangerth :

> On 10/12/2017 03:02 AM, Mark Ma wrote:
>
>>
>> Thanks for your useful advice. Finally I solved this problem.
>>
>
> So what was the problem? Maybe we can learn from your example?
>
>
> Best
>  W.
>
> --
> 
> Wolfgang Bangerth  email: bange...@colostate.edu
>www: http://www.math.colostate.edu/~bangerth/
>
> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see https://groups.google.com/d/fo
> rum/dealii?hl=en
> --- You received this message because you are subscribed to a topic in the
> Google Groups "deal.II User Group" group.
> To unsubscribe from this topic, visit https://groups.google.com/d/to
> pic/dealii/5hC7jODg-7k/unsubscribe.
> To unsubscribe from this group and all its topics, send an email to
> dealii+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.