Re: [deal.II] Copy blocks of BlockSparseMatrix

2017-10-13 Thread Alex Jarauta
Hi all, I am having a similar problem. I want to take a 3x3 BlockSparseMatrix m33 and copy four of its blocks into a 2x2 BlockSparseMatrix m22. For instance: m33 = [ A B C D E F G H I ] >From this matrix, I would like to obtain the following: m22 = [ E F

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-13 Thread Lucas Campos
Dear Wolfgang, Thank you for your explanation. Currently, I am using a code that was not written by me, and uses the MatrixTools::apply_boundary_values() approach. I try to change it to use the ConstraintMatrix one. For that, step-40 seems to be the best starting point, like Mark did. Thanks agai

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-13 Thread Wolfgang Bangerth
On 10/13/2017 08:39 AM, Lucas Campos wrote: In general, using MatrixTools::apply_boundary_values() is not the way to go with MPI programs. Rather, use a ConstraintMatrix and incorporate the boundary values into the same object as you do with hanging node constraints. This is th

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-13 Thread Lucas Campos
Dear Bangerth, When you mention In general, using MatrixTools::apply_boundary_values() is not the way to go > with MPI programs. Rather, use a ConstraintMatrix and incorporate the > boundary > values into the same object as you do with hanging node constraints. This is the way to go due to

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-13 Thread Wolfgang Bangerth
On 10/13/2017 02:06 AM, Mark Ma wrote: later, I changed the control into | | SolverControlsolver_control (5*system_rhs.size(),1e-12*system_rhs.l2_norm()) | | this works well for structure size of um or nm. I think previous setting may lead to a loss of precision so that the results are always i

[deal.II] Re: Error when applying initial values to MPI::Vector in multiple dimensions

2017-10-13 Thread 'Maxi Miller' via deal.II User Group
Additional: Even though it compiles in release mode, as soon as I run it with multiple nodes, I get a segfault at that place: mpirun noticed that process rank 0 with PID 0 on node linux-lb8c exited on signal 11 (Segmentation fault). Am Freitag, 13. Oktober 2017 11:05:01 UTC+2 schrieb Maxi Mil

[deal.II] Error when applying initial values to MPI::Vector in multiple dimensions

2017-10-13 Thread 'Maxi Miller' via deal.II User Group
I try to apply initial values to a vector defined as *LinearAlgebraTrilinos::MPI::Vector *using VectorTools::project (dof_handler, hanging_node_constraints, QGauss(fe.degree+1), InitialValues(), local_solution); When initializing the variable fe (as FESystem) with one or two components, i

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-13 Thread Mark Ma
It is actually my stupid mistake in solvercontrol, I used SolverControl solver_control (dof_handler.n_dofs(),1e-12) This works when the geometry size in the order of 1, but fails at 1e-6, or even 1e-9. Here I actually want to make a micrometer or nanometer size structure. later, I changed the con