Hi freinds, I have parallelized a code for a Thermoelastic problem based on step-40 . Although the non-parallel code works for all the initial values and boundary conditions, the parallel one fails giving this error : "PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range"
It seems from debugger backtracethat that error comes from the PETScWrappers::SolverCG. Surprisingly, the problem converges for few initial value or boundary conditions (not for all problem conditions) when I change the preconditioner from PreconditionAMG to PreconditionJacobi. It would be appreciated if you could give me any clue why I have this error. The following is my solve object: template <int dim> unsigned int Solid<dim>::solve () { LA::MPI::Vector completely_distributed_solution (locally_owned_dofs, mpi_communicator); SolverControl solver_control (dof_handler.n_dofs(), 1e-12); #ifdef USE_PETSC_LA LA::SolverCG solver(solver_control, mpi_communicator); #else LA::SolverCG solver(solver_control); #endif LA::MPI::PreconditionAMG preconditioner; LA::MPI::PreconditionAMG::AdditionalData data; #ifdef USE_PETSC_LA data.symmetric_operator = true; //data.strong_threshold = 0.5; #else /* Trilinos defaults are good */ #endif preconditioner.initialize(tangent_matrix, data); // LA::MPI::PreconditionJacobi preconditioner(tangent_matrix); solver.solve (tangent_matrix, completely_distributed_solution, system_rhs, preconditioner); constraints.distribute (completely_distributed_solution); solution_update = completely_distributed_solution; return solver_control.last_step(); } Thanks, -- The deal.II project is located at http://www.dealii.org/ For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en --- You received this message because you are subscribed to the Google Groups "deal.II User Group" group. To unsubscribe from this group and stop receiving emails from it, send an email to dealii+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout.