Re: [deal.II] Re: Transferring solutions in distributed computing

2016-07-21 Thread Daniel Arndt
Junchao, It seems that the documentation is outdated for this piece of information. In fact, neither PETScWrapper::MPI::Vector nor TrilinosWrappers::MPI::Vector does have update_ghost_values. What you should do is exactly what is done in the few lines of step-42 you referenced. "solution = distri

Re: [deal.II] Re: Transferring solutions in distributed computing

2016-07-21 Thread Junchao Zhang
Daniel, The link you provides is very helpful. Thanks. In the code, I see solution_transfer.interpolate(distributed_solution); constraints_hanging_nodes.distribute(distributed_solution); solution = distributed_solution; I am confused by the postprocessing. I think distributed_solution does not h

[deal.II] Re: Transferring solutions in distributed computing

2016-07-21 Thread Daniel Arndt
Junchao, You want to use parallel::distributed::SolutionTransfer instead if you are on a parallel::distributed::Triangulation Executing $ grep -r "parallel::distributed::SolutionTransfer" . in the examples folder, tells me that this object is used in step-32, step-42 and step-48. Have for exampl