Junchao,
It seems that the documentation is outdated for this piece of information.
In fact, neither PETScWrapper::MPI::Vector nor TrilinosWrappers::MPI::Vector
does have update_ghost_values.
What you should do is exactly what is done in the few lines of step-42 you
referenced.
"solution = distri
Daniel,
The link you provides is very helpful. Thanks. In the code, I see
solution_transfer.interpolate(distributed_solution);
constraints_hanging_nodes.distribute(distributed_solution);
solution = distributed_solution;
I am confused by the postprocessing. I think distributed_solution does not
h
Junchao,
You want to use parallel::distributed::SolutionTransfer instead if you are
on a parallel::distributed::Triangulation
Executing
$ grep -r "parallel::distributed::SolutionTransfer" .
in the examples folder, tells me that this object is used in step-32,
step-42 and step-48.
Have for exampl