[deal.II] Re: Error when writing files using write_vtu_header

2017-10-16 Thread Denis Davydov
AFAIK there is no know problem. Make sure you write this only from a single MPI core, though. Otherwise you will have multiple MPI cores trying to write into the same file. Regards, Denis. On Tuesday, October 17, 2017 at 8:43:55 AM UTC+2, Maxi Miller wrote: > > > Sometimes when running my progr

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-16 Thread Wolfgang Bangerth
On 10/16/2017 01:53 PM, Mark Ma wrote: I think this problem lies in the time updating of solution using old_solution, since the mass_matrix and laplace_matrix have already eliminated the constraint node, /*mass_matrix_T.vmult (system_rhs, old_solution_T_cal*//*);*/ is no longer valid for this

Re: [deal.II] Copy blocks of BlockSparseMatrix

2017-10-16 Thread Alex Jarauta
Hi Daniel, thanks for your kind reply. I apologize for not giving more details about my code. In the file where the global matrix (m33) and its sparsity pattern is created, I also include the declaration and initialization of the new matrix (m22). After that, the new matrix is of size 2x2 bloc

Re: [deal.II] Need for simple applied examples

2017-10-16 Thread Mark Ma
Hi Konstantin, Very very long time not contact with you, how are you doing? I am sorry I really have not too much time that could spend on Deal.II project, the Maxwell equation's problem and so on. what you propose here I think is right, at least in the view of very beginner of deal.II like m

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-16 Thread Mark Ma
source code are attached. 在 2017年10月16日星期一 UTC+2下午9:14:13,Wolfgang Bangerth写道: > > On 10/16/2017 09:23 AM, Mark Ma wrote: > > > > It almost looks to me like you're applying both Dirichlet values (via > > the ConstraintMatrix) and Neumann boundary values (via a boundary > > integral). But then

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-16 Thread Mark Ma
> On 10/16/2017 09:23 AM, Mark Ma wrote: > > > > It almost looks to me like you're applying both Dirichlet values (via > > the ConstraintMatrix) and Neumann boundary values (via a boundary > > integral). But then I haven't taken a look at the code, so I can't > > really say for sure. > > >

Re: [deal.II] Re: Error when applying initial values to MPI::Vector in multiple dimensions

2017-10-16 Thread Denis Davydov
interpoalate just evaluates the function at support points of FE basis (assuming that you have one with support points) and sets those values to DoFs, whereas project (as the name implies) does L2 projection. Thus as others have mentioned you are solving Mx = U where M is the mass matrix. Regar

[deal.II] Re: Error when applying initial values to MPI::Vector in multiple dimensions

2017-10-16 Thread 'Maxi Miller' via deal.II User Group
Yes, .interpolate() works fine. But what is the difference between interpolate() and project()? Am Montag, 16. Oktober 2017 08:54:02 UTC+2 schrieb Denis Davydov: > > Or you may want to use interpolate if this is enough > > https://www.dealii.org/developer/doxygen/deal.II/namespaceVectorTools.html

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-16 Thread Wolfgang Bangerth
On 10/16/2017 09:23 AM, Mark Ma wrote: It almost looks to me like you're applying both Dirichlet values (via the ConstraintMatrix) and Neumann boundary values (via a boundary integral). But then I haven't taken a look at the code, so I can't really say for sure. I think I only applied Dirich

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-16 Thread Mark Ma
在 2017年10月16日星期一 UTC+2下午5:05:20,Wolfgang Bangerth写道: > > On 10/16/2017 08:55 AM, Mark Ma wrote: > > > > > So when you visualize the solution, the error is at the > boundary > > but it looks > > > correct in the interior? > > > > > > Yes, it is. Thereafter, t

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-16 Thread Wolfgang Bangerth
On 10/16/2017 08:55 AM, Mark Ma wrote: > So when you visualize the solution, the error is at the boundary but it looks > correct in the interior? > > Yes, it is. Thereafter, the error at boundaries does propagate and then > interfere with the interior.

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-16 Thread Mark Ma
在 2017年10月16日星期一 UTC+2下午4:31:25,Wolfgang Bangerth写道: > > > > So when you visualize the solution, the error is at the boundary but > it looks > > correct in the interior? > > > > Yes, it is. Thereafter, the error at boundaries does propagate and then > > interfere with the interior. >

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-16 Thread Wolfgang Bangerth
So when you visualize the solution, the error is at the boundary but it looks correct in the interior? Yes, it is. Thereafter, the error at boundaries does propagate and then interfere with the interior. From the pictures you posted in a follow-up, it looks like the boundary values

[deal.II] Re: Solving time dependent heat equation with MPI (PETsc)

2017-10-16 Thread Mark Ma

Re: [deal.II] Solving time dependent heat equation with MPI (PETsc)

2017-10-16 Thread Mark Ma
> On 10/15/2017 12:41 PM, Mark Ma wrote: > > > > Now the projection of initial values (rewrite the code by manually > assemble > > the matrix and system_rhs and calculate) run OK, but the time updating > of T is > > not correct, same phenomenon appears. I believe this may arise from the >