Thanks, Barry and Changqing! That seems reasonable to me, so I'll make an MR with that change.
Am Mi., 1. Juni 2022 um 20:06 Uhr schrieb Barry Smith <bsm...@petsc.dev>: > > This appears to be a bug in the DMStag/Mat preallocator code. If you add > after the DMCreateMatrix() line in your code > > PetscCall(MatSetOption(A, MAT_NO_OFF_PROC_ENTRIES, PETSC_FALSE)); > > Your code will run correctly. > > Patrick and Matt, > > MatPreallocatorPreallocate_Preallocator() has > > PetscCall(MatSetOption(A, MAT_NO_OFF_PROC_ENTRIES, p->nooffproc)); > > to make the assembly of the stag matrix from the preallocator matrix a > little faster, > > but then it never "undoes" this call. Hence the matrix is left in the > state where it will error if someone sets values from a different rank > (which they certainly can using DMStagMatSetValuesStencil(). > > I think you need to clear the NO_OFF_PROC at the end > of MatPreallocatorPreallocate_Preallocator() because just because the > preallocation process never needed communication does not mean that when > someone puts real values in the matrix they will never use communication; > they can put in values any dang way they please. > > I don't know why this bug has not come up before. > > Barry > > > On May 31, 2022, at 11:08 PM, Ye Changqing <ye_changq...@outlook.com> > wrote: > > Dear all, > > [BugReport.c] is a sample code, [BugReportParallel.output] is the output > when execute BugReport with mpiexec, [BugReportSerial.output] is the output > in serial execution. > > Best, > Changqing > > ------------------------------ > *发件人:* Dave May <dave.mayhe...@gmail.com> > *发送时间:* 2022年5月31日 22:55 > *收件人:* Ye Changqing <ye_changq...@outlook.com> > *抄送:* petsc-users@mcs.anl.gov <petsc-users@mcs.anl.gov> > *主题:* Re: [petsc-users] Mat created by DMStag cannot access ghost points > > > > On Tue 31. May 2022 at 16:28, Ye Changqing <ye_changq...@outlook.com> > wrote: > > Dear developers of PETSc, > > I encountered a problem when using the DMStag module. The program could be > executed perfectly in serial, while errors are thrown out in parallel > (using mpiexec). Some rows in Mat cannot be accessed in local processes > when looping all elements in DMStag. The DM object I used only has one DOF > in each element. Hence, I could switch to the DMDA module easily, and the > program now is back to normal. > > Some snippets are below. > > Initialise a DMStag object: > PetscCall(DMStagCreate2d(PETSC_COMM_WORLD, DM_BOUNDARY_NONE, > DM_BOUNDARY_NONE, M, N, PETSC_DECIDE, PETSC_DECIDE, 0, 0, 1, > DMSTAG_STENCIL_BOX, 1, NULL, NULL, &(s_ctx->dm_P))); > Created a Mat: > PetscCall(DMCreateMatrix(s_ctx->dm_P, A)); > Loop: > PetscCall(DMStagGetCorners(s_ctx->dm_V, &startx, &starty, &startz, &nx, > &ny, &nz, &extrax, &extray, &extraz)); > for (ey = starty; ey < starty + ny; ++ey) > for (ex = startx; ex < startx + nx; ++ex) > { > ... > PetscCall(DMStagMatSetValuesStencil(s_ctx->dm_P, *A, 2, &row[0], 2, > &col[0], &val_A[0][0], ADD_VALUES)); // The traceback shows the problem is > in here. > } > > > In addition to the code or MWE, please forward us the complete stack trace > / error thrown to stdout. > > Thanks, > Dave > > > > Best, > Changqing > > <BugReport.c><BugReportParallel.output><BugReportSerial.output> > > >