> On May 30, 2019, at 11:08 PM, Manav Bhatia wrote:
>
> I managed to get this to work.
>
> I defined a larger matrix with the dense blocks appended to the end of the
> matrix on the last processor. Currently, I am only running with one extra
> unknown, so this should not be a significant pe
I managed to get this to work.
I defined a larger matrix with the dense blocks appended to the end of the
matrix on the last processor. Currently, I am only running with one extra
unknown, so this should not be a significant penalty for load balancing.
Since the larger matrix has the same I-j
"Smith, Barry F." writes:
> Sorry, my mistake. I assumed that the naming would follow PETSc convention
> and there would be MatGetLocalSubMatrix_something() as there is
> MatGetLocalSubMatrix_IS() and MatGetLocalSubMatrix_Nest(). Instead
> MatGetLocalSubMatrix() is hardwired to call MatCreat
Sorry, my mistake. I assumed that the naming would follow PETSc convention
and there would be MatGetLocalSubMatrix_something() as there is
MatGetLocalSubMatrix_IS() and MatGetLocalSubMatrix_Nest(). Instead
MatGetLocalSubMatrix() is hardwired to call MatCreateLocalRef() if the
method is not
"Smith, Barry F. via petsc-users" writes:
>This is an interesting idea, but unfortunately not directly compatible
> with libMesh filling up the finite element part of the matrix. Plus it
> appears MatGetLocalSubMatrix() is only implemented for IS and Nest matrices
> :-(
Maybe I'm missing
This is an interesting idea, but unfortunately not directly compatible with
libMesh filling up the finite element part of the matrix. Plus it appears
MatGetLocalSubMatrix() is only implemented for IS and Nest matrices :-(
You could create a MATNEST reusing exactly the matrix from lib me
Understood. Where are you putting the "few extra unknowns" in the vector and
matrix? On the first process, on the last process, some places in the middle of
the matrix?
We don't have any trivial code for copying a big matrix into a even larger
matrix directly because we frown on doing tha
Manav,
For parallel sparse matrices using the standard PETSc formats the matrix is
stored in two parts on each process (see the details in MatCreateAIJ()) thus
there is no inexpensive way to access directly the IJ locations as a single
local matrix. What are you hoping to use the informat
Yes, see MatGetRow
https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetRow.html
--Junchao Zhang
On Wed, May 29, 2019 at 2:28 PM Manav Bhatia via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:
Hi,
Once a MPI-AIJ matrix has been assembled, is there a method to get the