Re: [petsc-users] Tips on integrating MPI ksp petsc into my application?

2021-12-13 Thread Barry Smith
Sorry, I didn't notice these emails for a long time. PETSc does provide a "simple" mechanism to redistribute your matrix that does not require you to explicitly do the redistribution. You must create a MPIAIJ matrix over all the MPI ranks, but simply provide all the rows on the first r

Re: [petsc-users] Tips on integrating MPI ksp petsc into my application?

2021-12-07 Thread Junchao Zhang
On Tue, Dec 7, 2021 at 10:04 PM Faraz Hussain wrote: > The matrix in memory is in IJV (Spooles ) or CSR3 ( Pardiso ). The > application was written to use a variety of different direct solvers but > Spooles and Pardiso are what I am most familiar with. > I assume the CSR3 has the a, i, j arrays u

Re: [petsc-users] Tips on integrating MPI ksp petsc into my application?

2021-12-07 Thread Faraz Hussain via petsc-users
The matrix in memory is in IJV (Spooles ) or CSR3 ( Pardiso ). The application was written to use a variety of different direct solvers but Spooles and Pardiso are what I am most familiar with. On Tuesday, December 7, 2021, 10:33:24 PM EST, Junchao Zhang wrote: On Tue, Dec 7, 202

Re: [petsc-users] Tips on integrating MPI ksp petsc into my application?

2021-12-07 Thread Matthew Knepley
On Tue, Dec 7, 2021 at 10:25 PM Faraz Hussain wrote: > Thanks, that makes sense. I guess I was hoping petsc ksp is like intel's > cluster sparse solver where it handles distributing the matrix to the other > ranks for you. > > It sounds like that is not the case and I need to manually distribute

Re: [petsc-users] Tips on integrating MPI ksp petsc into my application?

2021-12-07 Thread Junchao Zhang
On Tue, Dec 7, 2021 at 9:06 PM Faraz Hussain via petsc-users < petsc-users@mcs.anl.gov> wrote: > Thanks, I took a look at ex10.c in ksp/tutorials . It seems to do as you > wrote, "it efficiently gets the matrix from the file spread out over all > the ranks.". > > However, in my application I only

Re: [petsc-users] Tips on integrating MPI ksp petsc into my application?

2021-12-07 Thread Faraz Hussain via petsc-users
Thanks, that makes sense. I guess I was hoping petsc ksp is like intel's cluster sparse solver where it handles distributing the matrix to the other ranks for you. It sounds like that is not the case and I need to manually distribute the matrix to the ranks? On Tuesday, December 7, 2021

Re: [petsc-users] Tips on integrating MPI ksp petsc into my application?

2021-12-07 Thread Matthew Knepley
On Tue, Dec 7, 2021 at 10:06 PM Faraz Hussain via petsc-users < petsc-users@mcs.anl.gov> wrote: > Thanks, I took a look at ex10.c in ksp/tutorials . It seems to do as you > wrote, "it efficiently gets the matrix from the file spread out over all > the ranks.". > > However, in my application I only

Re: [petsc-users] Tips on integrating MPI ksp petsc into my application?

2021-12-07 Thread Faraz Hussain via petsc-users
Thanks, I took a look at ex10.c in ksp/tutorials . It seems to do as you wrote, "it efficiently gets the matrix from the file spread out over all the ranks.". However, in my application I only want rank 0 to read and assemble the matrix. I do not want other ranks trying to get the matrix data. T

Re: [petsc-users] Tips on integrating MPI ksp petsc into my application?

2021-12-07 Thread Barry Smith
If you use MatLoad() it never has the entire matrix on a single rank at the same time; it efficiently gets the matrix from the file spread out over all the ranks. > On Dec 6, 2021, at 11:04 PM, Faraz Hussain via petsc-users > wrote: > > I am studying the examples but it seems all ranks r

Re: [petsc-users] Tips on integrating MPI ksp petsc into my application?

2021-12-07 Thread Mark Adams
I assume you are using PETSc to load matices. What example are you looking at? On Mon, Dec 6, 2021 at 11:04 PM Faraz Hussain via petsc-users < petsc-users@mcs.anl.gov> wrote: > I am studying the examples but it seems all ranks read the full matrix. Is > there an MPI example where only rank 0 read

[petsc-users] Tips on integrating MPI ksp petsc into my application?

2021-12-06 Thread Faraz Hussain via petsc-users
I am studying the examples but it seems all ranks read the full matrix. Is there an MPI example where only rank 0 reads the matrix? I don't want all ranks to read my input matrix and consume a lot of memory allocating data for the arrays. I have worked with Intel's cluster sparse solver and t