> On Feb 12, 2019, at 8:29 PM, Maahi Talukder wrote:
>
> Thank you for your suggestions. I will go through them.
>
> But can't I do it any other way? Like using MatMPIAIJPreallocation? Cause I
> have already calculated the elements of the matrix, and all I have to do is
> to efficiently
Sounds like you have a single structured grid so you should use
https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DMDA/DMDACreate2d.html
and start with something like src/ksp/ksp/examples/tutorials/ex29.c and ex46.c
The DMDA manages dividing up the grid among processes and
With MPI parallelism you need to do a decomposition of your data across the
processes. Thus, for example, each process will generate a subset of the matrix
entries. In addition, for large problems (which is what parallelism is for) you
cannot use "dense" data structures, like your Q(), to
Dear All,
I am tying to solve a linear system using ksp solvers. I have managed to
solve the system with a sequential code. The part of my sequential code
that deals with creating Matrix and setting values is as the following -
call MatCreate(PETSC_COMM_WORLD,Mp,ierr)
call
We should make the (two line) functionality a command-line feature of
PetscBinaryIO.py. Then a user could do
python -m PetscBinaryIO matrix.mm matrix.petsc
Matthew Knepley via petsc-users writes:
> It definitely should not be there under 'datafiles'. We should put it in an
> example, as
We have /home/petsc/datafiles/matrices/MtxMarket/mm2petsc.c
Hong
On Tue, Feb 12, 2019 at 9:52 AM Zhang, Junchao via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:
Sure.
--Junchao Zhang
On Tue, Feb 12, 2019 at 9:47 AM Matthew Knepley
mailto:knep...@gmail.com>> wrote:
Hi Junchao,
Could
Sure.
--Junchao Zhang
On Tue, Feb 12, 2019 at 9:47 AM Matthew Knepley
mailto:knep...@gmail.com>> wrote:
Hi Junchao,
Could you fix the MM example in PETSc to have this full support? That way we
will always have it.
Thanks,
Matt
On Tue, Feb 12, 2019 at 10:27 AM Zhang, Junchao via
Eda,
I have a code that can read in Matrix Market and write out PETSc binary
files. Usage: mpirun -n 1 ./mm2petsc -fin -fout . You can
have a try.
--Junchao Zhang
On Tue, Feb 12, 2019 at 1:50 AM Eda Oktay via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:
Hello,
I am trying to
It is better to convert the matrices to PETSc binary format first. One easy way
is to read them into Matlab with mmread.m and write with PETSc's
PetscBinaryWrite.m. This can be done similarly in python.
Jose
> El 12 feb 2019, a las 8:50, Eda Oktay via petsc-users
> escribió:
>
> Hello,
>
>