Re: [petsc-users] [EXTERNAL] Re: Call to DMSetMatrixPreallocateSkip not changing allocation behavior

2023-12-18 Thread Fackler, Philip via petsc-users
LLSetPreallocation_C", )); > + if (!sell) PetscCall(PetscObjectQueryFunction((PetscObject)A, > "MatSeqSELLSetPreallocation_C", )); > +} > +if (!sell) PetscCall(PetscObjectQueryFunction((PetscObject)A, > "MatISSetPreallocation_C", )); &

[petsc-users] Call to DMSetMatrixPreallocateSkip not changing allocation behavior

2023-12-14 Thread Fackler, Philip via petsc-users
I'm using the following sequence of functions related to the Jacobian matrix: DMDACreate1d(..., ); DMSetFromOptions(da); DMSetUp(da); DMSetMatType(da, MATAIJKOKKOS); DMSetMatrixPreallocateSkip(da, PETSC_TRUE); Mat J; DMCreateMatrix(da, ); MatSetPreallocationCOO(J, ...); I recently added the call

Re: [petsc-users] [Xolotl-psi-development] [EXTERNAL] Re: Unexpected performance losses switching to COO interface

2023-11-29 Thread Fackler, Philip via petsc-users
ZWQ=owjSIigcMR9I7l2CwJBbVn5A6D7SU6RaEKwSNLafLr2msXHmjhMhZWSPCqdwWRAI=dxTWlHKB4itRnwMh05b4rPf4V4axP7XpXAJIUNJYWoQ=> ] "Fackler, Philip via petsc-users" writes: > That makes sense. Here are the arguments that I think are relevant: > > -fieldsplit_1_pc_type redundant -fieldsplit_0_pc_typ

Re: [petsc-users] [EXTERNAL] Re: Unexpected performance losses switching to COO interface

2023-11-22 Thread Fackler, Philip via petsc-users
te: Hi, Philip, I will look into the tarballs and get back to you. Thanks. --Junchao Zhang On Mon, Oct 2, 2023 at 9:41 AM Fackler, Philip via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: We finally have xolotl ported to use the new COO interface and the aijkokkos implementation f

Re: [petsc-users] [EXTERNAL] Re: Unexpected performance losses switching to COO interface

2023-10-16 Thread Fackler, Philip via petsc-users
with COO enabled? [Screenshot 2023-10-05 at 10.55.29 AM.png] --Junchao Zhang On Mon, Oct 2, 2023 at 9:52 AM Junchao Zhang mailto:junchao.zh...@gmail.com>> wrote: Hi, Philip, I will look into the tarballs and get back to you. Thanks. --Junchao Zhang On Mon, Oct 2, 2023 at

Re: [petsc-users] [EXTERNAL] Re: Unexpected performance losses switching to COO interface

2023-10-11 Thread Fackler, Philip via petsc-users
o Zhang On Mon, Oct 2, 2023 at 9:52 AM Junchao Zhang mailto:junchao.zh...@gmail.com>> wrote: Hi, Philip, I will look into the tarballs and get back to you. Thanks. --Junchao Zhang On Mon, Oct 2, 2023 at 9:41 AM Fackler, Philip via petsc-users mailto:petsc-users@mcs.anl.gov>>

[petsc-users] Unexpected performance losses switching to COO interface

2023-10-02 Thread Fackler, Philip via petsc-users
We finally have xolotl ported to use the new COO interface and the aijkokkos implementation for Mat (and kokkos for Vec). Comparing this port to our previous version (using MatSetValuesStencil and the default Mat and Vec implementations), we expected to see an improvement in performance for

Re: [petsc-users] [EXTERNAL] Re: Initializing kokkos before petsc causes a problem

2023-07-21 Thread Fackler, Philip via petsc-users
ok at the issue. --Junchao Zhang On Wed, Jun 7, 2023 at 9:30 AM Fackler, Philip via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: I'm encountering a problem in xolotl. We initialize kokkos before initializing petsc. Therefore... The pointer referenced here: https://gitlab.

Re: [petsc-users] [EXTERNAL] Re: Initializing kokkos before petsc causes a problem

2023-06-27 Thread Fackler, Philip via petsc-users
Thanks for reporting. I will have a look at the issue. --Junchao Zhang On Wed, Jun 7, 2023 at 9:30 AM Fackler, Philip via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: I'm encountering a problem in xolotl. We initialize kokkos before initializing petsc. Therefore... The pointer referen

Re: [petsc-users] [EXTERNAL] Re: Initializing kokkos before petsc causes a problem

2023-06-27 Thread Fackler, Philip via petsc-users
] Initializing kokkos before petsc causes a problem Hi, Philip, Thanks for reporting. I will have a look at the issue. --Junchao Zhang On Wed, Jun 7, 2023 at 9:30 AM Fackler, Philip via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: I'm encountering a problem in xolotl. We init

[petsc-users] Initializing kokkos before petsc causes a problem

2023-06-07 Thread Fackler, Philip via petsc-users
I'm encountering a problem in xolotl. We initialize kokkos before initializing petsc. Therefore... The pointer referenced here: https://gitlab.com/petsc/petsc/-/blob/main/src/vec/is/sf/impls/basic/kokkos/sfkok.kokkos.cxx#L363

Re: [petsc-users] [EXTERNAL] Re: Kokkos backend for Mat and Vec diverging when running on CUDA device.

2023-03-27 Thread Fackler, Philip via petsc-users
>; Zhang, Junchao mailto:jczh...@mcs.anl.gov>>; Roth, Philip mailto:rot...@ornl.gov>> Subject: [EXTERNAL] Re: [petsc-users] Kokkos backend for Mat and Vec diverging when running on CUDA device. Hi, Philip, Sorry to hear that. It seems you could run the same code on CPUs but not no GP

Re: [petsc-users] [EXTERNAL] Re: Performance problem using COO interface

2023-01-23 Thread Fackler, Philip via petsc-users
Fackler, Philip via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: The following is the log_view output for the ported case using 4 MPI

Re: [petsc-users] [EXTERNAL] Re: Kokkos backend for Mat and Vec diverging when running on CUDA device.

2023-01-20 Thread Fackler, Philip via petsc-users
ers@mcs.anl.gov>>; Blondel, Sophie mailto:sblon...@utk.edu>>; Zhang, Junchao mailto:jczh...@mcs.anl.gov>>; Roth, Philip mailto:rot...@ornl.gov>> Subject: [EXTERNAL] Re: [petsc-users] Kokkos backend for Mat and Vec diverging when running on CUDA device. Hi, Philip, Sorr

Re: [petsc-users] Performance problem using COO interface

2023-01-20 Thread Fackler, Philip via petsc-users
The following is the log_view output for the ported case using 4 MPI tasks. ***WIDEN YOUR WINDOW TO 160

[petsc-users] Performance problem using COO interface

2023-01-17 Thread Fackler, Philip via petsc-users
In Xolotl's feature-petsc-kokkos branch I have ported the code to use petsc's COO interface for creating the Jacobian matrix (and the Kokkos interface for interacting with Vec entries). As the attached plots show for one case, while the code for computing the RHSFunction and RHSJacobian perform

Re: [petsc-users] [EXTERNAL] Re: Kokkos backend for Mat and Vec diverging when running on CUDA device.

2022-12-08 Thread Fackler, Philip via petsc-users
ailto:petsc-users@mcs.anl.gov>>; Blondel, Sophie mailto:sblon...@utk.edu>>; Zhang, Junchao mailto:jczh...@mcs.anl.gov>>; Roth, Philip mailto:rot...@ornl.gov>> Subject: [EXTERNAL] Re: [petsc-users] Kokkos backend for Mat and Vec diverging when running on CUDA device. Hi, Ph

Re: [petsc-users] [EXTERNAL] Re: Kokkos backend for Mat and Vec diverging when running on CUDA device.

2022-12-06 Thread Fackler, Philip via petsc-users
lto:jczh...@mcs.anl.gov>>; Roth, Philip mailto:rot...@ornl.gov>> Subject: [EXTERNAL] Re: [petsc-users] Kokkos backend for Mat and Vec diverging when running on CUDA device. Hi, Philip, Sorry to hear that. It seems you could run the same code on CPUs but not no GPUs (wi

Re: [petsc-users] [EXTERNAL] Re: Kokkos backend for Mat and Vec diverging when running on CUDA device.

2022-12-05 Thread Fackler, Philip via petsc-users
t not no GPUs (with either petsc/Kokkos backend or petsc/cuda backend, is it right? --Junchao Zhang On Mon, Nov 14, 2022 at 12:13 PM Fackler, Philip via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: This is an issue I've brought up before (and discussed in-person with Richard).

Re: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases

2022-11-22 Thread Fackler, Philip via petsc-users
EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases Hi, Philip, Can you tell me instructions to build Xolotl to reproduce the error? --Junchao Zhang On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users mailto:petsc-users@mcs.anl.gov>> wrot

Re: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases

2022-11-22 Thread Fackler, Philip via petsc-users
; Blondel, Sophie mailto:sblon...@utk.edu>> Subject: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases Hi, Philip, Can you tell me instructions to build Xolotl to reproduce the error? --Junchao Zhang On Mon, Nov 14, 2022 at 12:24 PM Fac

Re: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases

2022-11-21 Thread Fackler, Philip via petsc-users
t; Subject: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases Hi, Philip, Can you tell me instructions to build Xolotl to reproduce the error? --Junchao Zhang On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users mailto:petsc-users@mcs.anl

Re: [petsc-users] [EXTERNAL] Re: Kokkos backend for Mat and Vec diverging when running on CUDA device.

2022-11-16 Thread Fackler, Philip via petsc-users
kend or petsc/cuda backend, is it right? --Junchao Zhang On Mon, Nov 14, 2022 at 12:13 PM Fackler, Philip via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: This is an issue I've brought up before (and discussed in-person with Richard). I wanted to bring it up again because I'm hit

Re: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases

2022-11-15 Thread Fackler, Philip via petsc-users
i, Philip, Can you tell me instructions to build Xolotl to reproduce the error? --Junchao Zhang On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use

Re: [petsc-users] [EXTERNAL] Re: Kokkos backend for Mat and Vec diverging when running on CUDA device.

2022-11-15 Thread Fackler, Philip via petsc-users
at 12:13 PM Fackler, Philip via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: This is an issue I've brought up before (and discussed in-person with Richard). I wanted to bring it up again because I'm hitting the limits of what I know to do, and I need help figuring this out. The probl

[petsc-users] Using multiple MPI ranks with COO interface crashes in some cases

2022-11-14 Thread Fackler, Philip via petsc-users
In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use the COO interface for preallocating and setting values in the Jacobian matrix. I have found that with some of our test cases, using more than one MPI rank results in a crash. Way down in the preconditioner code in petsc a

[petsc-users] Kokkos backend for Mat and Vec diverging when running on CUDA device.

2022-11-14 Thread Fackler, Philip via petsc-users
This is an issue I've brought up before (and discussed in-person with Richard). I wanted to bring it up again because I'm hitting the limits of what I know to do, and I need help figuring this out. The problem can be reproduced using Xolotl's "develop" branch built against a petsc build with

Re: [petsc-users] [EXTERNAL] Re: Kokkos Interface for PETSc

2022-02-23 Thread Fackler, Philip via petsc-users
vice and it'll use GPU-aware MPI. There are a few examples of residual evaluation and matrix assembly on the device using Kokkos. You can also see libCEED examples for assembly on the device into Kokkos matrices and vectors without touching host memory. "Fackler, Philip via petsc-users" <ma

[petsc-users] Kokkos Interface for PETSc

2022-02-15 Thread Fackler, Philip via petsc-users
We're intending to transitioning the Xolotl interfaces with PETSc. I am hoping someone (can) point us to some documentation (and examples) for using PETSc's Kokkos-based interface. If this does not yet exist, then perhaps some slides (like the ones Richard Mills showed at the NE-SciDAC

Re: [petsc-users] [EXTERNAL] Re: Redirecting petsc output

2021-09-15 Thread Fackler, Philip via petsc-users
Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory From: petsc-users on behalf of Fackler, Philip via petsc-users Sent: Wednesday, September 8, 2021 11:24 To: Barry Smith Cc: petsc-users@mcs.anl.gov ; xolotl-psi-developm

Re: [petsc-users] [EXTERNAL] Re: Redirecting petsc output

2021-09-08 Thread Fackler, Philip via petsc-users
by setting PETSC_STDOUT = fopen(...) Barry On Sep 8, 2021, at 10:59 AM, Fackler, Philip via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: Is there a way to customize how petsc writes information? Instead of writing to stdout (for example: 0 TS dt 0.1 time 0.), what if we want

[petsc-users] Redirecting petsc output

2021-09-08 Thread Fackler, Philip via petsc-users
Is there a way to customize how petsc writes information? Instead of writing to stdout (for example: 0 TS dt 0.1 time 0.), what if we want to log that message to a file other output from Xolotl? I'm assuming there are multiple ways of getting this result. What's common practice with petsc