Re: [petsc-users] Ghost particles for DMSWARM (or similar)

2024-10-01 Thread Dave May
On Tue, 1 Oct 2024 at 08:56, MIGUEL MOLINOS PEREZ wrote: > Hi Dave, > > Would something like that work? > > Yes, this should work! Any idea on where to look so I can try to implement > it myself? > I am adding support for this right now. > > Best, > Miguel > &

Re: [petsc-users] Ghost particles for DMSWARM (or similar)

2024-10-01 Thread Dave May
Cheers, Dave > Thanks, > Miguel > > On Oct 1, 2024, at 5:56 PM, MIGUEL MOLINOS PEREZ wrote: > > Hi Dave, > > Would something like that work? > > Yes, this should work! Any idea on where to look so I can try to implement > it myself? > > Best, > Miguel &

Re: [petsc-users] Ghost particles for DMSWARM (or similar)

2024-10-01 Thread Dave May
Hi Miguel, On Tue 1. Oct 2024 at 07:56, MIGUEL MOLINOS PEREZ wrote: > Thank you Matt, it works! > > The implementation is straightforward: > - 1º Define the paddle regions using DMGetLocalBoundingBox with the > background DMDA mesh as an auxiliary mesh for the domain-partitioning. > - 2º Create

Re: [petsc-users] Ghost particles for DMSWARM (or similar)

2024-09-27 Thread Dave May
Hi all, Sorry to be very late coming to this party. I had always intended for swarm to support ghost particles. The way I wanted to do this was via a new migrate type (DMSWARM_MIGRATE_GHOST) (or a better name - suggestions welcome) and a new migrate function. The new migrate function would call a

Re: [petsc-users] (no subject)

2024-07-25 Thread Dave May
I think to achieve what you want you will need to use a KSP which supports right preconditioning, that is they monitor || Ax-b || rather than || P^{-1} ( Ax - b ) ||, where P^{-1} is the application of the preconditioner. Try running with -ksp_type fgmres or -ksp_type gcr. These Krylov methods sup

Re: [petsc-users] Trying to understand -log_view when using HIP kernels (ex34)

2024-01-19 Thread Dave May
/rich.c:106 > > [0]PETSC ERROR: #12 KSPSolve_Private() at > /scratch/bsmith/petsc/src/ksp/ksp/interface/itfunc.c:906 > > [0]PETSC ERROR: #13 KSPSolve() at > /scratch/bsmith/petsc/src/ksp/ksp/interface/itfunc.c:1079 > > [0]PETSC ERROR: #14 main() at ex34.c:52 > > [0]PET

Re: [petsc-users] Trying to understand -log_view when using HIP kernels (ex34)

2024-01-19 Thread Dave May
lpages/Profiling/PetscLogGpuTime/ > > --Junchao Zhang > > > On Fri, Jan 19, 2024 at 11:35 AM Dave May wrote: > >> Hi all, >> >> I am trying to understand the logging information associated with the >> %flops-performed-on-the-gpu reported by -log_view when run

[petsc-users] Trying to understand -log_view when using HIP kernels (ex34)

2024-01-19 Thread Dave May
Hi all, I am trying to understand the logging information associated with the %flops-performed-on-the-gpu reported by -log_view when running src/ksp/ksp/tutorials/ex34 with the following options -da_grid_x 192 -da_grid_y 192 -da_grid_z 192 -dm_mat_type seqaijhipsparse -dm_vec_type seqhip -ksp_ma

Re: [petsc-users] sources of floating point randomness in JFNK in serial

2023-05-04 Thread Dave May
Is your code valgrind clean? On Thu 4. May 2023 at 05:54, Mark Lohry wrote: > Try -pc_type none. >> > > With -pc_type none the 0 KSP residual looks identical. But *sometimes* > it's producing exactly the same history and others it's gradually > changing. I'm reasonably confident my residual eva

Re: [petsc-users] DMSWARM with DMDA and KSP

2023-05-01 Thread Dave May
On Mon 1. May 2023 at 18:57, Matthew Young wrote: > Thanks for the suggestion to keep DMs separate, and for pointing me toward > that example. I now have a DM for the particle quantities (i.e., density > and flux) and another for the potential. I'm hoping to use > KSPSetComputeOperators with PCGA

Re: [petsc-users] MPI+OpenMP+MKL

2023-04-07 Thread Dave May
On Fri 7. Apr 2023 at 07:06, Astor Piaz wrote: > Hello petsc-users, > I am trying to use a code that is parallelized with a combination of > OpenMP and MKL parallelisms, where OpenMP threads are able to spawn MPI > processes. > Is this really the correct way to go? Would it not be more suitabl

Re: [petsc-users] Memory Usage in Matrix Assembly.

2023-03-14 Thread Dave May
ave > > Thank you very much, > Pantelis > -- > *From:* Dave May > *Sent:* Tuesday, March 14, 2023 4:40 PM > *To:* Pantelis Moschopoulos > *Cc:* petsc-users@mcs.anl.gov > *Subject:* Re: [petsc-users] Memory Usage in Matrix Assembly. > > > > On Tue 1

Re: [petsc-users] Memory Usage in Matrix Assembly.

2023-03-14 Thread Dave May
On Tue 14. Mar 2023 at 07:15, Pantelis Moschopoulos < pmoschopou...@outlook.com> wrote: > Hi everyone, > > I am a new Petsc user that incorporates Petsc for FEM in a Fortran code. > My question concerns the sudden increase of the memory that Petsc needs > during the assembly of the jacobian matrix

Re: [petsc-users] PetscViewer with 64bit

2023-02-14 Thread Dave May
On Tue 14. Feb 2023 at 21:27, Jed Brown wrote: > Dave May writes: > > > On Tue 14. Feb 2023 at 17:17, Jed Brown wrote: > > > >> Can you share a reproducer? I think I recall the format requiring > certain > >> things to be Int32. > > > > >

Re: [petsc-users] PetscViewer with 64bit

2023-02-14 Thread Dave May
On Tue 14. Feb 2023 at 21:03, Dave May wrote: > > > On Tue 14. Feb 2023 at 17:17, Jed Brown wrote: > >> Can you share a reproducer? I think I recall the format requiring certain >> things to be Int32. > > > By default, the byte offset used with the appended d

Re: [petsc-users] PetscViewer with 64bit

2023-02-14 Thread Dave May
On Tue 14. Feb 2023 at 17:17, Jed Brown wrote: > Can you share a reproducer? I think I recall the format requiring certain > things to be Int32. By default, the byte offset used with the appended data format is UInt32. I believe that’s where the sizeof(int) is coming from. This default is annoy

Re: [petsc-users] coordinate degrees of freedom for 2nd-order gmsh mesh

2023-01-12 Thread Dave May
On Thu 12. Jan 2023 at 17:58, Blaise Bourdin wrote: > Out of curiosity, what is the rationale for _reading_ high order gmsh > meshes? > GMSH can use a CAD engine like OpenCascade. This provides geometric representations via things like BSplines. Such geometric representation are not exposed to t

Re: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh

2022-12-22 Thread Dave May
On Thu, 22 Dec 2022 at 12:08, Matteo Semplice wrote: > > Il 22/12/22 20:06, Dave May ha scritto: > > > > On Thu 22. Dec 2022 at 10:27, Matteo Semplice < > matteo.sempl...@uninsubria.it> wrote: > >> Dear Dave and Matt, >> >> I am really dealing

Re: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh

2022-12-22 Thread Dave May
interact with the DM. Based on what I see in the code, switching migrate modes between basic and dmneighbourscatter should be safe. If you are fine calling the point location from your side then what you propose should work. Cheers Dave > Thanks > > Matteo > Il 22/12/22 18:40, Dave May

Re: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh

2022-12-22 Thread Dave May
Hey Matt, On Thu 22. Dec 2022 at 05:02, Matthew Knepley wrote: > On Thu, Dec 22, 2022 at 6:28 AM Matteo Semplice < > matteo.sempl...@uninsubria.it> wrote: > >> Dear all >> >> please ignore my previous email and read this one: I have better >> localized the problem. Maybe DMSwarmMigrate is de

Re: [petsc-users] Efficiently build a matrix from two asymmetric diagonal block matrices

2022-07-21 Thread Dave May
On Thu 21. Jul 2022 at 14:06, Matthew Knepley wrote: > On Thu, Jul 21, 2022 at 6:28 AM Emile Soutter > wrote: > >> Dear all, >> >> I am struggling with the simple following problem : Having a first matrix >> B1 of size n1xm1, a second matrix B2 of size n2 x m2, build a matrix M of >> size (n1+n2

Re: [petsc-users] Mat created by DMStag cannot access ghost points

2022-05-31 Thread Dave May
On Tue 31. May 2022 at 16:28, Ye Changqing wrote: > Dear developers of PETSc, > > I encountered a problem when using the DMStag module. The program could be > executed perfectly in serial, while errors are thrown out in parallel > (using mpiexec). Some rows in Mat cannot be accessed in local proc

Re: [petsc-users] MatColoring

2022-05-10 Thread Dave May
On Tue 10. May 2022 at 18:51, Tang, Qi wrote: > We are using SNES + TS + dmstag. The current bottleneck is the number of > residual evaluation (more than 300 per Jacobian building using the default > coloring from dmstag). > I suspect that this high count stems from the fact that non zero patter

Re: [petsc-users] GMRES for outer solver

2022-05-01 Thread Dave May
On Sun 1. May 2022 at 07:03, Amneet Bhalla wrote: > How about using a fixed number of Richardson iterations as a Krylov > preconditioner to a GMRES solver? > That is fine. Would that lead to a linear operation? > Yes. > On Sat, Apr 30, 2022 at 8:21 PM Jed Brown wrote: > >> In general, no.

Re: [petsc-users] DMSwarm losing particles with a non-uniform mesh

2022-04-04 Thread Dave May
On Mon, 4 Apr 2022 at 12:07, Joauma Marichal wrote: > Hello, > > I have written before as I am trying use the DMSwarm library to track > particles over a collocated non-uniform mesh with ghost cells. > I have been able to deal with the collocated and ghost cell issues by > creating an intermediat

Re: [petsc-users] DMSwarm

2022-03-25 Thread Dave May
Hi, On Wed 23. Mar 2022 at 18:52, Matthew Knepley wrote: > On Wed, Mar 23, 2022 at 11:09 AM Joauma Marichal < > joauma.maric...@uclouvain.be> wrote: > >> Hello, >> >> I sent an email last week about an issue I had with DMSwarm but did not >> get an answer yet. If there is any other information n

Re: [petsc-users] Finite difference approximation of Jacobian

2021-12-13 Thread Dave May
On Mon, 13 Dec 2021 at 20:13, Matthew Knepley wrote: > On Mon, Dec 13, 2021 at 1:52 PM Dave May wrote: > >> On Mon, 13 Dec 2021 at 19:29, Matthew Knepley wrote: >> >>> On Mon, Dec 13, 2021 at 1:16 PM Dave May >>> wrote: >>> >>>>

Re: [petsc-users] Finite difference approximation of Jacobian

2021-12-13 Thread Dave May
ructure of the matrix. So actually colouring is supported. The only thing missing for you is that the matrix returned from DMCreateMatrix for DMSTAG does not have a defined non-zero structure. Once that is set / defined, colouring will just work. Qi > > > > On Dec 13, 2021, at 11:52 AM

Re: [petsc-users] Finite difference approximation of Jacobian

2021-12-13 Thread Dave May
On Mon, 13 Dec 2021 at 19:29, Matthew Knepley wrote: > On Mon, Dec 13, 2021 at 1:16 PM Dave May wrote: > >> >> >> On Sat 11. Dec 2021 at 22:28, Matthew Knepley wrote: >> >>> On Sat, Dec 11, 2021 at 1:58 PM Tang, Qi wrote: >>> >>>

Re: [petsc-users] Finite difference approximation of Jacobian

2021-12-13 Thread Dave May
On Sat 11. Dec 2021 at 22:28, Matthew Knepley wrote: > On Sat, Dec 11, 2021 at 1:58 PM Tang, Qi wrote: > >> Hi, >> Does anyone have comment on finite difference coloring with DMStag? We >> are using DMStag and TS to evolve some nonlinear equations implicitly. It >> would be helpful to have the c

Re: [petsc-users] GAMG memory consumption

2021-11-24 Thread Dave May
I think your run with -pc_type mg is defining a multigrid hierarchy with a only single level. (A single level mg PC would also explain the 100+ iterations required to converge.) The gamg configuration is definitely coarsening your problem and has a deeper hierarchy. A single level hierarchy will r

Re: [petsc-users] Scaling of the Petsc Binary Viewer

2021-07-07 Thread Dave May
On Wed 7. Jul 2021 at 20:41, Thibault Bridel-Bertomeu < thibault.bridelberto...@gmail.com> wrote: > Dear all, > > I have been having issues with large Vec (based on DMPLex) and massive MPI > I/O ... it looks like the data that is written by the Petsc Binary Viewer > is gibberish for large meshes

Re: [petsc-users] Change Amat in FormJacobian

2021-06-14 Thread Dave May
On Mon 14. Jun 2021 at 17:27, Anton Popov wrote: > > On 14.06.21 15:04, Dave May wrote: > > > Hi Anton, > > Hi Dave, > > > On Mon, 14 Jun 2021 at 14:47, Anton Popov wrote: > >> Hi Barry & Matt, >> >> thanks for your quick response. T

Re: [petsc-users] Change Amat in FormJacobian

2021-06-14 Thread Dave May
Hi Anton, On Mon, 14 Jun 2021 at 14:47, Anton Popov wrote: > Hi Barry & Matt, > > thanks for your quick response. These options were exactly what I needed > and expected: > > -pc_mg_galerkin pmat > -pc_use_amat false > > I just assumed that it’s a default behavior of the PC object. > > So to cla

Re: [petsc-users] Data transfer between DMDA-managed Vecs

2021-04-19 Thread Dave May
On Tue, 20 Apr 2021 at 01:06, Constantine Khrulev wrote: > Hi, > > I would like to transfer values from one DMDA-managed Vec (i.e. created > using DMCreateGlobalVector() or equivalent) to a Vec managed using a > different DMDA instance (same number of elements, same number of degrees > of freedom

Re: [petsc-users] Code speedup after upgrading

2021-03-23 Thread Dave May
Nice to hear! The answer is simple, PETSc is awesome :) Jokes aside, assuming both petsc builds were configured with —with-debugging=0, I don’t think there is a definitive answer to your question with the information you provided. It could be as simple as one specific implementation you use was i

Re: [petsc-users] error message

2021-03-16 Thread Dave May
On Tue, 16 Mar 2021 at 19:50, Sam Guo wrote: > Dear PETSc dev team, >When there is an PETSc error, I go following overly verbose error > message. Is it possible to get a simple error message like "Initial vector > is zero or belongs to the deflection space"? > > When an error occurs and the e

Re: [petsc-users] Block unstructured grid

2021-03-11 Thread Dave May
On Thu, 11 Mar 2021 at 09:27, Mathieu Dutour wrote: > Dear all, > > I would like to work with a special kind of linear system that ought to be > very common but I am not sure that it is possible in PETSC. > > What we have is an unstructured grid with say 3.10^5 nodes in it. > At each node, we hav

Re: [petsc-users] using preconditioner with SLEPc

2021-02-08 Thread Dave May
On Mon 8. Feb 2021 at 17:40, Dave May wrote: > > > On Mon 8. Feb 2021 at 15:49, Matthew Knepley wrote: > >> On Mon, Feb 8, 2021 at 9:37 AM Jose E. Roman wrote: >> >>> The problem can be written as A0*v=omega*B0*v and you want the >>> eigenvalues o

Re: [petsc-users] using preconditioner with SLEPc

2021-02-08 Thread Dave May
On Mon 8. Feb 2021 at 15:49, Matthew Knepley wrote: > On Mon, Feb 8, 2021 at 9:37 AM Jose E. Roman wrote: > >> The problem can be written as A0*v=omega*B0*v and you want the >> eigenvalues omega closest to zero. If the matrices were explicitly >> available, you would do shift-and-invert with tar

Re: [petsc-users] Enhancing MatScale computing time

2020-10-23 Thread Dave May
On Thu 22. Oct 2020 at 21:23, Antoine Côté wrote: > Hi, > > I'm working with a 3D DMDA, with 3 dof per "node", used to create a sparse > matrix Mat K. The Mat is modified repeatedly by the program, using the > commands (in that order) : > > MatZeroEntries(K) > In a for loop : MatSetValuesLocal(K,

Re: [petsc-users] Test convergence with non linear preconditioners

2020-08-07 Thread Dave May
On Fri 7. Aug 2020 at 18:21, Adolfo Rodriguez wrote: > Great, that works. What would be the way to change the ilu level, I need > to use ilu(1). I > You want to use the option -xxx_pc_factor_levels 1 Where xxx is the appropriate FAS prefix level. See here https://www.mcs.anl.gov/petsc/petsc-

Re: [petsc-users] Error on INTEGER SIZE using DMDACreate3d

2020-07-21 Thread Dave May
On Tue, 21 Jul 2020 at 12:32, Pierpaolo Minelli wrote: > Hi, > > I have asked to compile a Petsc Version updated and with 64bit indices. > Now I have Version 3.13.3 and these are the configure options used: > > #!/bin/python > if __name__ == '__main__': > import sys > import os > sys.path.i

Re: [petsc-users] [Ext] Re: Question on SLEPc + computing SVD with a "matrix free" matrix

2020-06-25 Thread Dave May
On Thu 25. Jun 2020 at 08:23, Ernesto Prudencio via petsc-users < petsc-users@mcs.anl.gov> wrote: > Thank you, Jose. > > However, in the case of a "matrix free" matrix, the APIs on PETSc seem to > allow just the implementation of A.v, not of A' . w > > One could create another "matrix free" matrix

Re: [petsc-users] Regarding P4est

2020-06-17 Thread Dave May
2020 at 6:47 PM MUKKUND SUNJII > wrote: > >> No, I have not checked it using Valgrind. Perhaps it will help me trace >> the problem. >> >> Regards, >> >> Mukkund >> >> On 18 Jun 2020, at 00:43, Dave May wrote: >> >> Is the code v

Re: [petsc-users] Regarding P4est

2020-06-17 Thread Dave May
interface. > > Nevertheless it is indeed strange that the problem disappears when I use a > PLEX dm. > > Regards, > > Mukkund > > On 17 Jun 2020, at 22:53, Dave May wrote: > > > > On Wed 17. Jun 2020 at 21:21, MUKKUND SUNJII > wrote: > >> Yes, precisely! I am n

Re: [petsc-users] Question on reverse scatters from VecScatterCreateToAll

2020-06-05 Thread Dave May
On Fri 5. Jun 2020 at 13:52, Fabian Jakub < fabian.ja...@physik.uni-muenchen.de> wrote: > Dear Petsc list, > > I have a question regarding reverse vec-scatters: > > I have a shared memory solver that I want to use on a distributed DMDA and > average its results. > > The shared mem solver needs som

Re: [petsc-users] Running example problem

2020-06-04 Thread Dave May
On Thu, 4 Jun 2020 at 14:17, Dave May wrote: > > > On Thu, 4 Jun 2020 at 14:15, Matthew Knepley wrote: > >> On Thu, Jun 4, 2020 at 9:12 AM Fazlul Huq wrote: >> >>> Somehow, make is not working. >>> Please find the attachment herewith for the termin

Re: [petsc-users] Running example problem

2020-06-04 Thread Dave May
On Thu, 4 Jun 2020 at 14:15, Matthew Knepley wrote: > On Thu, Jun 4, 2020 at 9:12 AM Fazlul Huq wrote: > >> Somehow, make is not working. >> Please find the attachment herewith for the terminal readout. >> > > Since you built with PETSC_ARCH=linux-gnu, you need that in your > environment. > Or

Re: [petsc-users] Agglomeration for Multigrid on Unstructured Meshes

2020-06-01 Thread Dave May
On Tue 2. Jun 2020 at 03:30, Matthew Knepley wrote: > On Mon, Jun 1, 2020 at 7:03 PM Danyang Su wrote: > >> Thanks Jed for the quick response. Yes I am asking about the >> repartitioning of coarse grids in geometric multigrid for unstructured >> mesh. I am happy with AMG. Thanks for letting me k

Re: [petsc-users] a question about MatSetValue

2020-05-21 Thread Dave May
implementation specific setters. Thanks Dave > Thanks! > > Cheers, > > Yang Bo > > > On 21 May 2020, at 5:42 PM, Dave May wrote: > > > > On Thu 21. May 2020 at 10:49, Yang Bo (Asst Prof) > wrote: > >> Hi Dave, >> >> Thank you very much

Re: [petsc-users] a question about MatSetValue

2020-05-21 Thread Dave May
t; This looks fine. However MatSeqAIJSetPreallocation() has no effect if the Mat type is not SEQAIJ. Are you running in parallel? If yes then the Mat type will be MATMPIAIJ and you either have to call the MPI specific preallocator or use the generic one I pointed you too. Thanks Dave > Cheers

Re: [petsc-users] a question about MatSetValue

2020-05-21 Thread Dave May
On Thu, 21 May 2020 at 08:55, Yang Bo (Asst Prof) wrote: > Hi Everyone, > > I have a question about adding values to the matrix. The code I have is > > > for (int i=0;i MatSetValue(A,row[i],column[i],h[i],INSERT_VALUES); > } > > where row.size() is a large number. It seems the running time of thi

Re: [petsc-users] PetscObjectGetComm

2020-04-21 Thread Dave May
On Wed 22. Apr 2020 at 07:11, Marius Buerkle wrote: > Hi, > > What is PetscObjectGetComm expected to return? As Patrick said, it returns the communicator associated with the petsc object. I thought it would give the MPI communicator the object lives on. So if I > create A matrix on PETSC_COMM_

Re: [petsc-users] petsc error disappears when I print something in the function

2020-04-21 Thread Dave May
On Tue 21. Apr 2020 at 16:47, Matthew Knepley wrote: > You are overwriting memory somewhere. The prints just move it around. I > suggest running with valgrind. > Matt is right. However, judging by the code snippet I bet all the arrays in question are statically allocated, thus valgrind may be of

Re: [petsc-users] error: too few arguments to function call (PetscOptionsHasName)

2020-04-17 Thread Dave May
report. > > > On Apr 17, 2020, at 16:10, Dave May wrote: > > Old versions of petsc had 3 args for this function, latest version expects > 4 (as the compiler error indicates). > > When in doubt as to what these args are, please refer to the extensive man > pages.

Re: [petsc-users] error: too few arguments to function call (PetscOptionsHasName)

2020-04-17 Thread Dave May
Old versions of petsc had 3 args for this function, latest version expects 4 (as the compiler error indicates). When in doubt as to what these args are, please refer to the extensive man pages. You can find them all here https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/singleindex.htm

Re: [petsc-users] Inquiry about the setup for multigrid as a preconditioner in Petsc.

2020-03-12 Thread Dave May
You want to look at the bottom of each of these web pages https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMCreateInjection.html https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMCreateInterpolation.html https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages

Re: [petsc-users] Choosing VecScatter Method in Matrix-Vector Product

2020-01-22 Thread Dave May
On Wed 22. Jan 2020 at 16:12, Felix Huber wrote: > Hello, > > I currently investigate why our code does not show the expected weak > scaling behaviour in a CG solver. Can you please send representative log files which characterize the lack of scaling (include the full log_view)? Are you using

Re: [petsc-users] DMDA Error

2020-01-21 Thread Dave May
Hi Anthony, On Tue, 21 Jan 2020 at 08:25, Anthony Jourdon wrote: > Hello, > > I made a test to try to reproduce the error. > To do so I modified the file $PETSC_DIR/src/dm/examples/tests/ex35.c > I attach the file in case of need. > > The same error is reproduced for 1024 mpi ranks. I tested two

Re: [petsc-users] error handling

2020-01-20 Thread Dave May
he first few lines of SLEPc example. What about following >> ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); >> ierr = MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,n,n);CHKERRQ(ierr); >> Is there any memory lost? >> >> On Mon, Jan 20, 2020 at 10:41 AM Dave May >

Re: [petsc-users] error handling

2020-01-20 Thread Dave May
-n",&n,NULL);CHKERRQ(ierr); >ierr = PetscPrintf(PETSC_COMM_WORLD,"\n1-D Laplacian Eigenproblem, > n=%D\n\n",n);CHKERRQ(ierr); > > I am wondering if the memory is lost by calling CHKERRQ. > No. > On Mon, Jan 20, 2020 at 10:14 AM Dave May wrote: > >>

Re: [petsc-users] error handling

2020-01-20 Thread Dave May
On Mon 20. Jan 2020 at 19:11, Sam Guo wrote: > Dear PETSc dev team, >If PETSc function returns an error, what's the correct way to clean > PETSc? > The answer depends on the error message reported. Send the complete error message and a better answer can be provided. Particularly how to clea

Re: [petsc-users] killed 9 signal after upgrade from petsc 3.9.4 to 3.12.2

2020-01-10 Thread Dave May
sed is: >> > mpiexec -n 24 valgrind --tool=massif --num-callers=20 >> --log-file=valgrind.log.%p ./ex7 -f1 A.petsc -f2 B.petsc -eps_nev 1 $opts >> -eps_target -4.008e-3+1.57142i -eps_target_magnitude -eps_tol 1e-14 >> > >> > Is there any possibility to install a versi

Re: [petsc-users] killed 9 signal after upgrade from petsc 3.9.4 to 3.12.2

2020-01-09 Thread Dave May
This kind of issue is difficult to untangle because you have potentially three pieces of software which might have changed between v3.9 and v3.12, namely PETSc, SLEPC and SuperLU_dist. You need to isolate which software component is responsible for the 2x increase in memory. When I look at the mem

Re: [petsc-users] Changing nonzero structure and Jacobian coloring

2019-10-16 Thread Dave May via petsc-users
What Ellen wants to do seems exactly the same use case as required by dynamic AMR. Some thoughts: * If the target problem is nonlinear, then you will need to evaluate the Jacobian more than once (with the same nonzero pattern) per time step. You would also have to solve a linear problem at each Ne

Re: [petsc-users] DMDAGetElements and global/local element number

2019-09-12 Thread Dave May via petsc-users
nswered in private side-conversations. You'll likely get an answer faster that way too. On Thu, 12 Sep 2019 at 22:26, Emmanuel Ayala wrote: > Thank you for the answer. > > El jue., 12 de sep. de 2019 a la(s) 15:21, Dave May ( > dave.mayhe...@gmail.com) escribió: > >>

Re: [petsc-users] DMDAGetElements and global/local element number

2019-09-12 Thread Dave May via petsc-users
On Thu, 12 Sep 2019 at 20:21, Emmanuel Ayala via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi everyone, it would be great if someone can give me a hint for this > issue, i have been trying to figure out how to solve it, but i did not > succeed > > I'm using DMDA to generate a 3D mesh (DMDA_E

Re: [petsc-users] [petsc-dev] Working Group Beginners: Feedback On Layout

2019-08-16 Thread Dave May via petsc-users
I think it would useful to have links to all the man pages in the table of contents. I also think it would be useful to have links to the man pages for specific key functions which are fundamental to the objectives of the tutorial. These could appear at the end of the tutorial under a new section

Re: [petsc-users] strange error using fgmres

2019-05-05 Thread Dave May via petsc-users
On Mon, 6 May 2019 at 02:18, Smith, Barry F. via petsc-users < petsc-users@mcs.anl.gov> wrote: > > > Even if you don't get failures on the smaller version of a code it can > still be worth running with valgrind (when you can't run valgrind on the > massive problem) because often the problem is s

Re: [petsc-users] Confusing Schur preconditioner behaviour

2019-03-19 Thread Dave May via petsc-users
Hi Colin, On Tue, 19 Mar 2019 at 09:33, Cotter, Colin J wrote: > Hi Dave, > > >If you are doing that, then you need to tell fieldsplit to use the Amat > to define the splits otherwise it will define the Schur compliment as > >S = B22 - B21 inv(B11) B12 > >preconditiones with B22, where as what y

Re: [petsc-users] PETSC address vector c++ access

2018-11-30 Thread Dave May via petsc-users
On Fri, 30 Nov 2018 at 14:50, RAELI ALICE via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi All, > My team is working on a PETSC version of an existent code. > In order to convert the main part of this work retaining the c++ levels of > abstraction, > we would access to c++ vector data struct

Re: [petsc-users] [SLEPc] ex5 fails, error in lapack

2018-10-28 Thread Dave May
gt; Ok. > Thanks! > > Santiago > > On Sun, Oct 28, 2018 at 10:31 AM Dave May wrote: > >> >> >> On Sun, 28 Oct 2018 at 09:37, Santiago Andres Triana >> wrote: >> >>> Hi petsc-users, >>> >>> I am experiencing problems

Re: [petsc-users] [SLEPc] ex5 fails, error in lapack

2018-10-28 Thread Dave May
On Sun, 28 Oct 2018 at 09:37, Santiago Andres Triana wrote: > Hi petsc-users, > > I am experiencing problems running ex5 and ex7 from the slepc tutorial. > This is after upgrade to petsc-3.10.2 and slepc-3.10.1. Has anyone run into > this problem? see the error message below. Any help or advice w

Re: [petsc-users] Shell Matrix Operations required for KSP solvers?

2018-10-23 Thread Dave May
On Tue, 23 Oct 2018 at 02:24, Matthew Knepley wrote: > On Mon, Oct 22, 2018 at 7:44 PM Andrew Ho wrote: > >> I have a specialized matrix structure I'm trying to take advantage of for >> solving large scale (non)linear systems. I think for this purpose using a >> Shell matrix is sufficient for in

Re: [petsc-users] KSP and matrix-free matrix (shell)

2018-10-18 Thread Dave May
On Thu, 18 Oct 2018 at 17:57, Florian Lindner wrote: > Hello, > > I try to use the KSP solver package together with a shell matrix: > > > MyContext mycontext; // an empty struct, not sure it it's needed? > Mat s; > ierr = MatCreateShell(PETSC_COMM_WORLD, size, size, PETSC_DECIDE, > PETSC_DE

Re: [petsc-users] Increasing norm with finer mesh

2018-10-16 Thread Dave May
On Wed, 17 Oct 2018 at 03:15, Weizhuo Wang wrote: > I just tried both, neither of them make a difference. I got exactly the > same curve with either combination. > Try using right preconditioning. https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/KSP/KSPSetPCSide.html Use the optio

Re: [petsc-users] Some Problems in Modifying Parallel Programs

2018-10-15 Thread Dave May
On Mon, 15 Oct 2018 at 16:54, Matthew Knepley wrote: > On Mon, Oct 15, 2018 at 10:42 AM Yingjie Wu wrote: > >> Dear Petsc developer: >> Hi, >> Thank you very much for your previous reply. >> I recently wanted to modify my program to parallel version, and >> encountered some problems in modifying

Re: [petsc-users] Failure of MUMPS

2018-10-12 Thread Dave May
use their approach is dependent on the nature of the problem you are solving. Thanks, Dave > > Mike > > Dave May 于2018年10月11日周四 上午1:50写道: > >> >> >> On Sat, 6 Oct 2018 at 12:42, Matthew Knepley wrote: >> >>> On Fri, Oct 5, 2018 at 9:08 PM Mi

Re: [petsc-users] Failure of MUMPS

2018-10-11 Thread Dave May
On Sat, 6 Oct 2018 at 12:42, Matthew Knepley wrote: > On Fri, Oct 5, 2018 at 9:08 PM Mike Wick > wrote: > >> Hello PETSc team: >> >> I am trying to solve a PDE problem with high-order finite elements. The >> matrix is getting denser and my experience is that MUMPS just outperforms >> iterative s

Re: [petsc-users] Fwd: Implementing a homotopy solver

2018-09-29 Thread Dave May
On Sat, 29 Sep 2018 at 16:09, Matthew Knepley wrote: > On Sat, Sep 29, 2018 at 9:47 AM zakaryah wrote: > >> Hi Matt - thanks for all your help. >> >> Let's say I want exactly the same solver for the tangent vector and the >> SNES update, so I should reuse the KSP. >> > If you want to do this, th

Re: [petsc-users] Checking if a vector is a localvector of a given DMDA

2018-09-25 Thread Dave May
l and local sizes based on the DMDA properties > instead, for now I want to get to an alpha version I can let people play > with. > > Phil > > On 25/09/18 13:07, Dave May wrote: > > > > On Tue, 25 Sep 2018 at 13:20, Matthew Knepley wrote: > >> On Tue, Sep 25,

Re: [petsc-users] Checking if a vector is a localvector of a given DMDA

2018-09-25 Thread Dave May
On Tue, 25 Sep 2018 at 13:20, Matthew Knepley wrote: > On Tue, Sep 25, 2018 at 7:03 AM Dave May wrote: > >> On Tue, 25 Sep 2018 at 11:49, Phil Tooley >> wrote: >> >>> Hi all, >>> >>> Given a vector I know I can get an associated DM (if there

Re: [petsc-users] Checking if a vector is a localvector of a given DMDA

2018-09-25 Thread Dave May
On Tue, 25 Sep 2018 at 11:49, Phil Tooley wrote: > Hi all, > > Given a vector I know I can get an associated DM (if there is one) by > calling VecGetDM, but I need to also be able to check that > > a) the vector is the localvector of that DM rather than the global > Given the vector, you can che

Re: [petsc-users] Use block Jacobi preconditioner with SNES

2018-08-27 Thread Dave May
On Mon, 27 Aug 2018 at 10:12, Ali Reza Khaz'ali wrote: > > Okay, interesting. I take it you either are not running in parallel > or need to have several subdomains (of varying size) per process. > > One approach would be to use PCASM (with zero overlap, it is > equivalent to Block Jacobi) and

Re: [petsc-users] Use block Jacobi preconditioner with SNES

2018-08-26 Thread Dave May
If none of the suggestions provided is to your taste, why not just build the preconditioner matrix yourself? Seems your have precise requirements and the relevant info of the individual blocks, so you should be able to construct the preconditioner, either using A (original operator) or directly fro

Re: [petsc-users] Question about I/O in PETSc

2018-08-25 Thread Dave May
On Sun, 26 Aug 2018 at 03:54, Yingjie Wu wrote: > Dear PETSc developer: > Hello, > I am a student of nuclear energy science from Tsinghua University. I want > to do some work of neutron numerical simulation based on PETSc. At present, > some examples of learning and testing PETSc have the followi

Re: [petsc-users] MatSetValues error with ViennaCL types

2018-08-15 Thread Dave May
On Thu, 16 Aug 2018 at 04:44, Manuel Valera wrote: > Thanks Matthew and Barry, > > Now my code looks like: > > call DMSetMatrixPreallocateOnly(daDummy,PETSC_TRUE,ierr) > > call DMSetMatType(daDummy,MATMPIAIJVIENNACL,ierr) >> call DMSetVecType(daDummy,VECMPIVIENNACL,ierr) >> > call DMCreateMatrix(

Re: [petsc-users] memory corruption when using harmonic extraction with SLEPc

2018-08-06 Thread Dave May
an when I only use a single process, eventually allocating all of the > available memory > > > Do you know if this behavior of the harmonic extraction routine > intended/necessary? > > > Regards, > Moritz > > > > > > > > -

Re: [petsc-users] memory corruption when using harmonic extraction with SLEPc

2018-08-02 Thread Dave May
On Thu, 2 Aug 2018 at 21:32, Moritz Cygorek wrote: > Hi, > > > I want to diagonalize a huge sparse matrix and I'm using the Kryov-Schur > method with harmonic extraction (command line option -eps_harmonic ) > implemented in SLEPc. > > > I manually distribute a sparse matrix across several CPUs an

Re: [petsc-users] Fieldsplit - Schur Complement Reduction - Efficient Preconditioner for Schur Complement

2018-07-27 Thread Dave May
> > RWTH Aachen University > > > > Mathieustr. 10| Tel +49 (0)241 80 49907 > > 52074 Aachen, Germany | Fax +49 (0)241 80 49889 > > > > http://www.eonerc.rwth-aachen.de/GGE > > hbues...@eonerc.rwth-aachen.de > > > > *Von:* Dave May

Re: [petsc-users] Fieldsplit - Schur Complement Reduction - Efficient Preconditioner for Schur Complement

2018-07-25 Thread Dave May
ied Geophysics and Geothermal Energy > > E.ON Energy Research Center > > RWTH Aachen University > > > > Mathieustr. 10 > <https://maps.google.com/?q=Mathieustr.+10&entry=gmail&source=g>| > Tel +49 (0)241 80 49907 > > 52074 Aachen, Germany | Fax +4

Re: [petsc-users] Fieldsplit - Schur Complement Reduction - Efficient Preconditioner for Schur Complement

2018-07-25 Thread Dave May
On 25 July 2018 at 09:48, Matthew Knepley wrote: > On Wed, Jul 25, 2018 at 4:24 AM Buesing, Henrik < > hbues...@eonerc.rwth-aachen.de> wrote: > >> Dear all, >> >> I would like to improve the iterative solver [1]. As I understand it I >> would need to improve the preconditioner for the Schur compl

Re: [petsc-users] petsc4py: parallel matrix-vector multiplication

2018-05-06 Thread Dave May
ector. See > > petsc4py/demo/bratu3d/bratu3d.py for examples of efficiently setting > > values (and computing residuals) using Global vectors. It should be > > simpler/cleaner code than you currently have. > > > >> > >> Best > >> -Robert- > >

Re: [petsc-users] petsc4py: parallel matrix-vector multiplication

2018-05-06 Thread Dave May
On Sun, 6 May 2018 at 10:40, Robert Speck wrote: > Hi! > > I would like to do a matrix-vector multiplication (besides using linear > solvers and so on) with petsc4py. I took the matrix from this example > (https://bitbucket.org/petsc/petsc4py/src/master/demo/kspsolve/petsc-mat.py) This example

Re: [petsc-users] Problems with VecGetArray under sub-communicators

2018-04-22 Thread Dave May
On Sun, 22 Apr 2018 at 20:13, Zin Lin wrote: > Hi > I am experiencing possible memory issues with VecGetArray when it is used > under sub-communicators (when I split the PETSC_COMM_WORLD to multiple > subcomms). The following is the minimal code. Basically, you can test that > if you parallelize

Re: [petsc-users] error running parallel on cluster

2018-04-18 Thread Dave May
ETSc release as a > contributor, you > can make a PR with this change :) > Here is a URL describing the PR's protocol for PETSc contribs: https://bitbucket.org/petsc/petsc/wiki/pull-request-instructions-git > > Thanks, > > Matt > > >> Sepideh >&

Re: [petsc-users] error running parallel on cluster

2018-04-18 Thread Dave May
On 18 April 2018 at 21:06, Sepideh Kavousi wrote: > Mathew > > I added the lines and I still have the same issue. It may be a silly > question but should I configure and install petsc again using this new > lines added? or changing the line is enough? the highlighted lines are the > lines I modif

Re: [petsc-users] petsc4py: reuse setup for multiple solver calls?

2018-04-05 Thread Dave May
On Fri, 6 Apr 2018 at 07:48, Robert Speck wrote: > Thank you for your answer! Please see below for comments/questions. > > On 05.04.18 12:53, Matthew Knepley wrote: > > On Thu, Apr 5, 2018 at 6:39 AM, Robert Speck > > wrote: > > > > Hi! > > > > I would like

Re: [petsc-users] Obtaining compiling and building information from a.out

2018-03-27 Thread Dave May
On 27 March 2018 at 10:16, TAY wee-beng wrote: > Hi, > > I have been compiling and building different version of my CFD with the > intel 2016, 2018 compilers, and also different compiling options. > > I tested a version of my a.out and it is much faster than the other a.out, > using only 3 min in

  1   2   3   4   >