On Fri, Jun 9, 2023 at 12:04 PM neil liu wrote:
> Dear Petsc developers,
>
> I am using valgrind to check the memory leak. It shows,
> [image: image.png]
> Finally, I found that DMPlexrestoretrasitiveclosure can resolve this
> memory leak.
>
> My question is from the above screen shot, it seems
_type lu
Thanks,
Matt
> ------
> *De:* Matthew Knepley
> *Enviado:* viernes, 9 de junio de 2023 4:13:35
> *Para:* Nicolas Garcia Guzman
> *Cc:* petsc-users@mcs.anl.gov
> *Asunto:* Re: [petsc-users] Behavior of KSP iterations when using Restart
>
&
On Thu, Jun 8, 2023 at 9:13 PM Nicolas Garcia Guzman
wrote:
> Hello,
>
>
> I am solving a linear system using petsc4py, with the following command:
>
>
> python main.py -ksp_type gmres -ksp_gmres_restart 16 -ksp_max_it 18
> -ksp_monitor -ksp_converged_reason -ksp_rtol 1e-15 -pc_type asm
>
On Thu, Jun 1, 2023 at 1:46 AM Duan Junming via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Dear all,
>
>
> I have a simple demo code attached below, which gives a segmentation
> violation error.
>
> Can you help me with this problem? I think the problem is due to the
> destroy function.
>
>
On Wed, May 31, 2023 at 6:37 PM Ferrand, Jesus A.
wrote:
> Dear PETSc team:
>
> For one of my applications, I need to know which owned DAG points in a
> (DMPlex) are other ranks' ghosts.
> Say, rank-0 has some point "x" (which it owns) and it shows up in, say,
> rank-1 as a ghost numbered "y".
d for consistency and comparison to these other (more appropriate!) uses
> of PETSc.
>
> Thanks.
> Kenneth
>
>
>
> *From: *Matthew Knepley
> *Date: *Wednesday, May 31, 2023 at 3:48 PM
> *To: *Kenneth C Hall
> *Cc: *petsc-users@mcs.anl.gov
> *Subject: *Re: [pe
On Wed, May 31, 2023 at 3:21 PM Kenneth C Hall
wrote:
> Hi,
>
>
>
> I am doing a number of problems using PETSc/SLEPc, but I also work on some
> non-PETSc/SLEPc flow solvers. I would like to use PETSc as a wrapper for
> this non-PETSc flow solver for compatibility, so I can use the tolerance
On Wed, May 31, 2023 at 2:43 PM YuSh Lo wrote:
> Matthew Knepley 於 2023年5月31日 週三 下午1:02寫道:
>
>> On Wed, May 31, 2023 at 1:53 PM YuSh Lo wrote:
>>
>>> Hi Matthew,
>>>
>>> Matthew Knepley 於 2023年5月31日 週三 上午5:08寫道:
>>>
>>>
On Wed, May 31, 2023 at 1:53 PM YuSh Lo wrote:
> Hi Matthew,
>
> Matthew Knepley 於 2023年5月31日 週三 上午5:08寫道:
>
>> On Wed, May 31, 2023 at 1:25 AM YuSh Lo wrote:
>>
>>> Hi,
>>>
>>> I have some multiple points constraint input as follows,
&
th exit code 1 (use -v to see
> invocation)*
>
>
>
>
>
> I attach the complete log file.
>
>
>
> Thanks a lot for your help.
>
>
>
> Best regards,
>
>
>
> Joauma
>
>
>
> *De : *Matthew Knepley
> *Date : *mercredi, 31 mai 2023 à 12:03
>
On Wed, May 31, 2023 at 1:25 AM YuSh Lo wrote:
> Hi,
>
> I have some multiple points constraint input as follows,
>
> A_1 a_4
> B_2 b_5
> C_3 c_6
>
> each columns are stored in different IS.
>
So one IS lists the capital letter and one lists the lowercase?
> After dmplex distribute, they will
On Wed, May 31, 2023 at 5:25 AM Joauma Marichal <
joauma.maric...@uclouvain.be> wrote:
> Hello,
>
>
>
> I am writing to you as I am trying to compile petsc on my mac.
>
>
>
> I used:
>
> $ export
>
On Mon, May 29, 2023 at 5:47 PM YuSh Lo wrote:
> Hi,
>
> How to get the offset in global vector for the points not owned by this
> processor?
> I have a parallel DMPlex and a section assigned to it.
> GetSectionGetOffset with a global section returns -1 for the points not
> owned by this
Checking back. What does not work?
Thanks,
Matt
On Tue, Jan 24, 2023 at 11:26 AM Matthew Knepley wrote:
> On Tue, Jan 24, 2023 at 10:39 AM Berend van Wachem <
> berend.vanwac...@ovgu.de> wrote:
>
>> Dear Matt,
>>
>> I have been working on this now
On Tue, May 23, 2023 at 8:44 PM Ferrand, Jesus A.
wrote:
> Dear PETSc team:
>
> I am trying to use DMPlex and DMLabel to develop an API to write plexes to
> .cgns format in parallel.
> To that end, I need a way to extract the height-0 points and sort them by
> topological type (i.e., chunk of
y. If this
>> amount of memory is sufficient for solving the matrix with approximately 3
>> million degrees of freedom?
>>
>> Thanks!
>> Zongze
>>
>> Il giorno lun 22 mag 2023 alle ore 20:03 Zongze Yang <
>>> yangzon...@gmail.com> ha scri
ees of freedom?
>
It really depends on the fill. Suppose that you get 1% fill, then
(3e6)^2 * 0.01 * 8 = 1e12 B
and you have 1.5e12 B, so I could easily see running out of memory.
Thanks,
Matt
> Thanks!
> Zongze
>
> Il giorno lun 22 mag 2023 alle ore 20:03 Zongze Yan
On Mon, May 22, 2023 at 11:07 AM Zongze Yang wrote:
> Hi,
>
> I hope this letter finds you well. I am writing to seek guidance regarding
> an error I encountered while solving a matrix using MUMPS on multiple nodes:
>
Iprobe is buggy on several MPI implementations. PETSc has an option for
llCoordinatesLocal and CoordinatesLocal works.
>
> Best regards, Berend.
>
> Many thanks and best regards, Berend.
>
> On 5/17/23 23:04, Matthew Knepley wrote:
> > On Wed, May 17, 2023 at 2:01 PM Berend van Wachem
> > mailto:berend.vanwac...@ovgu.de>> wrote:
> >
> >
t; 2 13 0 0
> 2 21 0 0
> 2 26 0 0
> 3 1 0 0
> $EndNodes
> $Elements
> 5 24 1 24
> 2 1 2 2
> 1 1 2 4
> 2 4 2 3
> 2 13 2 4
> 3 9 1 2
> 4 9 2 10
> 5 5 9 10
> 6 5 10 6
> 2 21 2 4
> 7 11 3 4
> 8 11 4 12
> 9 7 11 12
> 10 7 12 8
> 2 26 2 2
>
On Wed, May 17, 2023 at 6:58 PM neil liu wrote:
> Dear Petsc developers,
>
> I am writing my own code to calculate the FEM matrix. The following is my
> general framework,
>
> DMPlexCreateGmsh();
> MPI_Comm_rank (Petsc_comm_world, );
> DMPlexDistribute (.., .., );
>
> dm = dmDist;
> //This can
should be 1.0.
>
> Am I doing something wrong?
>
Quickly, I see that
a *= 10.0 + 1.0;
is the same as
a *= 11.0;
not multiply by 10 and add 1. I will send it back when I get everything the
way I want.
Thanks,
Matt
> Thanks and best regards,
>
> Berend.
>
> On 5/
On Wed, May 17, 2023 at 3:23 PM Matthew Knepley wrote:
> On Wed, May 17, 2023 at 2:59 PM Barry Smith wrote:
>
>>
>> Absolutely, that is fundamental to the design.
>>
>> In the simple case where all the degrees of freedom exist at the same
>> grid po
On Wed, May 17, 2023 at 2:59 PM Barry Smith wrote:
>
> Absolutely, that is fundamental to the design.
>
> In the simple case where all the degrees of freedom exist at the same
> grid points, hence storage is like u,v,t,p in the vector the nesting is
> trivial. You indicate the fields
DMSetCoordinatesLocal(dm, xl);
DMGetCellCoordinatesLocal(dm, );
VecScale(xl, scale);
DMSetCellCoordinatesLocal(dm, xl);
Does this not work?
Thanks,
Matt
Best regards, Berend.
>
> On 5/17/23 16:35, Matthew Knepley wrote:
> > On Wed, May 17, 2023 at 10:21 AM Berend van Wachem <
> berend.va
to push in an API for "just" scaling, but I could
be convinced
the other way.
Thanks,
Matt
> Thanks, Berend.
>
> On 5/17/23 16:10, Matthew Knepley wrote:
> > On Wed, May 17, 2023 at 10:02 AM Berend van Wachem
> > mailto:berend.vanwac...@ovgu.de>> wrote:
&g
On Wed, May 17, 2023 at 10:02 AM Berend van Wachem
wrote:
> Dear PETSc Team,
>
> We are using DMPlex, and we create a mesh using
>
> DMPlexCreateBoxMesh ( );
>
> and get a uniform mesh. The mesh is periodic.
>
> We typically want to "scale" the coordinates (vertices) of the mesh, and
> to
On Wed, May 17, 2023 at 9:02 AM Fleischli Benno HSLU T <
benno.fleisc...@hslu.ch> wrote:
> Dear PETSc developers
>
> I am creating a very large parallel sparse matrix (MATMPIAIJ) with PETSc.
> I write this matrix to disk.
> The number of non-zeros exceeds the maximum number a 32-bit integer can
>
s,
Matt
> Thank you for your patience! I am still new to PETSc and learning how to
> use it.
>
>
>
> *From:* Matthew Knepley
> *Sent:* Sunday, May 14, 2023 12:24 PM
> *To:* Khaled Nabil Shar Abdelaziz
> *Cc:* petsc-users@mcs.anl.gov
> *Subject:* Re: [petsc-
in each direction for cell (i, j, k).
Thanks,
Matt
>
>
> Hope I make it clear, thanks!
>
>
>
> Regards,
>
> Kai
>
> Matthew Knepley 于2023年5月16日周二 16:29写道:
>
>> On Tue, May 16, 2023 at 10:27 AM K. Wu wrote:
>>
>>> Hi all,
>&
On Tue, May 16, 2023 at 10:27 AM K. Wu wrote:
> Hi all,
>
> Good day!
>
> I am currently working on interploating the nodal field vector I obtained
> to its corresponding elemental field vector. I am doing it in a simple way
> by using structured mesh, the element value is just the average of
ther, broken build.
I would completely clean out your PETSc installation and start from scratch.
Thanks,
Matt
> Thanks!
> Marcos
> ----------
> *From:* Matthew Knepley
> *Sent:* Monday, May 15, 2023 12:53 PM
> *To:* Vanella, Marcos (Fed)
> *Cc
sing
> the job to be terminated. The first process to do so was:
>
> Process name: [[48108,1],0]
> Exit code:174
> --
> Completed test examples
> Error while running make check
> make[1]: *** [c
On Mon, May 15, 2023 at 11:19 AM Vanella, Marcos (Fed) via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI
> 4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX
> Ventura 13.3.1.
> I can compile PETSc
On Mon, May 15, 2023 at 9:55 AM Zongze Yang wrote:
> On Mon, 15 May 2023 at 17:24, Matthew Knepley wrote:
>
>> On Sun, May 14, 2023 at 7:23 PM Zongze Yang wrote:
>>
>>> Could you try to project the coordinates into the continuity space
On Mon, May 15, 2023 at 9:30 AM Jed Brown wrote:
> Matthew Knepley writes:
>
> > On Fri, May 5, 2023 at 10:55 AM Vilmer Dahlberg via petsc-users <
> > petsc-users@mcs.anl.gov> wrote:
> >
> >> Hi.
> >>
> >>
> >> I'm tryin
On Fri, May 5, 2023 at 10:55 AM Vilmer Dahlberg via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Hi.
>
>
> I'm trying to read a mesh of higher element order, in this example a mesh
> consisting of 10-node tetrahedral elements, from gmsh, into PETSC. But It
> looks like the mesh is not properly
t;
>
> On Mon, 15 May 2023 at 04:24, Matthew Knepley wrote:
>
>> On Sun, May 14, 2023 at 12:27 PM Zongze Yang
>> wrote:
>>
>>>
>>>
>>>
>>> On Sun, 14 May 2023 at 23:54, Matthew Knepley wrote:
>>>
>>>> On Sun, May
On Sun, May 14, 2023 at 12:27 PM Zongze Yang wrote:
>
>
>
> On Sun, 14 May 2023 at 23:54, Matthew Knepley wrote:
>
>> On Sun, May 14, 2023 at 9:21 AM Zongze Yang wrote:
>>
>>> Hi, Matt,
>>>
>>> The issue has been resolved whil
On Sun, May 14, 2023 at 12:06 PM Khaled Nabil Shar Abdelaziz <
kabde...@purdue.edu> wrote:
> Hey there,
>
>
>
> I'm having a problem with the DMDASNESSetFunctionLocal() function in C and
> its Fortran counterpart. The thing is, in C, you can pass a bunch of
> variables using the ctx parameter,
gt; my previous oversight.
>
Great! If you make an MR for this, you will be included on the next list of
PETSc contributors. Otherwise, I can do it.
Thanks,
Matt
> Best wishes,
> Zongze
>
>
> On Sun, 14 May 2023 at 16:44, Matthew Knepley wrote:
>
>> On Sat, May 13, 2
for some references on the order of
>> the dofs on PETSc's FE Space (especially high order elements)?
>>
>> Thanks,
>>
>> Zongze
>>
>> Matthew Knepley 于2022年6月18日周六 20:02写道:
>>
>>> On Sat, Jun 18, 2022 at 2:16 AM Zongze Yang
>>> w
f New Hampshire
> matthew.yo...@unh.edu
> ==
>
>
> On Fri, May 12, 2023 at 10:05 AM Matthew Knepley
> wrote:
>
>> On Fri, May 12, 2023 at 9:40 AM Matthew Young <
>> myoung.space.scie...@gmail.com> wrote:
>>
>>> Got it.
t; --Matt
> ==
> Matthew Young, PhD (he/him)
> Research Scientist II
> Space Science Center
> University of New Hampshire
> matthew.yo...@unh.edu
> ==========
>
>
> On Fri, May 12, 2023 at 5:15 AM Matthew Knepley wrote:
>
>
On Thu, May 11, 2023 at 9:15 PM Matthew Young <
myoung.space.scie...@gmail.com> wrote:
> Does setting up a PIC-type DMSWARM with an associated cell DM guarantee
> that each MPI rank will own the particles with coordinates inside the
> bounds of the portion of the grid it owns?
>
There is a
On Tue, May 9, 2023 at 3:12 PM neil liu wrote:
> Hello, Petsc Developers,
> I am trying to compile ksp/tutorial/ex36.cxx like make ex36,
>
> it shows an error
> " Documents/petsc-3.19.1/include/petscdmmoab.h:10:10: fatal error:
> moab/Core.hpp: No such file or directory
> #include /*I
On Tue, May 9, 2023 at 10:05 AM LEONARDO MUTTI <
leonardo.mutt...@universitadipavia.it> wrote:
> Great thanks! I can now successfully run
> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tests/ex71f.F90.
>
> Going forward with my experiments, let me post a new code snippet (very
> similar
On Tue, May 9, 2023 at 4:15 AM Stephan Köhler <
stephan.koeh...@math.tu-freiberg.de> wrote:
> Dear PETSc/Tao team,
>
> it seems to be that there is a bug in the LMVM matrix class:
>
> The function MatMultAdd_LMVM, see, e.g.,
> https://petsc.org/release/src/ksp/ksp/utils/lmvm/lmvmimpl.c.html at
rk
is patch smoothing (I gave a paper reference). It could
be that we have a bug in LSC, but I thought we verified it with the
Shuttleworth paper.
THanks,
Matt
> Mark
>
>
>>
>> Thanks a lot,
>> Sebastian
>>
>> On 03.05.2023 09:07, Sebastian Blauth wrote:
>> &
it is doing something morally similar.
Thanks,
Matt
> Qi
>
> On May 6, 2023, at 7:35 PM, Matthew Knepley wrote:
>
>
> On Sat, May 6, 2023 at 7:25 PM Jorti, Zakariae via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
>> Hello,
>>
>>
>&g
On Sun, May 7, 2023 at 10:02 AM Edoardo alinovi
wrote:
> Thanks,
>
> Is this a reasonable thing to do if I want to replicate what KSP is doing
> by default?
>
Yes. The other option is to pass along 'dummy'
Thanks,
Matt
--
What most experimenters take for granted before they begin
On Sun, May 7, 2023 at 9:42 AM Edoardo alinovi
wrote:
> Hi Matt,
>
> h, what if I do:
>
> KSPConvergedDefault(ksp, n, rnorm, flag, PETSC_NULL_FUNCTION, ierr)
>
> That looks to behave OK, but I am not sure about what I am doing -.-
>
You are saying that no convergence context was passed in
On Sun, May 7, 2023 at 9:21 AM Edoardo alinovi
wrote:
> Hello guys,
>
> Today I am about to write a custom convergence test for KSP doing the
> following job:
>
> - if the number of ksp iterations is less than a given threshold, iterate
> until that threshold is met
> - if the number of ksp
On Sat, May 6, 2023 at 7:25 PM Jorti, Zakariae via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Hello,
>
>
> I have a time-dependent model that I solve using TSSolve.
>
> And I am trying to adaptively change the step size (dt).
>
> I found that there are some TSAdapt schemes already available.
On Sat, May 6, 2023 at 11:47 AM Huidong Yang
wrote:
> Hi Petsc developer.
>
> may I ask if there is any available implementations in petsc
> using leap-frog scheme?
>
I don't think we have leapfrog, but we do have Stormer-Verlet, which is
also a 2nd order symplectic method.
Thanks,
time index on the inside.
>> Then the blocks would be over all time, but limited space, which is more
>> the spirit of ASM I think.
>>
>> Have you considered waveform relaxation for this problem?
>>
>>Thanks,
>>
>> Matt
>>
>>
>>
On Fri, May 5, 2023 at 5:13 AM Edoardo alinovi
wrote:
> Hi Matt,
>
> I have some more questions on the fieldsplit saga :)
>
> I am running a 1M cell ahmed body case using the following options:
>
> "solver": "fgmres",
> "preconditioner": "fieldsplit",
> "absTol": 1e-6,
>
목) 오후 12:08, Barry Smith 님이 작성:
>>
>>>
>>> You can configure with MUMPS ./configure --download-mumps
>>> --download-scalapack --download-ptscotch --download-metis
>>> --download-parmetis
>>>
>>> And then use MatMatSolve() as in src/m
ing the time index on the inside.
Then the blocks would be over all time, but limited space, which is more
the spirit of ASM I think.
Have you considered waveform relaxation for this problem?
Thanks,
Matt
> Hope this helps.
> Best,
> Leonardo
>
> Il giorno gio 4 mag 202
ng that
>> solver with a sparse matrix. This would give me confidence
>> that nothing in the solver is variable.
>>
>> I could do the sparse finite difference jacobian once, save it to disk,
> and then use that system each time.
>
Yes. That would work.
Thanks,
stem matrix followed by preconditioner matrix:
> Mat Object: 1 MPI process
> type: mffd
> rows=16384, cols=16384
> Matrix-free approximation:
> err=1.49012e-08 (relative error in function evaluation)
> Using wp compute h routine
>
On Thu, May 4, 2023 at 11:24 AM LEONARDO MUTTI <
leonardo.mutt...@universitadipavia.it> wrote:
> Thank you for the help.
> Adding to my example:
>
>
> * call PCGASMSetSubdomains(pc,NSub, subdomains_IS, inflated_IS,ierr)
> call PCGASMDestroySubdomains(NSub,subdomains_IS,inflated_IS,ierr)*
block size is 16
>>> linear system matrix = precond matrix:
>>> Mat Object: (sub_) 1 MPI process
>>> type: seqbaij
>>> rows=16384, cols=16384, bs=16
>>> total: nonzeros=1277952, allocated
rows=16384, cols=16384, bs=16
> total: nonzeros=1277952, allocated nonzeros=1277952
> total number of mallocs used during MatSetValues calls=0
> block size is 16
>
> On Thu, May 4, 2023 at 8:30 AM Mark Adams wrote:
>
>> If you are using MG what i
On Thu, May 4, 2023 at 8:21 AM Mark Lohry wrote:
> Do they start very similarly and then slowly drift further apart?
>
>
> Yes, this. I take it this sounds familiar?
>
> See these two examples with 20 fixed iterations pasted at the end. The
> difference for one solve is slight (final SNES norm
On Wed, May 3, 2023 at 6:05 AM 권승리 / 학생 / 항공우주공학과
wrote:
> Dear developers
>
> I'm trying to use parallel computing and I ran the command 'mpirun -np 4
> ./app'
>
> In this case, there are two problems.
>
> *First,* I encountered error message
> ///
> [0]PETSC ERROR: [1]PETSC ERROR:
On Tue, May 2, 2023 at 2:29 PM Jed Brown wrote:
> Sebastian Blauth writes:
>
> > I agree with your comment for the Stokes equations - for these, I have
> > already tried and used the pressure mass matrix as part of a (additive)
> > block preconditioner and it gave mesh independent results.
> >
blem accessing those nodes
> from the wrong partition unless those nodes are ghosted? Maybe I am not
> thinking about it correctly.
>
>
>
> Kind regards,
>
> Karthik.
>
>
>
>
>
> *From: *Matthew Knepley
> *Date: *Tuesday, 2 May 2023 at 13:35
> *To: *Cho
On Tue, May 2, 2023 at 9:07 AM Blauth, Sebastian <
sebastian.bla...@itwm.fraunhofer.de> wrote:
> Hello,
>
>
>
> I am having a problem using / configuring PETSc to obtain a scalable
> solver for the incompressible Navier Stokes equations. I am discretizing
> the equations using FEM (with the
On Tue, May 2, 2023 at 8:25 AM Karthikeyan Chockalingam - STFC UKRI via
petsc-users wrote:
> Hello,
>
>
>
> This is not exactly a PETSc question. I have a parallel partitioned finite
> element mesh. What are the steps involved in having a contiguous but unique
> set of node numbering from one
On Sun, Apr 30, 2023 at 1:12 PM Matthew Young <
myoung.space.scie...@gmail.com> wrote:
> Hi all,
>
> I am developing a particle-in-cell code that models ions as particles and
> electrons as an inertialess fluid. I use a PIC DMSWARM for the ions, which
> I gather into density and flux before
reporting that.
Matt
>
>
> Regards,
>
>
>
> Danyang
>
>
>
> *From: *
> *Date: *Friday, March 17, 2023 at 11:02 AM
> *To: *'Matthew Knepley'
> *Cc: *
> *Subject: *RE: [petsc-users] PETSC ERROR in DMGetLocalBoundingBox?
>
>
>
> Hi Matt,
o there is some reason it is
not there.
> Is there any way to solve this problem? Or Do I have to reinstall PETSc?
>
Just rebuild
cd $PETSC_DIR
make
Thanks,
Matt
> Best regards
> Seung Lee Kwon
>
> 2023년 4월 26일 (수) 오후 7:05, Matthew Knepley 님이 작성:
>
>> On Wed, Apr 26, 2023 at 6
On Wed, Apr 26, 2023 at 6:02 AM 권승리 / 학생 / 항공우주공학과
wrote:
> Dear developers
>
> Could you recommend the error messages below?
>
> /home/ksl/petsc/arch-linux-c-debug/bin/mpicxx -Wall -Wwrite-strings
> -Wno-strict-aliasing -Wno-unknown-pragmas -Wno-lto-type-mismatch
> -fstack-protector
On Tue, Apr 25, 2023 at 2:21 PM Suh, Hansol via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Actually, let me take that back , about inability to access LMVM.
>
> You can access LMVM Hessian mat after you are done with TaoSolve, but not
> within the iteration.
>
Good catch. You could do this
On Tue, Apr 25, 2023 at 8:53 AM Stefano Carli
wrote:
> Dear PETSc developers,
>
>
>
> I’m using PETSc version 3.14.1 coupled to a Fortran code, and I was
> wondering if there is a way of obtaining in output, possibly at each
> iteration, the estimated Hessian matrix for the BQNLS method.
>
It
> Alternatively, I want to know how to link the LAPACK library.
>
> best,
>
> Seung Lee Kwon
>
> 2023년 4월 25일 (화) 오후 6:44, Matthew Knepley 님이 작성:
>
>> On Mon, Apr 24, 2023 at 11:47 PM 권승리 / 학생 / 항공우주공학과
>> wrote:
>>
>>> Dear all
>>
erShipRange of SubP; I will be able to
> parallelize the loop. Is MatGetOwnerShipRange also available for
> submatrices as well?
>
It is a local submatrix, so I would only run over local things
(parallelization is implicit).
I think I have shown all these operations in the example.
TH
On Mon, Apr 24, 2023 at 11:47 PM 권승리 / 학생 / 항공우주공학과
wrote:
> Dear all
>
> It depends on the problem. It can have hundreds of thousands of degrees of
> freedom.
>
Suppose your matrix was dense and had 1e6 dofs. The work to invert a matrix
is O(N^3) with a small
constant, so it would take 1e18 =
n the PetscSpaceEvaluate() function:
https://petsc.org/main/manualpages/SPACE/PetscSpaceEvaluate/
You can see pointers to the implementations at the bottom of that page.
Thanks,
Matt
> On Fri, Apr 21, 2023 at 12:37 PM neil liu wrote:
>
>> Thanks a lot. Very helpful.
>>
Solve_Private() at
> /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:824*
>
> *[0]PETSC ERROR: #7 KSPSolve() at
> /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:1070*
>
> *End of program *
>
> *so
RRQ(ierr);
>
> ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_ksp_type",
> "preonly"); CHKERRQ(ierr);
>
> ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_pc_type", "lu");
> CHKERRQ(ierr);*/
>
>
>
> ierr = Pe
> Karthik.
>
>
>
>
>
> *From: *Chockalingam, Karthikeyan (STFC,DL,HC) <
> karthikeyan.chockalin...@stfc.ac.uk>
> *Date: *Wednesday, 19 April 2023 at 17:52
> *To: *Matthew Knepley
> *Cc: *petsc-users@mcs.anl.gov
> *Subject: *Re: [petsc-users] Settin
On Mon, Apr 24, 2023 at 7:33 AM 吉兴洲 wrote:
> Dear all,
>
> I'm solving a fluid problem and it has multiple square cylinders in the
> flow area. Unfortunately, I have to use a Cartesian grid (nonuniform) which
> can't generated by Gmsh.
>
> I have noticed that the object *DMDA* and *DMFOREST*
\ / x \ / 0 \ / y \ / 0 \
\ 0 / \ 1 / \ 0 / \ x / \ 0 / \ y /
six vectors with 2 components each.
Thanks,
Matt
> Thanks,
>
> Xiaodong
>
>
>
>
>
>
> On Fri, Apr 21, 2023 at 10:05 AM Matthew Knepley
> wrote:
>
>> On Fri, Apr 21, 2023 at 10:0
On Fri, Apr 21, 2023 at 10:02 AM neil liu wrote:
> Hello, Petsc group,
>
> I am learning the FE structure in Petsc by running case
> https://petsc.org/main/src/snes/tutorials/ex12.c.html with -run_type test
> -bc_type dirichlet -dm_plex_interpolate 0 -petscspace_degree 1
> -show_initial
work for this
system.
> I did some blind search using gamg/hypre and they look terribile. I guess
> i am missing a trick, probaly they are not the way to go?
>
I believe that trick is that the patches you use have to very specific.
Thanks,
Matt
> Thanks!
>
> Il Lun 17 Apr
On Thu, Apr 20, 2023 at 6:13 AM Karthikeyan Chockalingam - STFC UKRI via
petsc-users wrote:
> Hello,
>
>
>
> I created a new thread, thought would it be more appropriate (and is a
> continuation of my previous post). I want to construct the below K matrix
> (which is composed of submatrices)
>
>
If this is a problem, let's make an example and I can debug it because I
thought that this worked.
I might have only tested with Plex.
Thanks,
Matt
On Wed, Apr 12, 2023 at 11:32 AM Mark Adams wrote:
> First, you don't want a DMShell. Just use da_swarm.
> See src/dm/tutorials/ex20.c
>
>
On Wed, Apr 19, 2023 at 8:40 AM Joauma Marichal <
joauma.maric...@uclouvain.be> wrote:
> Hello,
>
>
>
> I am using the DMSwarm library in some Eulerian-Lagrangian approach to
> have vapor bubbles in water.
>
> I would like the bubbles to have an impact on the water fields of the same
> cells and
this case.
Thanks,
Matt
> Best regards,
>
> Karthik.
>
>
>
>
>
>
>
> *From: *Matthew Knepley
> *Date: *Tuesday, 18 April 2023 at 11:08
> *To: *Chockalingam, Karthikeyan (STFC,DL,HC) <
> karthikeyan.chockalin...@stfc.ac.uk>
> *Cc: *petsc-user
On Tue, Apr 18, 2023 at 5:24 AM Karthikeyan Chockalingam - STFC UKRI via
petsc-users wrote:
> Hello,
>
>
>
> I'm solving a problem using the Lagrange multiplier, the matrix has the
> form
>
>
>
> K = [A P^T
>
>P 0]
>
>
>
> I am familiar with constructing K using MATMPIAIJ. However, I
penmp=1 --with-cxx-dialect=C++11
> --with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices
> --with-make-np=256 --download-hpddm
> [1]PETSC ERROR: #1 buildTwo() at
> /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012
>
> On Mon, Apr 17, 2023 at 4:55 PM Ma
I don't think so. Can you show the whole stack?
THanks,
Matt
On Mon, Apr 17, 2023 at 6:24 PM Alexander Lindsay
wrote:
> If it helps: if I use those exact same options in serial, then no errors
> and the linear solve is beautiful :-)
>
> On Mon, Apr 17, 2023 at 4:22 PM Alexander Lindsay
On Mon, Apr 17, 2023 at 6:37 AM Edoardo alinovi
wrote:
> Sure thing, the solver I am working on is this one:
> https://gitlab.com/alie89/flubio-code-fvm.
>
> It is a 3D, collocated, unstructured, finite volume solver for
> incompressibility NS. I can run steady, unsteady and I can use SIMPLE,
On Mon, Apr 17, 2023 at 6:16 AM Edoardo alinovi
wrote:
> Do you mean the solver I am messing around? XD
>
Yes, and what physics it is targeting.
THanks,
Matt
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results
On Mon, Apr 17, 2023 at 6:09 AM Edoardo alinovi
wrote:
> Thanks Matt, your always there when you need <3
>
Glad it's working! Sometime you have to tell me what it is solving.
Thanks,
Matt
--
What most experimenters take for granted before they begin their
experiments is infinitely
On Mon, Apr 17, 2023 at 6:00 AM Edoardo alinovi
wrote:
> Hey Matt,
>
> Thanks for the help. Here is the error:
>
> [0]PETSC ERROR: - Error Message
> --
> [0]PETSC ERROR: Object is in wrong state
> [0]PETSC ERROR: Not
On Mon, Apr 17, 2023 at 5:36 AM Edoardo alinovi
wrote:
> Hello Barry, Matt, Jed,
>
> I have just installed the latest and greatest version of petsc and I am
> hitting a problem I did not have in previous releases.
>
> Here is the error:
>
>
>
>
> *[1]PETSC ERROR: - Error
line is wrong. Let's start with a PETSc example. Can you
make Vec ex1f90/F90?
cd $PETSC_DIR
cd src/vec/vec/tutorials/
make ex1f90
That will also show the correct link line.
Thanks,
Matt
> Danny.
>
> On 15 Apr 2023, at 13:41, Matthew Knepley wrote:
>
> On Sat, Apr 15, 2023 a
301 - 400 of 6176 matches
Mail list logo