Actually given the Subject of this email "Finite difference approximation of 
Jacobian" what I suggested is A complete story anyways for Patrick since the 
user cannot provide numerical values (of course Patrick could write a general 
new hash matrix type and then use it with zeros but that is asking a lot of 
him).

  Regarding Fande's needs. One could rig things so that later one could "flip" 
back the matrix to again use the hasher for setting values when the contacts 
change. 

> On Jan 12, 2022, at 2:48 AM, Barry Smith <bsm...@petsc.dev> wrote:
> 
> 
> 
>> On Jan 12, 2022, at 2:22 AM, Jed Brown <j...@jedbrown.org> wrote:
>> 
>> Because if a user jumps right into MatSetValues() and then wants to use the 
>> result, it better have the values. It's true that DM preallocation normally 
>> just "inserts zeros" and thus matches what MATPREALLOCATOR does.
> 
>  I am not saying the user jumps right into MatSetValues(). I am saying they 
> do a "value" free assembly to have the preallocation set up and then they do 
> the first real assembly; hence very much JUST a refactorization of 
> MATPREALLOCATE. So the user is "preallocation-aware" but does not have to do 
> anything difficult to determine their pattern analytically.
> 
>  Of course handling the values also is fine and simplifies user code and the 
> conceptual ideas (since preallocation ceases to exist as a concept for 
> users). I have no problem with skipping the "JUST a refactorization of 
> MATPREALLOCATE code" but for Patrick's current pressing need I think a 
> refactorization of MATPREALLOCATE would be fastest to develop hence I 
> suggested it.
> 
>  I did not look at Jed's branch but one way to eliminate the preallocation 
> part completely would be to have something like MATHASH and then when the 
> first assembly is complete do a MatConvert to AIJ or whatever.  The 
> conversion could be hidden inside the assemblyend of the hash so the user 
> need not be aware that something different was done the first time through.  
> Doing it this way would easily allow multiple MATHASH* implementations 
> (written by different people) that would compete on speed and memory usage.  
> So there could be MatSetType() and a new MatSetInitialType() where the 
> initial type would be automatically set to some MATHASH if preallocation info 
> is not provided.
> 
>> 
>> Barry Smith <bsm...@petsc.dev> writes:
>> 
>>> Why does it need to handle values? 
>>> 
>>>> On Jan 12, 2022, at 12:43 AM, Jed Brown <j...@jedbrown.org> wrote:
>>>> 
>>>> I agree with this and even started a branch jed/mat-hash in (yikes!) 2017. 
>>>> I think it should be default if no preallocation functions are called. But 
>>>> it isn't exactly the MATPREALLOCATOR code because it needs to handle 
>>>> values too. Should not be a lot of code and will essentially remove this 
>>>> FAQ and one of the most irritating subtle aspects of new codes using PETSc 
>>>> matrices.
>>>> 
>>>> https://petsc.org/release/faq/#assembling-large-sparse-matrices-takes-a-long-time-what-can-i-do-to-make-this-process-faster-or-matsetvalues-is-so-slow-what-can-i-do-to-speed-it-up
>>>> 
>>>> Barry Smith <bsm...@petsc.dev> writes:
>>>> 
>>>>> I think the MATPREALLOCATOR as a MatType is a cumbersome strange thing 
>>>>> and would prefer it was just functionality that Mat provided directly; 
>>>>> for example MatSetOption(mat, preallocator_mode,true); matsetvalues,... 
>>>>> MatAssemblyBegin/End. Now the matrix has its proper nonzero structure of 
>>>>> whatever type the user set initially, aij, baij, sbaij, .... And the user 
>>>>> can now use it efficiently.
>>>>> 
>>>>> Barry
>>>>> 
>>>>> So turning on the option just swaps out temporarily the operations for 
>>>>> MatSetValues and AssemblyBegin/End to be essentially those in 
>>>>> MATPREALLOCATOR. The refactorization should take almost no time and would 
>>>>> be faster than trying to rig dmstag to use MATPREALLOCATOR as is.
>>>>> 
>>>>> 
>>>>>> On Jan 11, 2022, at 9:43 PM, Matthew Knepley <knep...@gmail.com> wrote:
>>>>>> 
>>>>>> On Tue, Jan 11, 2022 at 12:09 PM Patrick Sanan <patrick.sa...@gmail.com 
>>>>>> <mailto:patrick.sa...@gmail.com>> wrote:
>>>>>> Working on doing this incrementally, in progress here: 
>>>>>> https://gitlab.com/petsc/petsc/-/merge_requests/4712 
>>>>>> <https://gitlab.com/petsc/petsc/-/merge_requests/4712>
>>>>>> 
>>>>>> This works in 1D for AIJ matrices, assembling a matrix with a maximal 
>>>>>> number of zero entries as dictated by the stencil width (which is 
>>>>>> intended to be very very close to what DMDA would do if you 
>>>>>> associated all the unknowns with a particular grid point, which is the 
>>>>>> way DMStag largely works under the hood).
>>>>>> 
>>>>>> Dave, before I get into it, am I correct in my understanding that 
>>>>>> MATPREALLOCATOR would be better here because you would avoid superfluous 
>>>>>> zeros in the sparsity pattern,
>>>>>> because this routine wouldn't have to assemble the Mat returned by 
>>>>>> DMCreateMatrix()?
>>>>>> 
>>>>>> Yes, here is how it works. You throw in all the nonzeros you come 
>>>>>> across. Preallocator is a hash table that can check for duplicates. At 
>>>>>> the end, it returns the sparsity pattern.
>>>>>> 
>>>>>> Thanks,
>>>>>> 
>>>>>>   Matt
>>>>>> 
>>>>>> If this seems like a sane way to go, I will continue to add some more 
>>>>>> tests (in particular periodic BCs not tested yet) and add the code for 
>>>>>> 2D and 3D. 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> Am Mo., 13. Dez. 2021 um 20:17 Uhr schrieb Dave May 
>>>>>> <dave.mayhe...@gmail.com <mailto:dave.mayhe...@gmail.com>>:
>>>>>> 
>>>>>> 
>>>>>> On Mon, 13 Dec 2021 at 20:13, Matthew Knepley <knep...@gmail.com 
>>>>>> <mailto:knep...@gmail.com>> wrote:
>>>>>> On Mon, Dec 13, 2021 at 1:52 PM Dave May <dave.mayhe...@gmail.com 
>>>>>> <mailto:dave.mayhe...@gmail.com>> wrote:
>>>>>> On Mon, 13 Dec 2021 at 19:29, Matthew Knepley <knep...@gmail.com 
>>>>>> <mailto:knep...@gmail.com>> wrote:
>>>>>> On Mon, Dec 13, 2021 at 1:16 PM Dave May <dave.mayhe...@gmail.com 
>>>>>> <mailto:dave.mayhe...@gmail.com>> wrote:
>>>>>> 
>>>>>> 
>>>>>> On Sat 11. Dec 2021 at 22:28, Matthew Knepley <knep...@gmail.com 
>>>>>> <mailto:knep...@gmail.com>> wrote:
>>>>>> On Sat, Dec 11, 2021 at 1:58 PM Tang, Qi <tan...@msu.edu 
>>>>>> <mailto:tan...@msu.edu>> wrote:
>>>>>> Hi,
>>>>>> Does anyone have comment on finite difference coloring with DMStag? We 
>>>>>> are using DMStag and TS to evolve some nonlinear equations implicitly. 
>>>>>> It would be helpful to have the coloring Jacobian option with that.
>>>>>> 
>>>>>> Since DMStag produces the Jacobian connectivity,
>>>>>> 
>>>>>> This is incorrect.
>>>>>> The DMCreateMatrix implementation for DMSTAG only sets the number of 
>>>>>> nonzeros (very inaccurately). It does not insert any zero values and 
>>>>>> thus the nonzero structure is actually not defined. 
>>>>>> That is why coloring doesn’t work.
>>>>>> 
>>>>>> Ah, thanks Dave.
>>>>>> 
>>>>>> Okay, we should fix that.It is perfectly possible to compute the nonzero 
>>>>>> pattern from the DMStag information.
>>>>>> 
>>>>>> Agreed. The API for DMSTAG is complete enough to enable one to
>>>>>> loop over the cells, and for all quantities defined on the cell (centre, 
>>>>>> face, vertex), 
>>>>>> insert values into the appropriate slot in the matrix. 
>>>>>> Combined with MATPREALLOCATOR, I believe a compact and readable
>>>>>> code should be possible to write for the preallocation (cf DMDA).
>>>>>> 
>>>>>> I think the only caveat with the approach of using all quantities 
>>>>>> defined on the cell is 
>>>>>> It may slightly over allocate depending on how the user wishes to impose 
>>>>>> the boundary condition,
>>>>>> or slightly over allocate for says Stokes where there is no 
>>>>>> pressure-pressure coupling term.
>>>>>> 
>>>>>> Yes, and would not handle higher order stencils.I think the 
>>>>>> overallocating is livable for the first imeplementation.
>>>>>> 
>>>>>> 
>>>>>> Sure, but neither does DMDA. 
>>>>>> 
>>>>>> The user always has to know what they are doing and set the stencil 
>>>>>> width accordingly.
>>>>>> I actually had this point listed in my initial email (and the stencil 
>>>>>> growth issue when using FD for nonlinear problems),
>>>>>> however I deleted it as all the same issue exist in DMDA and no one 
>>>>>> complains (at least not loudly) :D
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> Thanks,
>>>>>> 
>>>>>>   Matt
>>>>>> 
>>>>>> Thanks,
>>>>>> Dave
>>>>>> 
>>>>>> 
>>>>>> Paging Patrick :)
>>>>>> 
>>>>>> Thanks,
>>>>>> 
>>>>>>  Matt
>>>>>> 
>>>>>> Thanks,
>>>>>> Dave
>>>>>> 
>>>>>> 
>>>>>> you can use -snes_fd_color_use_mat. It has many options. Here is an 
>>>>>> example of us using that:
>>>>>> 
>>>>>> https://gitlab.com/petsc/petsc/-/blob/main/src/snes/tutorials/ex19.c#L898
>>>>>>  
>>>>>> <https://gitlab.com/petsc/petsc/-/blob/main/src/snes/tutorials/ex19.c#L898>
>>>>>> 
>>>>>> Thanks,
>>>>>> 
>>>>>>   Matt
>>>>>> 
>>>>>> Thanks,
>>>>>> Qi
>>>>>> 
>>>>>> 
>>>>>>> On Oct 15, 2021, at 3:07 PM, Jorti, Zakariae via petsc-users 
>>>>>>> <petsc-users@mcs.anl.gov <mailto:petsc-users@mcs.anl.gov>> wrote:
>>>>>>> 
>>>>>>> Hello,
>>>>>>> 
>>>>>>> Does the Jacobian approximation using coloring and finite differencing 
>>>>>>> of the function evaluation work in DMStag?   
>>>>>>> Thank you.
>>>>>>> Best regards,
>>>>>>> 
>>>>>>> Zakariae 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> -- 
>>>>>> What most experimenters take for granted before they begin their 
>>>>>> experiments is infinitely more interesting than any results to which 
>>>>>> their experiments lead.
>>>>>> -- Norbert Wiener
>>>>>> 
>>>>>> https://www.cse.buffalo.edu/~knepley/ 
>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>> 
>>>>>> 
>>>>>> -- 
>>>>>> What most experimenters take for granted before they begin their 
>>>>>> experiments is infinitely more interesting than any results to which 
>>>>>> their experiments lead.
>>>>>> -- Norbert Wiener
>>>>>> 
>>>>>> https://www.cse.buffalo.edu/~knepley/ 
>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>> 
>>>>>> 
>>>>>> -- 
>>>>>> What most experimenters take for granted before they begin their 
>>>>>> experiments is infinitely more interesting than any results to which 
>>>>>> their experiments lead.
>>>>>> -- Norbert Wiener
>>>>>> 
>>>>>> https://www.cse.buffalo.edu/~knepley/ 
>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>> 
>>>>>> 
>>>>>> -- 
>>>>>> What most experimenters take for granted before they begin their 
>>>>>> experiments is infinitely more interesting than any results to which 
>>>>>> their experiments lead.
>>>>>> -- Norbert Wiener
>>>>>> 
>>>>>> https://www.cse.buffalo.edu/~knepley/ 
>>>>>> <http://www.cse.buffalo.edu/~knepley/>
> 

Reply via email to