Re: [petsc-users] DMShellSetCreateRestriction

2016-03-11 Thread Dave May
> Other suggestions on how to best integrate staggered finite differences > within the current PETSc framework are ofcourse also highly welcome. > Our current thinking was to pack it into a DMSHELL (which has the problem > of not having a restriction interface). > > Using DMShell is the cleanest ap

Re: [petsc-users] DMShellSetCreateRestriction

2016-03-11 Thread Boris Kaus
Thanks - that would be great! > On Mar 11, 2016, at 10:25 PM, Barry Smith wrote: > > > Boris, > >We will add this support to the DMShell and its usage from PCMG within a > few days. > > Barry > >> On Mar 11, 2016, at 3:39 PM, Boris Kaus wrote: >> >> >>> On Mar 11, 2016, at 8:53 PM

Re: [petsc-users] CPU vs GPU for PETSc applications

2016-03-11 Thread Jed Brown
Justin Chang writes: > Matt, > > So what's an example of "doing a bunch of iterations to make sending the > initial datadown worth it"? CG/Jacobi for a high resolution problem. You pretty much have to have thrown in the towel on finding a good preconditioner, otherwise you'd be at risk of solv

Re: [petsc-users] DMShellSetCreateRestriction

2016-03-11 Thread Barry Smith
Boris, We will add this support to the DMShell and its usage from PCMG within a few days. Barry > On Mar 11, 2016, at 3:39 PM, Boris Kaus wrote: > > >> On Mar 11, 2016, at 8:53 PM, Matthew Knepley wrote: >> >> On Fri, Mar 11, 2016 at 12:26 PM, Dave May wrote: >> On 11 March 2016

Re: [petsc-users] DMShellSetCreateRestriction

2016-03-11 Thread Boris Kaus
> On Mar 11, 2016, at 8:53 PM, Matthew Knepley wrote: > > On Fri, Mar 11, 2016 at 12:26 PM, Dave May > wrote: > On 11 March 2016 at 18:11, anton > wrote: > Hi team, > > I'm implementing staggered grid in a PETSc-canonical way, trying

Re: [petsc-users] DMShellSetCreateRestriction

2016-03-11 Thread Matthew Knepley
On Fri, Mar 11, 2016 at 12:26 PM, Dave May wrote: > On 11 March 2016 at 18:11, anton wrote: > >> Hi team, >> >> I'm implementing staggered grid in a PETSc-canonical way, trying to build >> a custom DM object, attach it to SNES, that should later transfered it >> further to KSP and PC. >> >> Yet,

Re: [petsc-users] DMShellSetCreateRestriction

2016-03-11 Thread Dave May
On 11 March 2016 at 18:11, anton wrote: > Hi team, > > I'm implementing staggered grid in a PETSc-canonical way, trying to build > a custom DM object, attach it to SNES, that should later transfered it > further to KSP and PC. > > Yet, the Galerking coarsening for staggered grid is non-symmetric.

[petsc-users] DMShellSetCreateRestriction

2016-03-11 Thread anton
Hi team, I'm implementing staggered grid in a PETSc-canonical way, trying to build a custom DM object, attach it to SNES, that should later transfered it further to KSP and PC. Yet, the Galerking coarsening for staggered grid is non-symmetric. The question is how possible is it that DMShellS

Re: [petsc-users] CPU vs GPU for PETSc applications

2016-03-11 Thread Matthew Knepley
On Thu, Mar 10, 2016 at 4:48 PM, Justin Chang wrote: > Matt, > > So what's an example of "doing a bunch of iterations to make sending the > initial datadown worth it"? Is there a correlation between that and > arithmetic intensity, where an application is likely to be more > compute-bound and mem

Re: [petsc-users] asynchronous solve

2016-03-11 Thread Matthew Knepley
On Fri, Mar 11, 2016 at 12:48 AM, peter tutuk wrote: > I am developing my own nonlinear solver and I would like to achieve > asynchronous solve for subdomains. > > Is there any example around how to use PETSc in such a case? Or in > general, is there a possibility to achieve desired behavior, whi