[petsc-dev] Using PCFieldSplitSetIS

2011-03-18 Thread Thomas Witkowski
Barry Smith wrote: >Did you get a chance to run as I requested with "Please run with -ksp_view > and send the output." > > Yes, the problem with the second iteration occurs only when I make use of bcgs for the outer iteration (which is set in my code as the standard solver). When using ri

[petsc-dev] Using PCFieldSplitSetIS

2011-03-18 Thread Thomas Witkowski
Jed, thanks for clarification. I think that I now understand whats going when using the fieldsplit preconditioner. Thomas Jed Brown wrote: > On Wed, Mar 16, 2011 at 14:27, Thomas Witkowski > > wrote: > > > I'm going to guess that you still have

[petsc-dev] Using PCFieldSplitSetIS

2011-03-18 Thread Barry Smith
On Mar 18, 2011, at 6:23 AM, Thomas Witkowski wrote: > Barry Smith wrote: >> Did you get a chance to run as I requested with "Please run with -ksp_view >> and send the output." >> >> > Yes, the problem with the second iteration occurs only when I make use of > bcgs for the outer iteration

[petsc-dev] Using PCFieldSplitSetIS

2011-03-17 Thread Barry Smith
Did you get a chance to run as I requested with "Please run with -ksp_view and send the output." We are eager to determine the problem. Barry On Mar 16, 2011, at 11:03 AM, Barry Smith wrote: > > On Mar 16, 2011, at 8:57 AM, Thomas Witkowski wrote: > > -pc_fieldsplit_type schur -fi

[petsc-dev] Using PCFieldSplitSetIS

2011-03-16 Thread Jed Brown
On Wed, Mar 16, 2011 at 14:27, Thomas Witkowski < thomas.witkowski at tu-dresden.de> wrote: > >> I'm going to guess that you still have an outer KSP that (in the global >> norm, rather than the partitioned norm used inside of splits) has a tighter >> tolerance, therefore it takes a few outer itera

[petsc-dev] Using PCFieldSplitSetIS

2011-03-16 Thread Thomas Witkowski
Matthew Knepley wrote: > On Wed, Mar 16, 2011 at 8:57 AM, Thomas Witkowski > > wrote: > > Matt, it makes the output nicer, but does not help me to > understand what's going on inside of PETSc: > > > Residual norms for fieldsplit_boundary_ so

[petsc-dev] Using PCFieldSplitSetIS

2011-03-16 Thread Thomas Witkowski
Matt, it makes the output nicer, but does not help me to understand what's going on inside of PETSc: Residual norms for fieldsplit_boundary_ solve. 0 KSP Residual norm 1.790059331071e-04 1 KSP Residual norm 1.237356212928e-04 2 KSP Residual norm 7.952220245101e-05 3 KSP Residu

[petsc-dev] Using PCFieldSplitSetIS

2011-03-16 Thread Thomas Witkowski
Jed Brown wrote: > On Wed, Mar 16, 2011 at 07:37, Thomas Witkowski > > wrote: > > Thanks for explanations! It works fine in my code. But I have two > questions about it, maybe you can help me with them: > - To the first, is the LU factorizatio

[petsc-dev] Using PCFieldSplitSetIS

2011-03-16 Thread Jed Brown
On Wed, Mar 16, 2011 at 07:37, Thomas Witkowski < thomas.witkowski at tu-dresden.de> wrote: > Thanks for explanations! It works fine in my code. But I have two questions > about it, maybe you can help me with them: > - To the first, is the LU factorization on block A_00 done only once? > Yes, one

[petsc-dev] Using PCFieldSplitSetIS

2011-03-16 Thread Barry Smith
On Mar 16, 2011, at 8:57 AM, Thomas Witkowski wrote: -pc_fieldsplit_type schur -fieldsplit_interior_ksp_type preonly -fieldsplit_interior_pc_type bjacobi -fieldsplit_interior_sub_pc_type lu -fieldsplit_boundary_ksp_monitor -ksp_monitor_true_residual > Residual norms for fieldsplit_boundary

[petsc-dev] Using PCFieldSplitSetIS

2011-03-16 Thread Matthew Knepley
On Wed, Mar 16, 2011 at 8:57 AM, Thomas Witkowski < thomas.witkowski at tu-dresden.de> wrote: > Matt, it makes the output nicer, but does not help me to understand what's > going on inside of PETSc: > > > Residual norms for fieldsplit_boundary_ solve. > 0 KSP Residual norm 1.790059331071e-04 >

[petsc-dev] Using PCFieldSplitSetIS

2011-03-16 Thread Matthew Knepley
On Wed, Mar 16, 2011 at 8:27 AM, Thomas Witkowski < thomas.witkowski at tu-dresden.de> wrote: > Jed Brown wrote: > > On Wed, Mar 16, 2011 at 07:37, Thomas Witkowski < >> thomas.witkowski at tu-dresden.de > >> wrote: >> >>Thanks for explanations! It wo

[petsc-dev] Using PCFieldSplitSetIS

2011-03-16 Thread Thomas Witkowski
Jed Brown wrote: > On Mon, Mar 14, 2011 at 12:32, Thomas Witkowski > > wrote: > > Should I define blocks or splits for the subdomains and the > interior nodes? And what is the best way to force PETSc to make > some LU factorization on each sub

[petsc-dev] Using PCFieldSplitSetIS

2011-03-14 Thread Jed Brown
On Mon, Mar 14, 2011 at 12:32, Thomas Witkowski < thomas.witkowski at tu-dresden.de> wrote: > Should I define blocks or splits for the subdomains and the interior nodes? > And what is the best way to force PETSc to make some LU factorization on > each subdomain and to store it (it is needed to cre

[petsc-dev] Using PCFieldSplitSetIS

2011-03-14 Thread Thomas Witkowski
Jed, I'm a little bit confused about your and Matt's answers. I played a little bit with PFCFieldSplit, and I think that I got some basic understanding of this concept. My code now creates two splits. One for the unknowns of all subdomain interior nodes, and one for all unknowns on the subdomai

[petsc-dev] Using PCFieldSplitSetIS

2011-03-10 Thread Jed Brown
Thomas, I think Matt may have misunderstood what you wanted. PCFieldSplit does not solve multiple splits concurrently. You can emulate the first solve you need by defining a block diagonal system with one block per process. Then solving it with block Jacobi is the same as a direct solve. To expose

[petsc-dev] Using PCFieldSplitSetIS

2011-03-09 Thread Dave May
Assuming you have three IS's containing those indices (they will need to be sorted), then you just need to call PCFieldSplitSetIs() three times, passing in each IS in the order you want the blocks to be defined. Cheers, Dave On 9 March 2011 20:50, Thomas Witkowski wrote: > As I already aske

[petsc-dev] Using PCFieldSplitSetIS

2011-03-09 Thread Thomas Witkowski
As I already asked on petsc-users, I want to implement some kind of iterative substructuring algorithm in my fem code. It was suggested to me to switch to the dev version of petsc and to make use of PCFieldSplit. So far I have installed petsc-dev and read a little bit about the PCFieldSplit

[petsc-dev] Using PCFieldSplitSetIS

2011-03-09 Thread Matthew Knepley
Each process calls FieldSplitSetIS() with the indices for THAT FIELD that are owned by that process. Each process call FieldSplitSetIS() n+1 times if you have n+1 fields. Identifying fields with processes is a mistake. Matt On Wed, Mar 9, 2011 at 1:50 PM, Thomas Witkowski < Thomas.Witkowski at