> On Feb 8, 2017, at 11:39 PM, Dave May <dave.mayhe...@gmail.com> wrote:
> 
> Any time you modify one of the submats, you need to call assembly begin/end 
> on that sub matrix AND on the outer matnest object.

   Weird, and prone to errors it seems to me. Perhaps this needs to be rethought

> 
> Thanks,
>   Dave
> 
> 
> On Wed, 8 Feb 2017 at 22:51, Manav Bhatia <bhatiama...@gmail.com> wrote:
> aha.. that might be it.
> 
> Does that need to be called for the global matrix after each assembly of the 
> Jacobian blocks, or just once for the whole matrix?
> 
> -Manav
> 
> > On Feb 8, 2017, at 3:47 PM, Barry Smith <bsm...@mcs.anl.gov> wrote:
> >
> >
> >> On Feb 8, 2017, at 3:40 PM, Manav Bhatia <bhatiama...@gmail.com> wrote:
> >>
> >> Hi,
> >>
> >>   I have a nested matrix with 2x2 blocks. The blocks (1,1) and (2,2) are 
> >> AIJ matrices and blocks (1,2) and (2,1) and shell matrices. I am calling 
> >> the code with the following arguments: -pc_type fieldsplit , and get the 
> >> error shown below.
> >>
> >>   I see that the error is complaining about the matrix being in 
> >> unassembled state, but I think I am initializing and calling assembly end 
> >> routines on both the diagonal blocks. Still, there is something obvious I 
> >> am missing.
> >
> >  It is complaining about the outer most matrix, not the blocks. Perhaps you 
> > haven't finished with setting up your nest matrix?
> >
> >>
> >>   I would appreciate any guidance on this.
> >>
> >> Regards,
> >> Manav
> >>
> >>
> >>
> >> [1;31m[0]PETSC ERROR: --------------------- Error Message 
> >> --------------------------------------------------------------
> >> [0;39m[0;49m[0]PETSC ERROR: Object is in wrong state
> >> [0]PETSC ERROR: Not for unassembled matrix
> >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
> >> for trouble shooting.
> >> [0]PETSC ERROR: Petsc Release Version 3.7.4, Oct, 02, 2016
> >> [0]PETSC ERROR: 
> >> /Users/manav/Library/Developer/Xcode/DerivedData/MAST-crggwcqrouiyeucduvscdahjauvx/Build/Products/Debug/examples
> >>  on a arch-darwin-cxx-opt named Dhcp-90-250.HPC.MsState.Edu by manav Wed 
> >> Feb  8 15:28:04 2017
> >> [0]PETSC ERROR: Configure options 
> >> --prefix=/Users/manav/Documents/codes/numerical_lib/petsc/petsc-3.7.4/../ 
> >> --CC=mpicc-openmpi-clang38 --CXX=mpicxx-openmpi-clang38 
> >> --FC=mpif90-openmpi-clang38 --with-clanguage=c++ --with-fortran=0 
> >> --with-mpiexec=/opt/local/bin/mpiexec-openmpi-clang38 
> >> --with-shared-libraries=1 --with-x=1 --with-x-dir=/opt/X11 
> >> --with-debugging=0 --with-lapack-lib=/usr/lib/liblapack.dylib 
> >> --with-blas-lib=/usr/lib/libblas.dylib --download-superlu=yes 
> >> --download-superlu_dist=yes --download-suitesparse=yes 
> >> --download-mumps=yes --download-scalapack=yes --download-parmetis=yes 
> >> --download-metis=yes --download-hypre=yes --download-ml=yes
> >> [0]PETSC ERROR: #1 MatMult() line 2248 in 
> >> /Users/manav/Documents/codes/numerical_lib/petsc/petsc-3.7.4/src/mat/interface/matrix.c
> >> [0]PETSC ERROR: #2 PCApplyBAorAB() line 717 in 
> >> /Users/manav/Documents/codes/numerical_lib/petsc/petsc-3.7.4/src/ksp/pc/interface/precon.c
> >> [0]PETSC ERROR: #3 KSP_PCApplyBAorAB() line 274 in 
> >> /Users/manav/Documents/codes/numerical_lib/petsc/petsc-3.7.4/include/petsc/private/kspimpl.h
> >> [0]PETSC ERROR: #4 KSPGMRESCycle() line 156 in 
> >> /Users/manav/Documents/codes/numerical_lib/petsc/petsc-3.7.4/src/ksp/ksp/impls/gmres/gmres.c
> >> [0]PETSC ERROR: #5 KSPSolve_GMRES() line 240 in 
> >> /Users/manav/Documents/codes/numerical_lib/petsc/petsc-3.7.4/src/ksp/ksp/impls/gmres/gmres.c
> >> [0]PETSC ERROR: #6 KSPSolve() line 656 in 
> >> /Users/manav/Documents/codes/numerical_lib/petsc/petsc-3.7.4/src/ksp/ksp/interface/itfunc.c
> >> [0]PETSC ERROR: #7 SNESSolve_NEWTONLS() line 230 in 
> >> /Users/manav/Documents/codes/numerical_lib/petsc/petsc-3.7.4/src/snes/impls/ls/ls.c
> >> [0]PETSC ERROR: #8 SNESSolve() line 4005 in 
> >> /Users/manav/Documents/codes/numerical_lib/petsc/petsc-3.7.4/src/snes/interface/snes.c
> >>
> >
> 

Reply via email to