Cool. Which machine had the error you sent. I can't find it.

   Matt
On Oct 30, 2015 2:47 PM, "Barry Smith" <bsm...@mcs.anl.gov> wrote:

>
>   Satish can provide more details on how you can easily run in the EXACT
> environment where something broke to debug it much faster.  The model is
>
> 1)  ssh pe...@login.mcs.anl.gov
>
> 2) ssh testmachine  (testmachine is always the end part of the name of the
> log file)
>
> 3) cd to either  /sandbox/petsc/petsc.test  or /home/petsc/petsc.test
> depending on the machine
>
> 4) git fetch
>
> 5) git checkout the broken branch
>
> 6) set PETSC_ARCH=arch of broken machine, this is in the name of the log
> filename
>
> 7) ./config/examples/${PETSC_ARCH}.py
>
> 8) build PETSc and debug away.
>
>   If you have any trouble with this let Satish and I know. It is the
> intention that debugging on the test machines should be very
> straightforward and not require getting help from anyone or following
> convoluted instructions.
>
>   Barry
>
> > On Oct 30, 2015, at 2:15 PM, Matthew Knepley <knep...@gmail.com> wrote:
> >
> > I ran it through valgrind on my machine with no problems. Checking logs
> >
> >    Matt
> >
> > On Thu, Oct 29, 2015 at 5:09 PM, Barry Smith <bsm...@mcs.anl.gov> wrote:
> >
> > <   [3] Roots referenced by my leaves, by rank
> > < Symmetric gradient null space: PASS
> > < Function tests pass for order 0 at tolerance 1e-10
> > < Function tests pass for order 0 derivatives at tolerance 1e-10
> > ---
> > > [2]PETSC ERROR: #1 PetscCommDuplicate() line 178 in
> /usr/home/balay/petsc.clone-3/src/sys/objects/tagm.c
> > > [2]PETSC ERROR: #2 PetscHeaderCreate_Private() line 60 in
> /usr/home/balay/petsc.clone-3/src/sys/objects/inherit.c
> > > [2]PETSC ERROR: #3 PetscSFCreate() line 44 in
> /usr/home/balay/petsc.clone-3/src/vec/is/sf/interface/sf.c
> > > [2]PETSC ERROR: #4 DMPlexDistribute() line 1562 in
> /usr/home/balay/petsc.clone-3/src/dm/impls/plex/plexdistribute.c
> > > [2]PETSC ERROR: #5 CreateMesh() line 232 in
> /usr/home/balay/petsc.clone-3/src/dm/impls/plex/examples/tests/ex3.c
> > > [2]PETSC ERROR: #6 main() line 911 in
> /usr/home/balay/petsc.clone-3/src/dm/impls/plex/examples/tests/ex3.c
> > > [2]PETSC ERROR: PETSc Option Table entries:
> > > [2]PETSC ERROR: -dim 3
> > > [2]PETSC ERROR: -dm_plex_max_projection_height 2
> > > [2]PETSC ERROR: -dm_view ascii::ASCII_INFO_DETAIL
> > > [2]PETSC ERROR: -malloc_dump
> > > [2]PETSC ERROR: -nox
> > > [2]PETSC ERROR: -nox_warning
> > > [2]PETSC ERROR: -num_comp 3
> > > [2]PETSC ERROR: -petscpartitioner_type simple
> > > [2]PETSC ERROR: -petscspace_order 1
> > > [2]PETSC ERROR: -petscspace_poly_tensor
> > > [2]PETSC ERROR: -qorder 1
> > > [2]PETSC ERROR: -simplex 0
> > > [2]PETSC ERROR: -test_fe_jacobian
> > > [2]PETSC ERROR: -tree
> > > [2]PETSC ERROR: ----------------End of Error Message -------send
> entire error message to petsc-ma...@mcs.anl.gov----------
> > > application called MPI_Abort(MPI_COMM_WORLD, 1) - process 2
> > > [cli_2]: aborting job:
> > > application called MPI_Abort(MPI_COMM_WORLD, 1) - process 2
> > >
> > >
> ===================================================================================
> > > =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> > > =   PID 15231 RUNNING AT wii
> > > =   EXIT CODE: 1
> > > =   CLEANING UP REMAINING PROCESSES
> > > =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
> > >
> ===================================================================================
> > /usr/home/balay/petsc.clone-3/src/dm/impls/plex/examples/tests
> > Possible problem with with runex3_nonconforming_tensor_3, diffs above
> >
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
>
>

Reply via email to