plication called MPI_Abort(MPI_COMM_WORLD, 0) - process 0
>> > _pmii_daemon(SIGCHLD): [NID 10649] [c23-3c0s6n1] [Mon Apr 2 13:06:48
>> 2012] PE 0 exit signal Aborted
>> > Application 133198 exit codes: 134
>> > Application 133198 resources: utime ~1s, stime ~0s
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/93e0a49c/attachment-0001.htm>
plication called MPI_Abort(MPI_COMM_WORLD, 0) - process 0
>> > _pmii_daemon(SIGCHLD): [NID 10649] [c23-3c0s6n1] [Mon Apr 2 13:06:48
>> 2012] PE 0 exit signal Aborted
>> > Application 133198 exit codes: 134
>> > Application 133198 resources: utime ~1s, stime ~0s
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/dcbe3ead/attachment.htm>
Satish
Things work fine on my linux machine (and other linux clusters) and
valgrind shows no error. Unfortunately Totalview (GUI starts fine on the
node) gives me a licensing error on the Cray.
I will continue to explore.
Thanks
Tabrez
On 04/02/2012 09:05 PM, Satish Balay wrote:
> Sounds like
2
> 13:06:48 2012] PE 0 exit signal Aborted
> > Application 133198 exit codes: 134
> > Application 133198 resources: utime ~1s, stime ~0s
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/2b63becd/attachment.htm>
t;>
>>>>>>>>> SCRGP2$ ./ex52 -dim 2 -compute_function -show_residual -batch
>>>>>>>>> Residual:
>>>>>>>>> Vector Object: 1 MPI processes
>>>>>>>>> type: seq
>>>>>>>>> -0.25
>>>>>>>>> -0.5
>>>>>>>>> 0.25
>>>>>>>>> -0.5
>>>>>>>>> -1
>>>>>>>>> 0.5
>>>>>>>>> 0.25
>>>>>>>>> 0.5
>>>>>>>>> 0.75
>>>>>>>>> SCRGP2$ ./ex52 -dim 2 -compute_function -show_residual -batch -gpu
>>>>>>>>> [0]PETSC ERROR: IntegrateElementBatchGPU() line 323 in
>>>>>>>>> src/snes/examples/tutorials/ex52_integrateElement.cu
>>>>>>>>> [0]PETSC ERROR: FormFunctionLocalBatch() line 679 in
>>>>>>>>> src/snes/examples/tutorials/ex52.c
>>>>>>>>> [0]PETSC ERROR: SNESDMComplexComputeFunction() line 431 in
>>>>>>>>> src/snes/utils/damgsnes.c
>>>>>>>>> [0]PETSC ERROR: main() line 1021 in
>>>>>>>>> src/snes/examples/tutorials/ex52.c
>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 35) - process 0
>>>>>>>>>
>>>>>>>>
>>>>>>>> This is failing on cudaMalloc(), which means your card is not
>>>>>>>> available for running. Are you trying to run on your laptop?
>>>>>>>> If so, applications like Preview can lock up the GPU. I know of no
>>>>>>>> way to test this in CUDA while running. I just close
>>>>>>>> apps until it runs.
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>>
>>>>>>>> Matt
>>>>>>>>
>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Tue, Mar 27, 2012 at 8:37 PM, Matthew Knepley <
>>>>>>>>> knepley at gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> On Tue, Mar 27, 2012 at 2:10 PM, Blaise Bourdin >>>>>>>>> lsu.edu>wrote:
>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Mar 27, 2012, at 1:23 PM, Matthew Knepley wrote:
>>>>>>>>>>>
>>>>>>>>>>> On Tue, Mar 27, 2012 at 12:58 PM, David Fuentes <
>>>>>>>>>>> fuentesdt at gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hi,
>>>>>>>>>>>>
>>>>>>>>>>>> I had a question about the status of example 52.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> http://petsc.cs.iit.edu/petsc/petsc-dev/file/a8e2f2c19319/src/snes/examples/tutorials/ex52.c
>>>>>>>>>>>>
>>>>>>>>>>>> http://petsc.cs.iit.edu/petsc/petsc-dev/file/a8e2f2c19319/src/snes/examples/tutorials/ex52_integrateElement.cu
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> Can this example be used with a DM object created from an
>>>>>>>>>>>> unstructured exodusII mesh, DMMeshCreateExodus, And the FEM
>>>>>>>>>>>> assembly done
>>>>>>>>>>>> on GPU ?
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> 1) I have pushed many more tests for it now. They can be run
>>>>>>>>>>> using the Python build system
>>>>>>>>>>>
>>>>>>>>>>> ./config/builder2.py check src/snes/examples/tutorials/ex52.c
>>>>>>>>>>>
>>>>>>>>>>> in fact, you can build any set of files this way.
>>>>>>>>>>>
>>>>>>>>>>> 2) The Exodus creation has to be converted to DMComplex from
>>>>>>>>>>> DMMesh. That should not take me very long. Blaise maintains that
>>>>>>>>>>> so maybe there will be help :) You will just replace
>>>>>>>>>>> DMComplexCreateBoxMesh() with DMComplexCreateExodus(). If you
>>>>>>>>>>> request
>>>>>>>>>>> it, I will bump it up the list.
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> DMMeshCreateExodusNG is much more flexible
>>>>>>>>>>> than DMMeshCreateExodus in that it can read meshes with multiple
>>>>>>>>>>> element
>>>>>>>>>>> types and should have a much lower memory footprint. The code
>>>>>>>>>>> should be
>>>>>>>>>>> fairly easy to read. you can email me directly if you have specific
>>>>>>>>>>> questions. I had looked at creating a DMComplex and it did not look
>>>>>>>>>>> too
>>>>>>>>>>> difficult, as long as interpolation is not needed. I have plans to
>>>>>>>>>>> write DMComplexCreateExodus, but haven't had time too so far.
>>>>>>>>>>> Updating the
>>>>>>>>>>> Vec viewers and readers may be a bit more involved. In perfect
>>>>>>>>>>> world, one
>>>>>>>>>>> would write an EXODUS viewer following the lines of the VTK and
>>>>>>>>>>> HDF5 ones.
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> David and Blaise, I have converted this function, now
>>>>>>>>>> DMComplexCreateExodus(). Its not tested, but I think
>>>>>>>>>> Blaise has some stuff we can use to test it.
>>>>>>>>>>
>>>>>>>>>> Thanks,
>>>>>>>>>>
>>>>>>>>>> Matt
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> Blaise
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Let me know if you can run the tests.
>>>>>>>>>>>
>>>>>>>>>>> Thanks
>>>>>>>>>>>
>>>>>>>>>>> Matt
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>> Thanks,
>>>>>>>>>>>> David
>>>>>>>>>>>>
>>>>>>>>>>> --
>>>>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>>>>> experiments is infinitely more interesting than any results to
>>>>>>>>>>> which their
>>>>>>>>>>> experiments lead.
>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> --
>>>>>>>>>>> Department of Mathematics and Center for Computation & Technology
>>>>>>>>>>> Louisiana State University, Baton Rouge, LA 70803, USA
>>>>>>>>>>> Tel. +1 (225) 578 1612, Fax +1 (225) 578 4276
>>>>>>>>>>> http://www.math.lsu.edu/~bourdin
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>>>> experiments is infinitely more interesting than any results to which
>>>>>>>>>> their
>>>>>>>>>> experiments lead.
>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>> experiments is infinitely more interesting than any results to which
>>>>>>>> their
>>>>>>>> experiments lead.
>>>>>>>> -- Norbert Wiener
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> What most experimenters take for granted before they begin their
>>>>>> experiments is infinitely more interesting than any results to which
>>>>>> their
>>>>>> experiments lead.
>>>>>> -- Norbert Wiener
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/bfdfce0e/attachment-0001.htm>
Sounds like a Cray machine.
start_in_debugger is useful for debugging on workstations [or
clusters] etc where there is some control on X11 tunnels. Also
'xterm','gdb' or similar debugger should be available on the compute
nodes [along with a x/ssh tunnel].
On a cray - you are better off looking f
ng job:
> > application called MPI_Abort(MPI_COMM_WORLD, 0) - process 0
> > _pmii_daemon(SIGCHLD): [NID 10649] [c23-3c0s6n1] [Mon Apr 2 13:06:48
> 2012] PE 0 exit signal Aborted
> > Application 133198 exit codes: 134
> > Application 133198 resources: utime ~1s, stime ~
On Apr 2, 2012, at 8:10 PM, Tabrez Ali wrote:
> Hello
>
> I am trying to debug a program using the switch '-on_error_attach_debugger'
> but the vendor/sysadmin built PETSc 3.2.00 is unable to start the debugger
> in xterm (see text below). But xterm is installed. What am I doing wrong?
>
> B
] PE 0 exit signal Aborted
> Application 133198 exit codes: 134
> Application 133198 resources: utime ~1s, stime ~0s
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/cef9d865/attachment.htm>
Hello
I am trying to debug a program using the switch
'-on_error_attach_debugger' but the vendor/sysadmin built PETSc 3.2.00
is unable to start the debugger in xterm (see text below). But xterm is
installed. What am I doing wrong?
Btw the segfault happens during a call to MatMult but only wit
Oh, OK, so you do not explicitly scale the matrix good. We had a user (John)
recently that was using diagonal scaling and was getting failure with GAMG and
then GAMG worked and he was no longer using diagonal scaling so I was thinking
this was the problem.
Mark
On Apr 2, 2012, at 7:40 PM, Bar
> scaling? If it is Mat then we might want MatSetNearNullSpace to do this,
> otherwise we should think of a good way to deal with this. It is very error
> prone to not do the right thing here, we should at least throw an error.
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/ca84edbd/attachment.htm>
ht thing here, we should at least throw an error.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/d671cf89/attachment.htm>
Mark,
Who would be doing the "diagonal scaling". Under normal conditions we never
want to be passing to any preconditioner a diagonally scaled matrix (for
exactly the issue you point out) so why are you worried about that case?
Barry
On Apr 2, 2012, at 6:40 PM, Mark F. Adams wrote:
>
is currently working with ML and reply to this message.)
>
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/67a9a7ab/attachment.htm>
cal (DMGlobalToLocalBegin/End) using a BOX stencil
> with width at least 1, get the global array u_new[][][] from UGlobalNew and
> the local arrays u[][][] from Ulocal, then assign u_new[k][j][i] =
> u[k-1][j-1][i-1].
>
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/1e98dcc6/attachment-0001.htm>
modes have to be scaled appropriately). Who owns
> the diagonal scaling? If it is Mat then we might want MatSetNearNullSpace
> to do this, otherwise we should think of a good way to deal with this. It
> is very error prone to not do the right thing here, we should at least
> throw an error.
> >>
> >>
> >
> >
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/e30f220d/attachment-0001.htm>
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/d18642d0/attachment.htm>
c KSP
tutorials - KSP Laplacian, 2d (ex2.c) for mesh dim of
n = 15
m = 100
Thanks
Bibrak
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/6cbea21c/attachment.htm>
HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/b6214df8/attachment.htm>
use ML in an optimal manner for vector problems.
Regards,
Nicolas
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/9bee5c8a/attachment.htm>
[k-1][j-1][i-1].
>
>
--
What most experimenters take for granted before they begin their experiments is
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/198a0763/attachment.htm>
gt; Like flowing water, the spirit stays free
> http://www.loyno.edu/~li/home
> New Orleans, Louisiana (504)865-2051(fax)
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/b6ef910c/attachment-0001.htm>
> in order to use ML in an optimal manner for vector problems.
>>
>> The block size is used if you don't provide any other information. The
>> preferred approach with petsc-dev is to use MatSetNearNullSpace(). (I'll
>> make sure this is currently working with ML and reply to this message.)
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/d994becc/attachment.htm>
On Mon, 2 Apr 2012, Matthew Knepley wrote:
> On Mon, Apr 2, 2012 at 10:11 AM, Xuefeng Li wrote:
>
>> Hi, all.
>>
>> I am using multigrid with DMMG (5 levels) in my petsc program.
>> Turning on the dmmg_grid_sequence option seems
>> to slow down the program dramatically. For instance,
>> the progr
-
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/ff42ce83/attachment.htm>
nt was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/555a58f0/attachment.htm>
Hi, all.
I am using multigrid with DMMG (5 levels) in my petsc program.
Turning on the dmmg_grid_sequence option seems
to slow down the program dramatically. For instance,
the program converges in 3 iterations in 6.270e+01 sec
when dmmg_grid_sequence is OFF; it also converges
in 3 iterations but i
rently working with ML and reply to this message.)
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/bc15d0bf/attachment.htm>
Bibrak,
Start with http://www.mcs.anl.gov/petsc/documentation/faq.html#computers
Then http://www.mcs.anl.gov/petsc/documentation/faq.html#log-summary
and http://www.mcs.anl.gov/petsc/documentation/faq.html#slowerparallel
Barry
On Apr 2, 2012, at 7:37 AM, Bibrak Qamar wrote:
[i] =
> u[k-1][j-1][i-1].
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/485ddc8d/attachment.htm>
east 1, get the global array u_new[][][] from UGlobalNew and the
local arrays u[][][] from Ulocal, then assign u_new[k][j][i] = u[k-1][j-1][i-1].
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/b9318138/attachment-0001.htm>
32 matches
Mail list logo