Barry,
Thanks.
Sam
On Tuesday, March 16, 2021, Barry Smith wrote:
>
> Sam,
>
>You can pass a simple C function to PetscPushErrorHandler() that prints
> the top message and then immediately aborts to get the effect you want. But
> I agree with Dave you lose a lot of useful information
Dave,
You made a very point.
Thanks,
Sam
On Tuesday, March 16, 2021, Dave May wrote:
>
>
> On Tue, 16 Mar 2021 at 19:50, Sam Guo wrote:
>
>> Dear PETSc dev team,
>>When there is an PETSc error, I go following overly verbose error
>> message. Is it possible to get a simple error message
Sam,
You can pass a simple C function to PetscPushErrorHandler() that prints the
top message and then immediately aborts to get the effect you want. But I agree
with Dave you lose a lot of useful information by producing such a simple error
message.
Barry
> On Mar 16, 2021, at 5:43 PM,
On Tue, 16 Mar 2021 at 19:50, Sam Guo wrote:
> Dear PETSc dev team,
>When there is an PETSc error, I go following overly verbose error
> message. Is it possible to get a simple error message like "Initial vector
> is zero or belongs to the deflection space"?
>
>
When an error occurs and the e
Dear PETSc dev team,
When there is an PETSc error, I go following overly verbose error
message. Is it possible to get a simple error message like "Initial vector
is zero or belongs to the deflection space"?
Thanks,
Sam
[0]PETSC ERROR: - Error Message
---
I am employing PETSc for DD in existing serial fortran code. the program
ran for few seconds and showed segmentation fault core dumped. would anyone
suggest how to fix this
error message from GDB:
Program received signal SIGSEGV, Segmentation fault.
0x764f7cc0 in PetscCheckPointer (ptr=0x1
Evan Um writes:
> Dear Jed,
>
> In my problem, each process has a part of contiguous rows of matrix B. In
> this partitioned matrix, locations of non-zero elements are irregular
> because it is a part of unstructured matrix. It is hard to define the
> distribution of a vector that the partitioned
Dear Jed,
In my problem, each process has a part of contiguous rows of matrix B. In
this partitioned matrix, locations of non-zero elements are irregular
because it is a part of unstructured matrix. It is hard to define the
distribution of a vector that the partitioned matrix B is multiplied with.
Evan Um writes:
> Dear Jed,
>
> Thanks for your help many times. These numbers mean the total number of
> elements to be added using MatSetValues(). For example, at process 0,
> 148767+5821 elements are added to matrix B. In other words, the length of
> arrays (i.e. mat_b_i_partitioned, mat_b_j_p
Dear Jed,
Thanks for your help many times. These numbers mean the total number of
elements to be added using MatSetValues(). For example, at process 0,
148767+5821 elements are added to matrix B. In other words, the length of
arrays (i.e. mat_b_i_partitioned, mat_b_j_partitioned and
mat_b_val_part
Evan Um writes:
> Dear PETSC users,
>
> I hope that I can have a comment about errors I got during sparse symmetric
> matrix construction. In this example, I used three processes. The size of a
> test matrix is 52105-by-52105. The length of array d_nnz and o_nnz is 17461
> at rank 0, 17111 at ran
Dear PETSC users,
I hope that I can have a comment about errors I got during sparse symmetric
matrix construction. In this example, I used three processes. The size of a
test matrix is 52105-by-52105. The length of array d_nnz and o_nnz is 17461
at rank 0, 17111 at rank 1 and 17535 at rank 2. The
On Wed, May 21, 2014 at 2:09 PM, Luc Berger-Vergiat
wrote:
> So I just pulled an updated version of petsc-dev today (I switched from
> the *next* branch to the *master* branch due to some compilation error
> existing with the last commit on *next*).
> I still have the same error and I believe th
So I just pulled an updated version of petsc-dev today (I switched from
the *next* branch to the *master* branch due to some compilation error
existing with the last commit on *next*).
I still have the same error and I believe this is the whole error
message I have.
I mean I am running multiple
On Tue, May 20, 2014 at 4:33 PM, Luc Berger-Vergiat wrote:
> Hi all,
> I am running an FEM simulation that uses Petsc as a linear solver.
> I am setting up ISs and pass them to a DMShell in order to use the
> FieldSplit capabilities of Petsc.
>
> When I pass the following options to Petsc:
>
> "
I saw a similar error sometime back while fooling around with
fieldsplit. Can you update petsc-dev (git-pull), rebuild and try again.
T
On 05/20/2014 04:33 PM, Luc Berger-Vergiat wrote:
Hi all,
I am running an FEM simulation that uses Petsc as a linear solver.
I am setting up ISs and pass them
Hi all,
I am running an FEM simulation that uses Petsc as a linear solver.
I am setting up ISs and pass them to a DMShell in order to use the
FieldSplit capabilities of Petsc.
When I pass the following options to Petsc:
" -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur
-pc_
Hi Jed,
The problem is being caused from Out of Memory error.
So I am going to stick to the smaller problems.
Thanks a lot.
-
Garnet Vaz
On Tue, Jul 9, 2013 at 5:25 PM, Jed Brown wrote:
> Garnet Vaz writes:
>
> > Hi Jed,
> >
> > Yes. It has been running fine for problems up to 3M triangles.
Garnet Vaz writes:
> Hi Jed,
>
> Yes. It has been running fine for problems up to 3M triangles.
> /var/log/messages does say that the process is killed.
>
> Changed the oom options to allow over-commiting. I think it
> should work now.
Hmm, normally when over-commit is turned off, malloc will re
Hi Jed,
Yes. It has been running fine for problems up to 3M triangles.
/var/log/messages does say that the process is killed.
Changed the oom options to allow over-commiting. I think it
should work now.
Thanks.
-
Garnet
On Tue, Jul 9, 2013 at 3:57 PM, Jed Brown wrote:
> Garnet Vaz write
Garnet Vaz writes:
> Hi Jed,
>
> Thanks. The output of quota reads "unlimited".
> The system memory is 16GB.
>
> Doing "ulimit -m" gives 13938212
> in kilobytes which corresponds to 13GB. I think this means
> that I should be able to use most of it. I am the only person
> running jobs on this mac
Hi Jed,
Thanks. The output of quota reads "unlimited".
The system memory is 16GB.
Doing "ulimit -m" gives 13938212
in kilobytes which corresponds to 13GB. I think this means
that I should be able to use most of it. I am the only person
running jobs on this machine right now.
I can run with -memo
Garnet Vaz writes:
> Dear all,
>
> My PETSc code crashes with the output
>
> "
> Number of lines in file is 5349000 #<- Number of points
> Number of lines in file is 10695950 #<- Number of triangles
> reading cell list successful
> reading vertex list successful
> Mesh distribution succ
Dear all,
My PETSc code crashes with the output
"
Number of lines in file is 5349000 #<- Number of points
Number of lines in file is 10695950 #<- Number of triangles
reading cell list successful
reading vertex list successful
Mesh distribution successful
===
Hi,
In case of SeqDense matrix the message if very usefull since prints the
maximum and actual indices:
MatSetValues_SeqDense() line 750 in
/lib/petsc-dev1/src/mat/impls/dense/seq/dense.c
if (indexn[j] >= A->cmap->n)
SETERRQ2(PETSC_COMM_SELF,PETSC_ERR_ARG_OUTOFRANGE,"Column too large: col
%D
On Jun 6, 2012, at 7:27 AM, Alexander Grayver wrote:
> Hi,
>
> In case of SeqDense matrix the message if very usefull since prints the
> maximum and actual indices:
>
> MatSetValues_SeqDense() line 750 in
> /lib/petsc-dev1/src/mat/impls/dense/seq/dense.c
> if (indexn[j] >= A->cmap->n)
> SETE
The error messages says it all. You must have calls to VecSetValues() before
the VecSet() but not have a VecAssemblyBegin/End(). After your VecSetValues()
you always need to have VecAssemblyBegin/End()
Barry
On Sep 7, 2010, at 5:56 PM, NAN ZHAO wrote:
> Dear all,
>
> I got a strange p
Dear all,
I got a strange petsc error when I am running my code with petsc as a linear
sover:
[0]PETSC ERROR: - Error Message
[0]PETSC ERROR: Object is in wrong state!
[0]PETSC ERROR: You cannot call this after you have called VecSetValues()
28 matches
Mail list logo