I notice that you are using cmake from /opt/local/bin, but spack would have
installed its own cmake. Could you try using cmake from spack to compile and
then run the example ?
best
praveen
> On 29-Sep-2020, at 7:35 PM, 'Alexander Greiner' via deal.II User Group
> wrote:
>
> Hi Luca,
>
> I ra
Gabriel,
1. Regarding the initialization of PETSc::MPI::BlockSparseMatrix:
I have used the the IndexSet::split_by_block() function and this indeed works
good. Thanks for the suggestion!
Unfortunately, I have encountered another issue. The
PETSc::MPI::BlockSparseMatrix must be partitioned
Hello everyone!
This is deal.II newsletter #136.
It automatically reports recently merged features and discussions about the
deal.II finite element library.
## Below you find a list of recently proposed or merged features:
#10981: Deprecate QTrapez and rename it to QTrapezoid (proposed by bang
Dear fellow community members,
Hi! I want to
simulate some simulation cases based upon the elastoplastic material model.
They involve the loading and then unloading of the material too. I tried
with the step-42 but apparently it is for
Thanks a lot Prof. Wolfgang. Your guidance was very fruitful and right on
point. Solved my problem! :)
On Thursday, August 6, 2020 at 1:53:48 AM UTC+2 Wolfgang Bangerth wrote:
>
> > Thanks for the guidance. I tried
> > replacing the " source/particles/partic
Dear Alex,
what happens after loading dealii with `spack load dealii`, if you try to
build (from scratch) `step-40`?
Can you send us the output of cmake and make?
Luca.
On Mon, Sep 28, 2020 at 4:46 PM 'Alexander Greiner' via deal.II User Group <
dealii@googlegroups.com> wrote:
> Hi Luca,
>
>
Thank you for your input!
1. Regarding the initialization of PETSc::MPI::BlockSparseMatrix:
I have used the the IndexSet::split_by_block() function and this indeed
works good. Thanks for the suggestion!
Unfortunately, I have encountered another issue. The
PETSc::MPI::BlockSparseMatrix must be
Hi Bruno and Jean,
Thanks a lot for your help with the eigenvalue problem. I am able to solve
the standard eigenvalue problem, and have tested my code on simple
problems. The results are matching (compared them with the MATLAB
eigenvalue analysis).
Thanks again!
Animesh
On Tuesday, September
Dear Timo, Dear Wolfgang,
Thank you both for your input!
1. Regarding the initialization of PETSc::MPI::BlockSparseMatrix:
I have used the the IndexSet::split_by_block() function and this indeed
works good. Thanks for the suggestion!
Unfortunately, I have encountered another issue. The
PETSc: