I was able to reproduce this behaviour with the following code (also
attached); the CMakeLists file is also attached. The code hangs after
printing 'Scaled variable 0'.
Let me mention that I have used a different algorithm to obtain locally
relevant dofs, rather than directly using the function
Thank you for the reply Dr. Arndt. Let me give some more detail about the
output with and without compress.
The code concerned snippet is the following. Just for ease in
understanding, 'g' stands for global, 'crk' for current RK (for RK-type
time stepping), 'avar' for auxiliary variables (three
Hi all,
I am having a little problem with projecting a function onto (parts of) FE
spaces. I am getting the error
The violated condition was:
(dynamic_cast *>(
&(dof.get_triangulation())) == nullptr)
Additional information:
You are trying to use functionality in deal.II that is curre
Hello everyone!
This is deal.II newsletter #102.
It automatically reports recently merged features and discussions about the
deal.II finite element library.
## Below you find a list of recently proposed or merged features:
#9088: Suppress warning in python-bindings (proposed by masterleinad)
Hi! Andrew. I would not say it is giving wrong results because the
objective for which it is written is being well fulfilled. But for my
application, I need the output of plastic strains, plastic flow and
hardening coefficients values along with the isotropic plastic hardening.
As well as stori
Vachan,
No there should be no need to call PETScWrappers::VectorBase::compress()
after PETScWrappers::VectorBase::scale() (otherwise that would clearly be a
bug in deal.II) since
scale() calls VecPointwiseMult which is already collective. Does adding
compress() in fact change anything for you?
Oth
Great idea Prof. Bangerth. Thanks!
On Thursday, November 21, 2019 at 7:02:48 PM UTC+1, Wolfgang Bangerth wrote:
>
> On 11/21/19 10:41 AM, Muhammad Mashhood wrote:
> > Hi! I am trying to setup quasi static
> > thermoelastoplastic code using the step-26 (thermal analy
Hello,
I am facing a weird problem. At a point in code, I
have PETScWrappers::VectorBase::scale() called for few distributed vectors.
Subsequently, I have assignment operator on ghosted versions of these
vectors for parallel communication. When I launch the code with 2 or 4
processes, it works