On Thu, Aug 27, 2015 at 11:10 AM, John Peterson
wrote:
>
>
> On Thu, Aug 27, 2015 at 9:53 AM, Michael Povolotskyi
> wrote:
>
>> Another thought.
>> In your error handler you call MPI_Abort.
>> I think this is causing problem for my application.
>> In my application I use both real and complex ve
On Thu, Aug 27, 2015 at 9:53 AM, Michael Povolotskyi
wrote:
> Another thought.
> In your error handler you call MPI_Abort.
> I think this is causing problem for my application.
> In my application I use both real and complex versions of PETSc, so I
> initialize MPI myself, then initialize PETSC,
Another thought.
In your error handler you call MPI_Abort.
I think this is causing problem for my application.
In my application I use both real and complex versions of PETSc, so I
initialize MPI myself, then initialize PETSC, then initialize libmesh.
Can I still have libmesh throwing exceptions
>
> On Wed, 26 Aug 2015, namu patel wrote:
>
> If you're using a SerialMesh, and updating based on some
> already-serialized information, then it's probably most efficient to
> iterate over all nodes.
>
​Thanks, this seems to be the way to go. Everything works out nicely when
iterating over all no
On Thu, Aug 27, 2015 at 6:38 AM, wrote:
>
> When using the FemSystem class, what is the best way to implement
> Neumann and Robin boundary conditions?
For "standard" boundary Neumann/Robin terms, override side_time_derivative
in your FEMSystem subclass. That function is only called for elements
Hi all,
When using the FemSystem class, what is the best way to implement
Neumann and Robin boundary conditions? Is this still by adding penalty
terms to matrix and RHS?
If yes, does this collide with using the DirichletBoundary class (I
think it shouldn't)? Or should all bcs be enforced with pe