Re: [deal.II] Nonhomogeneous Dirichlet Boundary conditions using a Dirichlet lift

2017-02-09 Thread Giulia Deolmi
Hi Praveen, as far as I have understood (but I might be wrong), the functions VectorTools::interpolate_boundary_values MatrixTools::apply_boundary_values

Re: [deal.II] Assemble Righthand Side for vector-valued problem

2017-02-09 Thread Jaekwang Kim
Thank you for your advice. I would make the problem simpler. Do you get the right solution if you > have a constant viscosity? If you iterate, do you get the solution after > one iteration? > Yes, I checked this. After one iteration, I get the solution after one iteration when constant visc

Re: [deal.II] Assemble Righthand Side for vector-valued problem

2017-02-09 Thread Jaekwang Kim
Dr. Bangerth Thank you, I just fixed up what was wrong... in most cases, it is usually easy problems. The manufactured solution of velocities has not satisfies the continuity equation. (i.e. manufactured solution does not satisfy, div.u=0)... Jaekwang Kim -- The deal.II project is lo

Re: [deal.II] Assemble Righthand Side for vector-valued problem

2017-02-09 Thread Wolfgang Bangerth
On 02/09/2017 04:45 AM, Jaekwang Kim wrote: The manufactured solution of velocities has not satisfies the continuity equation. (i.e. manufactured solution does not satisfy, div.u=0)... Yes, that would do it :-) Best W. -- --

Re: [deal.II] Nonhomogeneous Dirichlet Boundary conditions using a Dirichlet lift

2017-02-09 Thread Wolfgang Bangerth
as far as I have understood (but I might be wrong), the functions VectorTools::interpolate_boundary_values MatrixTools::apply_boundary_values

Re: [deal.II] Nonhomogeneous Dirichlet Boundary conditions using a Dirichlet lift

2017-02-09 Thread Giulia Deolmi
Thanks a lot! I will have a look at it, kind regards, Giulia Il giorno giovedì 9 febbraio 2017 15:19:07 UTC+1, Wolfgang Bangerth ha scritto: > > > > as far as I have understood (but I might be wrong), the functions > > VectorTools::interpolate_boundary_values > > < > https://www.dealii.org/8.4.

[deal.II] Renumbering dofs with petsc + block + MPI + Direct solver work around

2017-02-09 Thread Spencer Patty
A problem I am working on results in a non symmetric 4x4 block matrix system with the first block representing a vector valued velocity and the remaining 3 blocks scalar quantities that are all coupled. The fe system is represented as FESystem (FESystem(FE_Q (parameters.degree_of_fe_veloci

[deal.II] Re: Renumbering dofs with petsc + block + MPI + Direct solver work around

2017-02-09 Thread Bruno Turcksin
Hi, this is not the answer to your question but if I understand correctly, everything works fine with Trilinos and the only reason why you need PETSc is to use MUMPS. If that's the case, instead of using Amesos_KLU with Trilinos, you can use SuperLU_dist (http://crd-legacy.lbl.gov/~xiaoye/Supe

[deal.II] Re: Renumbering dofs with petsc + block + MPI + Direct solver work around

2017-02-09 Thread Spencer Patty
Interesting, I wondered if SuperLU_dist might be parallel but I hadn't looked into it yet. If it does work, then that certainly makes things much simpler since I have trilinos integrated well. I will look into installing it and see if it will work. I see what you mean by it not being the easi

Re: [deal.II] Re: Renumbering dofs with petsc + block + MPI + Direct solver work around

2017-02-09 Thread Bruno Turcksin
2017-02-09 14:33 GMT-05:00 Spencer Patty : > Interesting, I wondered if SuperLU_dist might be parallel but I hadn't > looked into it yet. If it does work, then that certainly makes things much > simpler since I have trilinos integrated well. I will look into installing > it and see if it will wo

Re: [deal.II] Fully distributed triangulation (level 0)

2017-02-09 Thread Timo Heister
see https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_dealii_dealii_pull_3956&d=CwIBaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=4k7iKXbjGC8LfYxVJJXiaYVu6FRWmEjX38S7JmlS9Vw&m=Kumuf8l3Od3XM93KDoNo2AFeuWqKbGvpUufysXzdnAE&s=847Vw5LGq1DTVk1pUvZNIpjoFirAOxRIkxnlS0VHYC8&e= for the

Re: [deal.II] Re: Renumbering dofs with petsc + block + MPI + Direct solver work around

2017-02-09 Thread Daniel Jodlbauer
Actually Mumps is included in the Amesos solver used by TrilinosWrappers::SolverDirect("Amesos_Mumps"). You may have to recompile Trilinos with the corresponding flags to enable it (and probably deal.II as well). Am Donnerstag, 9. Februar 2017 20:52:16 UTC+1 schrieb Bruno Turcksin: > > 2017-02-