> First, I'm wondering if the DoF mapping in deal.II is flexible enough to
> handle different variables, or different numbers of variables, in
> different elements.  For example, if one region has elastic deformation
> and the other has Navier-Stokes flow, then the latter will need a field
> for pressure while the former won't.
>
> Is this currently possible in deal.II?

I think I know how this should be implemented but it isn't currently. My 
general idea on how this can be achieved is through the hp framework. 
There, you would, for example, use the following element in the elasticity 
part of your domain:
   Q_p ^ dim   \times  {0}^{dim+1}
and for the Navier-Stokes part
   {0}^dim  \times  Q_{p+1}^dim \times Q_p

The element I use for the respectively unused components (what I indicated 
by {0}) is what mentally I have always called the FENothing: it has no 
basis functions because we don't need them to describe the zero function, 
but it has a single component. The two different finite elements above can 
be composed using the FESystem class, and one would then use an 
hp::DoFHandler that allows to use different elements in different parts of 
the domain.

FENothing is currently not implemented. I believe that it shouldn't be too 
complicated to do -- there's no basis functions, no hanging node 
constraints, etc, but I would anticipate that there are things that can go 
wrong in various places of the library where *something* is expected.


> Second, what is the advantage to using deal.II "native" solvers vs.
> PETSc solvers?  One disadvantage I've noticed is that there doesn't seem
> to be an option to printing the residual during solution, equivalent to
> PETSc's -ksp_monitor command-line option -- or am I missing something?

You can pass options to the SolverControl object to print the history of 
residual reduction etc. It is written to the deallog output stream, so 
you'll have to enable it (it is enabled by default, but all tutorial 
programs disable it as their first action).


> Third, the Debian PETSc package includes wrappers for UMFPack/AMD,
> SuperLU and SPOOLES solvers and hypre preconditioners.  Are there any
> plans to wrap the relevant PETSc interfaces to take advantage of those
> PETSc features?

It shouldn't actually be very hard to do so -- we already have interfaces 
to UMFPACK/AMD, and I've tried SuperLU in the past (worked very similarly 
to UMFPACK). If hypre has an easy enough to use interface, it shouldn't be 
too hard to write a connection to it as well. Note that we have an 
interface to the Trilinos equivalent of hypre, namely ML, already in the 
current svn repository.

As for many other things, the reason why we don't currently have these 
interfaces is that nobody implemented them so far. We're happy to accept 
patched!


> As for the one issue, see the attached tiny patch against 6.1.0 which
> fixes documentation bugs (may be obsolete for 6.2 pre).

Awesome, Guido and I always wondered if there will be a way to use latex 
packages when running doxygen. It's great to see that you've found what 
we've wanted!


> The Debian 
> package also includes patches for:
>       * Using the METIS compatibility layer of Scotch (I think linking
>         with METIS itself may violate the QPL, unless there's an
>         exception I didn't see)
>       * Eliminating rpath which is unnecessary if libs are in /usr/lib
>       * Shared library versioning
>       * Relocating from the build directory to the install directory (a
>         bit of a hack, not clean enough to be useful to upstream)
>       * Using the Debian UMFPack package properly
>
> If any of these interest you, see the .diff.gz files in
> http://lyre.mit.edu/~powell/deal.ii/ - these are in debian/patches.
> (Everything there is signed by my key which is in the Debian keyring for
> verification.)

I would be interested in seeing the deal.II specific parts of these patches 
(i.e. METIS, rpath, versioning, umfpack). Can you send them to me in a 
private mail? In general, feel free to discuss these patches with us 
directly in the future, we'd be quite willing to help keep your and our 
version in synch.

Best
 Wolfgang

-------------------------------------------------------------------------
Wolfgang Bangerth                email:            [EMAIL PROTECTED]
                                 www: http://www.math.tamu.edu/~bangerth/


_______________________________________________

Reply via email to