Hello everyone,
I got the package in a reasonably working state and Travis testing setup, so I am putting the package up on Github.

        https://github.com/JaredCrean2/PETSc.jl

        There is still a lot more work to do, but its a start.

        A couple questions:
When looking though the code, I noticed the MPI communicator is being passed as a 64 bit integer. mpi.h typedefs it as an int, so shouldn't it be a 32 bit integer?

Also, is there a way to find out at runtime what datatype a PetscScalar is? It appears PetscDataTypeGetSize does not accept PetscScalar as an argument.

    Jared Crean


On 07/06/2015 09:02 AM, Matthew Knepley wrote:
On Mon, Jul 6, 2015 at 4:59 AM, Patrick Sanan <patrick.sa...@gmail.com <mailto:patrick.sa...@gmail.com>> wrote:

    I had a couple of brief discussions about this at Juliacon as
    well. I think it would be useful, but there are a couple of things
    to think about from the start of any new attempt to do this:
    1. As Jack pointed out, one issue is that the PETSc library must
    be compiled for a particular precision. This raises some questions
    - should several versions of the library be built to allow for
    flexibility?
    2. An issue with wrapping PETSc is always that the flexibility of
    using the PETSc options paradigm is reduced - how can this be
    addressed? Could/should an expert user be able to access the
    options database directly, or would this be too much violence to
    the wrapper abstraction?


I have never understood why this is an issue. Can't you just wrap our interface level, and use the options just as we do? That is essentially what petsc4py does. What is limiting in this methodology? On the other hand, requiring specific types, ala FEniCS,
is very limiting.

   Matt

    On Sat, Jul 4, 2015 at 11:00 PM, Jared Crean <jcrea...@gmail.com
    <mailto:jcrea...@gmail.com>> wrote:

        Hello,
             I am a graduate student working on a CFD code written in
        Julia, and I am interested in using Petsc as a linear solver
        (and possibly for the non-linear solves as well) for the
        code.  I discovered the Julia wrapper file Petsc.jl in Petsc
        and have updated it to work with the current version of Julia
        and the MPI.jl package, using only MPI for communication (I
        don't think Julia's internal parallelism will scale well
        enough, at least not in the near future).

             I read the discussion on Github
        [https://github.com/JuliaLang/julia/issues/2645], and it looks
        like
        there currently is not a complete package to access Petsc from
        Julia.  With your permission, I would like to use the Petsc.jl
        file as the basis for developing a package.  My plan is create
        a lower level interface that exactly wraps Petsc functions,
        and then construct a higher level interface, probably an
        object that is a subtype of Julia's AbstractArray, that allows
        users to store values into Petsc vectors and matrices.  I am
        less interested in integrating tightly with Julia's existing
linear algebra capabilities than ensuring good scalability. The purpose of the high level interface it simple to populate
        the vector or matrix.

             What do you think, both about using the Petsc.jl file and
        the  overall approach?

             Jared Crean





--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

Reply via email to