Obviously, holistic support for OpenMP is critical to the future of PETSc
:-D

On a more serious note, Matt and I have discussed the use of PETSc for
sparse multidimensional array computations for dimensions greater than 2,
also known as tensor computations. The associated paper describing previous
work with dense arrays is
http://www.eecs.berkeley.edu/Pubs/TechRpts/2012/EECS-2012-210.html.  There
was even an unsuccessful SciDAC application proposal that described how
PETSc could be used for that domain when sparsity is important.  To start,
all we'd need is sparse matrix x sparse matrix multiplication, which I hear
the multigrid folks also need.  Sparse times dense is also important.
Sparse tensor factorization would also help, but I get that there are
enough open math questions there that it might be impractical to try to
implement something in PETSc in the near future.

Maybe I am just biased because I spend all of my time reading
www.nextplatform.com, but I hear machine learning is becoming an important
HPC workload.  While the most hyped efforts related to running inaccurate -
the technical term is half-precision - dense matrix multiplication as fast
as possible, I suspect that more elegant approaches will prevail.
Presumably there is something that PETSc can do to enable machine learning
algorithms.  As most of the existing approaches use silly programming
models based on MapReduce, it can't be too hard for PETSc to do better.

Jeff

On Fri, Jul 1, 2016 at 2:32 PM, Barry Smith <bsm...@mcs.anl.gov> wrote:

>
>    The DOE SciDAC institutes have supported PETSc linear solver
> research/code development for the past fifteen years.
>
>     This email is to solicit ideas for linear solver research/code
> development work for the next round of SciDAC institutes (which will be a 4
> year period) in PETSc. Please send me any ideas, no matter how crazy, on
> things you feel are missing, broken, or incomplete in PETSc with regard to
> linear solvers that we should propose to work on. In particular, issues
> coming from particular classes of applications would be good. Generic
> "multi physics" coupling types of things are too general (and old :-))
> while  work for extreme large scale is also out since that is covered under
> another call (ECP). But particular types of optimizations etc for existing
> or new codes could be in, just not for the very large scale.
>
>     Rough ideas and pointers to publications are all useful. There is an
> extremely short fuse so the sooner the better,
>
>     Thanks
>
>       Barry
>
>
>
>


-- 
Jeff Hammond
jeff.scie...@gmail.com
http://jeffhammond.github.io/

Reply via email to