Thanks for the kind words, Martin.

I think Pysparse on Py3k is unlikely. That project has been dormant for many 
years. While Daniel does have commit rights to the code, modernizing that 
codebase is really not something we have the time or expertise for.

My hope is that we will be able to improve the performance of PETSc. I have 
some ideas on this and just need to find the time to do some troubleshooting. 
Finishing the integration of PETSc and getting the conda feedstocks we depend 
on working took a *long* time and I've got a lot of other things I need to 
catch up on. This is a priority, though, as we hadn't realized how bad the 
situation was. I thought the cross-over was at 2-4 cpus, not 10-30!

- Jon

> On Feb 6, 2020, at 11:37 AM, Martinus WERTS <martinus.we...@ens-rennes.fr> 
> wrote:
> 
> Dear Fipy developers,
> 
> Many thanks for Fipy, and congratulations on the new version.
> 
> I do not have the occasion to work with Fipy as often as I would like, but 
> last Thursday, two students started a small numerical research project with 
> Fipy. The new Fipy release happened just before, which I consider to be some 
> kind of sign of good fortune. 
> 
> I have had students working with Fipy before, but for the first time, these 
> students had been able to install Fipy just by installing Anaconda and 
> clicking somewhere in the supplied "Navigator", instead of following some 
> obscure build recipe concocted by me. They now have 3.4 on their computers 
> and have started running some simple (cylindrical grid) diffusion/advection 
> problems. I know (through painful experiences) that conda is far from 
> perfect, but it did the job this time.
> 
> Many thanks again for Fipy (and any future work that you might do, in 
> particular on fixing issues with cylindrical grids and implementation of 1D 
> spherical grid... simple geometries are very good for getting started with 
> Fipy, and also helpful for building simple models)
> 
> Best wishes,
> 
> Martin
> 
> 
> P.S. I inadvertedly did a small benchmark with Fipy 3.4 on my own Linux 
> laptop computer (I ran an old script that I used to run on Fipy 3.1.x, Python 
> 2.7, pysparse)... now I understand what "lagging by a considerable margin" 
> means when you talk about the different solvers (scipy, but also 
> single-thread PetSc)! Wouldn't it be nice if Pysparse worked with Py3k? 
> Meanwhile, I'll keep a Python 2.7+pysparse environment around somewhere in 
> the lab... 
> 
> 
> 
> 
> 
> 
> On 29/01/2020 15:01, Guyer, Jonathan E. Dr. (Fed) via fipy wrote:
>> Some notes about this release:
>> 
>> - Because conda is a dreary pain in the behind, it appears to be necessary 
>> to specify
>>   `conda install --channel conda-forge fipy=3.4`
>>   otherwise fipy 3.3 takes precedence for some reason.
>> - PETSc supports Py3k in parallel (as well as Python 2.7). 
>> - PyTrilinos is available from conda-forge for Python 2.7. 
>>   In principle, it works with Py3k as well, but we won't be 
>>   attempting to build it to find out.
>> - On Python 2.7, PETSc and PyTrilinos have comparable performance.
>>   Unfortunately, this performance lags serial PySparse by a 
>>   considerable margin. Serial SciPy lags them all.
>>   See 
>> https://www.ctcms.nist.gov/fipy/documentation/USAGE.html#solving-in-parallel
>> 
>>   for discussion. 
>>   Until we can sort this out, we won't be (willingly) dropping support for 
>> Python 2.
>> 
>> 
>>> On Jan 29, 2020, at 8:51 AM, Guyer, Jonathan E. Dr. (Fed) via fipy 
>>> <FIPY@nist.gov>
>>>  wrote:
>>> 
>>> We are pleased to announce the release of FiPy 3.4.
>>> 
>>> 
>>> http://www.ctcms.nist.gov/fipy
>>> 
>>> 
>>> This release adds support for the PETSc solvers for solving in parallel.
>>> 
>>> Pulls
>>> -----
>>> 
>>> - Add support for PETSc solvers (#701)
>>> - Assorted fixes while supporting PETSc (#700)
>>>  - Fix print statements for Py3k
>>>  - Resolve Gmsh issues
>>>  - Dump only on processor 0
>>>  - Only write `timetests` on processor 0
>>>  - Fix conda-forge link
>>>  - Upload PDF
>>>  - Document `print` option of `FIPY_DISPLAY_MATRIX`
>>>  - Use legacy numpy formatting when testing individual modules
>>>  - Switch to matplotlib's built-in symlog scaling
>>>  - Clean up tests
>>> - Assorted fixes for benchmark 8 (#699)
>>>  - Stipulate `--force` option for `conda remove fipy`
>>>  - Update Miniconda installation url
>>>  - Replace `_CellVolumeAverageVariable` class with `Variable` expression
>>>  - Fix output for bad call stack
>>> - Make CircleCI build docs on Py3k (#698)
>>> - Fix link to Nick Croft's thesis (#681)
>>> - Fix NIST header footer (#680)
>>> - Use Nixpkgs version of FiPy expression (#661)
>>> - Update the Nix recipe (#658)
>>> 
>>> Fixes
>>> -----
>>> 
>>> - #692: Can't copy example scripts with the command line
>>> - #669: input() deadlock on parallel runs
>>> - #643: Automate release process
>>> 
>>> 
>>> ========================================================================
>>> 
>>> FiPy is an object oriented, partial differential equation (PDE) solver,
>>> written in Python, based on a standard finite volume (FV) approach. The
>>> framework has been developed in the Metallurgy Division and Center for
>>> Theoretical and Computational Materials Science (CTCMS), in the Material
>>> Measurement Laboratory (MML) at the National Institute of Standards and
>>> Technology (NIST).
>>> 
>>> The solution of coupled sets of PDEs is ubiquitous to the numerical
>>> simulation of science problems. Numerous PDE solvers exist, using a variety
>>> of languages and numerical approaches. Many are proprietary, expensive and
>>> difficult to customize. As a result, scientists spend considerable
>>> resources repeatedly developing limited tools for specific problems. Our
>>> approach, combining the FV method and Python, provides a tool that is
>>> extensible, powerful and freely available. A significant advantage to
>>> Python is the existing suite of tools for array calculations, sparse
>>> matrices and data rendering.
>>> 
>>> The FiPy framework includes terms for transient diffusion, convection and
>>> standard sources, enabling the solution of arbitrary combinations of
>>> coupled elliptic, hyperbolic and parabolic PDEs. Currently implemented
>>> models include phase field treatments of polycrystalline, dendritic, and
>>> electrochemical phase transformations as well as a level set treatment of
>>> the electrodeposition process.
>>> 
>>> 
>>> _______________________________________________
>>> fipy mailing list
>>> 
>>> fipy@nist.gov
>>> http://www.ctcms.nist.gov/fipy
>>> 
>>> [ NIST internal ONLY: 
>>> https://email.nist.gov/mailman/listinfo/fipy
>>>  ]
>>> 
>>> _______________________________________________
>>> fipy mailing list
>>> 
>>> fipy@nist.gov
>>> http://www.ctcms.nist.gov/fipy
>>> 
>>>  [ NIST internal ONLY: 
>>> https://email.nist.gov/mailman/listinfo/fipy
>>>  ]
>>> 
>> 
>> _______________________________________________
>> fipy mailing list
>> 
>> fipy@nist.gov
>> http://www.ctcms.nist.gov/fipy
>> 
>>   [ NIST internal ONLY: 
>> https://email.nist.gov/mailman/listinfo/fipy
>>  ]
>> 
> 
> 
> _______________________________________________
> fipy mailing list
> fipy@nist.gov
> http://www.ctcms.nist.gov/fipy
>  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]


_______________________________________________
fipy mailing list
fipy@nist.gov
http://www.ctcms.nist.gov/fipy
  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]

Reply via email to