Dario,

Thanks. I was able to install the PyAmg module and use it by adding —pyamg on 
the command line. The DefaultSolver on my system appears to be PyTrilinos. 
Sadly, with the —pyamg flag, it took significantly longer for the script to run 
… and I haven’t yet been able to get pySparse to work; or make the —inline flag 
go, for that matter. I’ll check on another machine later. 


Thanks for the help,
Carsten


  _____________________________________Dipl.-Phys. Carsten Langrock, Ph.D.

Senior Research Scientist
Edward L. Ginzton Laboratory, Rm. 202
Stanford University

348 Via Pueblo Mall
94305 Stanford, CA

Tel. (650) 723-0464
Fax (650) 723-2666

Ginzton Lab Shipping Address:
James and Anna Marie Spilker Engineering and Applied Sciences Building
04-040

348 Via Pueblo Mall
94305 Stanford, CA
_____________________________________





> On Jul 24, 2018, at 2:32 AM, Dario Panada <dario.pan...@gmail.com> wrote:
> 
> 
> Hi Carsten,
> 
> I'll start by saying I'm not a fipy expert, but have been playing around it 
> for a few months as part of my PhD project.
> 
> 
> Regarding your second question.
> 
> 
> Performance can be improved by switching from the default (SciPy) solver to 
> one of the others. I use the PyAmg solver that can solve a 100x100x100 3D 
> mesh with multiple sources and sinks in about 1 minute. I am not running a 
> particularly powerful computer, just my laptop with 8GB of RAM (about half of 
> which taken up by the OS, Browser and IntelliJ), so you could probably get 
> even better performance.
> 
> 
> Hope this is at least somewhat helpful. :)
> 
> 
> Kind Regards,
> Dario
> 
> 
> On Tue, Jul 24, 2018 at 12:44 AM Carsten Langrock <langr...@stanford.edu
>> wrote:
> 
>> 
>> Hi,
>> 
>> 
>> Thanks for the help with getting FiPy running under Linux! I am trying to 
>> re-create a 1D nonlinear diffusion problem for which we have C++ code that 
>> uses the implicit Thomas algorithm based on 
>> 
>> 
>> J. Weickert, B. Romerny, M. Viergever, "Efficient and Reliable Schemes
>> for Nonlinear Diffusion Filtering”, IEEE transactions on Image Processing, 
>> vol.7, N03, page 398, March 1998
>> 
>> 
>> 
>> I have been able to get results in FiPy that match this code very closely 
>> which was a great start. Our C++ code uses a fixed number of spatial points 
>> and a fixed time step, but re-meshes space to most efficiently use the size 
>> of the array; it increases the spatial step size by 2 whenever the 
>> concentration at a particular point reaches a set threshold. I tried 
>> implementing this in FiPy as well, but haven’t had much luck so far. I saw 
>> an old mailing-list entry from 2011 where a user was told that FiPy wasn’t 
>> meant to do remeshing. Is that still the case?
>> 
>> 
>> I’d imagine one would somehow need to update the Grid1D object with the new 
>> ‘dx’, but since the CellVariable that holds the solution was initialized 
>> with that mesh object, I am not sure that such a change would propagate in a 
>> sensible fashion. I think I know how to map the value of the CellVariable to 
>> account for the change in ‘dx’ by 
>> 
>> 
>> array_size = 2000
>> phi.value = numpy.concatenate((phi.value[1:array_size/2:2], 
>> numpy.zeros(1500)))
>> 
>> 
>> for the case when the initial variable holds 2000 spatial points. Maybe 
>> there’s a more elegant way, but I think this works in principle.
>> 
>> 
>> Another question would be execution speed. Right now, even when not plotting 
>> the intermediate solutions, it takes many seconds on a very powerful 
>> computer to run a simple diffusion problem. I am probably doing something 
>> really wrong. I wasn’t expecting the code to perform as well as the C++ 
>> code, but I had hoped to come within an order of magnitude. Are there ways 
>> to optimize the performance? Maybe select a particularly clever solver? If 
>> someone could point me into the right direction that’d be great. In the end, 
>> I would like to expand the code into 2D, but given the poor 1D performance, 
>> I don’t think that this would be feasible at this point.
>> 
>> 
>> Thanks,
>> Carsten
>> 
>> 
>> _____________________________________Dipl.-Phys. Carsten Langrock, Ph.D.
>> 
>> Senior Research Scientist
>> Edward L. Ginzton Laboratory, Rm. 202
>> Stanford University
>> 
>> 348 Via Pueblo Mall
>> 94305 Stanford, CA
>> 
>> Tel. (650) 723-0464
>> Fax (650) 723-2666
>> 
>> Ginzton Lab Shipping Address:
>> James and Anna Marie Spilker Engineering and Applied Sciences Building
>> 04-040
>> 
>> 348 Via Pueblo Mall
>> 94305 Stanford, CA
>> _____________________________________
>> 
>> 
>> 
>> 
>> _______________________________________________
>> fipy mailing list
>> fipy@nist.gov
>> 
>> http://www.ctcms.nist.gov/fipy
>> 
>>   [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy
>> ]
>> 
> _______________________________________________
> fipy mailing list
> fipy@nist.gov
> http://www.ctcms.nist.gov/fipy
> [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
> 
_______________________________________________
fipy mailing list
fipy@nist.gov
http://www.ctcms.nist.gov/fipy
  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]

Reply via email to