my previous message

2014-10-08 Thread olivier atteia
Hello all, well on my previous message i had fixed too coarse grid, and bad time steps, now it seems to wrok sorry to bother olivier ___ fipy mailing list fipy@nist.gov http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailma

a problem of dispersion

2014-10-08 Thread olivier atteia
Hello, i am working on advection dispersion in porous medium. I previously used fipy for some complex boundary conditions and it worked nicely, but now i was trying a 2d example with dispersion in two directions which gave strange results. So i came back to the simplest problem in 1D (a fixed

Re: Parallellizing do not have any benefit. Because trilinos is slower than standart solver (Simple Version)

2014-10-08 Thread Daniel Wheeler
On Wed, Oct 8, 2014 at 3:52 AM, Serbulent UNSAL wrote: > There should be some communication overhead but this couldn't explain 2.5 > time slower solution." > > So may be it is a good idea that forwarding problem to Trilinos upstream > if you also confirm the results with 40,000 cells vs. 160,000

Re: Parallellizing do not have any benefit. Because trilinos is slower than standart solver (Simple Version)

2014-10-08 Thread Serbulent UNSAL
Hello, Thanks for detailed info. Confirmation of parallel running was my first action both with parallel.py and printing "fipy.parallelComm.procID", also print mpi4py version of procID and Epetra's version "Epetra.PyComm().MyPID()" So I'm definetly sure :) Your test are consistent with my result