On Mon, Oct 21, 2013 at 3:48 PM, James Snyder <jbsny...@fanplastic.org> wrote:

> I've started by sticking some time.time() calls before/after the call out to
> Gmsh and the equation.solve() calls.  Should that be adequate for testing
> the time needed for solving and meshing?

That's right. The time per step for large numbers of steps is
obviously the most important quantity.

> Now that I've actually tried collecting some numbers on some varied mesh
> sizes non-mpi (pysparse) vs mpirun -np 8 --trilinos (on an 8 core machine)
> I'm seeing things that are less than 2x for the solver section on a 3-D mesh
> with element counts from 100k-750k. I'm re-running things now to check that
> I've done the comparison consistently since at one point I tried messing
> around with the solver and completion conditions.  I presume the most fair
> comparison might be with a selected solver type (LinearPCG for example) and
> the same termination conditions (tolerance/iterations)?

I think so. I'll be interested in seeing your results.

>
> Do you have any expected scaling characteristics?

We do have benchmarking results somewhere, but I can't seem to locate
them at the moment. I have never used FiPy with more than 64 nodes on
a shared memory machine and it scaled reasonably well up to that
point.

> I'd be happy to
> instrument this a bit and run some exploration of parameter space if it
> would be of use or interest. I see that there was some work done on
> profiling FiPy, but it doesn't look like that's in tools in the main branch.

We have made attempts at formalizing the benchmark process, but mostly
unsuccessfully. If you have any results at all they would be useful
(or just an anecdote).

> I'll look into that a little more since it's unexpected.  It's been a while
> since I've tried to do any debugging of Python with MPI, but I could at
> least try and provide some sort of test case that does it for me along with
> version information for any of the dependencies being used.

That would be great. Thanks.

-- 
Daniel Wheeler
_______________________________________________
fipy mailing list
fipy@nist.gov
http://www.ctcms.nist.gov/fipy
  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]

Reply via email to