On 7 October 2014 11:35, Jan Blechta <[email protected]> wrote:
>
> On Tue, 7 Oct 2014 10:23:21 +0100
> "Garth N. Wells" <[email protected]> wrote:
>
> >
> > On 6 Oct 2014, at 16:38, Martin Sandve Alnæs <[email protected]>
> > wrote:
> >
> > > I think this is the best solution:
> > >
> > > 1) Require the user to close file objects deterministically.
> > > Relying on the del operator is not deterministic, we need to support
> > > .close() and/or __enter__/__exit__ for the with statement in dolfin.
> > >
> >
> > Sounds good. We can print a warning message from the File object
> > destructors if a file is not closed (this can later become an error).
>
> Good idea, but maybe warning could be issued from __del__ operator if
> object was not properly destroyed/closed. In C++ layer everything is OK.

True, warning from Foo.__del__ is better than from ~Foo.


> Maybe we should also check how petsc4py deals with the issue and get
> eventually inspired.

Yes, they have to relate to the same pattern.


> Jan
>
> >
> > > 2) Recommend users to throw in some gc.collect() calls in their
> > > code if objects go out of scope in their code. This doesn't seem to
> > > be a big problem, but it's a lingering non-deterministic mpi
> > > deadlock waiting to happen and very hard to debug.
> > >
> >
> > What about insisting that objects that require collective calls
> > during destruction must have a collective ‘clear’ or ‘destroy'
> > function that cleans up the object.

Yes, I think that's the only truly safe way. We can't very well start
writing

with Vector() as b:
    with Matrix() as A:
        ...

either, so the bottom line is that python does not provide a usable
way to handle determinstic collective destruction.

> > Related to this discussion, we really need to to starting marking
> > (logically) collective functions in the docstrings.

Sounds good.

Martin
_______________________________________________
fenics mailing list
[email protected]
http://fenicsproject.org/mailman/listinfo/fenics

Reply via email to