Rémi,
A Dimarts 24 Gener 2006 16:00, [EMAIL PROTECTED] va escriure:
> Thank you very much for your help. I have been able to go ahead with my
> problem but now I'm once again stuck :-(
> Indeed, I get "segmentation faults" whenever I try to refresh a list of
> PyTables objects created from a reference (eg. the 'DIMENSION_LIST'
> attribute).
>
> To my mind the PyTables objects I create from reference are not properly
> deleted, since the objects created to refresh the list seem to be correctly
> created (and there's a problem somewhere !).
Mmm, I bet that the problem is in the way that PyTables object kill
themselves. You should know that all PyTables objects inherit from
Node, and that Node has a __del__() method. This method is called
whenever an object gets unbounded (my guess is that this happens for
example when you try to refresh them). Between other things, this
method will try to detach the Node from the tree, but as this Node is
somewhat 'special' (it has been created from a reference, so it is not
in the tree), then all sort of bad things will happen (as you are
experiencing).
A possible solution for this is to add a Python attribute to the
objects created from a reference (for example "_v_isRef", to follow
PyTables conventions). Then the next patch would be enough to bypass
the normal procedure to killing PyTables objects:
Index: tables/Node.py
===================================================================
--- tables/Node.py (revision 1406)
+++ tables/Node.py (working copy)
@@ -318,7 +318,20 @@
# the `Node` instance were left. If closed nodes could be
# revived, the user would also need to force the closed
# `Node` out of memory, which is not a trivial task.
- #
+
+ if hasattr(self, "_v_isRef"):
+ # Special case for objects created from references.
+ # Close the associated `AttributeSet`
+ # only if it has already been placed in the object's dictionary.
+ if '_v_attrs' in myDict:
+ self._v_attrs._f_close()
+
+ # Close the object on-disk.
+ self._g_close()
+
+ # Finally, clear all remaining attributes in the object.
+ self.__dict__.clear()
+
if not self._f_isOpen():
return
Please, have in mind that all of these operations are potentially
dangerous in the sense that if you (or me or whoever) forget to
properly close the Node, then leaks will develop. So it would be nice
if you can check carefully that a repeated process of creating and
deleting such a "referenced objects" does not grow the memory
consumption (I hope not).
Salut,
--
>0,0< Francesc Altet http://www.carabos.com/
V V Cárabos Coop. V. Enjoy Data
"-"
-------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems? Stop! Download the new AJAX search engine that makes
searching your log files as easy as surfing the web. DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid3432&bid#0486&dat1642
_______________________________________________
Pytables-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/pytables-users