Hi,

In the code below, as it goes through the 100,000 loop appending rows, the 
memory being used ( as measured by top on a linux box ) steadily increases.  
How can I prevent this from happening? ( calling flush does not seem to help )  
This would seem to present a limit on how much data I can append to a file?  
The behavior is with v1.4 and with the 2.0 beta.

thanks!
Stefan Kuzminski


from tables import *
from time import sleep

num_cols = 1500
fp = openFile( "foo", 'w' )


Float64Col( shape=(num_cols,))
table = fp.createTable( fp.root, 'title',
                        { 'var1' : Float64Col( shape=(num_cols,)) }, '' )
print 'appending...'
row = range( num_cols )
for i in range( 100000 ):

    table.append( [[row]])
    if i % 1000 == 0:
        print i
        table.flush()
print 'done appending'
sleep( 20 )







 
____________________________________________________________________________________
We won't tell. Get more on shows you hate to love 
(and love to hate): Yahoo! TV's Guilty Pleasures list.
http://tv.yahoo.com/collections/265 

-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
Pytables-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/pytables-users

Reply via email to