Hi Jacob,
H, this shouldn't be happening. The data isn't large enough. While
there *may* be a memory leak (use valgrind to find it) there it is likely
that you are just failing to deference something(s). Is there any place
in your calculation where you might accidentally keep data around?
Hello PyTables Users,
My current implementation works pretty well now and has the write speeds
that I am looking for; however, around 20 minutes of execution and of a
file size of around 127MB with level 3 blosc compression I seem to get
memory allocation errors. Here is my trace that I get, if an