A Wednesday 14 January 2009, Kevin Dunn escrigué:
> Francesc,
>
> Firstly, thanks for the great work you all are doing with PyTables.
> I really appreciate the manuals, website and the easy-to-use API.
Glad that you liked it :-)
> What about using these public datasets hosted at Amazon?
> http:/
Hi Milos,
A Wednesday 14 January 2009, Milos Ilak escrigué:
> Hi all,
>
> I realize it's been almost six months since I started this thread,
> but I just wanted to let people know that I figured out what my issue
> was last night, in case someone else was having similar problems. I
> seldom need t
Francesc,
Firstly, thanks for the great work you all are doing with PyTables. I
really appreciate the manuals, website and the easy-to-use API.
What about using these public datasets hosted at Amazon?
http://aws.amazon.com/publicdatasets/
I image that setting up the code to access these dataset
Hi all,
I realize it's been almost six months since I started this thread, but I
just wanted to let people know that I figured out what my issue was last
night, in case someone else was having similar problems. I seldom need to
run my codes on my personal computer, so it wasn't an urgent issue for
Hi Toby,
A Wednesday 14 January 2009, Toby Mathieson escrigué:
> Hi there,
>
> I am trying to port a single large table (~150 million records) into
> a single HDF5 file using Pytables. Whilst I am sure this is no
> problem to store in a HDF5 file, I am wondering if there is any way
> of getting t
Hi there,
I am trying to port a single large table (~150 million records) into a
single HDF5 file using Pytables. Whilst I am sure this is no problem
to store in a HDF5 file, I am wondering if there is any way of getting
this data directly from the MySQL file into the HDF5 file without
having to