Re: [Numpy-discussion] Numpy complex types, packing and C99

2009-07-02 Thread David Cournapeau
On Fri, Jul 3, 2009 at 4:44 AM, Pauli Virtanen wrote: > I think we tried this already (my c99-umath-funcs branch had > TestC99 special case tests that were in Numpy trunk for a while). > > The outcome was that the current implementations of the complex > functions don't have essentially any specia

Re: [Numpy-discussion] Using loadtxt to read in mixed data types

2009-07-02 Thread Peter Kelley
Thanks Pierre, For some reason 'formats':[eval(b) for b in event_format] didn't work, but as you said should it fail try dtype([(x,eval(b)) for (x,b) in zip(event_fields, event_format)]) which seems to be working. Interestingly before when I had typed this out by hand, both using tuples and a

Re: [Numpy-discussion] Using loadtxt to read in mixed data types

2009-07-02 Thread Pierre GM
On Jul 2, 2009, at 6:42 PM, Peter Kelley wrote: > > Hey Everyone, > > I am reading in a file of columns with mixed data types, and the > number of columns can vary and their format is inputted by the user. > So I came up with this: > > dt=dtype({'names': [x for x in event_fields], 'formats':

Re: [Numpy-discussion] Using loadtxt to read in mixed data types

2009-07-02 Thread Stéfan van der Walt
Hi Peter 2009/7/3 Peter Kelley : > I get TypeError: data type not understood, and I think it is because the > event format is a list of strings not data types. Does anyone have know how > to convert the list of strings into the data types for dtype. In your example the problem actually comes in w

[Numpy-discussion] Using loadtxt to read in mixed data types

2009-07-02 Thread Peter Kelley
Hey Everyone, I am reading in a file of columns with mixed data types, and the number of columns can vary and their format is inputted by the user. So I came up with this: dt=dtype({'names': [x for x in event_fields], 'formats': [b for b in event_format]}) eventArray = loadtxt(behEventFile,dt)

Re: [Numpy-discussion] Numpy complex types, packing and C99

2009-07-02 Thread Pauli Virtanen
On 2009-07-02, David Cournapeau wrote: > I think I will merge the complex_umath_tests branch soon > (platform-specific failures on build bot will be interesting), unless > someone sees a problem with it. I think we tried this already (my c99-umath-funcs branch had TestC99 special case tests that

Re: [Numpy-discussion] ndarray from column data

2009-07-02 Thread Robert
Elaine Angelino wrote: > Hi there -- > > Is there a fast way to make a numpy ndarray from column data? > > For example, suppose I want to make an ndarray with 2 rows and 3 columns > of different data types based on the following column data: > > C0 = [1,2] > C1 = ['a','b'] > C2 = [3.3,4.4] >

Re: [Numpy-discussion] ndarray from column data

2009-07-02 Thread Dan Yamins
> > What's wrong with recarrays? In any case, if you need a true ndarray > object > you can always do: > > ndarr = recarr.view(np.ndarray) > > and you are done. > I have a question about this though. The object "ndarr" will consist of "records", e.g.: In [96]: type(ndarr[0]) Out[96]: If

Re: [Numpy-discussion] Multi thread loading data

2009-07-02 Thread Chris Colbert
I'm relatively certain its possible, but then you have to deal with locks, semaphores, synchronization, etc... On Thu, Jul 2, 2009 at 12:04 PM, Sebastian Haase wrote: > On Thu, Jul 2, 2009 at 5:38 PM, Chris Colbert wrote: >> Who are quoting Sebastian? >> >> Multiprocessing is a python package tha

Re: [Numpy-discussion] Multi thread loading data

2009-07-02 Thread Sebastian Haase
On Thu, Jul 2, 2009 at 5:38 PM, Chris Colbert wrote: > Who are quoting Sebastian? > > Multiprocessing is a python package that spawns multiple python > processes, effectively side-stepping the GIL, and provides easy > mechanisms for IPC. Hence the need for serialization > I was replying to the

Re: [Numpy-discussion] Multi thread loading data

2009-07-02 Thread Chris Colbert
Who are quoting Sebastian? Multiprocessing is a python package that spawns multiple python processes, effectively side-stepping the GIL, and provides easy mechanisms for IPC. Hence the need for serialization On Thu, Jul 2, 2009 at 11:30 AM, Sebastian Haase wrote: > On Thu, Jul 2, 2009 at 5:1

Re: [Numpy-discussion] Multi thread loading data

2009-07-02 Thread Sebastian Haase
On Thu, Jul 2, 2009 at 5:14 PM, Chris Colbert wrote: > can you hold the entire file in memory as single array with room to spare? > If so, you could use multiprocessing and load a bunch of smaller > arrays, then join them all together. > > It wont be super fast, because serializing a numpy array is

Re: [Numpy-discussion] Multi thread loading data

2009-07-02 Thread Chris Colbert
can you hold the entire file in memory as single array with room to spare? If so, you could use multiprocessing and load a bunch of smaller arrays, then join them all together. It wont be super fast, because serializing a numpy array is somewhat slow when using multiprocessing. That said, its stil

Re: [Numpy-discussion] Numpy complex types, packing and C99

2009-07-02 Thread David Cournapeau
On Thu, Jul 2, 2009 at 9:02 AM, David Cournapeau wrote: > > True, but we can deal  with this once we have tests: we can force to > use our own, fixed implementations on broken platforms. The glibc > complex functions are indeed not great, I have noticed quite a few > problems for special value hand

[Numpy-discussion] pep-3118 extended struct format parser

2009-07-02 Thread Sebastien Binet
hi there, at last scipy'08 sprint, somebody (apologies for my brain fade) was working on being able to parse the extended struct format string so one could do: Nested structure :: struct { int ival; struct { unsigned short sval;

Re: [Numpy-discussion] ndarray from column data

2009-07-02 Thread Francesc Alted
A Thursday 02 July 2009 03:02:53 Elaine Angelino escrigué: > Hi there -- > > Is there a fast way to make a numpy ndarray from column data? > > For example, suppose I want to make an ndarray with 2 rows and 3 columns of > different data types based on the following column data: > > C0 = [1,2] > C1 =