Hello all,

As per my previous email, I encountered a strange sometimes-segfault  
when using 'numpy.array(thing)' to convert 'thing' (which provides an  
__array_interface__) to a numpy array.

The offending __array_interface__ has a 'data' item that is a python  
string (not, as specified in the protocol, a pointer to a memory area)  
which is too small for the provided array size. Which, in some cases,  
causes a segfault.

Is this a bug? That is, should numpy check the size of the  
'data' (when 'data' is in fact a sized python object and not just a  
bare pointer) against the size of the array and provided typestr? Or  
is this too much of a corner case to deal with...

Zach
_______________________________________________
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to