I've come across a strange behavior for classes subclassed from ndarray. Here's a minimal example that illustrates the problem:
import numpy as np class TestArray(np.ndarray): def __new__(cls, data, info=None, dtype=None, copy=False): subarr = np.array(data, dtype=dtype, copy=copy) subarr = subarr.view(cls) return subarr def sort(self,*args,**kwargs): print type(self) print type(self.base) Now consider this: In [1]: tst = TestArray(np.random.rand(2,3)) In [2]: tst.sort() <class '__main__.TestArray'> <type 'numpy.ndarray'> In [3]: np.sort(tst) <class '__main__.TestArray'> <type 'NoneType'> Out[3]: TestArray([[ 0.90489484, 0.950291 , 0.80753772], [ 0.49020689, 0.84582283, 0.61532922]]) Why whould tst.sort() show the correct base class and np.sort show NoneType as base class for tst? I'd appreciate any insights ... _______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion