Re: [Numpy-discussion] Numpy on Python3
Setup.py runs 2to3 automatically for all changed files. Of course, if it's possible to cater for24 and 3 at the same time,that's good. How do you work around the relative imports andthe changed exception catching syntax? -- alkuper. viesti -- Aihe: Re: [Numpy-discussion] Numpy on Python3 Lähettäjä: David Cournapeau courn...@gmail.com Päivämäärä: 23.11.2009 08:19 On Mon, Nov 23, 2009 at 2:35 PM, Pauli Virtanen p...@iki.fi wrote: It might be nice to have this merged in at some point after 1.4.0 (after the most obvious glaring bugs have been fixed), so that we could perhaps start aiming for Python3 compatibility in Numpy 1.5.0. One thing I have on my end is a numpy.distutils which runs under both python 2 and 3, so that you don't have to run 2to3 everytime you make a change. I did not put it in the trunk because I did not want to modify numpy.distutils for 1.4.0 at this point, but I will include the changes as soon as I branch the trunk into 1.4.0, David ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] Numpy on Python3
The issue with longs is that we wouldn't want array([1,2,3]) create object arrays -- so we need to decide on casting rules for longs. Currently, I think they're treated like python2 ints. -- alkuper. viesti -- Aihe: Re: [Numpy-discussion] Numpy on Python3 Lähettäjä: Charles R Harris charlesr.har...@gmail.com Päivämäärä: 23.11.2009 08:08 On Sun, Nov 22, 2009 at 10:35 PM, Pauli Virtanen p...@iki.fi wrote: http://github.com/pv/numpy-work/tree/py3k $ mkdir -p $PWD/dist/lib/python3.1/site-packages $ python3 setup.py install --prefix=$PWD/dist $ cd $PWD/dist/lib/python3.1/site-packages python3 Python 3.1.1+ (r311:74480, Oct 11 2009, 20:22:16) [GCC 4.4.1] on linux2 Type help, copyright, credits or license for more information. import numpy XXX: 3K: numpy.random is disabled for now, uses PyString_* XXX: 3K: numpy.ma is disabled for now -- some issues numpy.array([1., 2, 3, 4]) array([ 1., 2., 3., 4.]) _ + 10 array([ 11., 12., 13., 14.]) numpy.ones((4,), dtype=complex)/4 array([ 0.25+0.j, 0.25+0.j, 0.25+0.j, 0.25+0.j]) numpy.array([object(), object()]) array([object object at 0xb7778810, object object at 0xb7778d90], dtype=b'object') *** Things were fairly straightforward this far, just many tiny changes. What's left is then sorting out the bigger problems :) This is still far from being complete: - Most use of PyString_* needs auditing (note e.g. the b'object' in the dtype print above...). I simply added convenience wrappers for PyString - PyBytes, but this is not the correct choice at all points. - Also, should dtype='S' be Bytes or Unicode? I chose Bytes for now. - Whether to inherit Numpy ints from PyLong_* needs some thinking, as they are quite different objects. Now, I dropped the inheritance, but I wonder if this will break something. Maybe. But it was always a hassle because it behaved differently than the other integer types. Now onto float ;) - PyFile_AsFile has disappeared, and AsFileDescriptor+fdopen doesn't seem to cut it -- don't know exactly what's wrong here. - Integer - String formatting does not seem to work - Larger-than-long-long Python ints probably cause problems We used a python call which would raise an error if the number was too large. If that call is still valid, things should work. - The new buffer interface needs to be implemented -- currently there are just error-raising stubs. I remember Dag was working on this a bit: how far did it go? - Relative imports + 2to3 is a bit of a pain. A pity we can't have them in the mainline code because of python2.4. - I didn't check for semantic changes in tp_* interface functions. This we need still to do. I left some notes in the src folder. If you discover anything new put it in there. - And probably many other issues lurking. We do need to look at the initialization of the type math stuff in the ufuncobject module. Yeah, its a bit of a circular dependency, one reason it would be nice to have ufuncs operate on buffer objects instead of ndarrays would be to break the mutual dependence. Also, I didn't yet try checking how far the test suite passes on Python3. (It still passes completely on Python2, so at least I didn't break that part.) It might be nice to have this merged in at some point after 1.4.0 (after the most obvious glaring bugs have been fixed), so that we could perhaps start aiming for Python3 compatibility in Numpy 1.5.0. If you want to see real suffering, look at the programmer notes in the hacked CRU files. Some poor sucker was stuck with fixing up the g*dawful code while also needing to determine what data was in undocumented binary files, some with the same names but containing different data. Folks, don't let that happen to you. The conversion to Py3k is going to be a breeze by comparison. Chuck ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] Numpy on Python3
Pauli Virtanen wrote: Setup.py runs 2to3 automatically for all changed files. Yes, but I think it is more practical to have the build process to be 2 and 3-compatible. Of course, if it's possible to cater for24 and 3 at the same time,that's good. How do you work around the relative imports andthe changed exception catching syntax? For the exception catching, one can have a few utilities to walk through the stack - I don't remember how I did it for the relative import, but this was not a pb IIRC. I am quite disappointed that numscons will not be usable in the foreseable future for py3k, I hoped it would have been simpler than numpy.distutils, but porting scons to py3k is too big of a task. cheers, David ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] Numpy on Python3
Mon, 23 Nov 2009 01:40:00 -0500, Pierre GM wrote: [clip] XXX: 3K: numpy.ma is disabled for now -- some issues What are the issues ? Something resolving which would have taken more than 5 minutes :) Possibly because something that ma depends on is currently broken in numpy.core. I just wanted to breeze through and arrive as fast as possible at something that can be imported and works for simple things, so I didn't stop at anything that would take longer. I'll take a look at the rest later on. -- Pauli Virtanen ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] Numpy on Python3
Mon, 23 Nov 2009 08:58:47 +0100, Sturla Molden wrote: Pauli Virtanen skrev: XXX: 3K: numpy.random is disabled for now, uses PyString_* XXX: 3K: numpy.ma is disabled for now -- some issues I thought numpy.random uses Cython? Is it just a matter of recompiling the pyx-file? The Cython file uses the C-api directly, so we'll need a .h file with the necessary compile-time conditionals. I remember Dag was working on this a bit: how far did it go? Cython's include file numpy.pxd has an ndarray class that extend PyArrayObject with PEP 3118 buffer compatibility. Great! I believe I will just steal whatever I can and rewrite it in C -- for now, it seems possible to keep Numpy's core in plain C. -- Pauli Virtanen ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] Numpy on Python3
On Nov 23, 2009, at 4:36 AM, Pauli Virtanen wrote: Mon, 23 Nov 2009 01:40:00 -0500, Pierre GM wrote: [clip] XXX: 3K: numpy.ma is disabled for now -- some issues What are the issues ? Something resolving which would have taken more than 5 minutes :) Possibly because something that ma depends on is currently broken in numpy.core. Fair enough, fair enough... But y'all, please let me know potential issues (not that I'll be able to work on it anytime soon, but just in case...). P. ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] neighborhood iterator
Thank you, this is a start. I seems that there are more issues to resolve. I am trying to make a general frame that would enable one to write filters using this iterator. Nadav -Original Message- From: numpy-discussion-boun...@scipy.org on behalf of David Warde-Farley Sent: Mon 23-Nov-09 03:21 To: Discussion of Numerical Python Subject: Re: [Numpy-discussion] neighborhood iterator On 22-Nov-09, at 12:50 PM, Nadav Horesh wrote: I wonder if the neighbourhood iterator can be used as a more efficient replacement for ndimage.generic_filter. Is there a way to use it from cython? Yes, using the NumPy C API, called like any other C function is from Cython. Something like: ## import numpy as np cimport numpy as np cdef extern from numpy/arrayobject.h: object PyArray_NeighborhoodIterNew(object iter, np.npy_intp bounds, int mode, object, np.ndarray fill_value) int PyArrayNeighborhoodIter_Next(object iter) int PyArrayNeighborhoodIter_Reset(object iter) ## should do the trick. Note that you'll need to call np.import_array() before using any of these functions to initialize the C API, I think. David ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion winmail.dat___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
[Numpy-discussion] Resize method
Access by the interpreter prevents array resizing. Yes, one can use the function, in place of the method but this appears to require copying the whole array. If one sets b= a, then that reference can be deleted with del b. Is there any similar technique for the interpreter? Colin W. Python 2.6 (r26:66721, Oct 2 2008, 11:35:03) [MSC v.1500 32 bit (Intel)] on win32 Type help, copyright, credits or license for more information. from numpy import * a= array(7*[3]) a.resize((3,7)) a array([[3, 3, 3, 3, 3, 3, 3], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0]]) a.resize((4,7)) Traceback (most recent call last): File stdin, line 1, in module ValueError: cannot resize an array that has been referenced or is referencing another array in this way. Use the resize function ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
[Numpy-discussion] recompiling everything ?
I have several scikits with c extension compiled against an older trunk version of 1.4 but didn't recompile them with the numpy trunk version that I currently use. scikits.timeseries just crashed on me taking several hours worth of examples with it. (scipy crashed during testing, so I rebuilt it immediately with the matching numpy version) Is this only related to the temporary (?) ABI breaking during the datetime merge, or do we now have to recompile all packages with c-extension each time we update numpy trunk? (If scipy trunk didn't require numpy trunk, I would gladly just stick with the release version of numpy.) Josef ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] Numpy on Python3
Pauli Virtanen wrote: Mon, 23 Nov 2009 08:58:47 +0100, Sturla Molden wrote: Pauli Virtanen skrev: XXX: 3K: numpy.random is disabled for now, uses PyString_* XXX: 3K: numpy.ma is disabled for now -- some issues I thought numpy.random uses Cython? Is it just a matter of recompiling the pyx-file? The Cython file uses the C-api directly, so we'll need a .h file with the necessary compile-time conditionals. I remember Dag was working on this a bit: how far did it go? Cython's include file numpy.pxd has an ndarray class that extend PyArrayObject with PEP 3118 buffer compatibility. Great! I believe I will just steal whatever I can and rewrite it in C -- for now, it seems possible to keep Numpy's core in plain C. I did sit down with David to learn enough to do this and had a brief start on doing it properly for NumPy on SciPy 2009 (with seperate testcases and the buffer format string stored directly in the NumPy dtype structs on creation). I meant to come back to it in November but due to becoming sick etc. etc. that's no longer possible. If nothing happens by the mid/end of January I still hope to be able to do this then. Feel free to ask any questions about the buffer PEP if you do go forward with this as I've used it a lot in Cython (and wrote the implementation on behalf of NumPy there). The Cython numpy.pxd does: - Temporarily allocate buffers for the format string, this is inefficient as NumPy can store them directly in the dtype when the dtype is constructed - Not support non-native endian (and some other relevant packing formats I believe) - Not support the string types - Not support dtypes with nested sub-arrays within records What is done: David added some code (at least in some branch) that ensures that sizeof(npy_intp) == sizeof(Py_ssize_t) on Python 2.6+. (I.e. if that assumption is violated NumPy won't compile, so we're free to assume it until the issue of using npy_intp for indices is fixed on a more fundamental level in NumPy). This means that the shapes/strides in Py_buffer can be directed directly to the dimensions/strides in the NumPy array struct (whereas Cython's numpy.pxd has to make a copy on some platforms for Python 2.4). Dag Sverre ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] neighborhood iterator
Hi, Having just written some cython code to iterate a neighborhood across an array, I have some ideas about features that would be useful for a general frame. Specifically, being able to pass in a footprint boolean array to define the neighborhood is really useful in many contexts. Also useful is the ability to query the offset of the current pixel from the center of the neighborhood. (These two features, plus very efficient handling of boundary conditions by breaking the image into regions where the conditions are and are not required, make the image iterators in ITK really nice to use.) Zach On Nov 23, 2009, at 9:12 AM, Nadav Horesh wrote: Thank you, this is a start. I seems that there are more issues to resolve. I am trying to make a general frame that would enable one to write filters using this iterator. Nadav -Original Message- From: numpy-discussion-boun...@scipy.org on behalf of David Warde- Farley Sent: Mon 23-Nov-09 03:21 To: Discussion of Numerical Python Subject: Re: [Numpy-discussion] neighborhood iterator On 22-Nov-09, at 12:50 PM, Nadav Horesh wrote: I wonder if the neighbourhood iterator can be used as a more efficient replacement for ndimage.generic_filter. Is there a way to use it from cython? Yes, using the NumPy C API, called like any other C function is from Cython. Something like: ## import numpy as np cimport numpy as np cdef extern from numpy/arrayobject.h: object PyArray_NeighborhoodIterNew(object iter, np.npy_intp bounds, int mode, object, np.ndarray fill_value) int PyArrayNeighborhoodIter_Next(object iter) int PyArrayNeighborhoodIter_Reset(object iter) ## should do the trick. Note that you'll need to call np.import_array() before using any of these functions to initialize the C API, I think. David ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion winmail.dat___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
[Numpy-discussion] datetime update
I've made a few changes to datetime today and spent some time looking over what is there and what remains to be implemented. Basically, the biggest thing left to do is to implement the low-level casting functions to and from datetime types and other numpy types. In addition, the ufuncs need some auditing to make sure the right thing is being done when mixing different units. After that, lots and lots of additional tests need to be written. Once that is done, then most of the features should be available, but I suspect a few lingering issues might crop up and require fixing or fleshing out as well. I was hoping that someone would be able to contribute more tests for datetime.I will spend some time on the casting functions over the next few weeks and write a few tests. I fixed a problem today with the fact that PyArray_DescrFromScalar was not returning a data-type object with the correct frequency information stored when given a datetime64 or timedelta64 scalar (it was ignoring the date-time metadata on the scalar). This fixed a problem with the printing so that now a = arange(10).view('M8[Y]') shows something reasonable. I also removed numpy.datetime and numpy.timedelta from the namespace (replaced them with numpy.datetime_ and numpy.timedelta_). These were just short-hand for numpy.datetime64 and numpy.timedelta64 respectively. Avoiding the collision seemed like a good idea. Right now, what works is viewing arrays as datetime data-types and getting and setting date-time arrays using datetime objects. I would like to improve it so that setting with strings, integers, and other Python objects works as well.Also, adding simple integers works, but Dave C suggested removing the new C-API calls which sounds like a good idea to me for 1.4.0. Which functions get exported into the C-API for 1.5.0 could then receive some discussion. I apologize for the slow communication about where things are at. Best regards, -Travis ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
[Numpy-discussion] make hstack and vstack promote 1-D argument to 2-D when necessary
I opened ticket #1302 to make the following enhancement request: I'd like to see hstack and vstack promote 1-D arguments to 2-D when this is necessary to make the dimensions match. In the following example, c_ works as expected while hstack does not: [~]|8 x 8 array([[1, 2, 3], [4, 5, 6], [7, 8, 9]]) [~]|9 y 9 array([10, 11, 12]) [~]|10 c_[x,y] 10 array([[ 1, 2, 3, 10], [ 4, 5, 6, 11], [ 7, 8, 9, 12]]) [~]|11 x 11 array([[1, 2, 3], [4, 5, 6], [7, 8, 9]]) [~]|12 y 12 array([10, 11, 12]) [~]|13 hstack((x,y)) --- ValueErrorTraceback (most recent call last) \ipython console in module() C:\Program Files\Python25\lib\site-packages\numpy\lib\shape_base.pyc in hstack(tup) 503 504 -- 505 return _nx.concatenate(map(atleast_1d,tup),1) 506 507 row_stack = vstack ValueError: arrays must have same number of dimensions -- View this message in context: http://old.nabble.com/make-hstack-and-vstack-promote-1-D-argument-to-2-D-when-necessary-tp26488748p26488748.html Sent from the Numpy-discussion mailing list archive at Nabble.com. ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] make hstack and vstack promote 1-D argument to 2-D when necessary
On Mon, Nov 23, 2009 at 18:43, Dr. Phillip M. Feldman pfeld...@verizon.net wrote: I opened ticket #1302 to make the following enhancement request: I'd like to see hstack and vstack promote 1-D arguments to 2-D when this is necessary to make the dimensions match. In the following example, c_ works as expected while hstack does not: This isn't going to change. It would be inconsistent with the way we promote 1D arrays to higher dimensions when broadcasting (i.e. we always prepend 1s to the shape tuple so 1D arrays are treated like 2D row vectors). For the hstack() case, you might want to use column_stack() which explicitly treats 1D arrays like columns. Otherwise, just use column vectors explicitly. -- Robert Kern I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth. -- Umberto Eco ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] datetime update
On Nov 23, 2009, at 6:42 PM, Travis Oliphant wrote: I've made a few changes to datetime today and spent some time looking over what is there and what remains to be implemented. As always, many thanks for your work !! Basically, the biggest thing left to do is to implement the low-level casting functions to and from datetime types and other numpy types. In addition, the ufuncs need some auditing to make sure the right thing is being done when mixing different units. After that, lots and lots of additional tests need to be written. Once that is done, then most of the features should be available, but I suspect a few lingering issues might crop up and require fixing or fleshing out as well. I was hoping that someone would be able to contribute more tests for datetime.I will spend some time on the casting functions over the next few weeks and write a few tests. Fortunately, the new modifications will make it easier to write such tests... But in any case, we can assume that what is proposed in the NEP should work, right ? I also removed numpy.datetime and numpy.timedelta from the namespace (replaced them with numpy.datetime_ and numpy.timedelta_). These were just short-hand for numpy.datetime64 and numpy.timedelta64 respectively. Avoiding the collision seemed like a good idea. Right now, what works is viewing arrays as datetime data-types and getting and setting date-time arrays using datetime objects. I would like to improve it so that setting with strings, integers, and other Python objects works as well. Did you use any of Marty Fuhry's GSoC work ? What are the potential issues that could prevent an easy integration ? Also, adding simple integers works, but Dave C suggested removing the new C-API calls which sounds like a good idea to me for 1.4.0. Which functions get exported into the C-API for 1.5.0 could then receive some discussion. Wouldn't it be easier to leave the C-APi as it is now, even for 1.4.0, but not to advertize it before 1.5.0 ? Thanks again for everything P. ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] Resize method
Christopher Barker wrote: Colin J. Williams wrote: Access by the interpreter prevents array resizing. yup -- resize is really fragile for that reason. It really should be used quite sparingly. Personally, I think it should probably only be used when wrapped with a higher level layer. I've been working on an extendable array class, I call an accumulator (bad name...). The idea is that you can use it to accumulate values when you don't know how big it's going to end up, rather than using a list for this, which is the standard idiom. In [2]: import accumulator In [3]: a = accumulator.accumulator((1,2,3,4,)) In [4]: a Out[4]: accumulator([1, 2, 3, 4]) In [5]: a.append(5) In [6]: a Out[6]: accumulator([1, 2, 3, 4, 5]) In [8]: a.extend((6,7,8,9)) In [9]: a Out[9]: accumulator([1, 2, 3, 4, 5, 6, 7, 8, 9]) At the moment, it only support 1-d arrays, though I'd like to extend it to n-d, probably only allowing growing on the first axis. This has been discussed on this list a fair bit, with mixed reviews as to whether there is any point. It's slower than lists in common usage, but has other advantages -- I'd like to see a C version, but don't know if I'll ever have the time for that. I've enclosed to code for your viewing pleasure -Chris Thanks for this. My aim is to extract a row of data from a line in a file and append it to an array. The number of columns is fixed but, at the start, the number of rows is unknown. I think that I have sorted out the resize approach but I need more tests before I share it. Your accumulator idea is interesting. Back in 2004, I worked on MyMatrix, based on numarray - abandoned when numpy came onto the scene. One of the capabilities there was an /append/ method, intended to add a conforming matrix to the right or below the given matrix. It was probably not efficient but it provided a means of joining together block matrices, The append signature, from a January 2005 backup is here: def append(self, other, toRight= False): ''' Return self, with other appended, to the Right or Below, default: Below. other - a matrix, a list of matrices, or objects which can be converted into matrices. ''' assert self.iscontiguous() assert self.rank == 2 if isinstance(other, _n.NumArray): ... Colin W. ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] recompiling everything ?
On Tue, Nov 24, 2009 at 12:53 AM, josef.p...@gmail.com wrote: Is this only related to the temporary (?) ABI breaking during the datetime merge, or do we now have to recompile all packages with c-extension each time we update numpy trunk? Hopefully that's due to the temporary breakage. There is unfortunately no easy way to fix this, because of the way C extensions work in python: we can't rely on the linker to help use, in particular. David ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] Resize method
Colin J. Williams wrote: Thanks for this. My aim is to extract a row of data from a line in a file and append it to an array. The number of columns is fixed but, at the start, the number of rows is unknown. That is exactly the kind of use-case I had in mind. In fact, you can use it now if you use a custom dtype to hold your row of data, rather than using a 2-d array. I think that I have sorted out the resize approach but I need more tests before I share it. Please do, and consider adding to and/or making suggestions to accumulator -- why should we all re-invent this wheel? To be far, you can do what you need by accumulating in a python list, then making an array out of it when you are done -- and it will probably be faster, but *may* take more memory. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/ORR(206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception chris.bar...@noaa.gov ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
[Numpy-discussion] Python 2.6, NumPy on CentOS 5.3
An application package that I have requires Python 2.6 and NumPy. I've installed Python 2.6 in a parallel manner as follows: NO modification of the core Python2.4 in /usr/bin has been done. Rather, I installed Python 2.6 under /opt/Python_2.6.4 and modified my user (not root) environment variables appropriately. The directory /opt/Python_2.6.4 was modified with chown to give me rwx access. To install NumPy, I've downloaded the latest .tgz sources (v1.3.0) to build. When I attempt to configure/build I receive various errors related to blas and lapack. The NumPy configuration is searching /usr/lib, /usr/lib64, /usr/local/lib, and /usr/local/lib64 for various blas, lapack, and atlas libraries. Within /usr/lib64 I do find a few lapack and blas and lapack libraries installed (libblas.so.3.1.1 and liblapack.so.3.1.1), but configure is not finding them. No atlas libraries are found, but my understanding is that these are deprecated anyway. As an alternative, I tried to install NumPy using the standard Python 2.4.3 using yum install NumPy but I receive an error saying that NumPy is obsoleted by PyNumeric. What?? PyNumeric is the precursor to NumPy. So even in the most basic instance, I cannot install NumPy because a deprecated library is seen as higher priority? Even given the generally out of date nature of CentOS this is unrealistic. Finally, I could try to build blas and lapack myself, but this seems to border on insanity. Any help is appreciated. -Kirk ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] Python 2.6, NumPy on CentOS 5.3
rkdeli...@gmail.com wrote: An application package that I have requires Python 2.6 and NumPy. I've installed Python 2.6 in a parallel manner as follows: NO modification of the core Python2.4 in /usr/bin has been done. Rather, I installed Python 2.6 under /opt/Python_2.6.4 and modified my user (not root) environment variables appropriately. The directory /opt/Python_2.6.4 was modified with chown to give me rwx access. To install NumPy, I've downloaded the latest .tgz sources (v1.3.0) to build. When I attempt to configure/build I receive various errors related to blas and lapack. The NumPy configuration is searching /usr/lib, /usr/lib64, /usr/local/lib, and /usr/local/lib64 for various blas, lapack, and atlas libraries. Within /usr/lib64 I do find a few lapack and blas and lapack libraries installed (libblas.so.3.1.1 and liblapack.so.3.1.1), but configure is not finding them. You need the *.so and not the *.so.3.1.1. The latter are enough to *run* applications linked against the library, the former is necessary to link against them. IOW, you need the devel packages for blas, lapack (and python). If you want to do it without admin rights, there is no other solution than building them by yourself. David ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion