Re: [Numpy-discussion] Building a static libnumpy

2008-01-21 Thread Robert Kern
Jussi Enkovaara wrote:
> Hi,
> I need to use numpy in an environment which does not support shared libraries.
> Previously, I have used the old Numeric where I created a Makefile to build a
> static Numeric library which was later on linked to the python interpreter.
> 
> As far as I understood, numpy uses sort of extended distutils. I was 
> wondering if
> this extended distutils enables building of static libraries or do I have to 
> go
> the cumbersome Makefile-way again?

Cumbersome Makefile, sorry.

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
  that is made terrible by our own mad attempt to interpret it as though it had
  an underlying truth."
   -- Umberto Eco
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Docstring page, out of date?

2008-01-21 Thread Charles R Harris
On Jan 21, 2008 11:27 PM, Robert Kern <[EMAIL PROTECTED]> wrote:

> Charles R Harris wrote:
> >
> >
> > On Jan 21, 2008 6:09 PM, Jarrod Millman <[EMAIL PROTECTED]
> > > wrote:
> >
> > On Jan 21, 2008 2:03 PM, Matthew Brett <[EMAIL PROTECTED]
> > > wrote:
> >  > Search for the docstring standard, I hit this:
> >  >
> >  > http://www.scipy.org/DocstringStandard
> >
> > Good catch, I didn't know this page existed.  If I recall correctly,
> > Keir Mierle showed up on the mailing list around the time we were
> > discussing the docstring standard.  He proposed to do some work, but
> > then must have gotten busy with something else.  In particular, I
> > believe he was interested in seeing a unified docstring standard for
> > numpy, scipy, and matplotlib.  I guess he put this page up during
> that
> > period.  I went ahead and deleted it, since it conflicts with the
> > official docstring standard.
> >
> >  > but I think the current thinking is this:
> >  >
> >  > http://projects.scipy.org/scipy/numpy/wiki/CodingStyleGuidelines
> > 
> >  >
> >  > Is that correct? Does the first page apply to matplotlib in some
> way?
> >  > Should we change the first page to match the second now?
> >
> > Yes.  That page is autogenerated from the official coding standard
> > that is in the numpy trunk.  One of the nice features of trac is
> that
> > it can render restructured text from the svn repository.  This helps
> > us keep from having duplicate information that needs to be kept in
> > sync by hand.
> >
> >
> > If I hit the source code link in the generated  html, it looks like that
> > page was generated from the old document format. The new document format
> > doesn't produce output that looks anything like that and epydoc
> > generates a couple of warnings:
> >
> > | File /home/charris/workspace/numpy/numpy/doc/example.py, line 19, in
> > | example.foo
> > |   Warning: Line 24: Wrong underline character for heading.
> > |   Warning: Lines 27, 30, 32, 37, 39, 41, 43, 48, 50: Improper
> paragraph
> > |indentation.
> >
> > The new document format requires a preprocessor that has yet to be
> written.
>
> epydoc from SVN works just fine once the following line is added at the
> top.
>
> __docformat__ = "restructuredtext en"
>

So it does, sorry for the noise. Does it work with the docstrings for the C
code also?

Chuck
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Sage/Scipy Days 8 reminder: Feb 29-March 4.

2008-01-21 Thread Fernando Perez
Hi all,

Just a quick reminder for all about the upcoming Sage/Scipy Days 8 at
Enthought collaborative meeting:

http://wiki.sagemath.org/days8

Email me directly ([EMAIL PROTECTED]) if you plan on coming,
so we can have a proper count and plan accordingly.

Cheers,

f
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Building a static libnumpy

2008-01-21 Thread Matthieu Brucher
Hi,

I think the main problem would be that some parts of Numpy are coded in C
and that they must be compiled as a shared library so that Python can access
them.

Matthieu

2008/1/22, Jussi Enkovaara <[EMAIL PROTECTED]>:
>
> Hi,
> I need to use numpy in an environment which does not support shared
> libraries.
> Previously, I have used the old Numeric where I created a Makefile to
> build a
> static Numeric library which was later on linked to the python
> interpreter.
>
> As far as I understood, numpy uses sort of extended distutils. I was
> wondering if
> this extended distutils enables building of static libraries or do I have
> to go
> the cumbersome Makefile-way again?
>
> Regards,
> Jussi Enkovaara
>
> ___
> Numpy-discussion mailing list
> Numpy-discussion@scipy.org
> http://projects.scipy.org/mailman/listinfo/numpy-discussion
>



-- 
French PhD student
Website : http://matthieu-brucher.developpez.com/
Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92
LinkedIn : http://www.linkedin.com/in/matthieubrucher
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Building a static libnumpy

2008-01-21 Thread Jussi Enkovaara
Hi,
I need to use numpy in an environment which does not support shared libraries.
Previously, I have used the old Numeric where I created a Makefile to build a
static Numeric library which was later on linked to the python interpreter.

As far as I understood, numpy uses sort of extended distutils. I was wondering 
if
this extended distutils enables building of static libraries or do I have to go
the cumbersome Makefile-way again?

Regards,
Jussi Enkovaara

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Docstring page, out of date?

2008-01-21 Thread Robert Kern
Charles R Harris wrote:
> 
> 
> On Jan 21, 2008 6:09 PM, Jarrod Millman <[EMAIL PROTECTED] 
> > wrote:
> 
> On Jan 21, 2008 2:03 PM, Matthew Brett <[EMAIL PROTECTED]
> > wrote:
>  > Search for the docstring standard, I hit this:
>  >
>  > http://www.scipy.org/DocstringStandard
> 
> Good catch, I didn't know this page existed.  If I recall correctly,
> Keir Mierle showed up on the mailing list around the time we were
> discussing the docstring standard.  He proposed to do some work, but
> then must have gotten busy with something else.  In particular, I
> believe he was interested in seeing a unified docstring standard for
> numpy, scipy, and matplotlib.  I guess he put this page up during that
> period.  I went ahead and deleted it, since it conflicts with the
> official docstring standard.
> 
>  > but I think the current thinking is this:
>  >
>  > http://projects.scipy.org/scipy/numpy/wiki/CodingStyleGuidelines
> 
>  >
>  > Is that correct? Does the first page apply to matplotlib in some way?
>  > Should we change the first page to match the second now?
> 
> Yes.  That page is autogenerated from the official coding standard
> that is in the numpy trunk.  One of the nice features of trac is that
> it can render restructured text from the svn repository.  This helps
> us keep from having duplicate information that needs to be kept in
> sync by hand.
> 
> 
> If I hit the source code link in the generated  html, it looks like that 
> page was generated from the old document format. The new document format 
> doesn't produce output that looks anything like that and epydoc 
> generates a couple of warnings:
> 
> | File /home/charris/workspace/numpy/numpy/doc/example.py, line 19, in
> | example.foo
> |   Warning: Line 24: Wrong underline character for heading.
> |   Warning: Lines 27, 30, 32, 37, 39, 41, 43, 48, 50: Improper paragraph
> |indentation.
> 
> The new document format requires a preprocessor that has yet to be written.

epydoc from SVN works just fine once the following line is added at the top.

__docformat__ = "restructuredtext en"

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
  that is made terrible by our own mad attempt to interpret it as though it had
  an underlying truth."
   -- Umberto Eco
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Docstring page, out of date?

2008-01-21 Thread Jarrod Millman
On Jan 21, 2008 7:56 PM, Charles R Harris <[EMAIL PROTECTED]> wrote:
> On Jan 21, 2008 8:38 PM, Charles R Harris <[EMAIL PROTECTED]> wrote:
> > If I hit the source code link in the generated  html, it looks like that
> page was generated from the old document format. The new document format
> doesn't produce output that looks anything like that and epydoc generates a
> couple of warnings:
> >
> > | File /home/charris/workspace/numpy/numpy/doc/example.py, line 19, in
> > | example.foo
> > |   Warning: Line 24: Wrong underline character for heading.
> > |   Warning: Lines 27, 30, 32, 37, 39, 41, 43, 48, 50: Improper paragraph
> > |indentation.
> >
> > The new document format requires a preprocessor that has yet to be
> written.
>
> Since epydoc also works for compiled modules, I think the easiest way to do
> that is to fork epydoc. The syntax of the new markup is context sensitive -
> single types are no longer in {} - so parsing will be a bit more
> complicated. As ReST is not part of the current document markup, we might
> consider removing the parts of the document documentation that refer to it.

Hey Chuck,

I am sorry that there has been so much confusion over the docstring
standards.  As you know, the problems you are pointing out arose from
the changes Travis made in December.  I added a warning to the top of
the coding style guideline explaining the situation:
http://projects.scipy.org/scipy/numpy/wiki/CodingStyleGuidelines
Once Travis writes a tool to render the docstrings, we can go ahead
and update the instructions.

Thanks,

-- 
Jarrod Millman
Computational Infrastructure for Research Labs
10 Giannini Hall, UC Berkeley
phone: 510.643.4014
http://cirl.berkeley.edu/
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Docstring page, out of date?

2008-01-21 Thread Robert Kern
David Cournapeau wrote:
> Nobody likes to write doc, specially if you have to follow many rules.
> Not being able to see the result does not help. I don't know much
> about the current documation tools situation with python, but I have
> seen good, automatically generated doc for python modules: for
> example, the current developement doc of python
> (http://docs.python.org/dev/) looks pretty good to me, and using the
> same tools as python makes more sense than forking our own, no ? Or am
> I missing something fundamental ?

Yes. Sphinx does not do automatically generated documentation.

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
  that is made terrible by our own mad attempt to interpret it as though it had
  an underlying truth."
   -- Umberto Eco
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Docstring page, out of date?

2008-01-21 Thread David Cournapeau
On Jan 22, 2008 12:56 PM, Charles R Harris <[EMAIL PROTECTED]> wrote:
>
>
>
>
> On Jan 21, 2008 8:38 PM, Charles R Harris <[EMAIL PROTECTED]> wrote:
> >
> >
> >
> >
> > On Jan 21, 2008 6:09 PM, Jarrod Millman <[EMAIL PROTECTED]> wrote:
> >
> > >
> > > On Jan 21, 2008 2:03 PM, Matthew Brett <[EMAIL PROTECTED]> wrote:
> > > > Search for the docstring standard, I hit this:
> > > >
> > > > http://www.scipy.org/DocstringStandard
> > >
> > > Good catch, I didn't know this page existed.  If I recall correctly,
> > > Keir Mierle showed up on the mailing list around the time we were
> > > discussing the docstring standard.  He proposed to do some work, but
> > > then must have gotten busy with something else.  In particular, I
> > > believe he was interested in seeing a unified docstring standard for
> > > numpy, scipy, and matplotlib.  I guess he put this page up during that
> > > period.  I went ahead and deleted it, since it conflicts with the
> > > official docstring standard.
> > >
> > >
> > > > but I think the current thinking is this:
> > > >
> > > > http://projects.scipy.org/scipy/numpy/wiki/CodingStyleGuidelines
> > > >
> > > > Is that correct? Does the first page apply to matplotlib in some way?
> > > > Should we change the first page to match the second now?
> > >
> > > Yes.  That page is autogenerated from the official coding standard
> > > that is in the numpy trunk.  One of the nice features of trac is that
> > > it can render restructured text from the svn repository.  This helps
> > > us keep from having duplicate information that needs to be kept in
> > > sync by hand.
> > >
> >
> >
> > If I hit the source code link in the generated  html, it looks like that
> page was generated from the old document format. The new document format
> doesn't produce output that looks anything like that and epydoc generates a
> couple of warnings:
> >
> > | File /home/charris/workspace/numpy/numpy/doc/example.py, line 19, in
> > | example.foo
> > |   Warning: Line 24: Wrong underline character for heading.
> > |   Warning: Lines 27, 30, 32, 37, 39, 41, 43, 48, 50: Improper paragraph
> > |indentation.
> >
> >
> > The new document format requires a preprocessor that has yet to be
> written.
> >
>
> Since epydoc also works for compiled modules, I think the easiest way to do
> that is to fork epydoc. The syntax of the new markup is context sensitive -
> single types are no longer in {} - so parsing will be a bit more
> complicated. As ReST is not part of the current document markup, we might
> consider removing the parts of the document documentation that refer to it.
>
I must confess I don't get those discussion on docstring - at all. I
think there has been some discussion on this for almost one year, and
I have never managed to "compile" a full doc for numpy or scipy.

Nobody likes to write doc, specially if you have to follow many rules.
Not being able to see the result does not help. I don't know much
about the current documation tools situation with python, but I have
seen good, automatically generated doc for python modules: for
example, the current developement doc of python
(http://docs.python.org/dev/) looks pretty good to me, and using the
same tools as python makes more sense than forking our own, no ? Or am
I missing something fundamental ?

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Docstring page, out of date?

2008-01-21 Thread Charles R Harris
On Jan 21, 2008 8:38 PM, Charles R Harris <[EMAIL PROTECTED]> wrote:

>
>
> On Jan 21, 2008 6:09 PM, Jarrod Millman <[EMAIL PROTECTED]> wrote:
>
> > On Jan 21, 2008 2:03 PM, Matthew Brett <[EMAIL PROTECTED]> wrote:
> > > Search for the docstring standard, I hit this:
> > >
> > > http://www.scipy.org/DocstringStandard
> >
> > Good catch, I didn't know this page existed.  If I recall correctly,
> > Keir Mierle showed up on the mailing list around the time we were
> > discussing the docstring standard.  He proposed to do some work, but
> > then must have gotten busy with something else.  In particular, I
> > believe he was interested in seeing a unified docstring standard for
> > numpy, scipy, and matplotlib.  I guess he put this page up during that
> > period.  I went ahead and deleted it, since it conflicts with the
> > official docstring standard.
> >
> > > but I think the current thinking is this:
> > >
> > > http://projects.scipy.org/scipy/numpy/wiki/CodingStyleGuidelines
> > >
> > > Is that correct? Does the first page apply to matplotlib in some way?
> > > Should we change the first page to match the second now?
> >
> > Yes.  That page is autogenerated from the official coding standard
> > that is in the numpy trunk.  One of the nice features of trac is that
> > it can render restructured text from the svn repository.  This helps
> > us keep from having duplicate information that needs to be kept in
> > sync by hand.
> >
>
> If I hit the source code link in the generated  html, it looks like that
> page was generated from the old document format. The new document format
> doesn't produce output that looks anything like that and epydoc generates a
> couple of warnings:
>
> | File /home/charris/workspace/numpy/numpy/doc/example.py, line 19, in
> | example.foo
> |   Warning: Line 24: Wrong underline character for heading.
> |   Warning: Lines 27, 30, 32, 37, 39, 41, 43, 48, 50: Improper paragraph
> |indentation.
>
> The new document format requires a preprocessor that has yet to be
> written.
>

Since epydoc also works for compiled modules, I think the easiest way to do
that is to fork epydoc. The syntax of the new markup is context sensitive -
single types are no longer in {} - so parsing will be a bit more
complicated. As ReST is not part of the current document markup, we might
consider removing the parts of the document documentation that refer to it.

Chuck
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] maskedarray bug?

2008-01-21 Thread Eric Firing
Pierre GM wrote:
> On Sunday 20 January 2008 16:32:40 you wrote:
>> Pierre,
>>
>> The attached script shows how one can make a masked array with different
>> dimensions for the mask than for the data.  I presume this is a bug.  It
>> is triggered by the present code in the matplotlib quiver function.
> 
> Yep, bug indeed. Thanks for pointing that out ! 
> The following patch should take care of the problem. (In short, a getmask 
> function was used instead of getmaskarray).
> Note that the patch takes also into account elements I had sent to Stefan 2 
> weeks ago, that were not ported yet to the SVN: I still can't commit to the 
> numpy/branches/maskedarray.
> Let me know how it works, and thanks again for your time.

Pierre,

Thank you, it works fine.

Eric
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Docstring page, out of date?

2008-01-21 Thread Charles R Harris
On Jan 21, 2008 6:09 PM, Jarrod Millman <[EMAIL PROTECTED]> wrote:

> On Jan 21, 2008 2:03 PM, Matthew Brett <[EMAIL PROTECTED]> wrote:
> > Search for the docstring standard, I hit this:
> >
> > http://www.scipy.org/DocstringStandard
>
> Good catch, I didn't know this page existed.  If I recall correctly,
> Keir Mierle showed up on the mailing list around the time we were
> discussing the docstring standard.  He proposed to do some work, but
> then must have gotten busy with something else.  In particular, I
> believe he was interested in seeing a unified docstring standard for
> numpy, scipy, and matplotlib.  I guess he put this page up during that
> period.  I went ahead and deleted it, since it conflicts with the
> official docstring standard.
>
> > but I think the current thinking is this:
> >
> > http://projects.scipy.org/scipy/numpy/wiki/CodingStyleGuidelines
> >
> > Is that correct? Does the first page apply to matplotlib in some way?
> > Should we change the first page to match the second now?
>
> Yes.  That page is autogenerated from the official coding standard
> that is in the numpy trunk.  One of the nice features of trac is that
> it can render restructured text from the svn repository.  This helps
> us keep from having duplicate information that needs to be kept in
> sync by hand.
>

If I hit the source code link in the generated  html, it looks like that
page was generated from the old document format. The new document format
doesn't produce output that looks anything like that and epydoc generates a
couple of warnings:

| File /home/charris/workspace/numpy/numpy/doc/example.py, line 19, in
| example.foo
|   Warning: Line 24: Wrong underline character for heading.
|   Warning: Lines 27, 30, 32, 37, 39, 41, 43, 48, 50: Improper paragraph
|indentation.

The new document format requires a preprocessor that has yet to be written.

Chuck
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Docstring page, out of date?

2008-01-21 Thread Jarrod Millman
On Jan 21, 2008 2:03 PM, Matthew Brett <[EMAIL PROTECTED]> wrote:
> Search for the docstring standard, I hit this:
>
> http://www.scipy.org/DocstringStandard

Good catch, I didn't know this page existed.  If I recall correctly,
Keir Mierle showed up on the mailing list around the time we were
discussing the docstring standard.  He proposed to do some work, but
then must have gotten busy with something else.  In particular, I
believe he was interested in seeing a unified docstring standard for
numpy, scipy, and matplotlib.  I guess he put this page up during that
period.  I went ahead and deleted it, since it conflicts with the
official docstring standard.

> but I think the current thinking is this:
>
> http://projects.scipy.org/scipy/numpy/wiki/CodingStyleGuidelines
>
> Is that correct? Does the first page apply to matplotlib in some way?
> Should we change the first page to match the second now?

Yes.  That page is autogenerated from the official coding standard
that is in the numpy trunk.  One of the nice features of trac is that
it can render restructured text from the svn repository.  This helps
us keep from having duplicate information that needs to be kept in
sync by hand.

Thanks,

-- 
Jarrod Millman
Computational Infrastructure for Research Labs
10 Giannini Hall, UC Berkeley
phone: 510.643.4014
http://cirl.berkeley.edu/
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Optimizing speed for large-array inter-element algorithms (specifically, color space conversion)

2008-01-21 Thread Arnar Flatberg
Hi Theodore

Probably not the fastest, but a full example of how you may vectorize your loop.
Ran just one test, and that speeded up the original code.

Example:


from numpy import empty_like

def vectorized_rgb2hsv(im):
"im is a (m, n, 3) array."""
im = im/255.
out = empty_like(im)
im_max = im.max(-1)
delta = im.ptp(-1)
s = delta/im_max
s[delta==0] = 0
index = im[:,:,0] == im_max # red is max
out[index, 0] = (im[index, 1] - im[index, 2] ) / delta[index]
index = im[:,:,1] == im_max # green is max
out[index, 0] = 2 + (im[index, 2] - im[index, 0] ) / delta[index]
index = im[:,:,2] == im_max # blue is max
out[index, 0] = 4 + (im[index, 0] - im[index, 1] ) / delta[index]
out[:,:,0] = (out[:,:,0]/6.0) % 1.0
out[:,:,1] = s
out[:,:,2] = im_max
out = (255.*out).astype(uint8)
return out


Timings (shape: 1202, 800, 3):
-
code in first post:
%prun a = rgb2hsv.rgb2hsv(s)
 1923207 function calls in 18.423 CPU seconds

removing loop:
%prun b = rgb2hsv.vectorized_rgb2hsv(s)
 8 function calls in 4.630 CPU seconds


Arnar
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Docstring page, out of date?

2008-01-21 Thread Matthew Brett
Hi,

Search for the docstring standard, I hit this:

http://www.scipy.org/DocstringStandard

but I think the current thinking is this:

http://projects.scipy.org/scipy/numpy/wiki/CodingStyleGuidelines

Is that correct? Does the first page apply to matplotlib in some way?
Should we change the first page to match the second now?

Matthew
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] maskedarray bug?

2008-01-21 Thread Pierre GM
On Sunday 20 January 2008 16:32:40 you wrote:
> Pierre,
>
> The attached script shows how one can make a masked array with different
> dimensions for the mask than for the data.  I presume this is a bug.  It
> is triggered by the present code in the matplotlib quiver function.

Yep, bug indeed. Thanks for pointing that out ! 
The following patch should take care of the problem. (In short, a getmask 
function was used instead of getmaskarray).
Note that the patch takes also into account elements I had sent to Stefan 2 
weeks ago, that were not ported yet to the SVN: I still can't commit to the 
numpy/branches/maskedarray.
Let me know how it works, and thanks again for your time.
P.
Index: /home/backtopop/workspace/numpy_maskedarray/numpy/ma/core.py
===
--- /home/backtopop/workspace/numpy_maskedarray/numpy/ma/core.py	(revision 4737)
+++ /home/backtopop/workspace/numpy_maskedarray/numpy/ma/core.py	(working copy)
@@ -14,42 +14,47 @@
 
 :author: Pierre Gerard-Marchant
 :contact: pierregm_at_uga_dot_edu
+:version: $Id: core.py 341 2007-12-25 16:23:05Z backtopop $
 """
-__author__ = "Pierre GF Gerard-Marchant"
+__author__ = "Pierre GF Gerard-Marchant ($Author: backtopop $)"
+__version__ = '1.0'
+__revision__ = "$Revision: 341 $"
+__date__ = '$Date: 2007-12-25 11:23:05 -0500 (Tue, 25 Dec 2007) $'
+
 __docformat__ = "restructuredtext en"
 
 __all__ = ['MAError', 'MaskType', 'MaskedArray',
'bool_', 'complex_', 'float_', 'int_', 'object_',
'abs', 'absolute', 'add', 'all', 'allclose', 'allequal', 'alltrue',
-   'amax', 'amin', 'anom', 'anomalies', 'any', 'arange',
-   'arccos', 'arccosh', 'arcsin', 'arcsinh', 'arctan', 'arctan2',
-   'arctanh', 'argmax', 'argmin', 'argsort', 'around',
-   'array', 'asarray','asanyarray',
+   'amax', 'amin', 'anom', 'anomalies', 'any', 'arange',
+   'arccos', 'arccosh', 'arcsin', 'arcsinh', 'arctan', 'arctan2',
+   'arctanh', 'argmax', 'argmin', 'argsort', 'around',
+   'array', 'asarray','asanyarray',
'bitwise_and', 'bitwise_or', 'bitwise_xor',
'ceil', 'choose', 'compressed', 'concatenate', 'conjugate',
-   'cos', 'cosh', 'count',
+   'cos', 'cosh', 'count',
'default_fill_value', 'diagonal', 'divide', 'dump', 'dumps',
'empty', 'empty_like', 'equal', 'exp',
'fabs', 'fmod', 'filled', 'floor', 'floor_divide','fix_invalid',
'getmask', 'getmaskarray', 'greater', 'greater_equal', 'hypot',
'ids', 'inner', 'innerproduct',
-   'isMA', 'isMaskedArray', 'is_mask', 'is_masked', 'isarray',
+   'isMA', 'isMaskedArray', 'is_mask', 'is_masked', 'isarray',
'left_shift', 'less', 'less_equal', 'load', 'loads', 'log', 'log10',
-   'logical_and', 'logical_not', 'logical_or', 'logical_xor',
+   'logical_and', 'logical_not', 'logical_or', 'logical_xor',
'make_mask', 'make_mask_none', 'mask_or', 'masked',
-   'masked_array', 'masked_equal', 'masked_greater',
-   'masked_greater_equal', 'masked_inside', 'masked_less',
-   'masked_less_equal', 'masked_not_equal', 'masked_object',
-   'masked_outside', 'masked_print_option', 'masked_singleton',
-   'masked_values', 'masked_where', 'max', 'maximum', 'mean', 'min',
-   'minimum', 'multiply',
+   'masked_array', 'masked_equal', 'masked_greater',
+   'masked_greater_equal', 'masked_inside', 'masked_less',
+   'masked_less_equal', 'masked_not_equal', 'masked_object',
+   'masked_outside', 'masked_print_option', 'masked_singleton',
+   'masked_values', 'masked_where', 'max', 'maximum', 'mean', 'min',
+   'minimum', 'multiply',
'negative', 'nomask', 'nonzero', 'not_equal',
'ones', 'outer', 'outerproduct',
'power', 'product', 'ptp', 'put', 'putmask',
'rank', 'ravel', 'remainder', 'repeat', 'reshape', 'resize',
-   'right_shift', 'round_',
+   'right_shift', 'round_',
'shape', 'sin', 'sinh', 'size', 'sometrue', 'sort', 'sqrt', 'std',
-   'subtract', 'sum', 'swapaxes',
+   'subtract', 'sum', 'swapaxes',
'take', 'tan', 'tanh', 'transpose', 'true_divide',
'var', 'where',
'zeros']
@@ -178,6 +183,26 @@
 else:
 raise TypeError, 'Unsuitable type for calculating minimum.'
 
+
+def _check_fill_value(fill_value, dtype):
+descr = numpy.dtype(dtype).descr
+if fill_value is None:
+if len(descr) > 1:
+fill_value = [default_fill_value(numeric.dtype(d[1])) 
+  for d in descr]
+else:
+fill_value = default_fill_value(dtype)
+else:
+fval = numpy.resize(narray(fill_value,copy=False,dtype=object_),
+

Re: [Numpy-discussion] Optimizing speed for large-array inter-element algorithms (specifically, color space conversion)

2008-01-21 Thread Bruce Southey
Hi,
I don't know the area but following your code I would suggest the
completely untested code.  I am not sure about the conditions to get h
so I leave you get to the correct h and transform it appropriately.

Bruce

# First create 2 dimensional array/matrix (number of pixels by three):
'rows' are pixels and one 'columns' for each of r, g, b values, say
RGB from the image

# Compute v ie maximum across r, g and b for each pixel
v=RGB.max(axis=1)

#compute range between high and low values for each pixel
maxmin=v-RGB.min(axis=1)

#compute s
s=maxmin/v

#Compute h
hr=(RGB[:,1]-RGB[:,2])/maxmin
hg=2.0+(RGB[:,2]-RGB[:,0])/maxmin
hb=4.0+(RGB[:,0]-RGB[:,1])/maxmin

#Some code to get h

#Final output
HSV=255.0*numpy.column_stack(h,s,v)

On Jan 21, 2008 8:39 AM, theodore test <[EMAIL PROTECTED]> wrote:
> Hello all,
>
> I'm scratching my head over how to make this image color space conversion
> from "RGB" to "HSV" quicker.  It requires input from all three bands of a
> given pixel at each pixel, and thus can't be easily flattened.  I've already
> implemented psyco and removed the extra step of calling colorsys's
> rgb_to_hsv function by adapting the code into my primary for loop.
>
> Right now it takes around 9 seconds for a single 1600x1200 RGB image, a
> conversion that I've seen implemented more or less instantly in, say,
> ImageJ.  What can I do to make this conversion more efficient?
>
> Thank you in advance,
> Theodore Test
>
> 
>
>
> #Necessary imports and hand-waving at psyco usage.
> import numpy
> from numpy import *
> from scipy.misc import pilutil
>
> import psyco
> psyco.full()
>
>
> # Read image file, cram it into a normalized array with one row per pixel
> # with each column holding the given pixel's R,G, or B band-value.
> s = pilutil.imread('A_Biggish_Image.tif')
> r,c,d = s.shape
>
> l = r*c
> t = t.reshape(l,3)
> t = t/255.0
>
>
> # Cycle through all of the pixels, converting them to HSV space and putting
> them
> # back into the array.  This is the big time-waster.
> #
> # The conversion is adapted directly from colorsys.rgb_to_hsv to reduce call
> time
> for x in range(0,l):
> tr = float(t[x,0])
> tg = float(t[x,1])
> tb = float(t[x,2])
> maxc = max(tr, tg, tb)
> minc = min(tr, tg, tb)
> v = maxc
> if minc == maxc:
> t[x,0],t[x,1],t[x,2] = 0.0, 0.0, v
> else:
> s = (maxc-minc) / maxc
>  rc = (maxc-tr) / (maxc-minc)
> gc = (maxc-tg) / (maxc-minc)
> bc = (maxc-tb) / (maxc-minc)
> if tr == maxc: h = bc-gc
> elif tg == maxc: h = 2.0+rc-bc
> else: h = 4.0+gc-rc
> h = (h/6.0) % 1.0
> t[x,0],t[x,1],t[x,2]  = h, s, v
>
> # Renormalize shape and contents to image-file standards.
> t = t*255.0
> t = t.astype(uint8)
> t = t.reshape(r,c,d)
>
> finaloutputarray = t
>
> 
>
> ___
> Numpy-discussion mailing list
> Numpy-discussion@scipy.org
> http://projects.scipy.org/mailman/listinfo/numpy-discussion
>
>
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] def of var of complex

2008-01-21 Thread Robert Kern
Neal Becker wrote:
> Neal Becker wrote:
> 
>> I noticed that if I generate complex rv i.i.d. with var=1, that numpy
>> says:
>>
>> var () -> (close to 1.0)
>> var () -> (close to 1.0)
>>
>> but
>>
>> var (complex array) -> (close to complex 0)
>>
>> Is that not a strange definition?
> 
> I don't think there is any ambiguity about the definition of the variance of
> complex.
> 
> Var(x) = E{(x-E[x])^2} = E{x}^2 - E{x}

That's currently what's implemented, but there is simply no evidence that 
anyone 
ever uses this definition for complex random variables. Note that for real 
variables,

   E{(x-E[x])^2} = E{|x-E[x]|^2}

but for complex variables, there is a large difference. Since the || are 
superfluous with real variables, probability texts rarely include them unless 
if 
they are then going on to talk about complex variables. If you want to extend 
the definition for real variables to complex variables, that is an ambiguity 
you 
have to resolve.

There is, apparently, a large body of statistical signal processing literature 
that defines the complex variance as

   E{|z-E[z]|^2}

so, I (now) believe that this is what should be implemented.

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
  that is made terrible by our own mad attempt to interpret it as though it had
  an underlying truth."
   -- Umberto Eco
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] compiling c extension

2008-01-21 Thread Lou Pecora
Did you include arrayobject.h and  call import_array()
in the initialization function, after the call to
Py_InitModule() ?

--- Danny Chan <[EMAIL PROTECTED]> wrote:

> Hi!
> I am trying to compile a c extension that uses numpy
> arrays on windows. I can compile the extension file,
> but once I get to the linking stage I get unresolved
> references to PyArray_API and import_array. Can
> anyone help me with the correct linker flags? 
> 
> Thanks, Danny

[cut]
> more undefined references to `PyArray_API' follow
>
build\temp.win32-2.5\Release\.\python\libaid_wrap.o:libaid_wrap.c:(.text+0xc216):
> undefined reference to `import_array'
> collect2: ld returned 1 exit status
> error: command 'gcc' failed with exit status 1



-- Lou Pecora,   my views are my own.


  

Looking for last minute shopping deals?  
Find them fast with Yahoo! Search.  
http://tools.search.yahoo.com/newsearch/category.php?category=shopping
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Do not use -fomit-frame-pointer with mingw !

2008-01-21 Thread David Cournapeau
Hi,

I don't know if this is known, but since I wasted half a day on it, 
I thought I could avoid the pain for someone else: do not build numpy 
with -fomit-frame-pointer with mingw (windows), it will crash. I don't 
know if this just cannot be used, if this is a mingw bug, or something 
on our side. But you do not want to look for a bug caused by using this 
option.

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] compiling c extension

2008-01-21 Thread Danny Chan
Hi!
I am trying to compile a c extension that uses numpy arrays on windows. I can 
compile the extension file, but once I get to the linking stage I get 
unresolved references to PyArray_API and import_array. Can anyone help me with 
the correct linker flags? 

Thanks, Danny



running build
running build_py
file libaid.py (for module libaid) not found
file libaid.py (for module libaid) not found
running build_ext
building '_libaid' extension
writing build\temp.win32-2.5\Release\.\python\_libaid.def
x:\eda\mingw\5.1.3\bin\gcc.exe -mno-cygwin -shared -s 
build\temp.win32-2.5\Release\.\python\libaid_wrap.o 
build\temp.win32-2.5\Release\.\python\_libaid.def -L../../build/win32/bin 
-Lx:\eda\python\windows\libs -Lx:\eda\python\windows\PCBuild -laid -lpython25 
-lmsvcr71 -o build\lib.win32-2.5\_libaid.pyd
build\temp.win32-2.5\Release\.\python\libaid_wrap.o:libaid_wrap.c:(.text+0x2602):
 undefined reference to `PyArray_API'
build\temp.win32-2.5\Release\.\python\libaid_wrap.o:libaid_wrap.c:(.text+0x2635):
 undefined reference to `PyArray_API'
build\temp.win32-2.5\Release\.\python\libaid_wrap.o:libaid_wrap.c:(.text+0x264f):
 undefined reference to `PyArray_API'
build\temp.win32-2.5\Release\.\python\libaid_wrap.o:libaid_wrap.c:(.text+0x2709):
 undefined reference to `PyArray_API'
build\temp.win32-2.5\Release\.\python\libaid_wrap.o:libaid_wrap.c:(.text+0x2752):
 undefined reference to `PyArray_API'
build\temp.win32-2.5\Release\.\python\libaid_wrap.o:libaid_wrap.c:(.text+0x2784):
 more undefined references to `PyArray_API' follow
build\temp.win32-2.5\Release\.\python\libaid_wrap.o:libaid_wrap.c:(.text+0xc216):
 undefined reference to `import_array'
collect2: ld returned 1 exit status
error: command 'gcc' failed with exit status 1

   
-
Beginnen Sie den Tag mit den neuesten Nachrichten. Machen Sie Yahoo! zu Ihrer 
Startseite!___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Optimizing speed for large-array inter-element algorithms (specifically, color space conversion)

2008-01-21 Thread James A. Bednar
|  Date: Mon, 21 Jan 2008 09:39:21 -0500
|  From: "theodore test" <[EMAIL PROTECTED]>
|  
|  Hello all,
|  
|  I'm scratching my head over how to make this image color space
|  conversion from "RGB" to "HSV" quicker.  It requires input from all
|  three bands of a given pixel at each pixel, and thus can't be
|  easily flattened.  I've already implemented psyco and removed the
|  extra step of calling colorsys's rgb_to_hsv function by adapting
|  the code into my primary for loop.
|  
|  Right now it takes around 9 seconds for a single 1600x1200 RGB image, a
|  conversion that I've seen implemented more or less instantly in, say,
|  ImageJ.  What can I do to make this conversion more efficient?

I would love to see a decent implementation of that!  We have versions
quite similar to yours, instead.  We've had a to-do list item to
reimplement rgb_to_hsv (and its converse hsv_to_rgb) in C using weave,
but haven't ever gotten around to it.

I believe the best solution would be for PIL to be modified to accept
an image of type HSV; it already accepts four others, including the
non-obvious CMYK:

  m.mode => string
  Image mode. This is a string specifying the pixel format used by the
  image. Typical values are "1", "L", "RGB", or "CMYK." 

Adding CSV to this list would seem quite reasonable, and would allow
simple conversion between the various color spaces.  But I don't know
anything about the internals of the PIL code, to know whether that
would indeed make sense.  

Jim
 
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] numpy.distutils does not output compilation warning on win32 ?

2008-01-21 Thread David Cournapeau
Pearu Peterson wrote:
> Hi,
>
> If I remember correctly then the warnings were disabled because
> when compiling numpy/scipy on windows there were *lots* of
> warnings, especially for pyrex generated sources.
>   
Yes, that's certainly one of the thing I intend to improve with numscons 
:) (as an aside, I also hope that if numscons is adopted, we will be 
able to set more warnings for many extensions).

Thanks for the info.

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Optimizing speed for large-array inter-element algorithms (specifically, color space conversion)

2008-01-21 Thread Hans Meine
Am Montag, 21. Januar 2008 15:39:21 schrieb theodore test:
> Right now it takes around 9 seconds for a single 1600x1200 RGB image, a
> conversion that I've seen implemented more or less instantly in, say,
> ImageJ.  What can I do to make this conversion more efficient?

You should "just remove" the loop.  Try to convert all operations to 
elementwise array/matrix-operations, which are very fast.  IOW, make the 
algorithm work on all pixels in parallel.  The only tricky part is the 
conditional operation, but you can emulate that with the binary operators and 
e.g. put().

-- 
Ciao, /  /
 /--/
/  / ANS
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Bus Error with object arrays on big endian system

2008-01-21 Thread Christopher Hanley
Hi,

The following will cause a bus error on a big endian machine (Solaris 10 
Sun in this case):

> Python 2.5.1 (r251:54863, Jun 29 2007, 15:29:55) [C] on sunos5
> Type "help", "copyright", "credits" or "license" for more information.
 import numpy
 o = numpy.ndarray(shape=3,dtype=[('SEGMENT', '|S4'), ('SPEC_FOUND', 
 '|i1')])
 o1 = o.getfield(numpy.dtype('|S4'),0)
 print o1[0]
> UXÐ
 print o1[1]
> 4
 print o1[2]
> NT
 print o1
> Bus error (core dumped)

There are no issues on Linux or Mac OS X Intel based systems.

This example was done on the latest svn version of numpy (r1.0.5.dev47360).

Does anyone have an idea of what may have broken?

Thank you for your time and help,

Chris



-- 
Christopher Hanley
Systems Software Engineer
Space Telescope Science Institute
3700 San Martin Drive
Baltimore MD, 21218
(410) 338-4338
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Optimizing speed for large-array inter-element algorithms (specifically, color space conversion)

2008-01-21 Thread theodore test
Hello all,

I'm scratching my head over how to make this image color space conversion
from "RGB" to "HSV" quicker.  It requires input from all three bands of a
given pixel at each pixel, and thus can't be easily flattened.  I've already
implemented psyco and removed the extra step of calling colorsys's
rgb_to_hsv function by adapting the code into my primary for loop.

Right now it takes around 9 seconds for a single 1600x1200 RGB image, a
conversion that I've seen implemented more or less instantly in, say,
ImageJ.  What can I do to make this conversion more efficient?

Thank you in advance,
Theodore Test




#Necessary imports and hand-waving at psyco usage.
import numpy
from numpy import *
from scipy.misc import pilutil

import psyco
psyco.full()


# Read image file, cram it into a normalized array with one row per pixel
# with each column holding the given pixel's R,G, or B band-value.
s = pilutil.imread('A_Biggish_Image.tif')
r,c,d = s.shape

l = r*c
t = t.reshape(l,3)
t = t/255.0


# Cycle through all of the pixels, converting them to HSV space and putting
them
# back into the array.  This is the big time-waster.
#
# The conversion is adapted directly from colorsys.rgb_to_hsv to reduce call
time
for x in range(0,l):
tr = float(t[x,0])
tg = float(t[x,1])
tb = float(t[x,2])
maxc = max(tr, tg, tb)
minc = min(tr, tg, tb)
v = maxc
if minc == maxc:
t[x,0],t[x,1],t[x,2] = 0.0, 0.0, v
else:
s = (maxc-minc) / maxc
rc = (maxc-tr) / (maxc-minc)
gc = (maxc-tg) / (maxc-minc)
bc = (maxc-tb) / (maxc-minc)
if tr == maxc: h = bc-gc
elif tg == maxc: h = 2.0+rc-bc
else: h = 4.0+gc-rc
h = (h/6.0) % 1.0
t[x,0],t[x,1],t[x,2]  = h, s, v

# Renormalize shape and contents to image-file standards.
t = t*255.0
t = t.astype(uint8)
t = t.reshape(r,c,d)

finaloutputarray = t


___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] def of var of complex

2008-01-21 Thread Neal Becker
Neal Becker wrote:

> I noticed that if I generate complex rv i.i.d. with var=1, that numpy
> says:
> 
> var () -> (close to 1.0)
> var () -> (close to 1.0)
> 
> but
> 
> var (complex array) -> (close to complex 0)
> 
> Is that not a strange definition?

I don't think there is any ambiguity about the definition of the variance of
complex.

Var(x) = E{(x-E[x])^2} = E{x}^2 - E{x}

An estimator for this:

E[x^n] \approx (1/M)\sum(x^n)

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] numpy.distutils does not output compilation warning on win32 ?

2008-01-21 Thread Pearu Peterson
Hi,

If I remember correctly then the warnings were disabled because
when compiling numpy/scipy on windows there were *lots* of
warnings, especially for pyrex generated sources.
When there is an error, all warnings will be shown. Hmm, and on
linux the warnings should also be shown (this actually depends on what
Python distutils one is using as the distutils logging interface
has been changed between Python versions). In any case, we can enable
all the warnigs again with no problems (may be only windows guys will
not be so happy about it).

Regards,
Pearu

On Mon, January 21, 2008 8:44 am, David Cournapeau wrote:
> Hi,
>
> I noticed a strange behaviour with distutils: when compiling C code
> on windows (using mingw), the compilation warning are not output on the
> console. For example, umathmodule.c compilation emits the following
> warnings:
>
> numpy\core\src\umathmodule.c.src:128: warning: static declaration of
> 'acoshf' follows non-static declaration
> numpy\core\src\umathmodule.c.src:136: warning: static declaration of
> 'asinhf' follows non-static declaration
>
> I think this is coming from distutils because when I execute the exact
> same command on the shell, I get those warnings. If there is an error
> (by inserting #error), then all the warnings appear also when using
> distutils, which would suggest that distutils checks the return status
> of gcc to decide when to output should be sent on the shell ? Anyway, if
> we can do something about it, I think it would be better to always
> output warnings (it took me quite a while to understand why I got
> warnings with scons and not with distutils...).
>
> cheers,
>
> David
> ___
> Numpy-discussion mailing list
> Numpy-discussion@scipy.org
> http://projects.scipy.org/mailman/listinfo/numpy-discussion
>


___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Scipy page

2008-01-21 Thread Hans Meine
Am Sonntag, 20. Januar 2008 23:15:24 schrieb Travis Vaught:
> On Jan 20, 2008, at 3:30 PM, Charles R Harris wrote:
> > Thanks for the attention to this...I think we need to seek continual
> > improvement to the web site (numpy's default pages, again, could use
> > some TLC).  I've tweaked the bug a bit using the original photoshop
> > scipy logo.  Anyone who'd like the source images in photoshop format
> >
> > The bigger size is good, but I think the head needs to be green or
> > yellow so that it stands out against the background.
> >
> > Chuck
>
> Better now?

Nice.  In order to have better contrast around the legs, one could use the 
same white "shadow"/aura as in the download icon.  (++consistency)

-- 
Ciao, /  /
 /--/
/  / ANS
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] creating rec arrays

2008-01-21 Thread Francesc Altet
A Sunday 20 January 2008, Nicholas escrigué:
> I am trying to package a set of vector arrays into a single rec array
> with column titles.
>  c = numpy.rec.fromarrays([a,b],names='a,b')
> the problem is there is no 'copy' argument in .fromarrays which I can
> set to False. i.e. is there a way to do this without the overhead of
> copying?

You can't because of implementation issues: a recarray is implemented as 
a row-wise object, so the copy is absolutely needed.

BTW, perhaps column-wise recarray would be a great addition to NumPy, 
although I'm not sure whether this object would be much better than a 
dictionary of homogeneous arrays.

Cheers,

-- 
>0,0<   Francesc Altet     http://www.carabos.com/
V   V   Cárabos Coop. V.   Enjoy Data
 "-"
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion