ANN: AnacondaCON February 7-9, Austin TX --- Python powered Open Data Science Conference

2017-01-12 Thread Travis Oliphant
AnacondaCON February 7-9, Austin Texas
  http://anacondacon17.io

3-day Anaconda Open Data Science User Conference celebrating a strong
Python success story.


Hello everyone,

It has been 5 years since Peter Wang and I started Continuum Analytics with
the objective of expanding the commercial adoption of Python for
data-science, quantitative, computational, and numerical computing.  Thanks
to the amazing community and my colleagues at Continuum we've seen that
objective come to fruition and company after company is choosing Python as
their forward looking numerical computing modeling and data-science
language.

We created Anaconda to make it easy for individuals and organizations to
adopt the rich suite of tools and libraries that are commonly used by
scientists, engineers, and mathematicians.  As Anaconda has lowered the
barrier for people to adopt the open data science software stack we've seen
a significant increase in use by people who previously were unlikely to
move beyond Excel, and organizations who are recognizing the kind of value
that a strategic investment in Python can bring them.

Nearly 8 million people have downloaded Anaconda this year many of whom are
using Python for the first time (and choosing Python 3.X).  It's been a
dream come true to see all of our early efforts around SciPy, NumPy, and
Python come to fruition in the enterprise.   It's been an incredible
journey and is something the Python community can and should celebrate.

With that background I am excited to announce AnacondaCON, a 3 day Anaconda
user conference happening February 7-9 in Austin Texas.  We currently have
2-for-1 pricing until January 16th (2 tickets for $999).   We have an amazing
line up of speakers from industry, government, academia and, of course,
Continuum. https://anacondacon17.io/speakers/

Peter and I will both be speaking there.  I will be speaking about the
future of open data science including what community-oriented open-source
technologies we specifically will be working on and contributing to that
continue the success of numpy, scipy, pandas, conda, numba, bokeh, dask,
spyder, holoviews, phosphorjs, jupyter, and more.

I will also be discussing some ideas we are pursuing on the future of array
computing for Python 3.X and how to build a substructure for vector
computing that integrates better with the broader Python ecosystem and is
inspired by and can use the typing hints becoming popular in Python 3.X.

Python's future in technical computing and data science has never been
brighter and AnacondaCON is a great opportunity to connect with an
interesting segment of this larger community and catch up with others
interested in enterprise adoption of Python for data science and numerical
/ technical computing.

I really hope to see you there.

Best,

Travis Oliphant
-- 
https://mail.python.org/mailman/listinfo/python-announce-list

Support the Python Software Foundation:
http://www.python.org/psf/donations/


A new webpage promoting Compiler technology for CPython

2013-02-15 Thread Travis Oliphant
Hey all, 

With Numba and Blaze we have been doing a lot of work on what essentially is 
compiler technology and realizing more and more that we are treading on ground 
that has been plowed before with many other projects.   So, we wanted to create 
a web-site and perhaps even a mailing list or forum where people could 
coordinate and communicate about compiler projects, compiler tools, and ways to 
share efforts and ideas.

The website is:  http://compilers.pydata.org/

This page is specifically for Compiler projects that either integrate with or 
work directly with the CPython run-time which is why PyPy is not presently 
listed.  The PyPy project is a great project but we just felt that we wanted to 
explicitly create a collection of links to compilation projects that are 
accessible from CPython which are likely less well known.

But that is just where we started from.   The website is intended to be a 
community website constructed from a github repository.   So, we welcome pull 
requests from anyone who would like to see the website updated to reflect their 
related project.Jon Riehl (Mython, PyFront, ROFL, and many other 
interesting projects) and Stephen Diehl (Blaze) and I will be moderating the 
pull requests to begin with.   But, we welcome others with similar interests to 
participate in that effort of moderation.

The github repository is here:  https://github.com/pydata/compilers-webpage

This is intended to be a community website for information spreading, and so we 
welcome any and all contributions.  

Thank you,

Travis Oliphant


-- 
http://mail.python.org/mailman/listinfo/python-announce-list

Support the Python Software Foundation:
http://www.python.org/psf/donations/


[issue15540] Python 3.3 and numpy

2012-08-03 Thread Travis Oliphant

Travis Oliphant added the comment:

On Aug 3, 2012, at 1:35 AM, Martin v. Löwis wrote:

 
 Martin v. Löwis added the comment:
 
 This is a mis-understanding of what NumPy does and why.There is  
 a need to byte-swap only when the data is stored on disk in the  
 reverse order from the native machine
 
 So is there ever a need to byte-swap Unicode strings? I can see how *numeric*
 data are stored using the internal representation on disk; this is a common
 technique. For strings, there is the notion of encodings which makes the
 relationship between internal and disk representations. So if NumPy applies
 the numeric concept to string data, then this is a flaw.

Apologies for not using correct terminology.   I had to spend a lot of time 
getting to know Unicode when I wrote NumPy, but am rusty on the key points and 
so I may communicate incorrectly.   The NumPy representation of Unicode strings 
is always UTF-32BE or UTF-32LE (depending on the data-type of the array).
The question is what to do when extracting this data into an array-scalar 
(which for Unicode objects has exactly the same representation as a 
PyUnicodeObject).  In fact, the NumPy Unicode array scalar is a C-sub-type of 
PyUnicodeObject and inherits from both the PyUnicodeObject and the NumPy 
Character interface --- a likely rare example of dual-inheritance at the 
C-level.  

 
 It may be that people really do store text data in the same memory blob
 as numeric data and dump it to a file, but they really should think of this
 data as UTF-16-BE or UTF-32-LE and the like, not in terms of byte  
 swapping.
 You can use PyUnicode_Decode to create a Unicode object given a void*,
 a length, and a codec name. The concept native Unicode representation
 does not exist - people use all of two-byte, four-byte and UTF-8  
 representations
 in memory, on a single processor architecture and operating system.

I understand all the representations of Unicode data.   There is, however, a 
native byte-order and that's what I was talking about. 

 
 The byte-swapping must be done prior to conversion to a Python  
 Unicode-Object when selecting data out of the array.
 
 So if the byte swapping is done before the Unicode object is created:
 why did Dave and Ondřej run into problems then?

There were at least  2 issues:   1) a bad test that was written by someone who 
didn't understand you shouldn't have byte-swapped unicode strings as 
strings and 2) a mis-understanding of what was happening going from the data 
stored in a NumPy array and the Python scalar object that was being created.  
 
.
Thank you for your explanations.   It's very helpful.   Also, thank you for the 
PEP and improvements in Python 3.3.   The situation is *much* nicer now as 
NumPy is doing all kinds of hackery to support both narrow and wide builds.
This hackery could likely be improved even pre Python 3.3, but it's more clear 
how to handle the situation now in Python 3.3

 
 --
 
 ___
 Python tracker rep...@bugs.python.org
 http://bugs.python.org/issue15540
 ___

--

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue15540
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue15540] Python 3.3 and numpy

2012-08-02 Thread Travis Oliphant

Travis Oliphant added the comment:

On Aug 2, 2012, at 5:28 PM, Antoine Pitrou wrote:

 
 Antoine Pitrou added the comment:
 
 Agreed with Martin. Byte-swapped unicode data in unicode objects doesn't make 
 sense, since it will break the semantics of many operations. If numpy wants 
 to support byte-swapped unicode data (what for?), they should store it in a 
 different object type.

This is a mis-understanding of what NumPy does and why.There is a need to 
byte-swap only when the data is stored on disk in the reverse order from the 
native machine (i.e. NumPy is pointing to memory-mapped data).

The byte-swapping must be done prior to conversion to a Python Unicode-Object 
when selecting data out of the array.   

-Travis

--

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue15540
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue15540] Python 3.3 and numpy

2012-08-02 Thread Travis Oliphant

Travis Oliphant added the comment:

On Aug 2, 2012, at 6:09 PM, Antoine Pitrou wrote:

 
 Antoine Pitrou added the comment:
 
 The byte-swapping must be done prior to conversion to a Python
 Unicode-Object when selecting data out of the array.
 
 But then it shouldn't affect the invariants which are commented out in
 Dave's patch.
 

My impression is that Python should not have to change anything, but NumPy 
needs to adapt to the new Unicode concepts (which I think are great, by the way 
--- I'm a big supporter of getting rid of the wide/narrow build distinction). 

 --
 
 ___
 Python tracker rep...@bugs.python.org
 http://bugs.python.org/issue15540
 ___

--

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue15540
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue3132] implement PEP 3118 struct changes

2010-05-24 Thread Travis Oliphant

Travis Oliphant oliph...@enthought.com added the comment:

On May 19, 2010, at 2:38 PM, Mark Dickinson wrote:

 
 Mark Dickinson dicki...@gmail.com added the comment:
 
 Travis, this issue is still assigned to you.  Do you plan to work on this at 
 some stage, or may I unassign you?
 

You may unassign it from me.   Unfortunately, I don't have time anymore to work 
on it and I don't see that changing in the coming months. 

Thanks,

-Travis

--

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue3132
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue3132] implement PEP 3118 struct changes

2010-02-12 Thread Travis Oliphant

Travis Oliphant oliph...@enthought.com added the comment:

On Feb 12, 2010, at 7:29 PM, Meador Inge wrote:


 Meador Inge mead...@gmail.com added the comment:

 Is anyone working on implementing these new struct modifiers?  If  
 not, then I would love to take a shot at it.

That would be great.

-Travis

--

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue3132
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



Join us for the 2nd Scientific Computing with Python Webinar

2009-06-15 Thread Travis Oliphant


Hello all Python users:

I am pleased to announce the second installment of a free Webinar  
series that discusses using Python for scientific computing.
Enthought hosts this free series  which takes place once a month for  
about 60-90 minutes.   The schedule and length may change based on  
participation feedback, but for now it is scheduled for the third  
Friday of every month. This free webinar should not be confused  
with the EPD webinar on the first Friday of each month which is open  
only to subscribers to the Enthought Python Distribution at the Basic  
level or above.


This session's speakers will be me (Travis Oliphant) and Peter Wang.   
I will show off a bit of EPDLab which is an interactive Python  
environment built using IPython, Traits, and Envisage.   Peter Wang  
will present a demo of Chaco and provide some examples of interactive  
visualizations that can be easily constructed using it's classes.   If  
there is time after the Chaco demo, I will continue the discussion  
about Mayavi, but I suspect this will have to wait until the next  
session.   All of the tools we will show are open-source, freely- 
available tools from multiple sources.  They can all be conveniently  
installed using the Enthought Python Distribution.


This event will take place on Friday, June 19th at 1:00pm CDT and will  
last 60 to 90 minutes depending on the questions asked.  If you would  
like to participate, please register by clicking on the link below or  
going to https://www1.gotomeeting.com/register/303689873.


There will be a 15 minute technical help-session prior to the on-line  
meeting which you should plan to use if you have never participated in  
a GoToWebinar previously.  During this time you can test your  
connection and audio equipment as well as familiarize yourself with  
the GoTo Meeting software (which currently only works with Mac and  
Windows systems).


I am looking forward to interacting with many of you again this Friday.

Best regards,

Travis Oliphant
Enthought, Inc.


Enthought is the company that sponsored the creation of SciPy and the  
Enthought Tool Suite.  It continues to sponsor the SciPy community by  
hosting the SciPy mailing list and website and participating in the  
development of SciPy and NumPy.  Enthought creates custom  
scientific and technical software applications and provides training  
on using Python for technical computing.   Enthought also provides the  
Enthought Python Distribution.   Learn more at http://www.enthought.com


Bios for Travis Oliphant and Peter Wang can be read at 
http://www.enthought.com/company/executive-team.php


--
Travis Oliphant
Enthought Inc.
1-512-536-1057
http://www.enthought.com
oliph...@enthought.com




--
http://mail.python.org/mailman/listinfo/python-announce-list

   Support the Python Software Foundation:
   http://www.python.org/psf/donations.html


Join us for Scientific Computing with Python Webinar

2009-05-21 Thread Travis Oliphant
[re-sent plaintext by a...@python.org for c.l.py.announce]

Hello all Python users:

I am pleased to announce the beginning of a free Webinar series that  
discusses using Python for scientific computing.   Enthought will host  
this free series  which will take place once a month for 30-45 minutes.   
The schedule and length may change based on participation feedback, but 
for now it is scheduled for the fourth Friday of every month. This 
free webinar should not be confused with the EPD webinar on the first 
Friday of each month which is open only to subscribers to the Enthought 
Python Distribution.

I (Travis Oliphant) will be the first speaker at this continuing series.  
I plan to present a brief (10-15) minute talk on reading binary files 
with NumPy using memory mapped arrays and structured data-types.  This 
talk will be followed by a demonstration of Chaco for interactive 2-d 
visualization and Mayavi for interactive 3-d visualization.   Both Chaco 
and Mayavi are open-source tools and part of the Enthought Tool Suite.  
They can be conveniently installed using the Enthought Python 
Distribution.   Topics for future webinars will be chosen later based on 
participant feedback.

This event will take place on Friday at 3:00pm CDT and will last 30 to  
45 minutes depending on questions asked.   Space is limited at this  
event.   If you would like to participate, please register by going to 
https://www1.gotomeeting.com/register/422340144 or by clicking on the 
appropriate link in the attached announcement.

There will be a 10 minute technical help session prior to the on-line  
meeting which you should plan to use if you have never participated in a 
GoToWebinar previously.  During this time you can test your connection 
and audio equipment as well as familiarize yourself with the GoTo Meeting 
software.

I am looking forward to interacting with many of you this Friday.

Best regards,

Travis Oliphant
Enthought, Inc.


Enthought is the company that sponsored the creation of SciPy and the  
Enthought Tool Suite.  It continues to sponsor the SciPy community by  
hosting the SciPy mailing list and website and participating in the  
development of SciPy and NumPy.  Enthought creates custom scientific 
and technical software applications and provides training on using Python 
for technical computing.   Enthought also provides the Enthought Python 
Distribution.   Learn more at http://www.enthought.com

Travis Oliphant's bio can be read at 
http://www.enthought.com/company/executive-team.php


 Scientific Computing with Python Webinar  
   
 Each webinar in this continuing series will demonstrate the use of  
 some aspect of Python to assist with scientific, engineering, and  
 technical computing.   Enthought will host each meeting and select a  
 specific topic based on feedback from participants
 Register for a session now by clicking a date below:
 Fri, May 22, 2009 3:00 PM - 3:30 PM CDT
 Fri, Jun 19, 2009 1:00 PM - 1:30 PM CDT
 Fri, Jul 17, 2009 1:00 PM - 1:30 PM CDT
 Once registered you will receive an email confirming your registration
 with information you need to join the Webinar.
 System Requirements
 PC-based attendees
 Required: Windows? 2000, XP Home, XP Pro, 2003 Server, Vista
 Macintosh?-based attendees
 Required: Mac OS? X 10.4 (Tiger?) or newer
-- 
http://mail.python.org/mailman/listinfo/python-announce-list

Support the Python Software Foundation:
http://www.python.org/psf/donations.html


Join us for Scientific Computing with Python Webinar

2009-05-20 Thread Travis Oliphant


Hello all Python users:

I am pleased to announce the beginning of a free Webinar series that  
discusses using Python for scientific computing.   Enthought will host  
this free series  which will take place once a month for 30-45  
minutes.   The schedule and length may change based on participation  
feedback, but for now it is scheduled for the fourth Friday of every  
month. This free webinar should not be confused with the EPD  
webinar on the first Friday of each month which is open only to  
subscribers to the Enthought Python Distribution.


I (Travis Oliphant) will be the first speaker at this continuing  
series.  I plan to present a brief (10-15) minute talk on reading  
binary files with NumPy using memory mapped arrays and structured data- 
types.  This talk will be followed by a demonstration of Chaco for  
interactive 2-d visualization and Mayavi for interactive 3-d  
visualization.   Both Chaco and Mayavi are open-source tools and part  
of the Enthought Tool Suite.  They can be conveniently installed using  
the Enthought Python Distribution.   Topics for future webinars will  
be chosen later based on participant feedback.


This event will take place on Friday at 3:00pm CDT and will last 30 to  
45 minutes depending on questions asked.   Space is limited at this  
event.   If you would like to participate, please register by going to https://www1.gotomeeting.com/register/422340144 
 or by clicking on the appropriate link in the attached announcement.


There will be a 10 minute technical help session prior to the on-line  
meeting which you should plan to use if you have never participated in  
a GoToWebinar previously.  During this time you can test your  
connection and audio equipment as well as familiarize yourself with  
the GoTo Meeting software.


I am looking forward to interacting with many of you this Friday.

Best regards,

Travis Oliphant
Enthought, Inc.


Enthought is the company that sponsored the creation of SciPy and the  
Enthought Tool Suite.  It continues to sponsor the SciPy community by  
hosting the SciPy mailing list and website and participating in the  
development of SciPy and NumPy.  Enthought creates custom  
scientific and technical software applications and provides training  
on using Python for technical computing.   Enthought also provides the  
Enthought Python Distribution.   Learn more at http://www.enthought.com


Travis Oliphant's bio can be read at 
http://www.enthought.com/company/executive-team.php








Scientific Computing with Python Webinar





Each webinar in this continuing series will demonstrate the use of  
some aspect of Python to assist with scientific, engineering, and  
technical computing.   Enthought will host each meeting and select a  
specific topic based on feedback from participants

Register for a session now by clicking a date below:
Fri, May 22, 2009 3:00 PM - 3:30 PM CDT
Fri, Jun 19, 2009 1:00 PM - 1:30 PM CDT
Fri, Jul 17, 2009 1:00 PM - 1:30 PM CDT
Once registered you will receive an email confirming your registration
with information you need to join the Webinar.
System Requirements
PC-based attendees
Required: Windows® 2000, XP Home, XP Pro, 2003 Server, Vista
Macintosh®-based attendees
Required: Mac OS® X 10.4 (Tiger®) or newer









-- 
http://mail.python.org/mailman/listinfo/python-announce-list

Support the Python Software Foundation:
http://www.python.org/psf/donations.html


[issue4580] slicing of memoryviews when itemsize != 1 is wrong

2008-12-10 Thread Travis Oliphant

Travis Oliphant [EMAIL PROTECTED] added the comment:

I will take some time in the next week to look at this issue.  Thank you
for bringing these bugs to my attention.  

I did not finish the memoryview implementation, due to other demands on
my time.  I appreciate the efforts of so many to pick up and understand
the PEP and contribute code.   I've noticed some incorrect statements
being passed around, and I apologize for not providing a perspective on
them.   

I hope there is a clear distinction between the buffer protocol and the
memoryview object.   The buffer protocol was thought through fairly
well, but the memoryview object has not received the same attention.  My
mental model of the memoryview object is basically an (enhanced) NumPy
array without the math support.  

It does not suprise me that there are a few remaining issues in the
memoryview object.  However, I'm confident they can be cleared up.   A
few changes to the code were made by others without changing the PEP.  I
accept responsibility for this because people were cleaning up what I
should have already finished.  

All the help is appreciated.  I will review the patches and fix the
problem.

___
Python tracker [EMAIL PROTECTED]
http://bugs.python.org/issue4580
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue4580] slicing of memoryviews when itemsize != 1 is wrong

2008-12-10 Thread Travis Oliphant

Travis Oliphant [EMAIL PROTECTED] added the comment:

My perspective on statements made:
 
 * Memoryview object should report it's length as self-view.shape[0]
unless self-view.shape is NULL (indicating in this case a 0-d array or
scalar).  In this case, it should raise an error. 

 * The buffer protocol is clear about who owns the memory for shape and
strides (and suboffsets).  The exporter does and it is responsible for
not changing them until releasebuffer is called.

 * It should also be clear that shape, strides, and suboffsets will be
NULL in exactly two cases
   1) The corresponding flag was not set indicating the consumer is not
interested in shape, strides, or suboffsets
   2) ndim == 0 indicating a 0-dimensional array (scalar-like).

___
Python tracker [EMAIL PROTECTED]
http://bugs.python.org/issue4580
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue4580] slicing of memoryviews when itemsize != 1 is wrong

2008-12-10 Thread Travis Oliphant

Travis Oliphant [EMAIL PROTECTED] added the comment:

Another comment on statements made

  * I don't see where array.array getbuf implementation is broken.  It
looks correct to me.  It sets view.shape to NULL unless the consumer
asked for the shape information to be reported in which case it sets it
equal to a pointer to the number of elements in the array.

___
Python tracker [EMAIL PROTECTED]
http://bugs.python.org/issue4580
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



Re: NumPy arrays that use memory allocated from other libraries or tools

2008-09-11 Thread Travis Oliphant

sturlamolden wrote:

On Sep 10, 6:39 am, Travis Oliphant [EMAIL PROTECTED] wrote:


I wanted to point anybody interested to a blog post that describes a
useful pattern for having a NumPy array that points to the memory
created by a different memory manager than the standard one used by
NumPy.



Here is something similar I have found useful:

There will be a new module in the standard library called
'multiprocessing' (cf. the pyprocessing package in cheese shop). It
allows you to crerate multiple processes (as opposed to threads) for
concurrency on SMPs (cf. the dreaded GIL).

The 'multiprocessing' module let us put ctypes objects in shared
memory segments (processing.Array and processing.Value). It has it's
own malloc, so there is no 4k (one page) lower limit on object size.
Here is how we can make a NumPy ndarray view the shared memory
referencey be these objects:

try:
   import processing
except:
   import multiprocessing as processing

import numpy, ctypes

_ctypes_to_numpy = {
ctypes.c_char : numpy.int8,
ctypes.c_wchar : numpy.int16,
ctypes.c_byte : numpy.int8,
ctypes.c_ubyte : numpy.uint8,
ctypes.c_short : numpy.int16,
ctypes.c_ushort : numpy.uint16,
ctypes.c_int : numpy.int32,
ctypes.c_uint : numpy.int32,
ctypes.c_long : numpy.int32,
ctypes.c_ulong : numpy.int32,
ctypes.c_float : numpy.float32,
ctypes.c_double : numpy.float64
}

def shmem_as_ndarray( array_or_value ):

 view processing.Array or processing.Value as ndarray 

obj = array_or_value._obj
buf = obj._wrapper.getView()
try:
t = _ctypes_to_numpy[type(obj)]
return numpy.frombuffer(buf, dtype=t, count=1)
except KeyError:
t = _ctypes_to_numpy[obj._type_]
return numpy.frombuffer(buf, dtype=t)

With this simple tool we can make processes created by multiprocessing
work with ndarrays that reference the same shared memory segment. I'm
doing some scalability testing on this. It looks promising :)




Hey, that is very neat.

Thanks for pointing me to it.  I was not aware of this development in 
multiprocessing.



-Travis

--
http://mail.python.org/mailman/listinfo/python-list


NumPy arrays that use memory allocated from other libraries or tools

2008-09-09 Thread Travis Oliphant


I wanted to point anybody interested to a blog post that describes a 
useful pattern for having a NumPy array that points to the memory 
created by a different memory manager than the standard one used by 
NumPy.   The pattern  shows how to create a NumPy array that points to 
previously allocated memory and then shows how to construct an object 
that allows the correct deallocator to be called when the NumPy array is 
freed.


This may be useful if you are wrapping code that has it's own memory 
management scheme.   Comments and feedback is welcome.


The post is

http://blog.enthought.com/?p=62


Best regards,

-Travis Oliphant

--
http://mail.python.org/mailman/listinfo/python-list


[issue3139] bytearrays are not thread safe

2008-08-24 Thread Travis Oliphant

Travis Oliphant [EMAIL PROTECTED] added the comment:

I'm sorry that I was unavailable for comment during July and August as
it looks like a lot of decisions were made that have changed the
semantics a bit.  I'm still trying to figure out why the decisions were
made that were made.   

I get the impression that most of the problems are related to objects
incorrectly managing their exported buffers, but there may be some
semantic issues related to t# that were not conceived of during the
many discussions surrounding the design of PEP 3118.  

I'm not convinced that Py_buffer should have grown a link to an object.
 I think this is a shortcut solution due to misuse of the protocol that
may have unfortunate consequences. 

I'm not sure where PyBuffer_Release came from.  I can't find it in the
PEP and don't remember what it's purpose is.  Did I add it or did
somebody elese?

___
Python tracker [EMAIL PROTECTED]
http://bugs.python.org/issue3139
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue3101] global function _add_one_to_C

2008-08-24 Thread Travis Oliphant

Travis Oliphant [EMAIL PROTECTED] added the comment:

I've added comments in the code to document these functions.  I have no
opinion where they live except they should probably be available to
extensions modules. 

These routines increment an N-length counter representing a position in
an N-dimensional array with wrap-around when the counter reaches the
size of the dimension.  Thus, for a (2,3) array we have:

F-version

0,0
1,0
2,0
0,1
1,1
2,1

C-version
0,0
0,1
0,2
1,0
1,1
1,2

___
Python tracker [EMAIL PROTECTED]
http://bugs.python.org/issue3101
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue3046] Locking should be removed from the new buffer protocol

2008-08-24 Thread Travis Oliphant

Changes by Travis Oliphant [EMAIL PROTECTED]:


--
status: open - closed

___
Python tracker [EMAIL PROTECTED]
http://bugs.python.org/issue3046
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2394] [Py3k] Finish the memoryview object implementation

2008-08-24 Thread Travis Oliphant

Travis Oliphant [EMAIL PROTECTED] added the comment:

It would have been nice to finish the memoryview object for 3.0, but I
ran into time constraints.  The pieces that are left can be pushed to 3.1

___
Python tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2394
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2394] [Py3k] Finish the memoryview object implementation

2008-08-24 Thread Travis Oliphant

Changes by Travis Oliphant [EMAIL PROTECTED]:


--
versions: +Python 3.1 -Python 3.0

___
Python tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2394
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue3132] implement PEP 3118 struct changes

2008-08-24 Thread Travis Oliphant

Travis Oliphant [EMAIL PROTECTED] added the comment:

This can be re-targeted to 3.1 as described.

___
Python tracker [EMAIL PROTECTED]
http://bugs.python.org/issue3132
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue762920] API Functions for PyArray

2008-06-11 Thread Travis Oliphant

Travis Oliphant [EMAIL PROTECTED] added the comment:

I will look at the patch, but generally I'm not inclined to give the
array module more legs because I agree that the desired functionality
should be put into the memoryview object and the buffer protocol.

___
Python tracker [EMAIL PROTECTED]
http://bugs.python.org/issue762920
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2393] Backport buffer interface in Python 3.0 to Python 2.6

2008-03-18 Thread Travis Oliphant

Travis Oliphant [EMAIL PROTECTED] added the comment:

Back-porting of the new buffer interface was done in r61491.  This issue
can be closed.

__
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2393
__
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2393] Backport buffer interface in Python 3.0 to Python 2.6

2008-03-18 Thread Travis Oliphant

New submission from Travis Oliphant [EMAIL PROTECTED]:

Some (or all) of PEP 3118 should be backported to Python 2.6 because it
does not require backward-incompatible changes and can assist in the
transition to 3.0.   

This issue is to be sure that the buffer-interface portion of PEP 3118
is backported.  This does not mean that any objects in Python will
necessarily use the new buffer interface.  Any such changes would be
entered as separate issues.

--
components: Interpreter Core
messages: 63923
nosy: teoliphant
severity: normal
status: open
title: Backport buffer interface in Python 3.0 to Python 2.6
type: feature request
versions: Python 2.6

__
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2393
__
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2394] Finish the memoryview object implementation

2008-03-18 Thread Travis Oliphant

New submission from Travis Oliphant [EMAIL PROTECTED]:

The memoryview object in Python 3.0 needs to be finished.  There are a
few methods that are not complete.  In particular, the __getitem__ and
__setitem__ functionality needs to be finished as well as the tolist()
method.

--
components: Interpreter Core
messages: 63928
nosy: teoliphant
severity: normal
status: open
title: Finish the memoryview object implementation
type: behavior
versions: Python 3.0

__
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2394
__
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2395] struct module changes of PEP 3118

2008-03-18 Thread Travis Oliphant

New submission from Travis Oliphant [EMAIL PROTECTED]:

The additions to the struct module spelled out in PEP 3118 need to be
implemented for Python 3.0

--
components: Library (Lib)
messages: 63929
nosy: teoliphant
severity: normal
status: open
title: struct module changes of PEP 3118
versions: Python 3.0

__
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2395
__
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2395] struct module changes of PEP 3118

2008-03-18 Thread Travis Oliphant

Changes by Travis Oliphant [EMAIL PROTECTED]:


--
type:  - behavior

__
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2395
__
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2396] Backport memoryview object to Python 2.6

2008-03-18 Thread Travis Oliphant

New submission from Travis Oliphant [EMAIL PROTECTED]:

The memoryview object in Python 2.6 would help in the transition to
Python 3.0.  It is a lower-priority and could wait until 2.7 if it
doesn't get finished.

--
components: Interpreter Core
messages: 63930
nosy: teoliphant
severity: normal
status: open
title: Backport memoryview object to Python 2.6
type: feature request
versions: Python 2.6

__
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2396
__
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2397] Backport 3.0 struct module changes to 2.6

2008-03-18 Thread Travis Oliphant

New submission from Travis Oliphant [EMAIL PROTECTED]:

The changes to the struct module in PEP 3118 should be backported to 2.6
as it is backward compatible and would smooth the transition to 3.0.  It
is lower priority and could wait until 2.7

--
components: Library (Lib)
messages: 63931
nosy: teoliphant
severity: normal
status: open
title: Backport 3.0 struct module changes to 2.6
versions: Python 2.6

__
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2397
__
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2394] [Py3k] Finish the memoryview object implementation

2008-03-18 Thread Travis Oliphant

Changes by Travis Oliphant [EMAIL PROTECTED]:


--
title: Finish the memoryview object implementation - [Py3k] Finish the 
memoryview object implementation

__
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2394
__
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2395] [Py3k] struct module changes of PEP 3118

2008-03-18 Thread Travis Oliphant

Changes by Travis Oliphant [EMAIL PROTECTED]:


--
title: struct module changes of PEP 3118 - [Py3k] struct module changes of PEP 
3118

__
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2395
__
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2399] Patches for Tools/msi

2008-03-18 Thread Travis Oliphant

Changes by Travis Oliphant [EMAIL PROTECTED]:


Added file: http://bugs.python.org/file9734/msi.patch

__
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2399
__
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2399] Patches for Tools/msi

2008-03-18 Thread Travis Oliphant

Changes by Travis Oliphant [EMAIL PROTECTED]:


--
nosy: +loewis

__
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2399
__
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue2404] Backport ctypes support for buffer protocol to Python 2.6 (ref issue1971)

2008-03-18 Thread Travis Oliphant

Changes by Travis Oliphant [EMAIL PROTECTED]:


--
assignee:  - theller
components: +ctypes
nosy: +theller
type:  - behavior
versions: +Python 2.6

__
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue2404
__
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue1742669] %d format handling for long values

2007-11-01 Thread Travis Oliphant

Travis Oliphant added the comment:

I have two issues with this patch:

1) I'm not sure it's that bad to need to use '%d' % long(obj) to ensure
conversion to a long integer.

2) If this kind of auto-conversion is deemed useful, then the patch
itself is rather complicated.   I would re-factor so that the same code
is not repeated once in the PyNumber_Check and again in the original
PyLong_Check and else clauses.  Simply check for PyNumber.  Then, if not
already an int or a long, convert to int first and then long if that
creates an error.  Then, excecute the two segments of code for int and
long objects.

--
nosy: +teoliphant

_
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue1742669
_
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



Committing latest patch on Issue 708374 to SVN (adding offset to mmap)

2007-10-22 Thread Travis Oliphant

Hi all,

I think the latest patch for fixing Issue 708374 (adding offset to mmap) 
should be committed to SVN.

I will do it, if nobody opposes the plan.  I think it is a very 
important addition and greatly increases the capability of the mmap module.

Thanks,

-Travis Oliphant

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Committing latest patch on Issue 708374 to SVN (adding offset to mmap)

2007-10-22 Thread Travis Oliphant
Travis Oliphant wrote:
 Hi all,
 
 I think the latest patch for fixing Issue 708374 (adding offset to mmap) 
 should be committed to SVN.
 
 I will do it, if nobody opposes the plan.  I think it is a very 
 important addition and greatly increases the capability of the mmap module.
 
 Thanks,
 
 -Travis Oliphant

Sorry for the noise,  I sent this to the wrong group.  Any comment on 
the patch is welcome, however.

Best regards,

-Travis

-- 
http://mail.python.org/mailman/listinfo/python-list


[issue708374] add offset to mmap

2007-10-22 Thread Travis Oliphant

Travis Oliphant added the comment:

I applied phuang's patch in revision 58598. This can be closed.


Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue708374

___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue1268] array unittest problems with UCS4 build

2007-10-14 Thread Travis Oliphant

Travis Oliphant added the comment:

This issue may be closed.

__
Tracker [EMAIL PROTECTED]
http://bugs.python.org/issue1268
__
___
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



NumPy 1.0.3 released

2007-05-24 Thread Travis Oliphant
We are pleased to announce the release of NumPy 1.0.3

Hopefully, this release will work better with multiple interpreters as 
well as having some significant bugs fixed.

Other changes include

* x/y follows Python standard on mixed-sign division for array scalars 
and numpy arrays
* iinfo added to provide information on integer data-types
* improvements to SWIG typemaps, numpy.distutils, and f2py
* improvements to separator handling in fromfile and fromstring
* many, many bug fixes

Thank you to everybody who contributed to the recent release.

Best regards,

NumPy Developers
http://numpy.scipy.org

-- 
http://mail.python.org/mailman/listinfo/python-announce-list

Support the Python Software Foundation:
http://www.python.org/psf/donations.html


Re: Database in memory

2007-04-09 Thread Travis Oliphant
Jim wrote:
 I have an application that will maintain an in-memory database in the
 form of a list of lists.  Does anyone know of a way to search for and
 retreive records from such a structure?
 

Actually, the new NumPy can work as a very-good fast and efficient 
simple in-memory database (or memory-mapped data-base for that matter).

The elements of a NumPy array can be arbitrary records.  You would 
search using logical combinations of comparision.   I think the ability 
for NumPy (which now handles arbitrary records) to be used as a 
data-base is under-appreciated.

Mind you, it is SQL-less.  NumPy only provides the tables it does not 
provide the fancy logic on-top of the tables.  So, perhaps it would be 
better to say that NumPy could serve as the foundation for a simple 
data-base application.

-Travis

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: An error of matrix inversion using NumPy

2007-04-04 Thread Travis Oliphant
lancered wrote:
 
 
So,  can you tell me what goes wrong?  Is this a bug in
 Numpy.linalg? How to deal with this situation?  If you need, I can
 post the matrix I used below, but it is so long,so not at the moment.


As you discovered, it is very likely your problem is a very high 
condition number.

The easiest thing to do is to use

numpy.linalg.pinv

to perform a pseudo-inverse which will only use the singular-values that 
are well-conditioned to compute the inverse.

This will still not give you an exact identity, but at least you will 
know you aren't amplifiying low-valued singular vectors.

-Travis



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python equivalents to MATLAB str2func, func2str, ischar, isfunc?

2007-03-14 Thread Travis Oliphant
dmitrey wrote:
 I can't find these via web serch


You won't find exact equivalents.  But, the same functionality is 
available.  Perhaps you would like to show us what you are trying to do 
in Python.

Python's eval has some similarity with str2func

Python's repr() or str() has some similarity with func2str

ischar(A)  is similiar to isinstance(A, str)

isfunc  is similiar to callable

-Travis

P.S.  (if you are using NumPy, then there are other possibilities as well.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Direct memory access

2007-03-07 Thread Travis Oliphant
Collin Stocks wrote:
 Does anyone know how to directly handle memory using python?
 I want to be able, for example, to copy the actual contents of a memory 
 address, or set the actual contents of a memory address.

This kind of thing is generally not what Python is used for, so it's not 
really easy to do.

Writing to arbitrary memory areas is an easy way to cause segmentation 
violations which Python and its extensions try to make segmentation 
violations (memory violations) as near impossible as they can.

If you really need to do this then you can use ctypes to do it.

Let N be the number of bytes you want to access, then

import ctypes
g = (ctypes.c_char*N).from_address(addr)

g is now a settable sequence of bytes that you can read and write to 
using strings.

g[0]  # read the first byte
g[1]  # read the second byte

g[0] = '\x24' # set the first byte to hexadecimal 24

etc...

If you don't have permission to write to addr then you will get memory 
violations and your program will crash if you try to read from or write 
to the resulting sequence.

-Travis


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: export an array of floats to python

2007-02-22 Thread Travis Oliphant
Peter Wuertz wrote:
 Hi,
 
 I'm writing a C module for python, that accesses a special usb camera. 
 This module is supposed to provide python with data (alot of data). Then 
 SciPy is used to fit the data.


Which version of scipy are you using?


 My question is, how to make python read from a C array? By reading the 
 documentation, one could get the impression that PyBufferObjects do that 
 job.
 
 http://docs.python.org/api/bufferObjects.html
 
 It says:
 Two examples of objects that support the buffer interface are strings 
 and arrays.
 
 Where this function looks promising:
 PyBuffer_FromMemory - Return a new read-only buffer object that reads 
 from a specified location in memory, with a specified size.
 
 All right, lets imagine I created a PyBufferObject from my float array 
 in the C module, and passed it to python.

How about creating an array directly from your float array using 
PyArray_SimpleNewFromData (the NumPy C-API) or PyArray_FromDimsAndData 
(the Numeric C-API).


-Travis

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Two dimensional lists

2007-01-25 Thread Travis Oliphant
Laszlo Nagy wrote:
# shouldn't I be able to fill the lists simply by pointing to a location?

matrix[a_idx, p_idx] = 0x219 # and so on?
  
 
 Lists are not matrices. For example:
 
 L = [  [1,2,3], ['a','b','c'], 10 ]
 print L[1][2] # Prints 'c', you will like this syntax but...
 print L[2][5] # TypeError: 10 is not subscriptable
 
 You can create a function that creates a list of lists filled with 
 zeros, and then use this data structure as a matrix.
 
 BUT! If you need to use matrices filled with numbers, try numarray:
 
 http://www.stsci.edu/resources/software_hardware/numarray
 
 Numarray is much more efficient for this task.

It is also out-dated.  For new code users are strongly encouraged to use 
NumPy.  The numarray people are transitioning their own code to use 
NumPy and numarray will cease being supported at some point.

http://numpy.scipy.org

http://www.numpy.org --- sourceforge site.

-Travis

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: why scipy cause my program slow?

2007-01-17 Thread Travis Oliphant
Robert Kern wrote:
 HYRY wrote:
 
Why the exec time of test(readdata()) and test(randomdata()) of
following program is different?
my test file 150Hz10dB.wav has 2586024 samples, so I set randomdata
function
to return a list with 2586024 samples.
the exec result is:
2586024
type 'list'
10.8603842736
2586024
type 'list'
2.16525233979
test(randomdata()) is 5x faster than test(readdata())
if I remove from scipy import * then I get the following result:
2586024
type 'list'
2.21851601473
2586024
type 'list'
2.13885042216

So, what the problem with scipy?
 
 
 You're importing (through scipy) numpy's sum() function. The result type of 
 that
 function is a numpy scalar type. The set of scalar types was introduced for a
 number of reasons, mostly having to do with being able to represent the full
 range of numerical datatypes that Python does not have builtin types for.
 Unfortunately, the code paths that get executed when arithmetic is performed
 sith such scalars are still suboptimal; I believe they are still going through
 the full ufunc machinery.

This should not be true in the 1.0 release of NumPy.  The numpy scalars 
do their own math which has less overhead than ufunc-based math.  But, 
there is still more overhead than with simple floats because mixed-type 
arithmetic is handled more generically (the same algorithm covers all 
the cases).

The speed could be improved but hasn't been because it is so easy to get 
a Python float if you are concerned about speed.

-Travis

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: numpy numbers converted wrong

2006-10-26 Thread Travis Oliphant
robert wrote:
 in Gnuplot (Gnuplot.utils) the input array will be converted to a Numeric 
 float array as shown below. When I insert a numpy array into Gnuplot like 
 that below, numbers 7.44 are cast to 7.0 
 Why is this and what should I do ? Is this bug in numpy or in Numeric?
 
 
 [Dbg] m #numpy array
 array([[  9.78109200e+08,   7.4400e+00],
[  9.78454800e+08,   7.4400e+00],
[  9.78541200e+08,   8.1900e+00],
..., 
[  1.16162280e+09,   8.1460e+01],
[  1.16170920e+09,   8.1050e+01],
[  1.16179560e+09,   8.1680e+01]])

  [Dbg] Numeric.asarray(m, Numeric.Float32)[:10]
 array([[ 9.78109184e+008,  7.e+000],
[ 9.78454784e+008,  7.e+000],
[ 9.78541184e+008,  8.e+000],
[ 9.78627584e+008,  8.e+000],
[ 9.78713984e+008,  8.e+000],
[ 9.78973184e+008,  8.e+000],
[ 9.79059584e+008,  8.e+000],
[ 9.79145984e+008,  8.e+000],
[ 9.79232384e+008,  9.e+000],
[ 9.79318784e+008,  8.e+000]],'f')
 [Dbg] Numeric.asarray(m, Numeric.Float)[:10]
 array([[ 9.78109200e+008,  7.e+000],
[ 9.78454800e+008,  7.e+000],
[ 9.78541200e+008,  8.e+000],
[ 9.78627600e+008,  8.e+000],
[ 9.78714000e+008,  8.e+000],
[ 9.78973200e+008,  8.e+000],
[ 9.79059600e+008,  8.e+000],
[ 9.79146000e+008,  8.e+000],
[ 9.79232400e+008,  9.e+000],
[ 9.79318800e+008,  8.e+000]])

This is odd but we need to know the version numbers of both packages to 
help further.   For one, I'm surprised that you can use Numeric.asarray 
to force cast to Numeric.Float32 without raising an error.

Also, you can ask on numpy-discussion@lists.sourceforge.net to reach an 
audience more directly able to help.

 [Dbg] 
 
 
 and why and what is:
 
 [Dbg] m[0,1]
 7.44
 [Dbg] type(_)
 type 'numpy.float64'
 [Dbg] 
 
 
 does this also slow down python math computations? 

No, not necessarily (depends on what you mean).

Python floats are still Python floats.  NumPy provides, in addition, an 
array scalar for every kind of data that a NumPy array can be composed 
of.  This avoids the problems with being unable to find an appropriate 
Python scalar for a given data-type.  Where possible, the NumPy scalar 
inherits from the Python one.

By default, the NumPy scalars have their own math defined which uses the 
error-mode setting capabilities of NumPy to handle errors.  Right now, 
these operations are a bit slower than Python's built-ins because of the 
way that mixed calculations are handled.

For, the data-types that over-lap with Python scalars you can set things 
up so that NumPy scalars use the Python math instead if you want.  But, 
again, NumPy does nothing to change the way that Python numbers are 
calculated.


 should one better stay away from numpy in current stage of numpy development?

No, definitely not. Don't stay away.  NumPy 1.0 is out.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: numpy slice: create view, not copy

2006-10-12 Thread Travis Oliphant
K. Jansma wrote:
 Hi,
 
 given an array:
 
 import numpy
 a = numpy.arange(100).reshape((10,10))
 print a
 
 [[ 0  1  2  3  4  5  6  7  8  9]
  [10 11 12 13 14 15 16 17 18 19]
  [20 21 22 23 24 25 26 27 28 29]
  [30 31 32 33 34 35 36 37 38 39]
  [40 41 42 43 44 45 46 47 48 49]
  [50 51 52 53 54 55 56 57 58 59]
  [60 61 62 63 64 65 66 67 68 69]
  [70 71 72 73 74 75 76 77 78 79]
  [80 81 82 83 84 85 86 87 88 89]
  [90 91 92 93 94 95 96 97 98 99]]
 
 
 I'd like to create a new array that is a view of a, i.e. it shares data. If
 a is updated, this new array should be automatically 'updated'.
 
 e.g. the array west should be a view of a that gives the element at the left
 of location i,j in a.
 a[i,j] = west[i,j+1]
 
 west can be created using:
 
 a[:,range(-1,a.shape[1]-1)]
 
 As you can see, this also defines periodic boundaries (i.e. west[0,0] = 9)
 but it seems that this returns a copy of a, not a view.
 How can I change the slice definition in such a way it returns a view?

You can't get periodic boundary conditions but

a[:,1::]

should give you (part of) the view you are looking for.


-Travis



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: switching to numpy and failing, a user story

2006-10-04 Thread Travis Oliphant
[EMAIL PROTECTED] wrote:
 Travis E. Oliphant wrote:
 
 
 Given the quality of python's (free) documentation and how good it's
 been for a very long time, it's bit ironic to be using the phrase
 normal open-source documentation on this mailing list. Numeric
 python, which numpy aspires to be a replacement for, has perfectly
 reasonable documentation. 

And it is still perfectly useful.  Only a couple of details have 
changed.  The overall description is still useful.

It wasn't perfect, but it told you pretty
 much everything you needed to know to get started, use the system, and
 build extension modules. I guess this set my expectations for NumPy.
 

This documentation was written largely due to funding from a national 
laboratory.  I didn't have those resources.  If somebody wanted to step 
up to the plate and make me an offer, the NumPy docs could be free as 
well.  So far, people have been content to buy it a piece at a time.

 
 Ask on the mailing lists is viable for the occasional question or
 detail, but it's not really an efficient way to get started with a
 system. At least not for me. But that's fine, I have something that
 works (numeric), and I can do what I need to do there.


Absolutely,  that's the advantage of open source.  If the world moves a 
head you don't *have* to.  It's entirely your choice. There is no lock-in.


-Travis

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: iterator question

2006-09-29 Thread Travis Oliphant
Neal Becker wrote:
 Any suggestions for transforming the sequence:
 
 [1, 2, 3, 4...]
 Where 1,2,3.. are it the ith item in an arbitrary sequence
 
 into a succession of tuples:
 
 [(1, 2), (3, 4)...]
 
 In other words, given a seq and an integer that specifies the size of tuple
 to return, then for example:
 
 seq = [a,b,c,d,e,f]
 for e in transform (seq, 2):
   print e
 
 would return
 (a,b)
 (c,d)
 (e,f)
 

Well, if you have NumPy installed, then this is pretty easy to do by 
reshaping the 1-d array into a 2-d array:

import numpy as N

def transform(seq, num):
 a = N.array(seq)
 a.shape = (-1, num)
 return a

This would return a sequence object that would print using lists

If you really insisted on tuples, then you could either convert the 
elements by replacing the last line with

return [tuple(x) for x in a]

or use a record-array:

import numpy as N

def transform(seq, num):
 a = N.asarray(seq)
 dt = a.dtype
 newdt = [('',dt)]*num
 return a.view(newdt).tolist()

This would return a list of tuples as requested.

-Travis



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Computing correlations with SciPy

2006-03-17 Thread Travis Oliphant
[EMAIL PROTECTED] wrote:
 I want to compute the correlation between two sequences X and Y, and
 tried using SciPy to do so without success.l Here's what I have, how
 can I correct it?
 

This was a bug in NumPy (inherited from Numeric actually).  The fix is 
in SVN of NumPy.

Here are the new versions of those functions that should work as you 
wish (again, these are in SVN, but perhaps you have a binary install).

These functions belong in site-packages/numpy/lib/function_base.py



def cov(m,y=None, rowvar=1, bias=0):
 Estimate the covariance matrix.

 If m is a vector, return the variance.  For matrices return the
 covariance matrix.

 If y is given it is treated as an additional (set of)
 variable(s).

 Normalization is by (N-1) where N is the number of observations
 (unbiased estimate).  If bias is 1 then normalization is by N.

 If rowvar is non-zero (default), then each row is a variable with
 observations in the columns, otherwise each column
 is a variable and the observations are in the rows.
 

 X = asarray(m,ndmin=2)
 if X.shape[0] == 1:
 rowvar = 1
 if rowvar:
 axis = 0
 tup = (slice(None),newaxis)
 else:
 axis = 1
 tup = (newaxis, slice(None))


 if y is not None:
 y = asarray(y,ndmin=2)
 X = concatenate((X,y),axis)

 X -= X.mean(axis=1-axis)[tup]
 if rowvar:
 N = X.shape[1]
 else:
 N = X.shape[0]

 if bias:
 fact = N*1.0
 else:
 fact = N-1.0

 if not rowvar:
 return (dot(X.transpose(), X.conj()) / fact).squeeze()
 else:
 return (dot(X,X.transpose().conj())/fact).squeeze()

def corrcoef(x, y=None, rowvar=1, bias=0):
 The correlation coefficients
 
 c = cov(x, y, rowvar, bias)
 try:
 d = diag(c)
 except ValueError: # scalar covariance
 return 1
 return c/sqrt(multiply.outer(d,d))

-- 
http://mail.python.org/mailman/listinfo/python-list


[ANN] NumPy 0.9.6 released

2006-03-14 Thread Travis Oliphant
This post is to announce the release of NumPy 0.9.6 which fixes some 
important bugs and has several speed improvments.


NumPy is a multi-dimensional array-package for Python that allows rapid 
high-level array computing with Python.  It is successor to both Numeric 
and Numarray.  More information at http://numeric.scipy.org


The release notes are attached:

Best regards,

NumPy Developers




NumPy 0.9.6 is a bug-fix and optimization release with a
few new features: 


  New features (and changes):

  - bigndarray removed and support for Python2.5 ssize_t added giving
 full support in Python2.5 to very-large arrays on 64-bit systems.

  - Strides can be set more arbitrarily from Python (and checking is done 
 to make sure memory won't be violated). 

  - __array_finalize__ is now called for every array sub-class creation.

  - kron and repmat functions added

  - .round() method added for arrays
  
  - rint, square, reciprocal, and ones_like ufuncs added.

  - keyword arguments now possible for methods taking a single 'axis'
 argument

  - Swig and Pyrex examples added in doc/swig and doc/pyrex

  - NumPy builds out of the box for cygwin

  - Different unit testing naming schemes are now supported.

  - memusage in numpy.distutils works for NT platforms

  - numpy.lib.math functions now take vectors

  - Most functions in oldnumeric now return intput class where possible


  Speed ups:
 
  - x**n for integer n signficantly improved

  - array(python scalar) much faster

  - .fill() method is much faster


  Other fixes:

  - Output arrays to ufuncs works better. 

  - Several ma (Masked Array) fixes.

  - umath code generation improved

  - many fixes to optimized dot function (fixes bugs in 
matrix-sub-class multiply)

  - scalartype fixes

  - improvements to poly1d

  - f2py fixed to handle character arrays in common blocks

  - Scalar arithmetic improved to handle mixed-mode operation.

  - Make sure Python intYY types correspond exactly with C PyArray_INTYY


-- 
http://mail.python.org/mailman/listinfo/python-list

Comments sought for PEP 357 --- allowing any object in slice syntax

2006-03-03 Thread Travis Oliphant


This post is to gather feedback from the wider community on PEP 357.  It 
is nearing the acceptance stage and has previously been discussed on 
python-dev.  This is a chance for the wider Python community to comment 
on the proposal.


You will find the PEP attached


PEP: 357
Title: Allowing Any Object to be Used for Slicing
Version: $Revision: 42549 $
Last Modified: $Date: 2006-02-21 21:00:18 -0700 (Tue, 21 Feb 2006) $
Author: Travis Oliphant [EMAIL PROTECTED]
Status: Draft
Type: Standards Track
Created: 09-Feb-2006
Python-Version: 2.5

Abstract

This PEP proposes adding an nb_index slot in PyNumberMethods and an
__index__ special method so that arbitrary objects can be used
whenever integers are explicitly needed in Python, such as in slice
syntax (from which the slot gets its name).

Rationale

Currently integers and long integers play a special role in
slicing in that they are the only objects allowed in slice
syntax. In other words, if X is an object implementing the
sequence protocol, then X[obj1:obj2] is only valid if obj1 and
obj2 are both integers or long integers.  There is no way for obj1
and obj2 to tell Python that they could be reasonably used as
indexes into a sequence.  This is an unnecessary limitation.

In NumPy, for example, there are 8 different integer scalars
corresponding to unsigned and signed integers of 8, 16, 32, and 64
bits.  These type-objects could reasonably be used as integers in
many places where Python expects true integers but cannot inherit from 
the Python integer type because of incompatible memory layouts.  
There should be some way to be able to tell Python that an object can 
behave like an integer.

It is not possible to use the nb_int (and __int__ special method)
for this purpose because that method is used to *coerce* objects
to integers.  It would be inappropriate to allow every object that
can be coerced to an integer to be used as an integer everywhere
Python expects a true integer.  For example, if __int__ were used
to convert an object to an integer in slicing, then float objects
would be allowed in slicing and x[3.2:5.8] would not raise an error
as it should.

Proposal
 
Add an nb_index slot to PyNumberMethods, and a corresponding
__index__ special method.  Objects could define a function to place
in the nb_index slot that returns an appropriate C-integer (Py_ssize_t
after PEP 353).  This C-integer will be used whenever Python needs
one such as in PySequence_GetSlice, PySequence_SetSlice, and
PySequence_DelSlice.  

Specification:

1) The nb_index slot will have the signature

   Py_ssize_t index_func (PyObject *self)

2) The __index__ special method will have the signature

   def __index__(self):
   return obj
   
   where obj must be either an int or a long. 

3) A new C-API function PyNumber_Index will be added with signature

   Py_ssize_t PyNumber_Index (PyObject *obj)

   which will return obj-ob_type-tp_as_number-nb_index(obj) if it is 
available. 
   A -1 will be returned and an exception set on an error. 

4) A new operator.index(obj) function will be added that calls
   equivalent of obj.__index__() and raises an error if obj does not 
implement
   the special method.
   
Implementation Plan

1) Add the nb_index slot in object.h and modify typeobject.c to 
   create the __index__ method

2) Change the ISINT macro in ceval.c to ISINDEX and alter it to 
   accomodate objects with the index slot defined.

3) Change the _PyEval_SliceIndex function to accomodate objects
   with the index slot defined.

4) Change all builtin objects (e.g. lists) that use the as_mapping 
   slots for subscript access and use a special-check for integers to 
   check for the slot as well.

5) Add the nb_index slot to integers and long_integers.

6) Add PyNumber_Index C-API to return an integer from any 
   Python Object that has the nb_index slot.  

7) Add the operator.index(x) function.


Discussion Questions

Speed: 

Implementation should not slow down Python because integers and long
integers used as indexes will complete in the same number of
instructions.  The only change will be that what used to generate
an error will now be acceptable.

Why not use nb_int which is already there?

The nb_int method is used for coercion and so means something
fundamentally different than what is requested here.  This PEP
proposes a method for something that *can* already be thought of as
an integer communicate that information to Python when it needs an
integer.  The biggest example of why using nb_int would be a bad
thing is that float objects already define the nb_int method, but
float objects *should not* be used as indexes in a sequence.

Why the name __index__?

Some

Re: cannot install scipy

2006-02-06 Thread Travis Oliphant
Robert Kern wrote:
 nitro wrote:
 
Hi,

I am using a Debian system. I installed NumPy and everything works
well. When I try to install SciPy, I get the following error. Any help
would be appreciated.

===
[EMAIL PROTECTED]:~/scipy/scipy-0.4.4$ python setup.py install
import core - failed:
/usr/lib/python2.3/site-packages/numpy/core/multiarray.so: undefined
symbol: PyOS_ascii_strtod
import random - failed: 'module' object has no attribute 'dtype'
import lib - failed:
/usr/lib/python2.3/site-packages/numpy/core/multiarray.so: undefined
symbol: PyOS_ascii_strtod
Fatal Python error: can't initialize module lapack_lite
Aborted
[EMAIL PROTECTED]:~/scipy/scipy-0.4.4$

Actually, the numpy install did not go well.  Try

  import numpy

from a standard Python shell.

It looks like you are using Python 2.3 and NumPy 0.9.4 built without the 
required Python 2.3 patch.

-Travis

-- 
http://mail.python.org/mailman/listinfo/python-list


SciPy Core (Numeric replacement) version 0.6.1 released

2005-11-13 Thread Travis Oliphant
Background:

Numeric  is an add-on Python module that has seen widespread adoption.  
It enables Python to be used as a Scientific Computing Environment 
similar to MATLAB or IDL.  Numeric was originally written nearly 10 
years ago, and while still performing admirably, needed much updating to 
take advantage of the new features in Python and to remove old warts.

SciPy Core 0.6.1

SciPy Core is a new system which builds on the code-base of Numeric, but 
implements features (such as advanced index-selection, and user-settable 
error modes).  There are over 25 major new feature enhancements.  The 
LICENSE is still a BSD style License---the same as old Numeric.  More 
information can be found at the web-site: http://numeric.scipy.org

PA HREF=http://numeric.scipy.org;SciPy Core 0.6.1/A - Replacement 
for Numeric Python. (12-Nov-05)
-- 
http://mail.python.org/mailman/listinfo/python-announce-list

Support the Python Software Foundation:
http://www.python.org/psf/donations.html


SciPy 0.4.3 released (built against 0.6.1 of scipy_core)

2005-11-13 Thread Travis Oliphant
Background:

Full scipy builds on top of scipy_core to provide many more tools for 
computational science and engineering.  Included are tools for 
optimization, integration (including ode solvers), signal processing, 
sparse matrices, complete FFTs, complete linear algebra, statistical 
functions, input and output routines, interpolation, integration, and 
many special functions.

SciPy 0.4.3

This version is the first release to build on top of the new scipy_core 
(v 0.6.1).  The code is relatively stable, but there may be some 
lingering bugs from the transition from Numeric.  Please report any 
errors you find.  The LICENSE is a BSD style License---the same as 
scipy_core.  More information can be found (some of which is dated) at  
http://www.scipy.org.  The sourceforge site where it can be downloaded 
is http://sourceforge.net/projects/scipy.

PA HREF=http://www.scipy.org;SciPy (full) 0.4.3/A - Extension 
modules for scipy_core (12-Nov-05)
-- 
http://mail.python.org/mailman/listinfo/python-announce-list

Support the Python Software Foundation:
http://www.python.org/psf/donations.html


ANN: SciPy Core (Numeric Python Replacement) Version 0.4.X (beta) released

2005-09-30 Thread Travis Oliphant

Background:

Numeric  is an add-on Python module that has seen widespread adoption.  
It enables Python to be used as a Scientific Computing Environment 
similar to MATLAB or IDL.  Numeric was originally written nearly 10 
years ago, and while still performing admirably needed much updating to 
take advantage of the new features in Python and to remove old warts.

SciPy Core 0.4.1 (beta)

SciPy Core is a new system which builds on top of Numeric, but 
implements features (such as advanced index-selection, and user-settable 
error modes).  There are over 20 major new features over Numeric.  The 
LICENSE is still a BSD style License---the same as old Numeric.  More 
information can be found at the web-site: http://numeric.scipy.org


The primary developer of scipy core (besides the original creators of 
Numeric upon which it is based) is Travis Oliphant 
([EMAIL PROTECTED]), but his work received ideas and support from a 
wide cast of community members including:  Pearu Peterson, Robert Kern, 
Perry Greenfield, Eric Jones, John Hunter, Fernando Perez, Konrad 
Hinsen, and Paul Dubois.  These individuals should not be held 
responsible for any bugs remaining in the code.



-- 
http://mail.python.org/mailman/listinfo/python-list


Distutils spawn on unix acting strange

2005-03-10 Thread Travis Oliphant
I have a normal looking setup.py file with a single extension module.  
When distutils runs (python setup.py build),  the module compiles fine, 
but an error is issued that seems to indicate that gcc is being called 
with a blank input file (and gives an error). 

It appears that the spawn process inside of distutils is generating two 
calls: one that succeeds in compiling the module (it takes a while to 
compile) and another that is giving an error. 

Here is a typical output:
unning install
running build
running build_py
creating build
creating build/lib.linux-i686-2.3
creating build/lib.linux-i686-2.3/ndarray
copying Lib/numeric.py - build/lib.linux-i686-2.3/ndarray
copying Lib/numeric_version.py - build/lib.linux-i686-2.3/ndarray
copying Lib/numerictypes.py - build/lib.linux-i686-2.3/ndarray
copying Lib/array_printer.py - build/lib.linux-i686-2.3/ndarray
copying Lib/__init__.py - build/lib.linux-i686-2.3/ndarray
running build_ext
building 'ndarray/multiarray' extension
creating build/temp.linux-i686-2.3
creating build/temp.linux-i686-2.3/Src
cc -fno-strict-aliasing -DNDEBUG -O2 -fomit-frame-pointer -pipe 
-march=i586 -mtune=pentiumpro -g -fPIC 
-DSIZEOF_LONG_DOUBLE=12 -IInclude -I/usr/include/python2.3 -c 
Src/multiarraymodule.c -o 
build/temp.linux-i686-2.3/Src/multiarraymodule.o
cc: : No such file or directory
[snip]
error: command 'cc' failed with exit status 1

The error is apparently coming from cc (gcc) which states no such file 
or directory, but there is no file given so apparently cc is being 
called with a blank file (not just no file, but a blank file). 

The trouble is, the module is actually compiling fine (I can run python 
setup.py install again and
it finds the recent build and goes forward).  I also don't get the 
mysterious errror when I just cut-and-paste the
compile line.

I am very confused.  Has anyone seen this or anything like this before?  
Any help appreciated.

-Travis Oliphant
--
http://mail.python.org/mailman/listinfo/python-list


Nevow examples

2005-02-24 Thread Travis Oliphant
There was a request for nevow examples.  Nevow is a fantastic 
web-development framework for Python.

I used nevow to create http://www.scipy.org/livedocs/
This site uses nevow and self introspection to produce (live) 
documentation for scipy based on the internal docstrings.   It would be 
nice to add the capability for users to update the documentation through 
the web-site.  But, that functionality is not complete.

The code itself is available in the util directory of scipy which can be 
checked out of CVS (or browsed).  Go to http://www.scipy.org  for mor 
details.

-Travis Oliphant
--
http://mail.python.org/mailman/listinfo/python-list