On Wed, Jul 8, 2009 at 10:39 PM, a...@ajackson.org wrote:
On Fri, Jul 3, 2009 at 10:21 PM, Alan Jacksona...@ajackson.org wrote:
I don't see any problem here. If you can replicate your results, we
would need more information about the versions.
Josef
'''
np.version.version
'1.3.0'
2009/7/9 Pauli Virtanen pav...@iki.fi:
On 2009-07-08, Stéfan van der Walt ste...@sun.ac.za wrote:
I know very little about cache optimality, so excuse the triviality of
this question: Is it possible to design this loop optimally (taking
into account certain build-time measurable parameters),
2009/7/9 Citi, Luca lc...@essex.ac.uk:
Hello
The problem is not PyArray_Conjugate itself.
The problem is that whenever you call a function from the C side
and one of the inputs has ref_count 1, it can be overwritten.
This is not a problem from the python side because if the
ufunc sees a
Hello all,
(resending for the Nth time, as the previous attempts
didn't make it to the list)
I'm new to this list (and numpy is mostly new to me :-).
Using python 2.6 and numpy 1.3.
My plan is to write some C extensions that will perform
rather specialised processing on multichannel
On 9-Jul-09, at 1:12 AM, Mag Gam wrote:
Here is what I have, which does it 1x1:
z={} #dictionary
r=csv.reader(file)
for i,row in enumerate(r):
p=/MIT/+row[1]
if p not in z:
z[p]=0:
else:
z[p]+=1
arr[p]['chem'][z[p]]=tuple(row) #this loads the array 1 x 1
I would like to
Thu, 09 Jul 2009 10:00:25 +0200, Matthieu Brucher kirjoitti:
2009/7/9 Citi, Luca lc...@essex.ac.uk:
Hello
The problem is not PyArray_Conjugate itself. The problem is that
whenever you call a function from the C side and one of the inputs has
ref_count 1, it can be overwritten. This is not a
Matthieu Brucher wrote:
Unfortunately, this is not possible. We've been playing with blocking
loops for a long time in finite difference schemes, and it is always
compiler dependent
You mean CPU dependent, right ? I can't see how a reasonable optimizing
compiler could make a big difference on
On 8-Jul-09, at 6:16 PM, Pauli Virtanen wrote:
Just to tickle some interest, a pathological case before
optimization:
In [1]: import numpy as np
In [2]: x = np.zeros((8, 256))
In [3]: %timeit x.sum(axis=0)
10 loops, best of 3: 850 ms per loop
After optimization:
In
2009/7/9 David Cournapeau da...@ar.media.kyoto-u.ac.jp:
Matthieu Brucher wrote:
Unfortunately, this is not possible. We've been playing with blocking
loops for a long time in finite difference schemes, and it is always
compiler dependent
You mean CPU dependent, right ? I can't see how a
Hello Pauli,
excuse me if I insist, PyArray_Conjugate is not the problem.
If when using the numpy API, it is accepted something like:
ob1 = PyArray_CreateSomehowAnArray();
obj2 = PyArray_DoSomethingWithArray(obj1,...);
obj3 = PyArray_DoSomethingElseWithArray(obj1,...);
Py_DECREF(obj1);
Thu, 09 Jul 2009 09:54:26 +0200, Matthieu Brucher kirjoitti:
2009/7/9 Pauli Virtanen pav...@iki.fi:
[clip]
I'm still kind of hoping that it's possible to make some minimal
assumptions about CPU caches in general, and have a rule that decides a
code path that is good enough, if not optimal.
Thu, 09 Jul 2009 10:03:47 +0100, Citi, Luca kirjoitti:
[clip]
Excuse me if I insist, PyArray_Conjugate is not the problem. If when
using the numpy API, it is accepted something like:
obj1 = PyArray_CreateSomehowAnArray();
obj2 = PyArray_DoSomethingWithArray(obj1,...);
obj3 =
The problem is the array is very large. We are talking about 200+ million rows.
On Thu, Jul 9, 2009 at 4:41 AM, David Warde-Farleyd...@cs.toronto.edu wrote:
On 9-Jul-09, at 1:12 AM, Mag Gam wrote:
Here is what I have, which does it 1x1:
z={} #dictionary
r=csv.reader(file)
for i,row in
Hi all,
I am not a newbie to python and numpy, but however, I kinda do not
find a proper solution for my interpolation problem without coding it
explicitly myself.
All I want to do is to increase the resolution of an tree dimensional array.
I have a volume 'a'
a = numpy.random.rand(3,3,3)
Hi,
you can have a look at the method interp2d of scipy.interpolate.
I think it is what you are looking for.
Best,
Luca
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
You might want also to look into scipy.ndimage.zoom.
Zach
On Jul 9, 2009, at 9:42 AM, Thomas Hrabe wrote:
Hi all,
I am not a newbie to python and numpy, but however, I kinda do not
find a proper solution for my interpolation problem without coding it
explicitly myself.
All I want to do
On Thu, Jul 09, 2009 at 09:35:06AM +, Pauli Virtanen wrote:
Thu, 09 Jul 2009 10:03:47 +0100, Citi, Luca kirjoitti:
[clip]
Excuse me if I insist, PyArray_Conjugate is not the problem. If when
using the numpy API, it is accepted something like:
obj1 =
scipy.ndimage.zoom is exactly what you're looking for, as Zach Pincus
already said.
As far as I know, numpy doesn't have any 3D interpolation routines, so
you'll have to install scipy. Interp2d will only interpolate slices of your
data, not the whole volume.
-Joe
On Thu, Jul 9, 2009 at 8:42 AM,
Yep,
thats the one.
Unfortunately, installing scipy makes my project dependant on another
package. However, I installed it and it works...
Thank you
2009/7/9 Joe Kington jking...@wisc.edu:
scipy.ndimage.zoom is exactly what you're looking for, as Zach Pincus
already said.
As far as I know,
Hello Gaël,
I think it might be an option.
Also one could have an internal flag which says whether or not is safe
to overwrite inputs with ref_count=1.
Then import_array() sets this flag to unsafe (i.e. current behaviour).
If the user of the numpy C-api is aware of how the new feature works,
On Thu, Jul 9, 2009 at 10:30, Citi, Lucalc...@essex.ac.uk wrote:
Hello Gaël,
I think it might be an option.
Also one could have an internal flag which says whether or not is safe
to overwrite inputs with ref_count=1.
Then import_array() sets this flag to unsafe (i.e. current behaviour).
If
On Thu, Jul 9, 2009 at 04:35, Pauli Virtanenp...@iki.fi wrote:
Thu, 09 Jul 2009 10:03:47 +0100, Citi, Luca kirjoitti:
[clip]
Excuse me if I insist, PyArray_Conjugate is not the problem. If when
using the numpy API, it is accepted something like:
obj1 = PyArray_CreateSomehowAnArray();
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
On Thu, Jul 09, 2009 at 10:41:38AM -0500, Robert Kern wrote:
We could change ufunc_generic_call() (i.e. the C implementation of
ufunc.__call__) to use a new function like PyUFunc_GenericFunction
except with the refcount-1 semantics. This allows old C-level to
remain unchanged but let Python
Still speaking of particles, has anyone seen Nvidia's Cuda powered particle
systme demo?
http://www.youtube.com/watch?v=RqduA7myZok
I have a CUDA supported graphics card on my laptop, and tested on the actual
hardware. Seems very cool. However has way too many lines of code for that
piece of
The ndimage package can be accessed as numpy.numarray.nd_image. scipy is not
needed
Nadav
-הודעה מקורית-
מאת: numpy-discussion-boun...@scipy.org בשם Thomas Hrabe
נשלח: ה 09-יולי-09 17:57
אל: Discussion of Numerical Python
נושא: Re: [Numpy-discussion] interpolation in numpy
Yep,
2009/7/9 Nadav Horesh nad...@visionsense.com:
The ndimage package can be accessed as numpy.numarray.nd_image. scipy is not
needed
numpy.numarray.nd_image just imports from scipy.ndimage or a
standalone ndimage if you have built it so. numpy does not contain the
ndimage code.
--
Robert Kern
On Thu, Jul 09, 2009 at 04:30:11PM +0100, Citi, Luca wrote:
Also one could have an internal flag which says whether or not is safe
to overwrite inputs with ref_count=1.
Then import_array() sets this flag to unsafe (i.e. current behaviour).
If the user of the numpy C-api is aware of how the
there's a wrapper called PyCuda. but you actually write cuda code as a
docstring and its compiled and exectuted at run time.
I think it can be done more pythonic.
On Thu, Jul 9, 2009 at 12:31 PM, Gökhan SEVERgokhanse...@gmail.com wrote:
Still speaking of particles, has anyone seen Nvidia's
On Thu, Jul 9, 2009 at 11:44, Fons Adriaensenf...@kokkinizita.net wrote:
On Thu, Jul 09, 2009 at 04:30:11PM +0100, Citi, Luca wrote:
Also one could have an internal flag which says whether or not is safe
to overwrite inputs with ref_count=1.
Then import_array() sets this flag to unsafe (i.e.
This showed up on numpy-discussion. I posted this morning and it hasn't
appeared yet.
Ian
Nadav Horesh wrote:
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
--
Ian
[take 6 on sending this -- I'm subscribed to numpy-discuss, but this
post refuses to show up]
I have an large array consisting of 5-tuples. I'd like to select the
first and second columns in order to produce a scatter plot. Each tuple
consists of mixed types (floats and strings). The Matlab
Let me see if I understand correctly...
what you suggest is something like:
1) adding an argument flag to construct_arrays
that enables/disables the feature
2) adding the same argument flag to construct_loop which
is passed untouched to construct_arrays
3) set the flag to disable in the
Make sure you send the message as plain text, rather than with rich
formatting, and it should show up right away.
Mark
On Thu, Jul 9, 2009 at 11:18 AM, Ian Stokes-Rees
ijsto...@crystal.harvard.edu wrote:
[take 6 on sending this -- I'm subscribed to numpy-discuss, but this
post refuses to show
On 2009-07-09, Citi, Luca lc...@essex.ac.uk wrote:
Let me see if I understand correctly...
what you suggest is something like:
1) adding an argument flag to construct_arrays
that enables/disables the feature
2) adding the same argument flag to construct_loop which
is passed untouched to
On Thu, Jul 09, 2009 at 12:00:23PM -0500, Robert Kern wrote:
On Thu, Jul 9, 2009 at 11:44, Fons Adriaensenf...@kokkinizita.net wrote:
There is a simple rule which says that if you use an object
pointer as a function argument you must INCREF it. This is
just the logical consequence of
Hello all,
(resending for the Nth time, as the previous attempts
didn't make it to the list)
I'm new to this list (and numpy is mostly new to me :-).
Using python 2.6 and numpy 1.3.
My plan is to write some C extensions that will perform
rather specialised processing on multichannel
On Thu, Jul 9, 2009 at 15:20, Fons Adriaensenf...@kokkinizita.net wrote:
On Thu, Jul 09, 2009 at 12:00:23PM -0500, Robert Kern wrote:
On Thu, Jul 9, 2009 at 11:44, Fons Adriaensenf...@kokkinizita.net wrote:
There is a simple rule which says that if you use an object
pointer as a function
I tried to implement what you suggest.
The patch is in the ticket page.
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
say i have an Nx4 array of points and I want to dot every [n, :] 1x4
slice with a 4x4 matrix.
Currently I am using apply_along_axis in the following manner:
def func(slice, mat):
return np.dot(mat, slice)
np.apply_along_axis(func, arr, 1, mat)
Is there a more efficient way of doing this
On Thu, Jul 9, 2009 at 7:08 PM, Chris Colbertsccolb...@gmail.com wrote:
say i have an Nx4 array of points and I want to dot every [n, :] 1x4
slice with a 4x4 matrix.
Currently I am using apply_along_axis in the following manner:
def func(slice, mat):
return np.dot(mat, slice)
no, because dot(x,y) != dot(y,x)
x = np.random.rand(3,4)
y = np.random.rand(4,4)
np.dot(x, y)
array([[ 1.67624043, 1.66719374, 1.72465017, 1.20372021],
[ 0.70046162, 0.60187869, 0.73094349, 0.4604766 ],
[ 0.78707401, 1.01959666, 0.61617829, 0.43147398]])
np.dot(y, x[0,:
On Thu, Jul 9, 2009 at 9:36 PM, Chris Colbertsccolb...@gmail.com wrote:
no, because dot(x,y) != dot(y,x)
Try this, then:
In [24]: x = np.random.rand(10,4)
In [25]: y = np.random.rand(4,4)
In [26]: result = np.dot(y,x.T).T
In [39]: for i, res in enumerate(result):
: assert
hey,
great man! thanks!
I had thought that it may have been possible with a single dot, but
how to do it escaped me.
Thanks again!
Chris
On Thu, Jul 9, 2009 at 11:45 PM, Kurt Smithkwmsm...@gmail.com wrote:
On Thu, Jul 9, 2009 at 9:36 PM, Chris Colbertsccolb...@gmail.com wrote:
no, because
If i have two arrays representing start points and end points, is
there a function that will return a 2d array where each row is the
range(start, end, n) where n is a fixed number of steps and is the
same for all rows?
___
NumPy-Discussion mailing list
actually what would be better is if i can pass two 1d arrays X and Y
both size Nx1
and get back a 2d array of size NxM where the [n,:] row is the linear
interpolation of X[n] to Y[n]
On Fri, Jul 10, 2009 at 1:16 AM, Chris Colbertsccolb...@gmail.com wrote:
If i have two arrays representing
46 matches
Mail list logo