On 06/22/2010 02:58 PM, josef.p...@gmail.com wrote:
On Tue, Jun 22, 2010 at 10:09 AM, Tom Durrantthdurr...@gmail.com wrote:
the basic idea is in polyfit on multiple data points on
numpy-disscusion mailing list April 2009
In this case, calculations have to be done by groups
subtract
the basic idea is in polyfit on multiple data points on
numpy-disscusion mailing list April 2009
In this case, calculations have to be done by groups
subtract mean (this needs to be replaced by group demeaning)
modeldm = model - model.mean()
obsdm = obs - obs.mean()
xx =
What exactly are trying to fit because it is rather bad practice to fit
a model to some summarized data as you lose the uncertainty in the
original data?
If you define your boxes, you can loop through directly on each box and
even fit the equation:
model=mu +beta1*obs
The extension is
On 06/22/2010 09:13 AM, Tom Durrant wrote:
What exactly are trying to fit because it is rather bad practice
to fit
a model to some summarized data as you lose the uncertainty in the
original data?
If you define your boxes, you can loop through directly on each
box
On Tue, Jun 22, 2010 at 10:09 AM, Tom Durrant thdurr...@gmail.com wrote:
the basic idea is in polyfit on multiple data points on
numpy-disscusion mailing list April 2009
In this case, calculations have to be done by groups
subtract mean (this needs to be replaced by group demeaning)
On Sun, Jun 20, 2010 at 10:57 PM, Tom Durrant thdurr...@gmail.com wrote:
are you doing something like np.polyfit(model, obs, 1) ?
If you are using polyfit with deg=1, i.e. fitting a straight line,
then this could be also calculated using the weights in histogram2d.
histogram2d
On 06/20/2010 03:24 AM, Tom Durrant wrote:
Hi All,
I have a problem involving lat/lon data. Basically, I am evaluating
numerical weather model data against satellite data, and trying to
produce gridded plots of various statistics. There are various steps
involved with this, but
Hi All,
I have a problem involving lat/lon data. Basically, I am evaluating
numerical weather model data against satellite data, and trying to produce
gridded plots of various statistics. There are various steps involved with
this, but basically, I get to the point where I have four arrays of
On Sun, Jun 20, 2010 at 4:24 AM, Tom Durrant thdurr...@gmail.com wrote:
Hi All,
I have a problem involving lat/lon data. Basically, I am evaluating
numerical weather model data against satellite data, and trying to produce
gridded plots of various statistics. There are various steps involved
are you doing something like np.polyfit(model, obs, 1) ?
If you are using polyfit with deg=1, i.e. fitting a straight line,
then this could be also calculated using the weights in histogram2d.
histogram2d (histogramdd) uses np.digitize and np.bincount, so I'm
surprised if the
Hello Andreas,
please see this as a side remark.
A colleague of mine made me aware of a very beautiful thing about
covering spheres by evenly spaced points:
http://healpix.jpl.nasa.gov/
Since you want to calculate mean and stddev, to my understanding a
grid in longitude/latitude is without
On Wed, Jun 2, 2010 at 11:40 AM, Andreas Hilboll li...@hilboll.de wrote:
Hi there,
I'm interested in the solution to a special case of the parallel thread
'2D binning', which is going on at the moment. My data is on a fine global
grid, say .125x.125 degrees. I'm looking for a way to do
On 06/02/2010 04:52 AM, josef.p...@gmail.com wrote:
On Tue, Jun 1, 2010 at 9:57 PM, Zachary Pincuszachary.pin...@yale.edu
wrote:
I guess it's as fast as I'm going to get. I don't really see any
other way. BTW, the lat/lons are integers)
You could (in c or cython) try a brain-dead hashtable
Why not simply use a set?
uniquePoints = set(zip(lats, lons))
Ben Root
On Wed, Jun 2, 2010 at 2:41 AM, Vincent Schut sc...@sarvision.nl wrote:
On 06/02/2010 04:52 AM, josef.p...@gmail.com wrote:
On Tue, Jun 1, 2010 at 9:57 PM, Zachary Pincuszachary.pin...@yale.edu
wrote:
I guess it's as
Hi there,
I'm interested in the solution to a special case of the parallel thread
'2D binning', which is going on at the moment. My data is on a fine global
grid, say .125x.125 degrees. I'm looking for a way to do calculations on
coarser grids, e.g.
* calculate means()
* calculate std()
* ...
On Wed, Jun 2, 2010 at 3:41 AM, Vincent Schut sc...@sarvision.nl wrote:
On 06/02/2010 04:52 AM, josef.p...@gmail.com wrote:
On Tue, Jun 1, 2010 at 9:57 PM, Zachary Pincuszachary.pin...@yale.edu
wrote:
I guess it's as fast as I'm going to get. I don't really see any
other way. BTW, the
thanks. I am also getting an error in ndi.mean
Were you getting the error
RuntimeError: data type not supported?
-Mathew
On Wed, Jun 2, 2010 at 9:40 AM, Wes McKinney wesmck...@gmail.com wrote:
On Wed, Jun 2, 2010 at 3:41 AM, Vincent Schut sc...@sarvision.nl wrote:
On 06/02/2010 04:52 AM,
On Wed, Jun 2, 2010 at 1:23 PM, Mathew Yeates mat.yea...@gmail.com wrote:
thanks. I am also getting an error in ndi.mean
Were you getting the error
RuntimeError: data type not supported?
-Mathew
On Wed, Jun 2, 2010 at 9:40 AM, Wes McKinney wesmck...@gmail.com wrote:
On Wed, Jun 2, 2010 at
I'm on Windows, using a precompiled binary. I never built numpy/scipy on
Windows.
On Wed, Jun 2, 2010 at 10:45 AM, Wes McKinney wesmck...@gmail.com wrote:
On Wed, Jun 2, 2010 at 1:23 PM, Mathew Yeates mat.yea...@gmail.com
wrote:
thanks. I am also getting an error in ndi.mean
Were you
On Wed, Jun 2, 2010 at 2:09 PM, Mathew Yeates mat.yea...@gmail.com wrote:
I'm on Windows, using a precompiled binary. I never built numpy/scipy on
Windows.
ndimage measurements has been recently rewritten. ndimage is very fast
but (the old version) has insufficient type checking and may crash
On Wed, Jun 2, 2010 at 2:26 PM, josef.p...@gmail.com wrote:
On Wed, Jun 2, 2010 at 2:09 PM, Mathew Yeates mat.yea...@gmail.com wrote:
I'm on Windows, using a precompiled binary. I never built numpy/scipy on
Windows.
ndimage measurements has been recently rewritten. ndimage is very fast
but
Nope. This version didn't work either.
If you're on Python 2.6 the binary on here might work for you:
http://www.lfd.uci.edu/~gohlke/pythonlibs/
It looks recent enough to have the rewritten ndimage
___
NumPy-Discussion mailing list
On 6/2/2010 2:32 PM, Mathew Yeates wrote:
Nope. This version didn't work either.
If you're on Python 2.6 the binary on here might work for you:
http://www.lfd.uci.edu/~gohlke/pythonlibs/
It looks recent enough to have the rewritten ndimage
On 6/2/2010 2:32 PM, Mathew
On 1/06/2010 10:51 PM, Wes McKinney wrote:
snip
This is a pretty good example of the group-by problem that will
hopefully work its way into a future edition of NumPy.
Wes (or anyone else), please can you elaborate on any plans for groupby?
I've made my own modification to numpy.bincount
On Wed, Jun 2, 2010 at 6:18 PM, Stephen Simmons m...@stevesimmons.com wrote:
On 1/06/2010 10:51 PM, Wes McKinney wrote:
snip
This is a pretty good example of the group-by problem that will
hopefully work its way into a future edition of NumPy.
Wes (or anyone else), please can you
Hi
Can anyone think of a clever (non-lopping) solution to the following?
A have a list of latitudes, a list of longitudes, and list of data values.
All lists are the same length.
I want to compute an average of data values for each lat/lon pair. e.g. if
lat[1001] lon[1001] = lat[2001] [lon
On Tue, Jun 1, 2010 at 1:07 PM, Mathew Yeates mat.yea...@gmail.com wrote:
Hi
Can anyone think of a clever (non-lopping) solution to the following?
A have a list of latitudes, a list of longitudes, and list of data values.
All lists are the same length.
I want to compute an average of data
Hi
Can anyone think of a clever (non-lopping) solution to the following?
A have a list of latitudes, a list of longitudes, and list of data
values. All lists are the same length.
I want to compute an average of data values for each lat/lon pair.
e.g. if lat[1001] lon[1001] = lat[2001]
On Tue, Jun 1, 2010 at 4:49 PM, Zachary Pincus zachary.pin...@yale.edu wrote:
Hi
Can anyone think of a clever (non-lopping) solution to the following?
A have a list of latitudes, a list of longitudes, and list of data
values. All lists are the same length.
I want to compute an average of
I guess it's as fast as I'm going to get. I don't really see any other way.
BTW, the lat/lons are integers)
-Mathew
On Tue, Jun 1, 2010 at 1:49 PM, Zachary Pincus zachary.pin...@yale.eduwrote:
Hi
Can anyone think of a clever (non-lopping) solution to the following?
A have a list of
I guess it's as fast as I'm going to get. I don't really see any
other way. BTW, the lat/lons are integers)
You could (in c or cython) try a brain-dead hashtable with no
collision detection:
for lat, long, data in dataset:
bin = (lat ^ long) % num_bins
hashtable[bin] =
On Tue, Jun 1, 2010 at 9:57 PM, Zachary Pincus zachary.pin...@yale.edu wrote:
I guess it's as fast as I'm going to get. I don't really see any
other way. BTW, the lat/lons are integers)
You could (in c or cython) try a brain-dead hashtable with no
collision detection:
for lat, long, data in
On Tue, Jun 1, 2010 at 1:51 PM, Wes McKinney wesmck...@gmail.com wrote:
On Tue, Jun 1, 2010 at 4:49 PM, Zachary Pincus zachary.pin...@yale.edu
wrote:
Hi
Can anyone think of a clever (non-lopping) solution to the following?
A have a list of latitudes, a list of longitudes, and list of data
33 matches
Mail list logo