Discussion <
numpy-discussion@python.org> wrote:
> On Fri, Jun 27, 2025 at 5:29 PM Benjamin Root via NumPy-Discussion
> wrote:
> >
> > I'm looking at a situation where I like to wrap a C++ function that
> takes two doubles as inputs, and returns an error code, a pos
I'm looking at a situation where I like to wrap a C++ function that takes
two doubles as inputs, and returns an error code, a position vector, and a
velocity vector so that I essentially would have a function signature of
(N), (N) -> (N), (N, 3), (N, 3). When I try to use np.vectorize() or
np.fromp
Shasang,
My main concern is that if there is a legitimate bug in someone's code that
is causing mismatched array sizes, this can mask that bug in certain
situations where the mismatch just so happens to produce arrays of certain
shapes. I am intrigued by the idea, though.
Ben Root
On Tue, Mar 25
When I usually need to do something like that, I just construct a tuple of
slice() objects. No need to use swapaxes(). Or am I missing something?
On Sat, Feb 1, 2025 at 10:24 AM Michael Mullen
wrote:
> Hello,
>
> I am writing a class handling NumPy arrays, and basing it off some
> computations I
Seems to make sense to me? Or is the following a bug?
>>> import numpy as np
>>> u = np.zeros(5)
>>> v = np.ones(5)
>>> u
array([0., 0., 0., 0., 0.])
>>> u[...] = v
>>> u
array([1., 1., 1., 1., 1.])
>>> v[4] = 5
>>> v
array([1., 1., 1., 1., 5.])
>>> u
array([1., 1., 1., 1., 1.])
If you don't do a
LArray might also be useful to look at. I think there was a time when it
didn't use pandas, but it does have it as a dependency now.
https://github.com/larray-project/larray
I think this would be a really useful endeavor. The CDF data model is
extremely useful, and adopting even a piece of it wou
With regards to scheduling wheel builds (and this may already be completely
obvious), but numpy should consider a scheduled build time that comes
reasonably before scipy's scheduled build time so that there is a quicker
turn-around time with possible breaking changes. If you schedule the numpy
whee
1, 2023 at 1:47 PM Benjamin Root
> wrote:
>
>> I'm really confused. Summing from zero should be what cumsum() does now.
>>
>> ```
>> >>> np.__version__
>> '1.22.4'
>> >>> np.cumsum([[1, 2, 3], [4, 5, 6]])
>> array
I'm really confused. Summing from zero should be what cumsum() does now.
```
>>> np.__version__
'1.22.4'
>>> np.cumsum([[1, 2, 3], [4, 5, 6]])
array([ 1, 3, 6, 10, 15, 21])
```
which matches your example in the cumsum0() documentation. Did something
change in a recent release?
Ben Root
On Fri,
I think it is the special values aspect that is most concerning. Math is
just littered with all sorts of identities, especially with trig functions.
While I know that floating point calculations are imprecise, there are
certain properties of these functions that still held, such as going from
-1 to
Just as a quick note, I find it *very* common and handy to do something
like:
someCount = (x > 5).sum()
which requires implicit upcasting of np.bool_ to integer. Just making sure
that usecase isn't forgotten, as it had to be mentioned the last time this
subject came up.
On Mon, Mar 21, 2022 at
I have found that a bunch of lapack functions seem to have arguments for
stating whether or not the given arrays are C or F ordered. Then you
wouldn't need to worry about handling the layout yourself. For example, I
have some C++ code like so:
extern "C" {
/**
* Forward declaration for LAPACK's
(something) in python is only needed if you need to change the order of
precedence or if you need to split something across 2 or more lines.
Otherwise, it has no meaning and it is extraneous.
On Tue, Oct 19, 2021 at 9:42 AM wrote:
> See the following testing in IPython shell:
>
> In [6]: import
Why not both? The definition of the enum might live in a proper namespace
location, but I see no reason why `np.copy.IF_NEEDED =
np.flags.CopyFlgs.IF_NEEDED` can't be done (I mean, adding the enum members
as attributes to the `np.copy()` function). Seems perfectly reasonable to
me, and reads pretty
> One reason was that Sebastian didn't like people doing `x.shape = ...`.
> Users do that, presumably, to trigger an error if a copy needs to be made.
Users do that because it is 1) easier than every other option, and 2) I am
pretty sure we were encouraged to do it this way for the past 10 years.
to be honest, I read "topk" as "topeka", but I am weird. While numpy
doesn't use underscores all that much, I think this is one case where it
makes sense.
I'd also watch out for the use of the term "sorted", as it may mean
different things to different people, particularly with regards to what its
not sure. However, in looking at the dates, it looks like
> that was fixed in scipy as of 2019.
>
> Would you recommend using the scipy save interface as opposed to the numpy
> one?
>
> On Fri, May 14, 2021 at 10:16 AM Benjamin Root
> wrote:
>
>> Perhaps it is a
Perhaps it is a similar bug as this one?
https://github.com/scipy/scipy/issues/6999
Basically, it turned out that the CRC was getting computed on an unflushed
buffer, or something like that.
On Fri, May 14, 2021 at 10:05 AM Isaac Gerg wrote:
> I am using 1.19.5 on Windows 10 using Python 3.8.6
Isn't that just slicing? Or perhaps you are looking for a way to simplify
the calculation of the slice arguments from the original pad arguments?
On Mon, Apr 12, 2021 at 4:15 PM Jeff Gostick wrote:
> I often find myself padding an array to do some processing on it (i.e. to
> avoid edge artifacts
In both of those situations, the `pass` aspect makes sense, although they
probably should specify a better exception class to catch. The first one,
with the copyto() has a comment that explains what is goingon. The second
one, dealing with adding to the docstring, is needed because one can run
pyth
ow or no rows.
np.atleast_1d() is used in matplotlib in a bunch of places where inputs are
allowed to be scalar or lists.
On Thu, Feb 11, 2021 at 1:15 PM Stephan Hoyer wrote:
> On Thu, Feb 11, 2021 at 9:42 AM Benjamin Root
> wrote:
>
>> for me, I find that the at_least{1,2,3}d
for me, I find that the at_least{1,2,3}d functions are useful for
sanitizing inputs. Having an at_leastnd() function can be viewed as a step
towards cleaning up the API, not cluttering it (although, deprecations of
the existing functions probably should be long given how long they have
existed).
O
1.html
On Tue, Nov 24, 2020 at 2:07 PM Charles R Harris
wrote:
>
>
> On Tue, Nov 24, 2020 at 11:54 AM Benjamin Root
> wrote:
>
>>
>> Given that AWS and Azure have both made commitments to have their data
>> centers be carbon neutral, and given that electricity and
Given that AWS and Azure have both made commitments to have their data
centers be carbon neutral, and given that electricity and heat production
make up ~25% of GHG pollution, I find these sorts of
power-usage-analysis-for-the-sake-of-the-environment to be a bit
disingenuous. Especially since GHG p
Another thing to point out about having an array of that percentage of the
available memory is that it severely restricts what you can do with it.
Since you are above 50% of the available memory, you won't be able to
create another array that would be the result of computing something with
that arr
As a concrete example (maybe Ben has others): in astropy we have a
> sigma-clipping average routine, which uses a `MaskedArray` to iteratively
> mask items that are too far off from the mean; here, the mask varies each
> iteration (an initially masked element can come back into play), but the
Just to note, data that is masked isn't always garbage. There are plenty of
use-cases where one may want to temporarily apply a mask for a set of
computation, or possibly want to apply a series of different masks to the
data. I haven't read through this discussion deeply enough, but is this new
cla
> Yes, please do. For people with statistical background, but not CS. It
> seems strange the *real* range() function is used to generate natural
> numbers.
>
> Thanks, Ben!
>
>
>
> On Fri, May 24, 2019 at 10:34 PM Benjamin Root
> wrote:
>
>> This is the numpy
This is the numpy discussion list, not the pandas discussion list. Now, for
numpy's part, I have had hankerings for a `np.minmax()` ufunc, but never
enough to get over just calling min and max on my data separately.
On Fri, May 24, 2019 at 10:27 PM C W wrote:
> Hello all,
>
> I am want to calcul
Ewww, kinda wish that would be an error... It would be too easy for a typo
to get accepted this way.
On Wed, Dec 26, 2018 at 1:59 AM Eric Wieser
wrote:
> In the latest version of numpy, this runs without an error, although may
> or may not be what you want:
>
> In [1]: np.array([[1,2],[[1,2],[3,
Congratulations to Antony for his hard work on this important backend!
As far as I am concerned, the cairo backend is the future of matplotlib.
Test this backend out for yourselves and help us take matplotlib to the
next level in high-quality charting!
Cheers!
Ben Root
On Sun, Jul 22, 2018 at 4:
the openpyxl package will be your friend. Here is a whole chapter on using
it: https://automatetheboringstuff.com/chapter12/
Welcome to python!
Ben Root
On Mon, May 28, 2018 at 12:21 AM, gaurav sinha
wrote:
> Dear Experts
> Greetings!!
>
> *About me- I am Telecom professional having ~10 years o
worry that a masked array package would turn into Basemap.
Ben Root
On Wed, May 23, 2018 at 10:52 PM, Benjamin Root
wrote:
> users of a package does not equate to maintainers of a package. Scikits
> are successful because scientists that have specialty in a field can
> contribute c
users of a package does not equate to maintainers of a package. Scikits are
successful because scientists that have specialty in a field can contribute
code and support the packages using their domain knowledge. How many people
here are specialists in masked/missing value computation?
Would I like
Question: I submitted a BoF (and code sprint), but I didn't get any email
acknowledgement. Were we supposed to? How can we know that the submission
was successful?
On Tue, May 22, 2018 at 7:01 PM, Nelle Varoquaux
wrote:
> Dear all,
>
> (apologies for the cross-posting)
>
> The SciPy conference w
Ah, yes, I should have thought about that. Kind of seems like something
that we could make `np.take()` do, somehow, for something that is easier to
read.
Thank you!
Ben Root
On Mon, Mar 26, 2018 at 2:28 PM, Robert Kern wrote:
> On Mon, Mar 26, 2018 at 11:24 AM, Benjamin Root
>
I seem to be losing my mind... I can't seem to get this to work right.
I have a (N, k) array `distances` (along with a bunch of other arrays of
the same shape). I need to resort the rows, so I do:
indexs = np.argsort(distances, axis=1)
How do I use this index array correctly to get back distance
Hmm, this is neat. I imagine it would finally give some people a choice on
what np.nansum([np.nan]) should return? It caused a huge hullabeloo a few
years ago when we changed it from returning NaN to returning zero.
Ben Root
On Mon, Mar 26, 2018 at 11:16 AM, Sebastian Berg wrote:
> OK, the new
Sorry, I have been distracted with xarray improvements the past couple of
weeks.
Some thoughts on what has been discussed:
First, you are right...Decimal is not the right module for this. I think
instead I should use the 'fractions' module for loading grid spec
information from strings (command-l
17 Matthew Harrigan
> wrote:
>
>> I apologize if I'm missing something basic, but why are floats being
>> accumulated in the first place? Can't arange and linspace operations with
>> floats be done internally similar to `start + np.arange(num_steps) *
>> step_
On Fri, Feb 9, 2018 at 12:19 PM, Chris Barker wrote:
> On Wed, Feb 7, 2018 at 12:09 AM, Ralf Gommers
> wrote:
>>
>> It is partly a plea for some development of numerically accurate
>>> functions for computing lat/lon grids from a combination of inputs: bounds,
>>> counts, and resolutions.
>>>
>
Note, the following is partly to document my explorations in computing
lat/on grids in numpy lately, in case others come across these issues in
the future. It is partly a plea for some development of numerically
accurate functions for computing lat/lon grids from a combination of
inputs: bounds, co
I <3 structured arrays. I love the fact that I can access data by row and
then by fieldname, or vice versa. There are times when I need to pass just
a column into a function, and there are times when I need to process things
row by row. Yes, pandas is nice if you want the specialized indexing
featu
I assume you mean 1.14.0, rather than 1.4.0?
Did recarrays change? I didn't see anything in the release notes.
On Sat, Jan 13, 2018 at 4:25 PM, wrote:
> statsmodels does not work with numpy 1.4.0
>
> Besides the missing WarningsManager there seems to be 22 errors or
> failures from changes in n
Well, to get the ball rolling a bit, the key thing that matplotlib needs to
know is if `shape`, `reshape`, 'size', broadcasting, and logical indexing
is respected. So, I see three possible abc's here: one for attribute access
(things like `shape` and `size`) and another for shape manipulations
(bro
Duck typing is great and all for classes that implement some or all of the
ndarray interface but remember what the main reason for asarray() and
asanyarray(): to automatically promote lists and tuples and other
"array-likes" to ndarrays. Ignoring the use-case of lists of lists is
problematic at
In what way does it not work? Does it error out at the `arr = arr[mask]`
step? Or is it that something unexpected happens?
I am guessing that you are trying to mutate the px, py, pz, w, x, y, z
arrays? If so, that for-loop won't do it. In python, a plain simple
assignment merely makes the variable
One thing that concerns me is trying to keep up with demand. Our tools have
become extremely popular, but it is very difficult for maintainers to keep
up with this demand. So, we seem to have a tendency to "tribalize", in a
sense, focusing on the demand for our respective pet projects. Various
proj
There was discussion awhile back of adopting a `__citation__` attribute.
Anyone remember what happened with that idea?
On Tue, Sep 5, 2017 at 5:21 PM, Feng Yu wrote:
> str(numpy.version.citation) and numpy.version.citation.to_bibtex()?
>
> On Tue, Sep 5, 2017 at 2:15 PM, Paul Hobson wrote:
> >
I've long ago stopped doing any "emptiness is false"-type tests on any
python containers when iterators and generators became common, because they
always return True.
Ben
On Sat, Aug 19, 2017 at 6:05 PM, Marten van Kerkwijk <
m.h.vankerkw...@gmail.com> wrote:
> Agreed with Eric Wieser here have
o be mindful of such
potentially disruptive changes and give users enough of a heads-up about it.
Ben Root
On Thu, Aug 10, 2017 at 1:00 PM, Allan Haldane
wrote:
> On 07/18/2017 09:52 AM, Benjamin Root wrote:
> > This sort of change seems very similar to the np.diag() change a few
&g
So, this is a kernel mechanism?
On Fri, Aug 4, 2017 at 6:31 PM, Joseph Fox-Rabinovitz <
jfoxrabinov...@gmail.com> wrote:
> I would like to propose the addition of a new function,
> `np.neighborwise` in PR#9514. It is based on the discussion relating
> to my proposal for `np.ratio` (PR#9481) and E
This sort of change seems very similar to the np.diag() change a few years
ago. Are there lessons we could learn from then that we could apply to here?
Why would the returned view not be a masked array?
Ben Root
On Tue, Jul 18, 2017 at 9:37 AM, Eric Wieser
wrote:
> When using ndarray.squeeze,
Just a heads-up. There is now a sphinx-gallery plugin. Matplotlib and a few
other projects have migrated their docs over to use it.
https://sphinx-gallery.readthedocs.io/en/latest/
Cheers!
Ben Root
On Sat, Jul 1, 2017 at 7:12 AM, Ralf Gommers wrote:
>
>
> On Fri, Jun 30, 2017 at 6:50 AM, Paul
Forgive my ignorance, but what is "Z/2"?
On Tue, Jun 27, 2017 at 5:35 PM, Nathaniel Smith wrote:
> On Jun 26, 2017 6:56 PM, "Charles R Harris"
> wrote:
>
>
>> On 27 Jun 2017, 9:25 AM +1000, Nathaniel Smith , wrote:
>>
> I guess my preference would be:
>> 1) deprecate +
>> 2) move binary - back
Great news, Nathaniel! It was a huge boost to matplotlib a couple of years
ago when we got an FTE, even if it was just for a few months. While that
effort didn't directly produce any new features, we were able to overhaul
some very old parts of the codebase. Probably why the effort was so
successfu
Check in numpy.recfunctions. I know there is merge_arrays() and
stack_arrays(). I forget which one does what.
Cheers!
Ben Root
On Thu, May 11, 2017 at 1:51 PM, Isaac Gerg wrote:
> I'd prefer to stay in numpy land if possible.
>
> On Thu, May 11, 2017 at 1:17 PM, Isaac Xin Pei wrote:
>
>> Chec
57 matches
Mail list logo