Torgil Svensson wrote:
>>They are supposed to have different defaults because the functional
>>forms are largely for backward compatibility where axis=0 was the default.
>>
>>-Travis
>>
>>
>
>Isn't backwards compatibility what "oldnumeric" is for?
>
>
>
As this discussion indicates there ha
> They are supposed to have different defaults because the functional
> forms are largely for backward compatibility where axis=0 was the default.
>
> -Travis
Isn't backwards compatibility what "oldnumeric" is for?
+1 for consistent defaults.
-
I saw that one as well. Looks neat! Too bad they rarely mention the word "graph" so they never come up on my google searches. I found them through del.icio.us by searching for python and graph.
DaveOn 8/1/06, Pau Gargallo <[EMAIL PROTECTED]> wrote:
you may be interested in this python graph library
you may be interested in this python graph library
https://networkx.lanl.gov/
pau
On 8/1/06, David Grant <[EMAIL PROTECTED]> wrote:
> I actually just looked into the boost graph library and hit a wall. I
> basically had trouble running bjam on it. It complained about a missing
> build file or som
Here are few problems I had with numpy and scipy
1) Compiling scipy on solaris requires running ld -G instead of gcc
-shared. Apparently, gcc was not passing the correct args to my nongnu
ld. I could not figure out how to alter setup.py to link using ld
instead of gcc so I had to link by hand.
I actually just looked into the boost graph library and hit a wall. I basically had trouble running bjam on it. It complained about a missing build file or something like that.Anyways, for now I can live with non-sparse implementation. This is mostly prototyping code for integeration in to a largel
Thanks Bill,I think you are right, I think what I have is what I want (ie. not extending ndarray). I guess do go along with the "whatever makes your life the easiest" mantra, all I am really missing right now is the ability to access my Graph object like this g[blah] with square brackets and to do
Sasha wrote:
>I cannot reproduce your results, but I wonder if the following is right:
>
>
>
a = array([1,2,3,4,5])
var(a[newaxis,:])
>array([ 0., 0., 0., 0., 0.])
>
>
a[newaxis,:].var()
>2.0
>
>
a[newaxis,:].var(axis=0)
>>
Hi David,I often have several thousand nodes in a graph, sometimes clustered into connected components. I suspect that using an adjacency matrix is an inefficient representation for graphs of that size while for smaller graphs the overhead of more complicated structures wouldn't be noticeable. Have
Hi David,
For a graph, the fact that it's stored as a matrix, or stored as
linked nodes, or dicts, etc, is an implementation detail. So from a
classical OO point of view, inheritance is not what you want.
Inheritance says "this is a kind of that". But a graph is not a kind
of matrix. A matrix
I have written my own graph class, it doesn't really do much, just has a few methods, it might do more later. Up until now it has just had one piece of data, an adjacency matrix, so it looks something like this:class Graph:
def __init__(self, Adj): self.Adj = AdjI had the idea of changin
I also couldn't reproduce it on my 0.9.8 on Linux.DGOn 8/1/06, David L Goldsmith <[EMAIL PROTECTED]
> wrote:Hi, Hanno. I ran your sample session in numpy 0.9.8 (on a Mac, just so
you know; I don't yet have numpy installed on my Windows platform, and Idon't have immediate access to a *nix box) and
I cannot reproduce your results, but I wonder if the following is right:
>>> a = array([1,2,3,4,5])
>>> var(a[newaxis,:])
array([ 0., 0., 0., 0., 0.])
>>> a[newaxis,:].var()
2.0
>>> a[newaxis,:].var(axis=0)
array([ 0., 0., 0., 0., 0.])
Are method and function supposed to have different de
Hi all,
I'm attaching some patches that enable the current version of numexpr
(r2142) to:
1. Handle int64 integers in addition to int32 (constants, variables and
arrays). Python int objects are considered int32 if they fit in 32
bits. Python long objects and int objects that don't fit in
Hi, Hanno. I ran your sample session in numpy 0.9.8 (on a Mac, just so
you know; I don't yet have numpy installed on my Windows platform, and I
don't have immediate access to a *nix box) and could not reproduce the
problem, i.e., it does appear to have been fixed in 0.9.8.
DG
Hanno Klemm wrot
Hello,
numpy.var exhibits a rather dangereous behviour, as I have just
noticed. In some cases, numpy.var calculates the variance, and in some
cases the standard deviation (=square root of variance). Is this
intended? I have to admit that I use numpy 0.9.6 at the moment. Has
this been changed in m
> I listened to this and it looks like Sergio Ray is giving an intro class
> on scientific computing with Python and has some concepts confused. We
> should take this as a sign that we need to keep doing a good job of
> educating people.
I'm on UTC+02:00 so only just saw there have been a few p
17 matches
Mail list logo