On 7/7/06, Robert Kern <[EMAIL PROTECTED]> wrote:
Bill Baxter wrote:
> Robert Kern wrote:
>
>
> The slippery slope argument only applies to the .M, not the .T or .H.

No, it was the "Let's have a .T attribute. And if we're going to do that, then
we should also do this. And this. And this."

There's no slippery slope there.   It's just "Let's have a .T attribute, and if we have that then we should have .H also."  Period.  The slope stops there.    The .M and .A are a separate issue.

>     I don't think that just because arrays are often used for linear
>     algebra that
>
>     linear algebra assumptions should be built in to the core array type.
>
> It's not just that "arrays can be used for linear algebra".  It's that
> linear algebra is the single most popular kind of numerical computing in
> the world!  It's the foundation for a countless many fields.   What
> you're saying is like "grocery stores shouldn't devote so much shelf
> space to food, because food is just one of the products people buy", or
[etc.]

I'm sorry, but the argument-by-inappropriate-analogy is not convincing. Just
because linear algebra is "the base" for a lot of numerical computing does not
mean that everyone is using numpy arrays for linear algebra all the time. Much
less does it mean that all of those conventions you've devised should be shoved
into the core array type. I hold a higher standard for the design of the core
array type than I do for the stuff around it. "It's convenient for what I do,"
just doesn't rise to that level. There has to be more of an argument for it.

My argument is not that "it's convenient for what I do", it's that "it's convenient for what 90% of users want to do".  But unfortunately I can't think of a good way to back up that claim with any sort of numbers. 

But here's one I just found:  http://www.netlib.org/master_counts2.html
download statistics for various numerical libraries on netlib.org.
The top 4 are all linear algebra related:
/lapack 37,373,505
/lapack/lug 19,908,865
/scalapack 14,418,172
/linalg 11,091,511

The next three are more like general computing issues: parallelization lib, performance monitoring, benchmarks:
/pvm3 10,360,012
/performance 7,999,140
/benchmark 7,775,600

Then the next one is more linear algebra.  And that seems to hold pretty far down the list.  It looks like mostly stuff that's either linear algebra related or parallelization/benchmarking related.

And as another example, there's the success of higher level numerical environments like Matlab (and maybe R and S? and Mathematica, and Maple?) that have strong support for linear algebra right in the core, not requiring users to go into some syntax/library ghetto to use that functionality.

I am also curious, given the number of times I've heard this nebulous argument of "there are lots kinds of numerical computing that don't invlolve linear algebra", that no one ever seems to name any of these "lots of kinds".  Statistics, maybe?  But you can find lots of linear algebra in statistics.

--bb
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642
_______________________________________________
Numpy-discussion mailing list
Numpy-discussion@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/numpy-discussion

Reply via email to