On 7/6/06, Robert Kern <[EMAIL PROTECTED]> wrote:
> ...
> I don't think that just because arrays are often used for linear algebra that
> linear algebra assumptions should be built in to the core array type.
>

In addition, transpose is a (rank-2) array or matrix operation and not
a linear algebra operation.  Transpose corresponds to the "adjoint"
linear algebra operation if you represent vectors as single column
matrices and co-vectors as single-row matrices.  This is a convenient
representation followed by much of the relevant literature, but it
does not alow generalization beyond rank-2.  Another useful feature is
that inner product can be calculated as the matrix product as long as
you accept a 1x1 matrix for a scalar. This feature does not work
beyond rank-2 either because in order to do tensor inner product you
have to be explicit about the axes being collapsed (for example using
Einstein notation).

Since ndarray does not distinguish between upper an lower indices, it
is not possible distinguish between vectors and co-vectors in any way
other than using matrix convention.  This makes ndarrays a poor model
for linear algebra tensors.

Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642
_______________________________________________
Numpy-discussion mailing list
Numpy-discussion@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/numpy-discussion

Reply via email to