That sounds interesting.

And, yes, guaranteed zeros would let us prune minors.  And,
ultimately, eigenvalues would correspond to a diagonal of some matrix.

Anyways, I am still characterizing the processes, and I am in no rush
here, but if your code is simple to understand I would be interested
in reading it.

Thanks,

-- 
Raul

On Tue, Jun 12, 2012 at 12:11 PM, Henry Rich <henryhr...@nc.rr.com> wrote:
> Just thinking about your application:
>
> There are good eigenvalue algorithms for almost-diagonal matrices.
>
> If you notionally arrange your n 4x4 matrices along the principal
> diagonal of a 4nx4n matrix, you will have a matrix with only 5 nonzero
> diagonals, and the eigenvalues of this big matrix will be the same as
> the eigenvalues of the individual 4x4s.  Of course, you wouldn't create
> the entire big matrix.
>
> You could convert your pentadiagonal matrix to tridigonal using
> Householder transformations, and then use a tridiagonal solver to get
> the eigenvalues.
>
> [I don't know if the eigenvalues would come out in an order that allows
> you to associate them with the original 4x4s.  I assume it would be
> possible to make that happen].
>
> Numerical Recipes discusses this.  I have code that I translated from
> Numerical Recipes, restricted to the case of symmetric matrices, that
> converts a symmetric matrix to tridiagonal form and then calculates the
> eigenvalues.  I don't have anything to perform the
> pentadiagonal-to-tridiagonal conversion, but one Householder
> transformation should do the trick, and you might be able to work it out
> from the general tridiagonalization code.
>
> I will send you what I have if you like.  The main point is, doing many
> eigenvalue calculations in parallel may be a winner.
>
> Henry Rich
>
>
> On 6/12/2012 11:08 AM, Raul Miller wrote:
>> On Tue, Jun 12, 2012 at 10:41 AM, Henry Rich<henryhr...@nc.rr.com>  wrote:
>>> That may be.  But no way do you do a determinant with just one multiply
>>> - it takes 0 for a 1x1, 2 for a 2x2.
>>
>> You are thinking about scalars.
>>
>>     -/ .(*&([smoutput)) i. 2 2
>> 3 1
>> 0 2
>> _2
>>
>> Here, we are computing 0 2 * 3 1
>>
>> This is what I mean by "one multiply".
>>
>> You are correct that if I investigated at rank 0 that there would be
>> two multiplies.
>>
>>> The definition of DET is so quirky that it makes sense only when you
>>> have subtraction/ . multiplication
>>
>> All I had to do to make it behave like the implementation was terminate
>> recursion when derived lists contain 1 element instead of 0 elements.
>>
>>> for suitably defined operations on
>>> the space of interest.  Even +/ . *  runs in factorial time.  For the
>>> case of finding the characteristic polynomial, why not just use the verb
>>> definitions I gave earlier, rather than requiring a new DET?
>>
>> Which verb?
>>
>> Anyways, one issue here is whether the dictionary and the
>> implementation agree.  Currently, they do not.  But I currently see no
>> value in using the dictionary definition over the actual
>> implementation -- they are equivalent for -/ .* and I have not found a
>> useful case where the dictionary DET does something meaningful that
>> the implementation does not.
>>
>> Another issue here is generality, and motivation.  There is something
>> nice about being able to use the same "determinant primitive" to
>> compute both regular determinants and eigenvalues.  And even if that
>> never gets into the interpreter, I find some joy in modeling things
>> that way.
>>
>> The "resource cost" issue you are raising can be significant, in some
>> contexts, but it's not going to be significant in all contexts.  In
>> the case I am working up to dealing with (several hundred collections
>> 4x4 matrices where each collection typically contains of several
>> thousand of these 4x4 matrices), it's hard for me to see why I should
>> care about it.
>>
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to