I have always tried to avoid transposing the data matrix.

During my design phase for this, I was working on patch-wise patterns.
About then, Chris Dyer tried the simpler approach for a machine translation
problem and got very good results.

The major problem with the transpose is that it requires a MR job of its own
that is nearly as expensive as the multiply.   The combiner really makes a
huge difference to the inner product approach.

On Thu, Dec 3, 2009 at 10:31 AM, Jake Mannix <[email protected]> wrote:

> Wait, this is just doing A'A?  Am I misunderstanding, or is this not most
> easily done by first transposing A into A', and then doing the outer
> products instead of the inner products
>



-- 
Ted Dunning, CTO
DeepDyve

Reply via email to