This is a relatively small matrix. You can decompose a matrix like that, at least approximately in a pretty short time. With stock R (has no BLAS acceleration), it takes 9ms to decompose a 100 x 100 dense matrix and 6.2 seconds to decompose a 1K x 1K dense matrix. Since this is so fast, there is little point in using Hadoop for this.
HPC guys pretty much never invert a matrix larger than 10x10 unless it is diagonal or other special form. Even if you think you need an inverse, usually all you need to do is multiply by the inverse which is better done using a decomposed form instead of a real inverse. Matrix libraries will often cheat a bit and substitute a matrix object that is really a decomposition when you ask for an inverse. Unless you look at the elements of the matrix, they can leave it at that and you get speed and a warm feeling that you have an inverse matrix. One of the major reasons for avoiding the inverse at the size that you are talking about is that using the inverse directly will probably cause you to have a large loss of precision. On Mon, Jan 21, 2013 at 1:12 AM, Colin Wang <colin.bin.wang.mah...@gmail.com > wrote: > Hi Koobas, > > I am trying on dense matrix in Hadoop, thousand times thousand square size. > How do HPC guys to solve this problem? Any references? > > Thank you, > Colin > > On Mon, Jan 21, 2013 at 11:49 AM, Koobas <koo...@gmail.com> wrote: > > > Colin, > > I am more of an HPC guys. > > I am a Mahout noob myself. > > Are we talking about a dense matrix? > > What size? > > > > > > On Sun, Jan 20, 2013 at 9:34 PM, Colin Wang < > > colin.bin.wang.mah...@gmail.com > > > wrote: > > > > > Hi Koobas, > > > > > > I want the first one. Do you have any suggestions? > > > > > > Thank you, > > > Colin > > > On Fri, Jan 18, 2013 at 12:49 PM, Koobas <koo...@gmail.com> wrote: > > > > > > > Martix inversion > > > > > >