Hi!

>> I am not sure I understand the fine difference. As far as we are
>> concerned, all the operations which we are doing (point wise addition,
>> addition, multiplication etc.) are on the linear operator. 

Barry> Certainly pointwise addition is the same as adding two operators
Barry> together (PETSc has this with MatAXPY), but what about, for
Barry> example, pointwise multiply? You could well be correct, but I'd
Barry> like to see the list and what they correspond to in terms of
Barry> linear operators, for example row and column sums what do they
Barry> represent (true they are the multiplication of the matrix (or its
Barry> transpose) by the vector of all ones, but is that a useful
Barry> meaning. 

Well I guess the problem is that we are interested in computing matrices
of the form K_{ij} where the i, j th entry is 

exp( -1/2\sigma^{2} ||x_{i} - x_{j}||_{2}^{2}) 

Here X is a huge matrix (possibly sparse) and x_{i} are the rows of
X. Of course, in a purist sense this is not linear algebra being applied
to a linear operator but it makes sense to use pointwise operations to
compute it. No?

Sample code:

n_sq1 = numarray.add.reduce( x1*x1, 1 ) # vector of norm squareds (slightly 
faster than x1**2)
n_sq2 = numarray.add.reduce( x2*x2, 1 )
n_sq = (-0.5/self.sigma2) * numarray.add.outer( n_sq1, n_sq2 ) # matrix of norm 
squared sums
res = ip # result (alias)
numarray.add( ip, n_sq, res )   
numarray.exp( res, res )
    
vishy


Reply via email to