I don't know the goal here. What you did was start with two shadows of a 3 dimensional data set (user x item x dayOfWeek) in the form of two projections into 2 dimensional form (user x item formed by summing over dayOfWeek and user x day formed by summing over user).
You seem to be trying to form another shadow (item x day) by composing the first two. Since you have lost information in the first place, you can't necessarily do this except in special cases. What you have here is the beginnings of a rank-1 decomposition of a tensor. With matrices, the minimum squared error decomposition of this type is the SVD and is unique down to order of the singular values and sign. Unfortunately, there is no comparable unique decompositions for tensors. Lots of people have worked on the problem, but there is no clear consensus for the best way to approach it. There is a related point where you have two independent actions user x item_type_1 and user x item_type_2. Here the product that gives you item_type_1 x item_type_2 gives useful information because you don't really have a tensor in the first case, just two matrices. This is the case I am talking about when I refer to "cross-recommendation". I generally prefer to deal with problems like this from the viewpoint of generalized logistic regression with latent factors chosen to provide a model in a useful form. This avoids the ambiguity associated with tensor decompositions and leads directly to a form that can be optimized. On Mon, Nov 22, 2010 at 11:24 PM, Lance Norskog <[email protected]> wrote: > > Now, multiply these two matrices. The product is 2 Users v.s. 7 Days > of the Week: > S,M,T,W,T,F,S > { > U1 {2,4,9,0,0,0,0} > U2 {4,3,12,0,0,0,0} > } > > This matrix carries the total amount of enthusiasm for each user on > each day. To get the average enthusiasm of each user, divide each row > by the total number of ratings per day: > S,M,T,W,T,F,S > { > U1 {2,4,3,0,0,0,0} > U2 {4,3,4,0,0,0,0} > } > > Did I get this right, Ted?
