Re: Lambda and Kappa CCO

2017-04-17 Thread Pat Ferrel
> *Cc:* Trevor Grant; Ted Dunning; s...@apache.org > *Subject:* Re: Lambda and Kappa CCO > > Agreed. Downsampling was ignored in several places and with it a great > deal of input is a noop. Without downsampling too many things need to > change. > > Also everything is depen

Re: Lambda and Kappa CCO

2017-04-09 Thread Trevor Grant
42:55 PM > *To:* Ted Dunning; user@mahout.apache.org > *Cc:* Trevor Grant; Ted Dunning; s...@apache.org > *Subject:* Re: Lambda and Kappa CCO > > Agreed. Downsampling was ignored in several places and with it a great > deal of input is a noop. Without downsampling too many things need to >

Re: Lambda and Kappa CCO

2017-04-09 Thread Andrew Palumbo
@mahout.apache.org Cc: Trevor Grant; Ted Dunning; s...@apache.org Subject: Re: Lambda and Kappa CCO Agreed. Downsampling was ignored in several places and with it a great deal of input is a noop. Without downsampling too many things need to change. Also everything is dependent on this rather vague

Re: Lambda and Kappa CCO

2017-03-27 Thread Pat Ferrel
to change. This becomes feasible if you include the effect of down-sampling, but that has to be in the algorithm. From: Pat Ferrel Sent: Saturday, March 25, 2017 12:01:00 PM To: Trevor Grant; user@mahout.apache.org Cc: Ted Dunning; s...@apache.org Subject: Lambda and Kappa CCO This is an

Lambda and Kappa CCO

2017-03-25 Thread Pat Ferrel
This is an overview and proposal for turning the multi-modal Correlated Cross-Occurrence (CCO) recommender from Lambda-style into an online streaming incrementally updated Kappa-style learner. # The CCO Recommender: Lambda-style We have largely solved the problems of calculating the multi-modal