> *Cc:* Trevor Grant; Ted Dunning; s...@apache.org
> *Subject:* Re: Lambda and Kappa CCO
>
> Agreed. Downsampling was ignored in several places and with it a great
> deal of input is a noop. Without downsampling too many things need to
> change.
>
> Also everything is depen
42:55 PM
> *To:* Ted Dunning; user@mahout.apache.org
> *Cc:* Trevor Grant; Ted Dunning; s...@apache.org
> *Subject:* Re: Lambda and Kappa CCO
>
> Agreed. Downsampling was ignored in several places and with it a great
> deal of input is a noop. Without downsampling too many things need to
>
@mahout.apache.org
Cc: Trevor Grant; Ted Dunning; s...@apache.org
Subject: Re: Lambda and Kappa CCO
Agreed. Downsampling was ignored in several places and with it a great deal of
input is a noop. Without downsampling too many things need to change.
Also everything is dependent on this rather vague
to change.
This becomes feasible if you include the effect of down-sampling, but that has
to be in the algorithm.
From: Pat Ferrel
Sent: Saturday, March 25, 2017 12:01:00 PM
To: Trevor Grant; user@mahout.apache.org
Cc: Ted Dunning; s...@apache.org
Subject: Lambda and Kappa CCO
This is an
This is an overview and proposal for turning the multi-modal Correlated
Cross-Occurrence (CCO) recommender from Lambda-style into an online streaming
incrementally updated Kappa-style learner.
# The CCO Recommender: Lambda-style
We have largely solved the problems of calculating the multi-modal