My algorithms work perfectly fine for small inputs,
but when I go for amazon machine and want to compute larger inputs, my code
hangs on forever
There is one more thing that I'd check (that I already told [1]). You
reported a HPPC bug once but that involved a load factor of 1 on maps.
This was
maven-javadoc-plugin generates javadoc OK for Mahout Math module:
https://builds.apache.org/job/Mahout-Quality/ws/trunk/math/target/site/apidocs/index.html
Jenkins javadoc plugin seems to have some issues with showing javadoc for
generated sources:
Dear all,
Can you say a bit more about what you want to do?
I have an algorithm that finds infrequent patterns in a matrix. I use hppc
libraries to store various data needed to traverse the search space or
store found patterns. I want to convert it to the same one but with mahout
collections
Hello,
I have arff / csv file containing input data that I want to pass to svd :
Lanczos Singular Value Decomposition.
Which tool to use to convert it to required format ?
Thanks in Advance !
Thanks,
Rajesh
From: Sophie Sperner sophie.sper...@gmail.com
To: user@mahout.apache.org
Sent: Monday, May 20, 2013 6:01 AM
Subject: Re: mahout colt collections
Dear all,
Can you say a bit more about what you want to do?
I have an algorithm that finds infrequent
I certainly have questions about this architecture mentioned below but first
let me make sure I understand.
You use the user history vector as a query? This will be a list of item IDs and
strength-of-preference values (maybe 1s for purchases). The cooccurrence matrix
has columns treated like
Hi Pat,
On May 20, 2013, at 9:46am, Pat Ferrel wrote:
I certainly have questions about this architecture mentioned below but first
let me make sure I understand.
You use the user history vector as a query? This will be a list of item IDs
and strength-of-preference values (maybe 1s for
Inline answers.
On Mon, May 20, 2013 at 9:46 AM, Pat Ferrel pat.fer...@gmail.com wrote:
...
You use the user history vector as a query?
The most recent suffix of the history vector. How much is used varies by
the purpose.
This will be a list of item IDs and strength-of-preference values
I think Pat is just saying that
time(history_lookup) (1) + time (recommendation_calculation) (2)
time(precalc_lookop) (3)
since 1 and 3 are assumed to be served by the same system class (key value
store, db) with a single key and 2 0.
ed is using a lot of information that is available at