Chiming in a bit. Looking at the serialization error, it looks like we are
just one little step away from getting this to work.
Noelia, what does your synthesized data look like? All data that is
processed by Spark needs to be serializable. At some point, a
non-serializable vector object showing
Pat, you mentioned the problem could be that the data I was using was too
small. So now I'm using the attached data file as the data (4 users and 100
items). But I'm still getting the same error. I'm sorry I forgot to mention
I had increased the dataset.
The reason why I want to make it work with
Pat, you are absolutely right! I increased the sleep time and now the
integration test for handmade works perfectly.
However, the integration test adapted to run with my tiny app runs into the
same problem I've been having with this app:
[ERROR] [TaskSetManager] Task 1.0 in stage 10.0 (TID 23)
Hi Pat,
I have adapted the integration test for my app. I attach the integration
test as I'm running it, the data file and the screen output of the
integration test. In the latter you can see the same error as before even
though now I have much more data.
I wanted to implement a small example (a
What version of Scala. Spark, PIO, and UR are you using?
On Oct 4, 2017, at 6:10 AM, Noelia Osés Fernández wrote:
Hi all,
I'm still trying to create a very simple app to learn to use PredictionIO and
still having trouble. I have done pio build no problem. But when I do