Memory depends on your data and the engine you are using. Spark puts all data
into memory across the Spark cluster so if that is one machine, 4g will not
allow more than toy or example data. Remember that PIO and Machine Learning in
general works best with big data.
BTW my laptop has 16g and
PIO-0.12.0 by default, compiles and runs expecting ES5. If you are upgrading
(not installing from clean) you will have an issue because ES1 indexes are not
upgradable in any simple way. The simplest way to upgrade to pio-0.12.0 and ES5
is to do `pio export` to backup BEFORE upgrading—so export
2017-10-18 19:21 GMT+02:00 Donald Szeto :
> Looks like an out-of-memory issue here. How much memory does the build
> environment has?
>
A virtual server with 1GB in DigitalOcean . A system with 2GB of RAM would
be enough?
>
> On Wed, Oct 18, 2017 at 10:08 AM, Luciano
Chiming in a bit. Looking at the serialization error, it looks like we are
just one little step away from getting this to work.
Noelia, what does your synthesized data look like? All data that is
processed by Spark needs to be serializable. At some point, a
non-serializable vector object showing
Hey Noelia,
Since you are using PIO 0.11 you may want to use the previous template
version that was targeted for PIO 0.11. The latest version is targeted to
0.12’s default stack, which is Scala 2.11.
The only difference between these two template versions is targeting
different PIO versions.
Hi all,
Seeing as I couldn't get the UR template to work I decided to try my luck
with other templates. I have now tried with the Recommendation template as
downloaded from:
https://github.com/apache/incubator-predictionio-template-recommender
I'm using basically the same setup as before: