Sorry for taking long to reply
Here's the original stack trace
it happens directly after spark-submit
btw the code is just bunch of imports, and mahoutSparkContext in the main
function, that's it.
bahaa@sparkserver:~/nystrom/samsara$ spark-submit --master "local[*]"
target/scala-2.10/double-nystr
oh. one more question. are you getting this exception on the front end or
worker? Can you provide the stack trace?
One strange thing is, the backend actually should not ever need the
context, it is only the front end's thing. If you getting this at the
backend, it probably means you are capturing
this is strange. if you took over the context, added jars manually and it
still does not work, there's something wrong with spark i guess or
permissions or those other 1000 things that can go wrong on linux/spark
deployment.
You can try to add any custom jar to your application to call it at backe
yes, 0.11.1
On Tue, Feb 2, 2016 at 12:24 PM, Suneel Marthi
wrote:
> Are u working off of Mahout 0.11.1 ? 0.11.1 has been certified for Spark
> 1.5 but compatible with 1.6.
>
>
> On Tue, Feb 2, 2016 at 12:10 PM, BahaaEddin AlAila
> wrote:
>
> > Thank you very much for your reply.
> > As I mentio
Are u working off of Mahout 0.11.1 ? 0.11.1 has been certified for Spark
1.5 but compatible with 1.6.
On Tue, Feb 2, 2016 at 12:10 PM, BahaaEddin AlAila
wrote:
> Thank you very much for your reply.
> As I mentioned earlier, I am using mahoutSparkContext, and MAHOUT_HOME is
> set to the correct
Thank you very much for your reply.
As I mentioned earlier, I am using mahoutSparkContext, and MAHOUT_HOME is
set to the correct mahout path.
I also have tried setting up the context myself as I looked into the
implementation of mahoutSparkContext and supplied the jars path manually.
still the same
Bahaa, first off, i don't think we have certified any of releases to run
with spar 1.6 (yet). I think spark 1.5 is the last known release to run
with 0.11 series.
Second, if you use mahoutSparkContext() method to create context, it would
look for MAHOUT_HOME setup to add mahout binaries to the job
Greetings mahout users,
I have been trying to use mahout samsara as a library with scala/spark, but
I haven't been successful in doing so.
I am running spark 1.6.0 binaries, didn't build it myself.
However, I tried both readily available binaries on Apache mirrors, and
cloning and compiling mahou