Hi All,
Just realized cloudera version of spark on my cluster is 1.2, the jar which
I built using maven is version 1.6 which is causing issue.
Is there a way to run spark version 1.6 in 1.2 version of spark ?
Thanks
Sri
--
View this message in context:
Hi Sri,
Each node on the cluster where spark can run will have 1.2 version of spark.
If you can, you need to update the cluster to 1.6 spark. Otherwise, you
can't run 1.6 on those nodes.
-honain
kali.tumm...@gmail.com wrote
> Hi All,
>
> Just realized cloudera version of spark on my
If you have yarn you can just launch your spark 1.6 job from a single
machine with spark 1.6 available on it and ignore the version of spark
(1.2) that is installed
On Jan 27, 2016 11:29, "kali.tumm...@gmail.com"
wrote:
> Hi All,
>
> Just realized cloudera version of
Hi Koert,
I am submitting my code (spark jar ) using spark-submit in proxy node , I
checked the version of the cluster and node its says 1.2 I dint really
understand what you mean.
can I ask yarn to use different version of spark ? or should I say override
the spark_home variables to look at 1.6
you need to build spark 1.6 for your hadoop distro, and put that on the
proxy node and configure it correctly to find your cluster (hdfs and yarn).
then use the spark-submit script for that spark 1.6 version to launch your
application on yarn
On Wed, Jan 27, 2016 at 3:11 PM, sri hari kali charan
Sri
Look at the instructions here. They are for 1.5.1, but should also work for
1.6
https://www.linkedin.com/pulse/running-spark-151-cdh-deenar-toraskar-cfa?trk=hp-feed-article-title-publish=true=true
Deenar
On 27 January 2016 at 20:16, Koert Kuipers wrote:
> you need to
Thank you very much, well documented.
Thanks
Sri
On Wed, Jan 27, 2016 at 8:46 PM, Deenar Toraskar
wrote:
> Sri
>
> Look at the instructions here. They are for 1.5.1, but should also work
> for 1.6
>
>
>