Is Spark 2.0 master node compatible with Spark 1.5 work node?

2016-09-04 Thread Rex X
Wish to use the Pivot Table feature of data frame which is available since Spark 1.6. But the spark of current cluster is version 1.5. Can we install Spark 2.0 on the master node to work around this? Thanks!

Re: Is Spark 2.0 master node compatible with Spark 1.5 work node?

2016-09-04 Thread Holden Karau
You really shouldn't mix different versions of Spark between the master and worker nodes, if your going to upgrade - upgrade all of them. Otherwise you may get very confusing failures. On Monday, September 5, 2016, Rex X wrote: > Wish to use the Pivot Table feature of data frame which is availab

Re: Is Spark 2.0 master node compatible with Spark 1.5 work node?

2016-09-10 Thread Felix Cheung
You should be able to get it to work with 2.0 as uber jar. What type cluster you are running on? YARN? And what distribution? On Sun, Sep 4, 2016 at 8:48 PM -0700, "Holden Karau" mailto:hol...@pigscanfly.ca>> wrote: You really shouldn't mix different versions of Spark between the master and

Re: Is Spark 2.0 master node compatible with Spark 1.5 work node?

2016-09-10 Thread Holden Karau
I don't think a 2.0 uber jar will play nicely on a 1.5 standalone cluster. On Saturday, September 10, 2016, Felix Cheung wrote: > You should be able to get it to work with 2.0 as uber jar. > > What type cluster you are running on? YARN? And what distribution? > > > > > > On Sun, Sep 4, 2016 at 8

Re: Is Spark 2.0 master node compatible with Spark 1.5 work node?

2016-09-18 Thread Chris Fregly
you'll see errors like this... "java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -2221986757032131007, local class serialVersionUID = -5447855329526097695" ...when mixing versions of

Re: Is Spark 2.0 master node compatible with Spark 1.5 work node?

2016-09-18 Thread Felix Cheung
Well, uber jar works in YARN, but not with standalone ;) On Sun, Sep 18, 2016 at 12:44 PM -0700, "Chris Fregly" mailto:ch...@fregly.com>> wrote: you'll see errors like this... "java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class

Re: Is Spark 2.0 master node compatible with Spark 1.5 work node?

2016-09-26 Thread Rex X
Yes, I have a cloudera cluster with Yarn. Any more details on how to work out with uber jar? Thank you. On Sun, Sep 18, 2016 at 2:13 PM, Felix Cheung wrote: > Well, uber jar works in YARN, but not with standalone ;) > > > > > > On Sun, Sep 18, 2016 at 12:44 PM -0700, "Chris Fregly" > wrote: >

Re: Is Spark 2.0 master node compatible with Spark 1.5 work node?

2016-09-26 Thread Piotr SmoliƄski
In YARN you submit the whole application. This way unless the distribution provider does strange classpath "optimisations" you may just submit Spark 2 application aside of Spark 1.5 or 1.6. It is YARN responsibility to deliver the application files and spark assembly to the workers. What's more, y

Re: Is Spark 2.0 master node compatible with Spark 1.5 work node?

2016-09-26 Thread Koert Kuipers
it is also easy to launch many different spark versions on yarn by simply having them installed side-by-side. 1) build spark for your cdh version. for example for cdh 5 i do: $ git checkout v2.0.0 $ dev/make-distribution.sh --name cdh5.4-hive --tgz -Phadoop-2.6 -Dhadoop.version=2.6.0-cdh5.4.4 -Pya

Re: Is Spark 2.0 master node compatible with Spark 1.5 work node?

2016-09-26 Thread Koert Kuipers
oh i forgot in step1 you will have to modify spark's pom.xml to include cloudera repo so it can find the cloudera artifacts anyhow we found this process to be pretty easy and we stopped using the spark versions bundles with the distros On Mon, Sep 26, 2016 at 3:57 PM, Koert Kuipers wrote: > it