Wendell pwend...@gmail.com
Date: Thursday, August 14, 2014 at 6:32 PM
To: Gary Malouf malouf.g...@gmail.com
Cc: Mingyu Kim m...@palantir.com, dev@spark.apache.org
dev@spark.apache.org
Subject: Re: [SPARK-3050] Spark program running with 1.0.2 jar cannot run
against a 1.0.1 cluster
I commented
I ran a really simple code that runs with Spark 1.0.2 jar and connects to a
Spark 1.0.1 cluster, but it fails with java.io.InvalidClassException. I
filed the bug at https://issues.apache.org/jira/browse/SPARK-3050.
I assumed the minor and patch releases shouldn¹t break compatibility. Is
that
To be clear, is it 'compiled' against 1.0.2 or it packaged with it?
On Thu, Aug 14, 2014 at 6:39 PM, Mingyu Kim m...@palantir.com wrote:
I ran a really simple code that runs with Spark 1.0.2 jar and connects to
a Spark 1.0.1 cluster, but it fails with java.io.InvalidClassException. I
filed
I commented on the bug. For driver mode, you'll need to get the
corresponding version of spark-submit for Spark 1.0.2.
On Thu, Aug 14, 2014 at 3:43 PM, Gary Malouf malouf.g...@gmail.com wrote:
To be clear, is it 'compiled' against 1.0.2 or it packaged with it?
On Thu, Aug 14, 2014 at 6:39