I am not sure how the process works and if patches are applied to all
upcoming versions of spark. Is it likely that the fix is available in this
build (spark 1.6.0 17-Dec-2015 09:02)?
http://people.apache.org/~pwendell/spark-nightly/spark-master-bin/latest/
Thanks!
On Wed, Dec 16, 2015 at 9:22
As far as I can tell, it is not in 1.6.0 RC.
You can comment on the JIRA, requesting backport to 1.6.1
Cheers
On Thu, Dec 17, 2015 at 5:28 AM, Saiph Kappa wrote:
> I am not sure how the process works and if patches are applied to all
> upcoming versions of spark. Is it
Since both scala and java files are involved in the PR, I don't see an easy
way around without building yourself.
Cheers
On Wed, Dec 16, 2015 at 10:18 AM, Saiph Kappa wrote:
> Exactly, but it's only fixed for the next spark version. Is there any work
> around for version
Exactly, but it's only fixed for the next spark version. Is there any work
around for version 1.5.2?
On Wed, Dec 16, 2015 at 4:36 PM, Ted Yu wrote:
> This seems related:
> [SPARK-10123][DEPLOY] Support specifying deploy mode from configuration
>
> FYI
>
> On Wed, Dec 16,
This seems related:
[SPARK-10123][DEPLOY] Support specifying deploy mode from configuration
FYI
On Wed, Dec 16, 2015 at 7:31 AM, Saiph Kappa wrote:
> Hi,
>
> I have a client application running on host0 that is launching multiple
> drivers on multiple remote standalone
Hi,
I have a client application running on host0 that is launching multiple
drivers on multiple remote standalone spark clusters (each cluster is
running on a single machine):
«
...
List("host1", "host2" , "host3").foreach(host => {
val sparkConf = new SparkConf()
sparkConf.setAppName("App")