Hi,
I just configured my cluster to run with 1.4.0-rc2, alas the dependency
jungle does not one let just download, config and start. Instead one
will have to fiddle with sbt settings for the upcoming couple of nights:
2015-05-26 14:50:52,686 WARN a.r.ReliableDeliverySupervisor -
Hello,
I am using Spark 1.3.1-hadoop2.4 with Mesos 0.22.1 with zookeeper and
running on a cluster with 3 nodes on 64bit ubuntu.
My application is compiled with spark 1.3.1 (apparently with mesos
0.21.0 dependency), hadoop 2.5.1-mapr-1503 and akka 2.3.10. Only with
this combination I have
On Mon, May 25, 2015 at 2:43 PM, Reinis Vicups sp...@orbit-x.de wrote:
Hello,
I am using Spark 1.3.1-hadoop2.4 with Mesos 0.22.1 with zookeeper and
running on a cluster with 3 nodes on 64bit ubuntu.
My application is compiled with spark 1.3.1 (apparently with mesos 0.21.0
dependency),
Hello,
I assume I am running spark in a fine-grained mode since I haven't
changed the default here.
One question regarding 1.4.0-RC1 - is there a mvn snapshot repository I
could use for my project config? (I know that I have to download source
and make-distribution for executor as well)
Great hints, you guys!
Yes spark-shell worked fine with mesos as master. I haven't tried to
execute multiple rdd actions in a row though (I did couple of
successful counts on hbase tables i am working with in several
experiments but nothing that would compare to the stuff my spark jobs
are
Here is a link for builds of 1.4 RC2:
http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc2-bin/
For a mvn repo, I believe the RC2 artifacts are here:
https://repository.apache.org/content/repositories/orgapachespark-1104/
A few experiments you might try:
1. Does spark-shell work?