Are you using 0.8.1? It will build with protobuf 2.5 instead of 2.4 as long as
you make it depend on Hadoop 2.2. But make sure you build it with
SPARK_HADOOP_VERSION=2.2.0 or whatever.
Spark 0.8.0 doesn’t support Hadoop 2.2 due to this issue.
Matei
On Dec 15, 2013, at 10:25 PM, Azuryy Yu
Thanks Evan, I tried it and the new SBT direct import seems to work well,
though I did run into issues with some yarn imports on Spark.
n
On Thu, Dec 12, 2013 at 7:03 PM, Evan Chan e...@ooyala.com wrote:
Nick, have you tried using the latest Scala plug-in, which features native
SBT project
Great job everyone! A big step forward.
On Sat, Dec 14, 2013 at 2:37 AM, andy.petre...@gmail.com
andy.petre...@gmail.com wrote:
That's a very good news!
Congrats
Envoyé depuis mon HTC
- Reply message -
De : Sam Bessalah samkil...@gmail.com
Pour : dev@spark.incubator.apache.org
Any news regarding this setting? Is this expected behaviour? Is there some
other way I can have Spark fail-fast?
Thanks!
On Mon, Dec 9, 2013 at 4:35 PM, Grega Kešpret gr...@celtra.com wrote:
Hi!
I tried this (by setting spark.task.maxFailures to 1) and it still does
not fail-fast. I started
I just merged your pull request
https://github.com/apache/incubator-spark/pull/245
On Mon, Dec 16, 2013 at 2:12 PM, Grega Kešpret gr...@celtra.com wrote:
Any news regarding this setting? Is this expected behaviour? Is there some
other way I can have Spark fail-fast?
Thanks!
On Mon, Dec 9,
i guess it should really be maximum number of total task run attempts.
At least that's what it looks logically. in that sense, the rest of the
documentation is correct ( should be at least 1; 1 = task is allowed no
retries (1-1=0)).
On Fri, Nov 29, 2013 at 2:02 AM, Grega Kešpret