Encrypting jobs submitted by the client

2016-02-01 Thread eugene miretsky
Spark supports client authentication via shared secret or kerberos (on YARN). However, the job itself is sent unencrypted over the network. Is there a way to encrypt the jobs the client submits to cluster? The rational for this is very similar to encrypting the HTTP file server traffic - Jars

Re: Spark 1.6.1

2016-02-01 Thread Michael Armbrust
We typically do not allow changes to the classpath in maintenance releases. On Mon, Feb 1, 2016 at 8:16 AM, Hamel Kothari wrote: > I noticed that the Jackson dependency was bumped to 2.5 in master for > something spark-streaming related. Is there any reason that this

Re: Scala 2.11 default build

2016-02-01 Thread Jakob Odersky
Awesome! +1 on Steve Loughran's question, how does this affect support for 2.10? Do future contributions need to work with Scala 2.10? cheers On Mon, Feb 1, 2016 at 7:02 AM, Ted Yu wrote: > The following jobs have been established for build against Scala 2.10: > >

Re: Scala 2.11 default build

2016-02-01 Thread Reynold Xin
Yes they do. We haven't dropped 2.10 support yet. There are too many 2.10 active deployments out there. On Mon, Feb 1, 2016 at 11:33 AM, Jakob Odersky wrote: > Awesome! > +1 on Steve Loughran's question, how does this affect support for > 2.10? Do future contributions need

Re: Secure multi tenancy on in stand alone mode

2016-02-01 Thread Ted Yu
w.r.t. running Spark on YARN, there are a few outstanding issues. e.g. SPARK-11182 HDFS Delegation Token See also the comments under SPARK-12279 FYI On Mon, Feb 1, 2016 at 1:02 PM, eugene miretsky wrote: > When having multiple users sharing the same Spark cluster,

Secure multi tenancy on in stand alone mode

2016-02-01 Thread eugene miretsky
When having multiple users sharing the same Spark cluster, it's a good idea to isolate the users - make sure that each users runs under a different Linux account and prevent them from accessing data in jobs submitted by other users. Is it currently possible to do with Spark? The only thing I

Re: sbt publish-local fails with 2.0.0-SNAPSHOT

2016-02-01 Thread Mike Hynes
Thank you Saisai for the JIRA/PR; I'm glad to see it is a one-line fix, and will try this locally in the interim. Mike On 2/1/16, Saisai Shao wrote: > I think it is due to our recent changes to override the external resolvers > in sbt building profile, I just created a

Re: sbt publish-local fails with 2.0.0-SNAPSHOT

2016-02-01 Thread Saisai Shao
I think it is due to our recent changes to override the external resolvers in sbt building profile, I just created a JIRA ( https://issues.apache.org/jira/browse/SPARK-13109) to track this. On Mon, Feb 1, 2016 at 3:01 PM, Mike Hynes <91m...@gmail.com> wrote: > Hi devs, > > I used to be able to

Re: Spark 1.6.1

2016-02-01 Thread Hamel Kothari
I noticed that the Jackson dependency was bumped to 2.5 in master for something spark-streaming related. Is there any reason that this upgrade can't be included with 1.6.1? According to later comments on this thread: https://issues.apache.org/jira/browse/SPARK-8332 and my personal experience

Spark Executor retries infinitely

2016-02-01 Thread Prabhu Joseph
Hi All, When a Spark job (Spark-1.5.2) is submitted with a single executor and if user passes some wrong JVM arguments with spark.executor.extraJavaOptions, the first executor fails. But the job keeps on retrying, creating a new executor and failing every tim*e, *until CTRL-C is pressed*. *Do

Re: Scala 2.11 default build

2016-02-01 Thread Steve Loughran
On 30 Jan 2016, at 08:22, Reynold Xin > wrote: FYI - I just merged Josh's pull request to switch to Scala 2.11 as the default build. https://github.com/apache/spark/pull/10608 does this mean that Spark 2.10 compatibility & testing are no

[ANNOUNCE] New SAMBA Package = Spark + AWS Lambda

2016-02-01 Thread David Russell
Hi all, Just sharing news of the release of a newly available Spark package, SAMBA . https://github.com/onetapbeyond/lambda-spark-executor SAMBA is an Apache Spark

Re: Scala 2.11 default build

2016-02-01 Thread Ted Yu
The following jobs have been established for build against Scala 2.10: https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Compile/job/SPARK-master-COMPILE-MAVEN-SCALA-2.10/ https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Compile/job/SPARK-master-COMPILE-sbt-SCALA-2.10/ FYI On