Hi,
I would like to use Spark as a service through REST API calls
for uploading and submitting a job, getting results, etc.
There is a project by the folks at Ooyala:
https://github.com/spark-jobserver/spark-jobserver
I also encountered some hidden job REST APIs in Spark:
Unfortunately, no. I switched back to OpenJDK 1.7.
Didn't get a chance to dig deeper.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-YARN-using-Java-1-8-fails-tp24925p25360.html
Sent from the Apache Spark User List mailing list archive at
Hi,
I've deployed a Secure YARN 2.7.1 cluster with HDFS encryption and am trying
to run the pyspark shell using Spark 1.5.1
pyspark shell works and I can run a sample code to calculate PI just fine.
However, when I try to stop the current context (e.g., sc.stop()) and then
create a new context
Hi,
I have successfully run pyspark on Spark 1.5.1 on YARN 2.7.1 with Java
OpenJDK 1.7.
However, when I run the same test on Java OpenJDK 1.8 (or Oracle Java 1.8),
I cannot start up pyspark.
Has anyone been able to run Spark on YARN with Java 1.8?
I get ApplicationMaster disassociated
Hi,
I've noticed running Spark apps on Mesos is significantly slower compared to
stand-alone or Spark on YARN.
I don't think it should be the case, so I am posting the problem here in
case someone has some explanation
or can point me to some configuration options i've missed.
I'm running the
I'm trying to compare the performance of Spark running on Mesos vs YARN.
However, I am having problems being able to configure the Spark workload to
run in a similar way on Mesos and YARN.
When running Spark on YARN, you can specify the number of executors per
node. So if I have a node with 4