Hi all,
Can spark set worker node environment like linux,eg:
--conf
spark.yarn.appMasterEnv.PYTHONPATH=./feature-server:$PYTHONPATH ?
It is not working like linux shell.
I just want to add a path to PYTHONPATH on worker node rather than cover it.
Thanks for any answer!
Here is an example pyspark program which illustrates this problem. If run
using spark-submit, the default configurations for Spark do not seem to be
loaded when a new SparkConf class is instantiated (contrary to what the
loadDefaults=True keyword arg implies). When using the interactive shell
for
In master branch, build/sbt-launch-lib.bash has the following:
URL1=
https://dl.bintray.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/${SBT_VERSION}/sbt-launch.jar
I verified that the following exists:
That's the correct URL. Recent change? The last time I looked, earlier this
week, it still had the obsolete artifactory URL for URL1 ;)
Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
http://shop.oreilly.com/product/0636920033073.do (O'Reilly)
Typesafe http://typesafe.com
@deanwampler
Looks like Sean fixed it:
[SPARK-9633] [BUILD] SBT download locations outdated; need an update
Cheers
On Fri, Aug 7, 2015 at 3:22 PM, Dean Wampler deanwamp...@gmail.com wrote:
That's the correct URL. Recent change? The last time I looked, earlier
this week, it still had the obsolete
I Recently downloaded spark package 1.4.0:
A build of Spark with sbt/sbt clean assembly failed with message Error:
Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar
Upon investigation I figured out that sbt-launch-0.13.7.jar is downloaded
at build time and that it contained the the