Hi Andrew,
I'm actually using spark-submit, and I tried using
spark.executor.extraJavaOpts to configure tachyon client to connect to
Tachyon HA master, however the configuration settings were not picked up.
On the other hand when I set the same tachyon configuration parameters
through
still struggling with SPARK_JAVA_OPTS being deprecated. i am using spark
standalone.
for example if i have a akka timeout setting that i would like to be
applied to every piece of the spark framework (so spark master, spark
workers, spark executor sub-processes, spark-shell, etc.). i used to do
Hi Koert and Lukasz,
The recommended way of not hard-coding configurations in your application
is through conf/spark-defaults.conf as documented here:
http://spark.apache.org/docs/latest/configuration.html#dynamically-loading-spark-properties.
However, this is only applicable to
spark-submit, so
for a jvm application its not very appealing to me to use spark submit
my application uses hadoop, so i should use hadoop jar, and my
application uses spark, so it should use spark-submit. if i add a piece
of code that uses some other system there will be yet another suggested way
to launch
Hi,
I'm facing similar problem
According to: http://tachyon-project.org/Running-Spark-on-Tachyon.html
in order to allow tachyon client to connect to tachyon master in HA mode you
need to pass 2 system properties:
-Dtachyon.zookeeper.address=zookeeperHost1:2181,zookeeperHost2:2181
hey patrick,
i have a SparkConf i can add them too. i was looking for a way to do this
where they are not hardwired within scala, which is what SPARK_JAVA_OPTS
used to do.
i guess if i just set -Dspark.akka.frameSize=1 on my java app launch
then it will get picked up by the SparkConf too
Just wondering - how are you launching your application? If you want
to set values like this the right way is to add them to the SparkConf
when you create a SparkContext.
val conf = new SparkConf().set(spark.akka.frameSize,
1).setAppName(...).setMaster(...)
val sc = new SparkContext(conf)
-
i have some settings that i think are relevant for my application. they are
spark.akka settings so i assume they are relevant for both executors and my
driver program.
i used to do:
SPARK_JAVA_OPTS=-Dspark.akka.frameSize=1
now this is deprecated. the alternatives mentioned are:
* some