Ok, I found it on JIRA SPARK-2390:
https://issues.apache.org/jira/browse/SPARK-2390
So it looks like this is a known issue.

From: alee...@hotmail.com
To: user@spark.apache.org
Subject: spark-1.0.0-rc11 2f1dc868 spark-shell not honoring --properties-file 
option?
Date: Tue, 8 Jul 2014 15:17:00 -0700




Build: Spark 1.0.0 rc11 (git commit tag: 
2f1dc868e5714882cf40d2633fb66772baf34789)








Hi All,
When I enabled the spark-defaults.conf for the eventLog, spark-shell broke 
while spark-submit works.
I'm trying to create a separate directory per user to keep track with their own 
Spark job event logs with the env $USER in spark-defaults.conf.
Here's the spark-defaults.conf I specified so that HistoryServer can start 
picking up these event log from HDFS.As you can see here, I was trying to 
create a directory for each user so they can store the event log on a per user 
base.However, when I launch spark-shell, it didn't pick up $USER as the current 
login user. However, this works for spark-submit.
Here's more details.
/opt/spark/ is SPARK_HOME








[test@ ~]$ cat /opt/spark/conf/spark-defaults.conf
# Default system properties included when running spark-submit.
# This is useful for setting default environmental settings.


# Example:
# spark.master            spark://master:7077
spark.eventLog.enabled    true
spark.eventLog.dir        hdfs:///user/$USER/spark/logs/
# spark.serializer        org.apache.spark.serializer.KryoSerializer

and I tried to create a separate config file to override the default one:







[test@ ~]$ SPARK_SUBMIT_OPTS="-XX:MaxPermSize=256m" /opt/spark/bin/spark-shell 
--master yarn --driver-class-path 
/opt/hadoop/share/hadoop/mapreduce/lib/hadoop-lzo.jar --properties-file 
/home/test/spark-defaults.conf [test@~]$ cat /home/test/spark-defaults.conf# 
Default system properties included when running spark-submit.# This is useful 
for setting default environmental settings.
# Example:# spark.master            spark://master:7077spark.eventLog.enabled   
 truespark.eventLog.dir        hdfs:///user/test/spark/logs/















# spark.serializer        org.apache.spark.serializer.KryoSerializer
But it didn't work also, it is still looking at the 
/opt/spark/conf/spark-defaults.conf. According to the document, 
http://spark.apache.org/docs/latest/configuration.htmlHardcoded properties in 
SparkConf > spark-submit / spark-shell > conf/spark-defaults.conf
2 problems here:
1. In repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala, the instance 
SparkConf didn't look for the user specified spark-defaults.conf anywhere.
I don't see anywhere that pulls in the file from option --properties-file, it 
is just the default location conf/spark-defaults.confval conf = new SparkConf() 
     .setMaster(getMaster())      .setAppName("Spark shell")      .setJars(jars)











      .set("spark.repl.class.uri", intp.classServer.uri)
2. The $USER isn't picked up in spark-shell. This may be another problem and 
fixed at the same time when it re-use how SparkSubmit.scala does to SparkConf???










                                                                                
  

Reply via email to