Hi, I think it should be accessible via the SparkConf in the SparkContext. Something like sc.getConf().get("spark.app.name")?
Thanks, Deng On Tue, Nov 25, 2014 at 12:40 PM, rapelly kartheek <kartheek.m...@gmail.com> wrote: > Hi, > > When I submit a spark application like this: > > ./bin/spark-submit --class org.apache.spark.examples.SparkKMeans > --deploy-mode client --master spark://karthik:7077 > $SPARK_HOME/examples/*/scala-*/spark-examples-*.jar /k-means 4 0.001 > Which part of the spark framework code deals with the name of the > application?. Basically, I want to access the name of the application in > the spark scheduler code. > > Can someone please tell me where I should look for the code that deals > with the name of the currently executing application (say, SparkKMeans)? > > Thank you. > -- Maria Odea "Deng" Ching-Mallete | och...@apache.org | http://www.linkedin.com/in/oching