[ https://issues.apache.org/jira/browse/SPARK-9270?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Cheolsoo Park updated SPARK-9270: --------------------------------- Description: Currently, the app name is hardcoded in pyspark as "PySparkShell", and the app name cannot be changed. SPARK-8650 fixed this issue for spark-sql. SPARK-9180 introduced a new option {{--name}} for spark-shell. sparkR is different because {{SparkContext}} is not automatically constructed in sparkR, and the app name can be set when initializing {{SparkContext}}. In summary- ||shell||able to set app name|| |pyspark|no| |spark-shell|yes via --name| |spark-sql|yes via --conf spark.app.name| |sparkR|n/a| was: Currently, the app name is hardcoded in pyspark as "PySparkShell", and the app name cannot be set. SPARK-8650 fixed this issue for spark-sql, but pyspark is not fixed. SPARK-9180 introduced a new option {{--name}} for spark-shell, but the {{spark.app.name}} property isn't honored in spark-shell. sparkR is different because {{SparkContext}} is not automatically constructed in sparkR, and the app name can be set when initializing {{SparkContext}}. In summary- ||shell||able to set app name|| |pyspark|no| |spark-shell|yes via --name| |spark-sql|yes via --conf spark.app.name| |sparkR|n/a| > Allow --name option in pyspark > ------------------------------ > > Key: SPARK-9270 > URL: https://issues.apache.org/jira/browse/SPARK-9270 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 1.4.1, 1.5.0 > Reporter: Cheolsoo Park > Priority: Minor > > Currently, the app name is hardcoded in pyspark as "PySparkShell", and the > app name cannot be changed. > SPARK-8650 fixed this issue for spark-sql. > SPARK-9180 introduced a new option {{--name}} for spark-shell. > sparkR is different because {{SparkContext}} is not automatically constructed > in sparkR, and the app name can be set when initializing {{SparkContext}}. > In summary- > ||shell||able to set app name|| > |pyspark|no| > |spark-shell|yes via --name| > |spark-sql|yes via --conf spark.app.name| > |sparkR|n/a| -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org