[jira] [Commented] (SPARK-20178) Improve Scheduler fetch failures
[ https://issues.apache.org/jira/browse/SPARK-20178?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17476710#comment-17476710 ] Venkat Sambath commented on SPARK-20178: Is design doc referenced in this jira https://docs.google.com/document/d/1D3b_ishMfm5sXmRS494JrOJmL9V_TRVZUB4TgK1l1fY/edit?usp=sharing available anywhere else? > Improve Scheduler fetch failures > > > Key: SPARK-20178 > URL: https://issues.apache.org/jira/browse/SPARK-20178 > Project: Spark > Issue Type: Epic > Components: Scheduler, Spark Core >Affects Versions: 2.1.0 >Reporter: Thomas Graves >Priority: Major > Labels: bulk-closed > > We have been having a lot of discussions around improving the handling of > fetch failures. There are 4 jira currently related to this. > We should try to get a list of things we want to improve and come up with one > cohesive design. > SPARK-20163, SPARK-20091, SPARK-14649 , and SPARK-19753 > I will put my initial thoughts in a follow on comment. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-25537) spark.pyspark.driver.python when set in code doesnt work
[ https://issues.apache.org/jira/browse/SPARK-25537?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Venkat Sambath updated SPARK-25537: --- Description: spark.pyspark.driver.python, spark.pyspark.python when set in code doesnt get picked up by driver or executor. It gets picked up only when set through --conf or when set in spark-defaults.conf. Can we add a line which states it is illegal to set these in application as we do for spark.driver.extraJavaOptions in the doc https://spark.apache.org/docs/latest/configuration.html#application-properties (was: spark.pyspark.driver.python, spark.pyspark.python when set in code doesnt get picked up by driver. Can we add a line which states it is illegal to set these in application as we do for spark.driver.extraJavaOptions ) > spark.pyspark.driver.python when set in code doesnt work > > > Key: SPARK-25537 > URL: https://issues.apache.org/jira/browse/SPARK-25537 > Project: Spark > Issue Type: Documentation > Components: Spark Core >Affects Versions: 2.3.0 >Reporter: Venkat Sambath >Priority: Minor > > spark.pyspark.driver.python, spark.pyspark.python when set in code doesnt get > picked up by driver or executor. It gets picked up only when set through > --conf or when set in spark-defaults.conf. Can we add a line which states it > is illegal to set these in application as we do for > spark.driver.extraJavaOptions in the doc > https://spark.apache.org/docs/latest/configuration.html#application-properties -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-25537) spark.pyspark.driver.python when set in code doesnt work
Venkat Sambath created SPARK-25537: -- Summary: spark.pyspark.driver.python when set in code doesnt work Key: SPARK-25537 URL: https://issues.apache.org/jira/browse/SPARK-25537 Project: Spark Issue Type: Documentation Components: Spark Core Affects Versions: 2.3.0 Reporter: Venkat Sambath spark.pyspark.driver.python, spark.pyspark.python when set in code doesnt get picked up by driver. Can we add a line which states it is illegal to set these in application as we do for spark.driver.extraJavaOptions -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org