[jira] [Commented] (SPARK-20472) Support for Dynamic Configuration
[ https://issues.apache.org/jira/browse/SPARK-20472?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15997946#comment-15997946 ] Sean Owen commented on SPARK-20472: --- JVM config matters. How do you change the driver heap size in client mode after startup? What are the semantics of changing a batch size at runtime? cache size? It raises a lot of questions, so no this is not generally possible. > Support for Dynamic Configuration > - > > Key: SPARK-20472 > URL: https://issues.apache.org/jira/browse/SPARK-20472 > Project: Spark > Issue Type: Bug > Components: Spark Submit >Affects Versions: 2.1.0 >Reporter: Shahbaz Hussain > > Currently Spark Configuration can not be dynamically changed. > It requires Spark Job be killed and started again for a new configuration to > take in to effect. > This bug is to enhance Spark ,such that configuration changes can be > dynamically changed without requiring a application restart. > Ex: If Batch Interval in a Streaming Job is 20 seconds ,and if user wants to > reduce it to 5 seconds,currently it requires a re-submit of the job. -- This message was sent by Atlassian JIRA (v6.3.15#6346) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-20472) Support for Dynamic Configuration
[ https://issues.apache.org/jira/browse/SPARK-20472?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15997874#comment-15997874 ] Shahbaz Hussain commented on SPARK-20472: - Yes ,the idea is to have a way by which we can persist configuration in memory,like for Ex: Batch Interval ,sql shuffle partitions etc ,primarily these are Spark Specific configuration. JVM configuration are global and cant be changed ,this request is not for Dyncamic Configuration for JVM but for Spark application specific. > Support for Dynamic Configuration > - > > Key: SPARK-20472 > URL: https://issues.apache.org/jira/browse/SPARK-20472 > Project: Spark > Issue Type: Bug > Components: Spark Submit >Affects Versions: 2.1.0 >Reporter: Shahbaz Hussain > > Currently Spark Configuration can not be dynamically changed. > It requires Spark Job be killed and started again for a new configuration to > take in to effect. > This bug is to enhance Spark ,such that configuration changes can be > dynamically changed without requiring a application restart. > Ex: If Batch Interval in a Streaming Job is 20 seconds ,and if user wants to > reduce it to 5 seconds,currently it requires a re-submit of the job. -- This message was sent by Atlassian JIRA (v6.3.15#6346) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-20472) Support for Dynamic Configuration
[ https://issues.apache.org/jira/browse/SPARK-20472?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15984461#comment-15984461 ] Sean Owen commented on SPARK-20472: --- I don't think this is generally possible, because some config is global, it is needed and has effect at startup, and can't be changed even if you wanted (think: JVM heap size). I doubt this is achievable. You will probably have to narrow this down much further. > Support for Dynamic Configuration > - > > Key: SPARK-20472 > URL: https://issues.apache.org/jira/browse/SPARK-20472 > Project: Spark > Issue Type: Bug > Components: Spark Submit >Affects Versions: 2.1.0 >Reporter: Shahbaz Hussain > > Currently Spark Configuration can not be dynamically changed. > It requires Spark Job be killed and started again for a new configuration to > take in to effect. > This bug is to enhance Spark ,such that configuration changes can be > dynamically changed without requiring a application restart. > Ex: If Batch Interval in a Streaming Job is 20 seconds ,and if user wants to > reduce it to 5 seconds,currently it requires a re-submit of the job. -- This message was sent by Atlassian JIRA (v6.3.15#6346) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org