Github user tdas commented on a diff in the pull request: https://github.com/apache/spark/pull/5060#discussion_r29080523 --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala --- @@ -94,6 +94,11 @@ class SparkContext(config: SparkConf) extends Logging with ExecutorAllocationCli // contains a map from hostname to a list of input format splits on the host. private[spark] var preferredNodeLocationData: Map[String, Set[SplitInfo]] = Map() + // This is used for Spark Streaming to check whether driver host and port are set by user, + // if these two configurations are set by user, so the recovery mechanism should not remove this. + private[spark] val isDriverHostSetByUser = config.contains("spark.driver.host") --- End diff -- I am not sure it is a good idea to clutter SparkContext further with such functions, especially when Spark core itself does not use it. Would be good to find a different solution.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org