Github user jerryshao commented on a diff in the pull request: https://github.com/apache/spark/pull/5060#discussion_r29116759 --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala --- @@ -94,6 +94,11 @@ class SparkContext(config: SparkConf) extends Logging with ExecutorAllocationCli // contains a map from hostname to a list of input format splits on the host. private[spark] var preferredNodeLocationData: Map[String, Set[SplitInfo]] = Map() + // This is used for Spark Streaming to check whether driver host and port are set by user, + // if these two configurations are set by user, so the recovery mechanism should not remove this. + private[spark] val isDriverHostSetByUser = config.contains("spark.driver.host") --- End diff -- But I think there has to be a place in Spark Core to judge whether this configuration is set by user or Spark itself before SparkContext is initialized, either in SparkConf or somewhere else. It cannot be gotten from Spark Streaming, where all the SparkContext things have already been initialized.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org