Re: [PR] [SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side [spark]

2024-05-04 Thread via GitHub
github-actions[bot] closed pull request #44855: [SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side URL: https://github.com/apache/spark/pull/44855 -- This is an automated message from the Apache Git Service. To respond to the message,

Re: [PR] [SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side [spark]

2024-05-03 Thread via GitHub
github-actions[bot] commented on PR #44855: URL: https://github.com/apache/spark/pull/44855#issuecomment-2093908111 We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable.

Re: [PR] [SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side [spark]

2024-01-24 Thread via GitHub
huangxiaopingRD commented on PR #44855: URL: https://github.com/apache/spark/pull/44855#issuecomment-1907791631 > Although I am surprised that we have such an ability, the code change here looks incorrect to me, as the RpcEnv in a sub-SparkContext relies on this property. You are

Re: [PR] [SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side [spark]

2024-01-23 Thread via GitHub
yaooqinn commented on PR #44855: URL: https://github.com/apache/spark/pull/44855#issuecomment-1907409989 Although I am surprised that we have such an ability, the code change here looks incorrect to me, as the RpcEnv in a sub-SparkContext relies on this property. -- This is an automated

Re: [PR] [SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side [spark]

2024-01-23 Thread via GitHub
huangxiaopingRD commented on PR #44855: URL: https://github.com/apache/spark/pull/44855#issuecomment-1907243544 > May I ask your use case, @huangxiaopingRD ? It would be great if you can put that into the PR description because `spark.executor.allowSparkContext` is not recommended in the