github-actions[bot] closed pull request #44855: [SPARK-46813][CORE] Don't set
the executor id to "driver" when SparkContext is created by the executor side
URL: https://github.com/apache/spark/pull/44855
--
This is an automated message from the Apache Git Service.
To respond to the message,
github-actions[bot] commented on PR #44855:
URL: https://github.com/apache/spark/pull/44855#issuecomment-2093908111
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
huangxiaopingRD commented on PR #44855:
URL: https://github.com/apache/spark/pull/44855#issuecomment-1907791631
> Although I am surprised that we have such an ability, the code change here
looks incorrect to me, as the RpcEnv in a sub-SparkContext relies on this
property.
You are
yaooqinn commented on PR #44855:
URL: https://github.com/apache/spark/pull/44855#issuecomment-1907409989
Although I am surprised that we have such an ability, the code change here
looks incorrect to me, as the RpcEnv in a sub-SparkContext relies on this
property.
--
This is an automated
huangxiaopingRD commented on PR #44855:
URL: https://github.com/apache/spark/pull/44855#issuecomment-1907243544
> May I ask your use case, @huangxiaopingRD ? It would be great if you can
put that into the PR description because `spark.executor.allowSparkContext` is
not recommended in the