dongjoon-hyun opened a new pull request, #42381:
URL: https://github.com/apache/spark/pull/42381

   ### What changes were proposed in this pull request?
   
   This PR aims to use `INFO`-level log instead of `WARN`-level in 
`ExecutorPodsWatcher.onClose` if `SparkContext` is stopped. Since Spark can 
distinguish the expected behavior from the error cases, Spark had better avoid 
WARNING.
   
   ### Why are the changes needed?
   
   Previously, we have `WARN ExecutorPodsWatchSnapshotSource: Kubernetes client 
has been closed` message.
   ```
   23/08/07 18:10:14 INFO SparkContext: SparkContext is stopping with exitCode 
0.
   23/08/07 18:10:14 WARN TaskSetManager: Lost task 2594.0 in stage 0.0 (TID 
2594) ([2620:149:100d:1813::3f86] executor 1615): TaskKilled (another attempt 
succeeded)
   23/08/07 18:10:14 INFO TaskSetManager: task 2594.0 in stage 0.0 (TID 2594) 
failed, but the task will not be re-executed (either because the task failed 
with a shuffle data fetch failure, so the previous stage needs to be re-run, or 
because a different copy of the task has already succeeded).
   23/08/07 18:10:14 INFO SparkUI: Stopped Spark web UI at http://xxx:4040
   23/08/07 18:10:14 INFO KubernetesClusterSchedulerBackend: Shutting down all 
executors
   23/08/07 18:10:14 INFO 
KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Asking each 
executor to shut down
   23/08/07 18:10:14 WARN ExecutorPodsWatchSnapshotSource: Kubernetes client 
has been closed.
   ```
   
   ### Does this PR introduce _any_ user-facing change?
   
   No.
   
   ### How was this patch tested?
   
   Pass the CIs.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to