[GitHub] [spark] wankunde commented on pull request #28850: [SPARK-32015][Core]Remote inheritable thread local variables after spark context is stopped
wankunde commented on pull request #28850: URL: https://github.com/apache/spark/pull/28850#issuecomment-659383289 @srowen @Ngone51 Thanks for all your help. I will remove the reference after spark context is stopped in my application. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] wankunde commented on pull request #28850: [SPARK-32015][Core]Remote inheritable thread local variables after spark context is stopped
wankunde commented on pull request #28850: URL: https://github.com/apache/spark/pull/28850#issuecomment-656494493 @Ngone51 @srowen Update PR, could we only remove the thread reference created by hive? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] wankunde commented on pull request #28850: [SPARK-32015][Core]Remote inheritable thread local variables after spark context is stopped
wankunde commented on pull request #28850: URL: https://github.com/apache/spark/pull/28850#issuecomment-656098500 @Ngone51 Agree with you, and updated the code. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] wankunde commented on pull request #28850: [SPARK-32015][Core]Remote inheritable thread local variables after spark context is stopped
wankunde commented on pull request #28850: URL: https://github.com/apache/spark/pull/28850#issuecomment-654253511 Update PR. @srowen Could you help me review the code? Thanks. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] wankunde commented on pull request #28850: [SPARK-32015][Core]Remote inheritable thread local variables after spark context is stopped
wankunde commented on pull request #28850: URL: https://github.com/apache/spark/pull/28850#issuecomment-65215 Hi, @srowen , if spark context is not stopped, it is meaningless to delete this reference, if spark context is stopped, we should remove this reference at the same time. Therefore, I think this reference should be deleted in the shutdown method. I don't think it makes sense to convert `sparkContext` into a soft reference. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] wankunde commented on pull request #28850: [SPARK-32015][Core]Remote inheritable thread local variables after spark context is stopped
wankunde commented on pull request #28850: URL: https://github.com/apache/spark/pull/28850#issuecomment-651607143 @srowen @holdenk Yes, a test is a use case. Another use case is to loop a user application containing sparkcontext. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] wankunde commented on pull request #28850: [SPARK-32015][Core]Remote inheritable thread local variables after spark context is stopped
wankunde commented on pull request #28850: URL: https://github.com/apache/spark/pull/28850#issuecomment-648718788 A detailed trouble shooting description in Chinese : https://blog.csdn.net/wankunde/article/details/106683680 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] wankunde commented on pull request #28850: [SPARK-32015][Core]Remote inheritable thread local variables after spark context is stopped
wankunde commented on pull request #28850: URL: https://github.com/apache/spark/pull/28850#issuecomment-648673248 > @wankunde Do you have a way to reproduce the OOM? Just setup and stop a few new SparkSessions, and use spark-sql to read tables managed by hive. HiveCatalog will init a new child thread in `org.apache.hive.common.util.ShutdownHookManager`, which will have a reference to `SparkContext`(because SparkContext is a inheritable thread local variable) ```java Runtime.getRuntime().addShutdownHook( new Thread() { @Override public void run() { MGR.shutdownInProgress.set(true); for (Runnable hook : getShutdownHooksInOrder()) { try { hook.run(); } catch (Throwable ex) { LOG.warn("ShutdownHook '" + hook.getClass().getSimpleName() + "' failed, " + ex.toString(), ex); } } } } ); ``` Those sparkContexts cannot be garbage collected after stopped because there are references from java.lang.ApplicationShutdownHooks to those sparkContexts.. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org