vicennial commented on PR #43735: URL: https://github.com/apache/spark/pull/43735#issuecomment-1829823363
@fhalde Yes, with this PR, it would be possible to have isolated classloaders per spark session on the executors without going through Spark Connect. The `withResources` method should be used to wrap all executions (like in Spark Connect [here](https://github.com/apache/spark/blob/master/connector/connect/server/src/main/scala/org/apache/spark/sql/connect/service/SessionHolder.scala#L245)) and note that, all artifacts (i.e Jars, classfiles) would need to be added through the `ArtifactManager` (directly adding these to `SparkContext` would not work), refer to Spark Connect's `AddArtifactsHandler` to see how we use the [API](https://github.com/apache/spark/blob/master/connector/connect/server/src/main/scala/org/apache/spark/sql/connect/service/SparkConnectAddArtifactsHandler.scala#L90). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org