rdblue commented on code in PR #6058:
URL: https://github.com/apache/iceberg/pull/6058#discussion_r1018470425
##########
spark/v3.3/spark/src/main/java/org/apache/iceberg/spark/SparkCatalog.java:
##########
@@ -532,6 +537,24 @@ public final void initialize(String name,
CaseInsensitiveStringMap options) {
Splitter.on('.').splitToList(options.get("default-namespace")).toArray(new
String[0]);
}
}
+
+ EnvironmentContext.put(EnvironmentContext.ENGINE_NAME, "spark");
+ EnvironmentContext.put(
+ EnvironmentContext.ENGINE_VERSION,
sparkSession.sparkContext().version());
+ EnvironmentContext.put(CatalogProperties.APP_ID,
sparkSession.sparkContext().applicationId());
+ sparkSession
+ .sparkContext()
+ .addSparkListener(
+ new SparkListener() {
+ @Override
+ public void onOtherEvent(SparkListenerEvent event) {
+ if (event instanceof SparkListenerSQLExecutionStart) {
+ SparkListenerSQLExecutionStart start =
(SparkListenerSQLExecutionStart) event;
+ EnvironmentContext.putLocal(
+ SQLExecution.EXECUTION_ID_KEY(), () ->
String.valueOf(start.executionId()));
Review Comment:
Doesn't this implementation defeat the purpose of using a supplier to get
the value? This captures the source of the value in the supplier, which is why
it needs to be threadlocal.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]