[ https://issues.apache.org/jira/browse/SPARK-23228?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-23228. ---------------------------------- Resolution: Fixed Fix Version/s: 2.4.0 Issue resolved by pull request 20404 [https://github.com/apache/spark/pull/20404] > Able to track Python create SparkSession in JVM > ----------------------------------------------- > > Key: SPARK-23228 > URL: https://issues.apache.org/jira/browse/SPARK-23228 > Project: Spark > Issue Type: Improvement > Components: PySpark > Affects Versions: 2.4.0 > Reporter: Saisai Shao > Assignee: Saisai Shao > Priority: Minor > Fix For: 2.4.0 > > > Currently when we write a {{SparkListener}} which invokes {{SparkSession}} > and loaded in PySpark application. This {{SparkListener}} will fail to get > {{SparkSession}} created by PySpark, so the below {{assert}} will throw an > exception. To avoid such issue, we should add PySpark created > {{SparkSession}} into JVM {{defaultSession}}. > {code} > spark.sql("CREATE TABLE test (a INT)") > {code} > {code} > class TestSparkSession extends SparkListener with Logging { > override def onOtherEvent(event: SparkListenerEvent): Unit = { > event match { > case CreateTableEvent(db, table) => > val session = > SparkSession.getActiveSession.orElse(SparkSession.getDefaultSession).get > assert(session != null) > val tableInfo = session.sharedState.externalCatalog.getTable(db, > table) > logInfo(s"Table info ${tableInfo}") > case e => > logInfo(s"event $e") > } > } > } > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org