GitHub user jerryshao opened a pull request:

    https://github.com/apache/spark/pull/20404

    [SPARK-23228][PYSPARK] Add Python Created jsparkSession to JVM's 
defaultSession

    ## What changes were proposed in this pull request?
    
    In the current PySpark code, Python created `jsparkSession` doesn't add to 
JVM's defaultSession, this `SparkSession` object cannot be fetched from Java 
side, so the below scala code will be failed when loaded in PySpark application.
    
    ```scala
    class TestSparkSession extends SparkListener with Logging {
      override def onOtherEvent(event: SparkListenerEvent): Unit = {
        event match {
          case CreateTableEvent(db, table) =>
            val session = 
SparkSession.getActiveSession.orElse(SparkSession.getDefaultSession)
            assert(session.isDefined)
            val tableInfo = 
session.get.sharedState.externalCatalog.getTable(db, table)
            logInfo(s"Table info ${tableInfo}")
    
          case e =>
            logInfo(s"event $e")
    
        }
      }
    }
    ```
    
    So here propose to add fresh create `jsparkSession` to `defaultSession`.
    
    ## How was this patch tested?
    
    Manual verification.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/jerryshao/apache-spark SPARK-23228

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/20404.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #20404
    
----
commit d9189ad1763d4ef867027f4ade2a332d589fe698
Author: jerryshao <sshao@...>
Date:   2018-01-26T09:41:28Z

    Add Python Create jsparkSession to JVM's defaultSession

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to