Github user HyukjinKwon commented on the issue:

    https://github.com/apache/incubator-livy/pull/121
  
    @vanzin, weird.
    
    ```
    $ ./bin/spark-shell
    scala> sql("CREATE TABLE tblA(a int)")
    scala> spark.stop()
    ```
    
    ```
    $ rm -fr metastore_db
    $ rm -fr spark-warehouse
    $ rm -fr derby.log
    ```
    
    ```
    scala> import org.apache.spark.sql.SparkSession
    scala> val spark = SparkSession.builder().enableHiveSupport().getOrCreate()
    scala> sql("CREATE TABLE tblA(a int)")
    ```
    
    These steps reproduce the same error all in Spark 2.3.0, 2.3.1 and 2.4.0 - 
I remember we never properly supported to stop and start Hive enabled support 
Spark session in this way. SparkR tests were failed due to the same reason. Hmm 
.. I am taking a look.


---

Reply via email to