[ 
https://issues.apache.org/jira/browse/SPARK-33578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

steven zhang updated SPARK-33578:
---------------------------------
    Description: 
reproduce as follow code:

        SparkConf sparkConf = new SparkConf().setAppName("hello");

        sparkConf.set("spark.master", "local");

        JavaSparkContext jssc = new JavaSparkContext(sparkConf);

        spark = SparkSession.builder()

                .config("spark.serializer", 
"org.apache.spark.serializer.KryoSerializer")

                .config("hive.exec.dynamici.partition", 
true).config("hive.exec.dynamic.partition.mode", "nonstrict")

                .config("hive.metastore.uris", "thrift://hivemetastore:9083")

                .enableHiveSupport()

                .master("local")

                .getOrCreate();

       spark.sql("select * from hudi_db.hudi_test_order").show();

 

 it will produce follow Exception    

AssertionError: assertion failed: No plan for HiveTableRelation 
[`hudi_db`.`hudi_test_order` … (at current master branch)  
org.apache.spark.sql.AnalysisException: Table or view not found: 
`hudi_db`.`hudi_test_order`;  (at spark v2.4.4)

  

 the reason is SparkContext#getOrCreate(SparkConf) will return activeContext 
that include previous spark config if it has

but the input SparkConf is the newest which include previous spark config and 
options.

  enableHiveSupport will set options (“spark.sql.catalogImplementation", 
"hive”) when spark session created it will miss this conf

SharedState load conf from sparkContext and will miss hive catalog

  was:
reproduce as follow code:

 

        SparkConf sparkConf = new SparkConf().setAppName("hello");

        sparkConf.set("spark.master", "local");

 

        JavaSparkContext jssc = new JavaSparkContext(sparkConf);

   

        spark = SparkSession.builder()

                .config("spark.serializer", 
"org.apache.spark.serializer.KryoSerializer")

                .config("hive.exec.dynamici.partition", 
true).config("hive.exec.dynamic.partition.mode", "nonstrict")

                .config("hive.metastore.uris", "thrift://hivemetastore:9083")

                .enableHiveSupport() 

                .master("local")

                .getOrCreate();

 

       spark.sql("select * from hudi_db.hudi_test_order").show();

 

 it will produce follow Exception     AssertionError: assertion failed: No plan 
for HiveTableRelation [`hudi_db`.`hudi_test_order` … (at current master branch)

 

                                org.apache.spark.sql.AnalysisException: Table 
or view not found: `hudi_db`.`hudi_test_order`;  (at spark v2.4.4)

 

 

 The reason is SparkContext#getOrCreate(SparkConf) will return activeContext 
that include previous spark config if it has

 but the input SparkConf is the newest which include previous spark config and 
options.

 

 enableHiveSupport will set options (“spark.sql.catalogImplementation", "hive”) 
when spark session created it will miss this conf

 SharedState load conf from sparkContext and will miss hive catalog


> enableHiveSupport is invalid after sparkContext that without hive support 
> created 
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-33578
>                 URL: https://issues.apache.org/jira/browse/SPARK-33578
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.1.0
>            Reporter: steven zhang
>            Priority: Minor
>             Fix For: 3.1.0
>
>
> reproduce as follow code:
>         SparkConf sparkConf = new SparkConf().setAppName("hello");
>         sparkConf.set("spark.master", "local");
>         JavaSparkContext jssc = new JavaSparkContext(sparkConf);
>         spark = SparkSession.builder()
>                 .config("spark.serializer", 
> "org.apache.spark.serializer.KryoSerializer")
>                 .config("hive.exec.dynamici.partition", 
> true).config("hive.exec.dynamic.partition.mode", "nonstrict")
>                 .config("hive.metastore.uris", "thrift://hivemetastore:9083")
>                 .enableHiveSupport()
>                 .master("local")
>                 .getOrCreate();
>        spark.sql("select * from hudi_db.hudi_test_order").show();
>  
>  it will produce follow Exception    
> AssertionError: assertion failed: No plan for HiveTableRelation 
> [`hudi_db`.`hudi_test_order` … (at current master branch)  
> org.apache.spark.sql.AnalysisException: Table or view not found: 
> `hudi_db`.`hudi_test_order`;  (at spark v2.4.4)
>   
>  the reason is SparkContext#getOrCreate(SparkConf) will return activeContext 
> that include previous spark config if it has
> but the input SparkConf is the newest which include previous spark config and 
> options.
>   enableHiveSupport will set options (“spark.sql.catalogImplementation", 
> "hive”) when spark session created it will miss this conf
> SharedState load conf from sparkContext and will miss hive catalog



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to