[ https://issues.apache.org/jira/browse/SPARK-29015?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yuming Wang updated SPARK-29015: -------------------------------- Summary: "add jar" can not support on JDK 11 (was: Thriftserver can not support add jar on JDK 11) > "add jar" can not support on JDK 11 > ----------------------------------- > > Key: SPARK-29015 > URL: https://issues.apache.org/jira/browse/SPARK-29015 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 3.0.0 > Reporter: Yuming Wang > Priority: Major > > How to reproduce: > {code:bash} > export JAVA_HOME=/usr/lib/jdk-11.0.3 > export PATH=$JAVA_HOME/bin:$PATH > build/sbt clean package -Phive -Phadoop-3.2 -Phive-thriftserver > export SPARK_PREPEND_CLASSES=true > sbin/start-thriftserver.sh > bin/beeline -u jdbc:hive2://localhost:10000 > {code} > {noformat} > 0: jdbc:hive2://localhost:10000> add jar > /root/.m2/repository/org/apache/hive/hcatalog/hive-hcatalog-core/2.3.6/hive-hcatalog-core-2.3.6.jar; > INFO : Added > [/root/.m2/repository/org/apache/hive/hcatalog/hive-hcatalog-core/2.3.6/hive-hcatalog-core-2.3.6.jar] > to class path > INFO : Added resources: > [/root/.m2/repository/org/apache/hive/hcatalog/hive-hcatalog-core/2.3.6/hive-hcatalog-core-2.3.6.jar] > +---------+ > | result | > +---------+ > +---------+ > No rows selected (0.381 seconds) > 0: jdbc:hive2://localhost:10000> CREATE TABLE addJar(key string) ROW FORMAT > SERDE 'org.apache.hive.hcatalog.data.JsonSerDe'; > +---------+ > | Result | > +---------+ > +---------+ > No rows selected (0.613 seconds) > 0: jdbc:hive2://localhost:10000> select * from addJar; > Error: Error running query: java.lang.RuntimeException: > java.lang.ClassNotFoundException: org.apache.hive.hcatalog.data.JsonSerDe > (state=,code=0) > {noformat} -- This message was sent by Atlassian Jira (v8.3.2#803003) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org