[ 
https://issues.apache.org/jira/browse/SPARK-23831?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuming Wang updated SPARK-23831:
--------------------------------
    Description: 
Add org.apache.derby to IsolatedClientLoader,otherwise it may throw an 
exception:
{noformat}
[info] Cause: java.sql.SQLException: Failed to start database 'metastore_db' 
with class loader 
org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@2439ab23, see the 
next exception for details.
[info] at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
[info] at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
[info] at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
[info] at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown 
Source)
[info] at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
[info] at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
{noformat}
How to reproduce:
{noformat}
build/sbt clean package -Phive -Phive-thriftserver
export SPARK_PREPEND_CLASSES=true
bin/spark-sql --conf spark.sql.hive.metastore.version=2.3.4 --conf 
spark.sql.hive.metastore.jars=maven -e "create table t1 as select 1 as c"
{noformat}

  was:
Add org.apache.derby to IsolatedClientLoader,otherwise it may throw an 
exception:
{noformat}
[info] Cause: java.sql.SQLException: Failed to start database 'metastore_db' 
with class loader 
org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@2439ab23, see the 
next exception for details.
[info] at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
[info] at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
[info] at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
[info] at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown 
Source)
[info] at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
[info] at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
{noformat}
How to reproduce:
{noformat}
sed 's/HiveExternalCatalogSuite/HiveExternalCatalog2Suite/g' 
sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveExternalCatalogSuite.scala
 > 
sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveExternalCatalog2Suite.scala

build/sbt -Phive "hive/test-only *.HiveExternalCatalogSuite 
*.HiveExternalCatalog2Suite"
{noformat}


> Add org.apache.derby to IsolatedClientLoader
> --------------------------------------------
>
>                 Key: SPARK-23831
>                 URL: https://issues.apache.org/jira/browse/SPARK-23831
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.4.0
>            Reporter: Yuming Wang
>            Priority: Major
>
> Add org.apache.derby to IsolatedClientLoader,otherwise it may throw an 
> exception:
> {noformat}
> [info] Cause: java.sql.SQLException: Failed to start database 'metastore_db' 
> with class loader 
> org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@2439ab23, see 
> the next exception for details.
> [info] at 
> org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
> [info] at 
> org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
> [info] at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
> [info] at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown 
> Source)
> [info] at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
> [info] at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
> {noformat}
> How to reproduce:
> {noformat}
> build/sbt clean package -Phive -Phive-thriftserver
> export SPARK_PREPEND_CLASSES=true
> bin/spark-sql --conf spark.sql.hive.metastore.version=2.3.4 --conf 
> spark.sql.hive.metastore.jars=maven -e "create table t1 as select 1 as c"
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to