[jira] [Updated] (SPARK-29254) Failed to include jars passed in through --jars when isolatedLoader is enabled

2019-09-25 Thread Yuming Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-29254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuming Wang updated SPARK-29254:

Description: 
Failed to include jars passed in through --jars when {{isolatedLoader}} is 
enabled({{spark.sql.hive.metastore.jars != builtin}}). How to reproduce:
{code:scala}
  test("SPARK-29254: include jars passed in through --jars when isolatedLoader 
is enabled") {
val unusedJar = TestUtils.createJarWithClasses(Seq.empty)
val jar1 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassA"))
val jar2 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassB"))
val jar3 = HiveTestJars.getHiveContribJar.getCanonicalPath
val jar4 = HiveTestJars.getHiveHcatalogCoreJar.getCanonicalPath
val jarsString = Seq(jar1, jar2, jar3, jar4).map(j => 
j.toString).mkString(",")
val args = Seq(
  "--class", SparkSubmitClassLoaderTest.getClass.getName.stripSuffix("$"),
  "--name", "SparkSubmitClassLoaderTest",
  "--master", "local-cluster[2,1,1024]",
  "--conf", "spark.ui.enabled=false",
  "--conf", "spark.master.rest.enabled=false",
  "--conf", "spark.sql.hive.metastore.version=3.1.2",
  "--conf", "spark.sql.hive.metastore.jars=maven",
  "--driver-java-options", "-Dderby.system.durability=test",
  "--jars", jarsString,
  unusedJar.toString, "SparkSubmitClassA", "SparkSubmitClassB")
runSparkSubmit(args)
  }
{code}

Logs:
{noformat}
2019-09-25 22:11:42.854 - stderr> 19/09/25 22:11:42 ERROR log: error in 
initSerDe: java.lang.ClassNotFoundException Class 
org.apache.hive.hcatalog.data.JsonSerDe not found
2019-09-25 22:11:42.854 - stderr> java.lang.ClassNotFoundException: Class 
org.apache.hive.hcatalog.data.JsonSerDe not found
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:84)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:77)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:289)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:271)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.hadoop.hive.ql.metadata.Table.getColsInternal(Table.java:663)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:646)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:898)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:937)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$createTable$1(HiveClientImpl.scala:539)
2019-09-25 22:11:42.854 - stderr>   at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:311)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:245)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:244)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:294)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.spark.sql.hive.client.HiveClientImpl.createTable(HiveClientImpl.scala:537)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$createTable$1(HiveExternalCatalog.scala:284)
2019-09-25 22:11:42.854 - stderr>   at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:99)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.spark.sql.hive.HiveExternalCatalog.createTable(HiveExternalCatalog.scala:242)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.createTable(ExternalCatalogWithListener.scala:94)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:325)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.spark.sql.execution.command.CreateTableCommand.run(tables.scala:132)
2019-09-25 22:11:42.854 - stderr>   at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
2019-09-25 22:11:42.854 - stderr>   at 

[jira] [Updated] (SPARK-29254) Failed to include jars passed in through --jars when isolatedLoader is enabled

2019-09-25 Thread Yuming Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-29254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuming Wang updated SPARK-29254:

Summary: Failed to include jars passed in through --jars when 
isolatedLoader is enabled  (was: Failed to include jars passed in through 
--jars when isolatedLoader is enabled())

> Failed to include jars passed in through --jars when isolatedLoader is enabled
> --
>
> Key: SPARK-29254
> URL: https://issues.apache.org/jira/browse/SPARK-29254
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 3.0.0
>Reporter: Yuming Wang
>Priority: Major
>
> Failed to include jars passed in through --jars when {{isolatedLoader}} is 
> enabled({{spark.sql.hive.metastore.jars != builtin}}). How to reproduce:
> {code:scala}
>   test("SPARK-29254: include jars passed in through --jars when 
> isolatedLoader is enabled") {
> val unusedJar = TestUtils.createJarWithClasses(Seq.empty)
> val jar1 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassA"))
> val jar2 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassB"))
> val jar3 = HiveTestJars.getHiveContribJar.getCanonicalPath
> val jar4 = HiveTestJars.getHiveHcatalogCoreJar.getCanonicalPath
> val jarsString = Seq(jar1, jar2, jar3, jar4).map(j => 
> j.toString).mkString(",")
> val args = Seq(
>   "--class", SparkSubmitClassLoaderTest.getClass.getName.stripSuffix("$"),
>   "--name", "SparkSubmitClassLoaderTest",
>   "--master", "local-cluster[2,1,1024]",
>   "--conf", "spark.ui.enabled=false",
>   "--conf", "spark.master.rest.enabled=false",
>   "--conf", "spark.sql.hive.metastore.version=3.1.2",
>   "--conf", "spark.sql.hive.metastore.jars=maven",
>   "--driver-java-options", "-Dderby.system.durability=test",
>   "--jars", jarsString,
>   unusedJar.toString, "SparkSubmitClassA", "SparkSubmitClassB")
> runSparkSubmit(args)
>   }
> {code}
> Logs:
> {noformat}
> [info]   2019-09-25 21:56:22.023 - stderr> 19/09/25 21:56:22 ERROR log: error 
> in initSerDe: java.lang.ClassNotFoundException Class 
> org.apache.hive.hcatalog.data.JsonSerDe not found
> [info]   2019-09-25 21:56:22.023 - stderr> java.lang.ClassNotFoundException: 
> Class org.apache.hive.hcatalog.data.JsonSerDe not found
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:84)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:77)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:289)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:271)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.hadoop.hive.ql.metadata.Table.getColsInternal(Table.java:663)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:646)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:898)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:937)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$createTable$1(HiveClientImpl.scala:542)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:314)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:248)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:247)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:297)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.spark.sql.hive.client.HiveClientImpl.createTable(HiveClientImpl.scala:540)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$createTable$1(HiveExternalCatalog.scala:284)
> [info]   2019-09-25 21:56:22.023 - stderr>at 
> scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
> [info]   2019-09-25 21:56:22.023 - 

[jira] [Updated] (SPARK-29254) Failed to include jars passed in through --jars when isolatedLoader is enabled()

2019-09-25 Thread Yuming Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-29254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuming Wang updated SPARK-29254:

Description: 
Failed to include jars passed in through --jars when {{isolatedLoader}} is 
enabled({{spark.sql.hive.metastore.jars != builtin}}). How to reproduce:
{code:scala}
  test("SPARK-29254: include jars passed in through --jars when isolatedLoader 
is enabled") {
val unusedJar = TestUtils.createJarWithClasses(Seq.empty)
val jar1 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassA"))
val jar2 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassB"))
val jar3 = HiveTestJars.getHiveContribJar.getCanonicalPath
val jar4 = HiveTestJars.getHiveHcatalogCoreJar.getCanonicalPath
val jarsString = Seq(jar1, jar2, jar3, jar4).map(j => 
j.toString).mkString(",")
val args = Seq(
  "--class", SparkSubmitClassLoaderTest.getClass.getName.stripSuffix("$"),
  "--name", "SparkSubmitClassLoaderTest",
  "--master", "local-cluster[2,1,1024]",
  "--conf", "spark.ui.enabled=false",
  "--conf", "spark.master.rest.enabled=false",
  "--conf", "spark.sql.hive.metastore.version=3.1.2",
  "--conf", "spark.sql.hive.metastore.jars=maven",
  "--driver-java-options", "-Dderby.system.durability=test",
  "--jars", jarsString,
  unusedJar.toString, "SparkSubmitClassA", "SparkSubmitClassB")
runSparkSubmit(args)
  }
{code}

Logs:
{noformat}
[info]   2019-09-25 21:56:22.023 - stderr> 19/09/25 21:56:22 ERROR log: error 
in initSerDe: java.lang.ClassNotFoundException Class 
org.apache.hive.hcatalog.data.JsonSerDe not found
[info]   2019-09-25 21:56:22.023 - stderr> java.lang.ClassNotFoundException: 
Class org.apache.hive.hcatalog.data.JsonSerDe not found
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:84)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:77)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:289)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:271)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.hadoop.hive.ql.metadata.Table.getColsInternal(Table.java:663)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:646)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:898)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:937)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$createTable$1(HiveClientImpl.scala:542)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:314)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:248)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:247)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:297)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.spark.sql.hive.client.HiveClientImpl.createTable(HiveClientImpl.scala:540)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$createTable$1(HiveExternalCatalog.scala:284)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:99)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.spark.sql.hive.HiveExternalCatalog.createTable(HiveExternalCatalog.scala:242)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.createTable(ExternalCatalogWithListener.scala:94)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:325)
[info]   2019-09-25 21:56:22.023 - stderr>  at 
org.apache.spark.sql.execution.command.CreateTableCommand.run(tables.scala:132)
[info]   2019-09-25 21:56:22.023 

[jira] [Updated] (SPARK-29254) Failed to include jars passed in through --jars when isolatedLoader is enabled()

2019-09-25 Thread Yuming Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-29254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuming Wang updated SPARK-29254:

Description: 
Failed to include jars passed in through --jars when {{isolatedLoader}} is 
enabled({{spark.sql.hive.metastore.jars != builtin}}). How to reproduce:
{code:scala}
  test("SPARK-29254: include jars passed in through --jars when isolatedLoader 
is enabled") {
val unusedJar = TestUtils.createJarWithClasses(Seq.empty)
val jar1 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassA"))
val jar2 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassB"))
val jar3 = HiveTestJars.getHiveContribJar.getCanonicalPath
val jar4 = HiveTestJars.getHiveHcatalogCoreJar.getCanonicalPath
val jarsString = Seq(jar1, jar2, jar3, jar4).map(j => 
j.toString).mkString(",")
val args = Seq(
  "--class", SparkSubmitClassLoaderTest.getClass.getName.stripSuffix("$"),
  "--name", "SparkSubmitClassLoaderTest",
  "--master", "local-cluster[2,1,1024]",
  "--conf", "spark.ui.enabled=false",
  "--conf", "spark.master.rest.enabled=false",
  "--conf", "spark.sql.hive.metastore.version=3.1.2",
  "--conf", "spark.sql.hive.metastore.jars=maven",
  "--driver-java-options", "-Dderby.system.durability=test",
  "--jars", jarsString,
  unusedJar.toString, "SparkSubmitClassA", "SparkSubmitClassB")
runSparkSubmit(args)
  }
{code}



  was:
Failed to include jars passed in through --jars when isolatedLoader is 
enabled({{spark.sql.hive.metastore.jars != builtin}}). How to reproduce:
{code:scala}
  test("SPARK-29254: include jars passed in through --jars when isolatedLoader 
is enabled") {
val unusedJar = TestUtils.createJarWithClasses(Seq.empty)
val jar1 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassA"))
val jar2 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassB"))
val jar3 = HiveTestJars.getHiveContribJar.getCanonicalPath
val jar4 = HiveTestJars.getHiveHcatalogCoreJar.getCanonicalPath
val jarsString = Seq(jar1, jar2, jar3, jar4).map(j => 
j.toString).mkString(",")
val args = Seq(
  "--class", SparkSubmitClassLoaderTest.getClass.getName.stripSuffix("$"),
  "--name", "SparkSubmitClassLoaderTest",
  "--master", "local-cluster[2,1,1024]",
  "--conf", "spark.ui.enabled=false",
  "--conf", "spark.master.rest.enabled=false",
  "--conf", "spark.sql.hive.metastore.version=3.1.2",
  "--conf", "spark.sql.hive.metastore.jars=maven",
  "--driver-java-options", "-Dderby.system.durability=test",
  "--jars", jarsString,
  unusedJar.toString, "SparkSubmitClassA", "SparkSubmitClassB")
runSparkSubmit(args)
  }
{code}




> Failed to include jars passed in through --jars when isolatedLoader is 
> enabled()
> 
>
> Key: SPARK-29254
> URL: https://issues.apache.org/jira/browse/SPARK-29254
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 3.0.0
>Reporter: Yuming Wang
>Priority: Major
>
> Failed to include jars passed in through --jars when {{isolatedLoader}} is 
> enabled({{spark.sql.hive.metastore.jars != builtin}}). How to reproduce:
> {code:scala}
>   test("SPARK-29254: include jars passed in through --jars when 
> isolatedLoader is enabled") {
> val unusedJar = TestUtils.createJarWithClasses(Seq.empty)
> val jar1 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassA"))
> val jar2 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassB"))
> val jar3 = HiveTestJars.getHiveContribJar.getCanonicalPath
> val jar4 = HiveTestJars.getHiveHcatalogCoreJar.getCanonicalPath
> val jarsString = Seq(jar1, jar2, jar3, jar4).map(j => 
> j.toString).mkString(",")
> val args = Seq(
>   "--class", SparkSubmitClassLoaderTest.getClass.getName.stripSuffix("$"),
>   "--name", "SparkSubmitClassLoaderTest",
>   "--master", "local-cluster[2,1,1024]",
>   "--conf", "spark.ui.enabled=false",
>   "--conf", "spark.master.rest.enabled=false",
>   "--conf", "spark.sql.hive.metastore.version=3.1.2",
>   "--conf", "spark.sql.hive.metastore.jars=maven",
>   "--driver-java-options", "-Dderby.system.durability=test",
>   "--jars", jarsString,
>   unusedJar.toString, "SparkSubmitClassA", "SparkSubmitClassB")
> runSparkSubmit(args)
>   }
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-29254) Failed to include jars passed in through --jars when isolatedLoader is enabled()

2019-09-25 Thread Yuming Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-29254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuming Wang updated SPARK-29254:

Description: 
Failed to include jars passed in through --jars when isolatedLoader is 
enabled({{spark.sql.hive.metastore.jars != builtin}}). How to reproduce:
{code:scala}
  test("SPARK-29254: include jars passed in through --jars when isolatedLoader 
is enabled") {
val unusedJar = TestUtils.createJarWithClasses(Seq.empty)
val jar1 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassA"))
val jar2 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassB"))
val jar3 = HiveTestJars.getHiveContribJar.getCanonicalPath
val jar4 = HiveTestJars.getHiveHcatalogCoreJar.getCanonicalPath
val jarsString = Seq(jar1, jar2, jar3, jar4).map(j => 
j.toString).mkString(",")
val args = Seq(
  "--class", SparkSubmitClassLoaderTest.getClass.getName.stripSuffix("$"),
  "--name", "SparkSubmitClassLoaderTest",
  "--master", "local-cluster[2,1,1024]",
  "--conf", "spark.ui.enabled=false",
  "--conf", "spark.master.rest.enabled=false",
  "--conf", "spark.sql.hive.metastore.version=3.1.2",
  "--conf", "spark.sql.hive.metastore.jars=maven",
  "--driver-java-options", "-Dderby.system.durability=test",
  "--jars", jarsString,
  unusedJar.toString, "SparkSubmitClassA", "SparkSubmitClassB")
runSparkSubmit(args)
  }
{code}



  was:
Failed to include jars passed in through --jars when isolatedLoader is 
enabled({{spark.sql.hive.metastore.jars != builtin}}). How to reproduce:
{code:scala}
  test("SPARK-8368: include jars passed in through --jars when isolatedLoader 
is enabled") {
val unusedJar = TestUtils.createJarWithClasses(Seq.empty)
val jar1 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassA"))
val jar2 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassB"))
val jar3 = HiveTestJars.getHiveContribJar.getCanonicalPath
val jar4 = HiveTestJars.getHiveHcatalogCoreJar.getCanonicalPath
val jarsString = Seq(jar1, jar2, jar3, jar4).map(j => 
j.toString).mkString(",")
val args = Seq(
  "--class", SparkSubmitClassLoaderTest.getClass.getName.stripSuffix("$"),
  "--name", "SparkSubmitClassLoaderTest",
  "--master", "local-cluster[2,1,1024]",
  "--conf", "spark.ui.enabled=false",
  "--conf", "spark.master.rest.enabled=false",
  "--conf", "spark.sql.hive.metastore.version=3.1.2",
  "--conf", "spark.sql.hive.metastore.jars=maven",
  "--driver-java-options", "-Dderby.system.durability=test",
  "--jars", jarsString,
  unusedJar.toString, "SparkSubmitClassA", "SparkSubmitClassB")
runSparkSubmit(args)
  }
{code}




> Failed to include jars passed in through --jars when isolatedLoader is 
> enabled()
> 
>
> Key: SPARK-29254
> URL: https://issues.apache.org/jira/browse/SPARK-29254
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 3.0.0
>Reporter: Yuming Wang
>Priority: Major
>
> Failed to include jars passed in through --jars when isolatedLoader is 
> enabled({{spark.sql.hive.metastore.jars != builtin}}). How to reproduce:
> {code:scala}
>   test("SPARK-29254: include jars passed in through --jars when 
> isolatedLoader is enabled") {
> val unusedJar = TestUtils.createJarWithClasses(Seq.empty)
> val jar1 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassA"))
> val jar2 = TestUtils.createJarWithClasses(Seq("SparkSubmitClassB"))
> val jar3 = HiveTestJars.getHiveContribJar.getCanonicalPath
> val jar4 = HiveTestJars.getHiveHcatalogCoreJar.getCanonicalPath
> val jarsString = Seq(jar1, jar2, jar3, jar4).map(j => 
> j.toString).mkString(",")
> val args = Seq(
>   "--class", SparkSubmitClassLoaderTest.getClass.getName.stripSuffix("$"),
>   "--name", "SparkSubmitClassLoaderTest",
>   "--master", "local-cluster[2,1,1024]",
>   "--conf", "spark.ui.enabled=false",
>   "--conf", "spark.master.rest.enabled=false",
>   "--conf", "spark.sql.hive.metastore.version=3.1.2",
>   "--conf", "spark.sql.hive.metastore.jars=maven",
>   "--driver-java-options", "-Dderby.system.durability=test",
>   "--jars", jarsString,
>   unusedJar.toString, "SparkSubmitClassA", "SparkSubmitClassB")
> runSparkSubmit(args)
>   }
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org