[ 
https://issues.apache.org/jira/browse/SPARK-25337?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-25337:
----------------------------------
    Description: 
Observed in the Scala 2.12 pull request builder consistently now. I don't see 
this failing the main 2.11 builds, so assume it's 2.12-related, but, kind of 
hard to see how.

CC [~sadhen]
{code:java}
org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite *** ABORTED ***
Exception encountered when invoking run on a nested suite - spark-submit 
returned with exit code 1.
Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 
'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 
'spark.master.rest.enabled=false' '--conf' 
'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/hive/target/tmp/warehouse-37386cdb-c0fb-405d-9442-8f0044b81643'
 '--conf' 'spark.sql.test.version.index=0' '--driver-java-options' 
'-Dderby.system.home=/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/hive/target/tmp/warehouse-37386cdb-c0fb-405d-9442-8f0044b81643'
 
'/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/hive/target/tmp/test7888487003559759098.py'
...
2018-09-04 20:00:04.949 - stdout>   File 
"/private/tmp/test-spark/spark-2.1.3/python/lib/pyspark.zip/pyspark/sql/session.py",
 line 545, in sql
2018-09-04 20:00:04.949 - stdout>   File 
"/private/tmp/test-spark/spark-2.1.3/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py",
 line 1257, in __call__
2018-09-04 20:00:04.949 - stdout>   File 
"/private/tmp/test-spark/spark-2.1.3/python/lib/pyspark.zip/pyspark/sql/utils.py",
 line 63, in deco
2018-09-04 20:00:04.949 - stdout>   File 
"/private/tmp/test-spark/spark-2.1.3/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py",
 line 328, in get_return_value
2018-09-04 20:00:04.95 - stdout> py4j.protocol.Py4JJavaError: An error occurred 
while calling o27.sql.
2018-09-04 20:00:04.95 - stdout> : java.util.ServiceConfigurationError: 
org.apache.spark.sql.sources.DataSourceRegister: Provider 
org.apache.spark.sql.hive.execution.HiveFileFormat could not be instantiated
{code}

  was:
Observed in the Scala 2.12 pull request builder consistently now. I don't see 
this failing the main 2.11 builds, so assume it's 2.12-related, but, kind of 
hard to see how.

CC [~sadhen]
{code:java}
org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite *** ABORTED ***
Exception encountered when invoking run on a nested suite - spark-submit 
returned with exit code 1.
Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 
'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 
'spark.master.rest.enabled=false' '--conf' 
'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/hive/target/tmp/warehouse-37386cdb-c0fb-405d-9442-8f0044b81643'
 '--conf' 'spark.sql.test.version.index=0' '--driver-java-options' 
'-Dderby.system.home=/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/hive/target/tmp/warehouse-37386cdb-c0fb-405d-9442-8f0044b81643'
 
'/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/hive/target/tmp/test7888487003559759098.py'
...
2018-09-04 07:48:30.833 - stdout> : java.util.ServiceConfigurationError: 
org.apache.spark.sql.sources.DataSourceRegister: Provider 
org.apache.spark.sql.execution.datasources.orc.OrcFileFormat could not be 
instantiated
...
2018-09-04 07:48:30.834 - stdout> Caused by: java.lang.NoSuchMethodError: 
org.apache.spark.sql.execution.datasources.FileFormat.$init$(Lorg/apache/spark/sql/execution/datasources/FileFormat;)V
2018-09-04 07:48:30.834 - stdout> at 
org.apache.spark.sql.execution.datasources.orc.OrcFileFormat.<init>(OrcFileFormat.scala:81)
...{code}


> HiveExternalCatalogVersionsSuite + Scala 2.12 = NoSuchMethodError: 
> org.apache.spark.sql.execution.datasources.FileFormat.$init$(Lorg/apache/spark/sql/execution/datasources/FileFormat;)
> ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-25337
>                 URL: https://issues.apache.org/jira/browse/SPARK-25337
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 2.4.0
>            Reporter: Sean Owen
>            Priority: Major
>
> Observed in the Scala 2.12 pull request builder consistently now. I don't see 
> this failing the main 2.11 builds, so assume it's 2.12-related, but, kind of 
> hard to see how.
> CC [~sadhen]
> {code:java}
> org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite *** ABORTED ***
> Exception encountered when invoking run on a nested suite - spark-submit 
> returned with exit code 1.
> Command line: './bin/spark-submit' '--name' 'prepare testing tables' 
> '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 
> 'spark.master.rest.enabled=false' '--conf' 
> 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/hive/target/tmp/warehouse-37386cdb-c0fb-405d-9442-8f0044b81643'
>  '--conf' 'spark.sql.test.version.index=0' '--driver-java-options' 
> '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/hive/target/tmp/warehouse-37386cdb-c0fb-405d-9442-8f0044b81643'
>  
> '/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/hive/target/tmp/test7888487003559759098.py'
> ...
> 2018-09-04 20:00:04.949 - stdout>   File 
> "/private/tmp/test-spark/spark-2.1.3/python/lib/pyspark.zip/pyspark/sql/session.py",
>  line 545, in sql
> 2018-09-04 20:00:04.949 - stdout>   File 
> "/private/tmp/test-spark/spark-2.1.3/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py",
>  line 1257, in __call__
> 2018-09-04 20:00:04.949 - stdout>   File 
> "/private/tmp/test-spark/spark-2.1.3/python/lib/pyspark.zip/pyspark/sql/utils.py",
>  line 63, in deco
> 2018-09-04 20:00:04.949 - stdout>   File 
> "/private/tmp/test-spark/spark-2.1.3/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py",
>  line 328, in get_return_value
> 2018-09-04 20:00:04.95 - stdout> py4j.protocol.Py4JJavaError: An error 
> occurred while calling o27.sql.
> 2018-09-04 20:00:04.95 - stdout> : java.util.ServiceConfigurationError: 
> org.apache.spark.sql.sources.DataSourceRegister: Provider 
> org.apache.spark.sql.hive.execution.HiveFileFormat could not be instantiated
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to