[ 
https://issues.apache.org/jira/browse/HUDI-3957?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sivabalan narayanan reassigned HUDI-3957:
-----------------------------------------

    Assignee: sivabalan narayanan

> Support spark2 and scala12 testing w/ integ test bundle
> -------------------------------------------------------
>
>                 Key: HUDI-3957
>                 URL: https://issues.apache.org/jira/browse/HUDI-3957
>             Project: Apache Hudi
>          Issue Type: Task
>          Components: tests-ci
>            Reporter: sivabalan narayanan
>            Assignee: sivabalan narayanan
>            Priority: Major
>             Fix For: 0.12.0
>
>
> currently, integ test bundle does not work for spark2 and scala12. Spark 
> session initialization (enableHive()) expects some hive classes which are 
> missing. 
>  
> {code:java}
> 22/04/22 20:26:08 WARN testsuite.HoodieTestSuiteJob: Running spark job w/ app 
> id local-1650673568081
> 22/04/22 20:26:08 INFO fs.FSUtils: Resolving file /tmp/test.propertiesto be a 
> remote file.
> Exception in thread "main" java.lang.IllegalArgumentException: Unable to 
> instantiate SparkSession with Hive support because Hive classes are not found.
>       at 
> org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:871)
>       at 
> org.apache.hudi.integ.testsuite.HoodieTestSuiteJob.<init>(HoodieTestSuiteJob.java:110)
>       at 
> org.apache.hudi.integ.testsuite.HoodieTestSuiteJob.main(HoodieTestSuiteJob.java:180)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>       at 
> org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:855)
>       at 
> org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
>       at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
>       at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
>       at 
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 22/04/22 20:26:08 INFO spark.SparkContext: Invoking stop() from shutdown hook
> 22/04/22 20:26:08 INFO server.AbstractConnector: Stopped 
> Spark@889d9e8{HTTP/1.1, (http/1.1)}{0.0.0.0:8090} {code}



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

Reply via email to