[jira] [Updated] (SPARK-30755) Support Hive 1.2.1's Serde after making built-in Hive to 2.3

2020-02-08 Thread Xiao Li (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-30755?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xiao Li updated SPARK-30755:

Target Version/s: 3.0.0
 Description: 
{noformat}
2020-01-27 05:11:20.446 - stderr> 20/01/27 05:11:20 INFO DAGScheduler: 
ResultStage 2 (main at NativeMethodAccessorImpl.java:0) failed in 1.000 s due 
to Job aborted due to stage failure: Task 0 in stage 2.0 failed 4 times, most 
recent failure: Lost task 0.3 in stage 2.0 (TID 13, 10.110.21.210, executor 1): 
java.lang.NoClassDefFoundError: org/apache/hadoop/hive/serde2/SerDe
  2020-01-27 05:11:20.446 - stderr>  at 
java.lang.ClassLoader.defineClass1(Native Method)
  2020-01-27 05:11:20.446 - stderr>  at 
java.lang.ClassLoader.defineClass(ClassLoader.java:756)
  2020-01-27 05:11:20.446 - stderr>  at 
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
  2020-01-27 05:11:20.446 - stderr>  at 
java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
  2020-01-27 05:11:20.446 - stderr>  at 
java.net.URLClassLoader.access$100(URLClassLoader.java:74)
  2020-01-27 05:11:20.446 - stderr>  at 
java.net.URLClassLoader$1.run(URLClassLoader.java:369)
  2020-01-27 05:11:20.446 - stderr>  at 
java.net.URLClassLoader$1.run(URLClassLoader.java:363)
  2020-01-27 05:11:20.446 - stderr>  at 
java.security.AccessController.doPrivileged(Native Method)
  2020-01-27 05:11:20.446 - stderr>  at 
java.net.URLClassLoader.findClass(URLClassLoader.java:362)
  2020-01-27 05:11:20.446 - stderr>  at 
java.lang.ClassLoader.loadClass(ClassLoader.java:418)
  2020-01-27 05:11:20.446 - stderr>  at 
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
  2020-01-27 05:11:20.446 - stderr>  at 
java.lang.ClassLoader.loadClass(ClassLoader.java:405)
  2020-01-27 05:11:20.446 - stderr>  at 
java.lang.ClassLoader.loadClass(ClassLoader.java:351)
  2020-01-27 05:11:20.446 - stderr>  at java.lang.Class.forName0(Native Method)
  2020-01-27 05:11:20.446 - stderr>  at java.lang.Class.forName(Class.java:348)
  2020-01-27 05:11:20.446 - stderr>  at 
org.apache.hadoop.hive.ql.plan.TableDesc.getDeserializerClass(TableDesc.java:76)
  2020-01-27 05:11:20.447 - stderr>  at 
org.apache.spark.sql.hive.execution.HiveOutputWriter.(HiveFileFormat.scala:119)
  2020-01-27 05:11:20.447 - stderr>  at 
org.apache.spark.sql.hive.execution.HiveFileFormat$$anon$1.newInstance(HiveFileFormat.scala:104)
  2020-01-27 05:11:20.447 - stderr>  at 
org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.newOutputWriter(FileFormatDataWriter.scala:126)
  2020-01-27 05:11:20.447 - stderr>  at 
org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.(FileFormatDataWriter.scala:111)
  2020-01-27 05:11:20.447 - stderr>  at 
org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:267)
  2020-01-27 05:11:20.447 - stderr>  at 
org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$write$15(FileFormatWriter.scala:208)
  2020-01-27 05:11:20.447 - stderr>  at 
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
  2020-01-27 05:11:20.447 - stderr>  at 
org.apache.spark.scheduler.Task.doRunTask(Task.scala:144)
  2020-01-27 05:11:20.447 - stderr>  at 
org.apache.spark.scheduler.Task.run(Task.scala:117)
  2020-01-27 05:11:20.447 - stderr>  at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$6(Executor.scala:567)
  2020-01-27 05:11:20.447 - stderr>  at 
org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1559)
  2020-01-27 05:11:20.447 - stderr>  at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:570)
  2020-01-27 05:11:20.447 - stderr>  at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
  2020-01-27 05:11:20.447 - stderr>  at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
  2020-01-27 05:11:20.447 - stderr>  at java.lang.Thread.run(Thread.java:748)
  2020-01-27 05:11:20.447 - stderr> Caused by: 
java.lang.ClassNotFoundException: org.apache.hadoop.hive.serde2.SerDe
  2020-01-27 05:11:20.447 - stderr>  at 
java.net.URLClassLoader.findClass(URLClassLoader.java:382)
  2020-01-27 05:11:20.447 - stderr>  at 
java.lang.ClassLoader.loadClass(ClassLoader.java:418)
  2020-01-27 05:11:20.447 - stderr>  at 
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
  2020-01-27 05:11:20.447 - stderr>  at 
java.lang.ClassLoader.loadClass(ClassLoader.java:351)
  2020-01-27 05:11:20.447 - stderr>  ... 31 more
{noformat}


  was:

{noformat}
2020-01-27 05:11:20.446 - stderr> 20/01/27 05:11:20 INFO DAGScheduler: 
ResultStage 2 (main at NativeMethodAccessorImpl.java:0) failed in 1.000 s due 
to Job aborted due to stage failure: Task 0 in stage 2.0 failed 4 times, most 
recent failure: Lost task 0.3 in stage 2.0 (TID 13, 10.110.21.210, executor 1): 
java.lang.NoClassDefFoundError: org/apache/hadoop/hive/serde2/SerDe
  2020-01-27 

[jira] [Updated] (SPARK-30755) Support Hive 1.2.1's Serde after making built-in Hive to 2.3

2020-02-08 Thread Xiao Li (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-30755?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xiao Li updated SPARK-30755:

Priority: Blocker  (was: Major)

> Support Hive 1.2.1's Serde after making built-in Hive to 2.3
> 
>
> Key: SPARK-30755
> URL: https://issues.apache.org/jira/browse/SPARK-30755
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 3.0.0
>Reporter: Yuming Wang
>Priority: Blocker
>
> {noformat}
> 2020-01-27 05:11:20.446 - stderr> 20/01/27 05:11:20 INFO DAGScheduler: 
> ResultStage 2 (main at NativeMethodAccessorImpl.java:0) failed in 1.000 s due 
> to Job aborted due to stage failure: Task 0 in stage 2.0 failed 4 times, most 
> recent failure: Lost task 0.3 in stage 2.0 (TID 13, 10.110.21.210, executor 
> 1): java.lang.NoClassDefFoundError: org/apache/hadoop/hive/serde2/SerDe
>   2020-01-27 05:11:20.446 - stderr>  at 
> java.lang.ClassLoader.defineClass1(Native Method)
>   2020-01-27 05:11:20.446 - stderr>  at 
> java.lang.ClassLoader.defineClass(ClassLoader.java:756)
>   2020-01-27 05:11:20.446 - stderr>  at 
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>   2020-01-27 05:11:20.446 - stderr>  at 
> java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
>   2020-01-27 05:11:20.446 - stderr>  at 
> java.net.URLClassLoader.access$100(URLClassLoader.java:74)
>   2020-01-27 05:11:20.446 - stderr>  at 
> java.net.URLClassLoader$1.run(URLClassLoader.java:369)
>   2020-01-27 05:11:20.446 - stderr>  at 
> java.net.URLClassLoader$1.run(URLClassLoader.java:363)
>   2020-01-27 05:11:20.446 - stderr>  at 
> java.security.AccessController.doPrivileged(Native Method)
>   2020-01-27 05:11:20.446 - stderr>  at 
> java.net.URLClassLoader.findClass(URLClassLoader.java:362)
>   2020-01-27 05:11:20.446 - stderr>  at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:418)
>   2020-01-27 05:11:20.446 - stderr>  at 
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
>   2020-01-27 05:11:20.446 - stderr>  at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:405)
>   2020-01-27 05:11:20.446 - stderr>  at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:351)
>   2020-01-27 05:11:20.446 - stderr>  at java.lang.Class.forName0(Native 
> Method)
>   2020-01-27 05:11:20.446 - stderr>  at 
> java.lang.Class.forName(Class.java:348)
>   2020-01-27 05:11:20.446 - stderr>  at 
> org.apache.hadoop.hive.ql.plan.TableDesc.getDeserializerClass(TableDesc.java:76)
>   2020-01-27 05:11:20.447 - stderr>  at 
> org.apache.spark.sql.hive.execution.HiveOutputWriter.(HiveFileFormat.scala:119)
>   2020-01-27 05:11:20.447 - stderr>  at 
> org.apache.spark.sql.hive.execution.HiveFileFormat$$anon$1.newInstance(HiveFileFormat.scala:104)
>   2020-01-27 05:11:20.447 - stderr>  at 
> org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.newOutputWriter(FileFormatDataWriter.scala:126)
>   2020-01-27 05:11:20.447 - stderr>  at 
> org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.(FileFormatDataWriter.scala:111)
>   2020-01-27 05:11:20.447 - stderr>  at 
> org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:267)
>   2020-01-27 05:11:20.447 - stderr>  at 
> org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$write$15(FileFormatWriter.scala:208)
>   2020-01-27 05:11:20.447 - stderr>  at 
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>   2020-01-27 05:11:20.447 - stderr>  at 
> org.apache.spark.scheduler.Task.doRunTask(Task.scala:144)
>   2020-01-27 05:11:20.447 - stderr>  at 
> org.apache.spark.scheduler.Task.run(Task.scala:117)
>   2020-01-27 05:11:20.447 - stderr>  at 
> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$6(Executor.scala:567)
>   2020-01-27 05:11:20.447 - stderr>  at 
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1559)
>   2020-01-27 05:11:20.447 - stderr>  at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:570)
>   2020-01-27 05:11:20.447 - stderr>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>   2020-01-27 05:11:20.447 - stderr>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>   2020-01-27 05:11:20.447 - stderr>  at java.lang.Thread.run(Thread.java:748)
>   2020-01-27 05:11:20.447 - stderr> Caused by: 
> java.lang.ClassNotFoundException: org.apache.hadoop.hive.serde2.SerDe
>   2020-01-27 05:11:20.447 - stderr>  at 
> java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>   2020-01-27 05:11:20.447 - stderr>  at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:418)
>   2020-01-27 05:11:20.447 - stderr>  at 
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
>   2020-01-27