[ 
https://issues.apache.org/jira/browse/FLINK-13998?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xuefu Zhang updated FLINK-13998:
--------------------------------
    Description: 
Our test is using local file system, and orc in HIve 2.0.x seems having issue 
with that. 
{code}
06:54:43.156 [ORC_GET_SPLITS #0] ERROR org.apache.hadoop.hive.ql.io.AcidUtils - 
Failed to get files with ID; using regular API
java.lang.UnsupportedOperationException: Only supported for DFS; got class 
org.apache.hadoop.fs.LocalFileSystem
        at 
org.apache.hadoop.hive.shims.Hadoop23Shims.ensureDfs(Hadoop23Shims.java:813) 
~[hive-exec-2.0.0.jar:2.0.0]
        at 
org.apache.hadoop.hive.shims.Hadoop23Shims.listLocatedHdfsStatus(Hadoop23Shims.java:784)
 ~[hive-exec-2.0.0.jar:2.0.0]
        at 
org.apache.hadoop.hive.ql.io.AcidUtils.getAcidState(AcidUtils.java:477) 
[hive-exec-2.0.0.jar:2.0.0]
        at 
org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$FileGenerator.call(OrcInputFormat.java:890)
 [hive-exec-2.0.0.jar:2.0.0]
        at 
org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$FileGenerator.call(OrcInputFormat.java:875)
 [hive-exec-2.0.0.jar:2.0.0]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
[?:1.8.0_181]
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
[?:1.8.0_181]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
[?:1.8.0_181]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
[?:1.8.0_181]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
[?:1.8.0_181]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
{code}

  was:Including 2.0.0 and 2.0.1.


> Fix ORC test failure with Hive 2.0.x
> ------------------------------------
>
>                 Key: FLINK-13998
>                 URL: https://issues.apache.org/jira/browse/FLINK-13998
>             Project: Flink
>          Issue Type: Improvement
>          Components: Connectors / Hive
>            Reporter: Xuefu Zhang
>            Assignee: Xuefu Zhang
>            Priority: Major
>             Fix For: 1.10.0
>
>
> Our test is using local file system, and orc in HIve 2.0.x seems having issue 
> with that. 
> {code}
> 06:54:43.156 [ORC_GET_SPLITS #0] ERROR org.apache.hadoop.hive.ql.io.AcidUtils 
> - Failed to get files with ID; using regular API
> java.lang.UnsupportedOperationException: Only supported for DFS; got class 
> org.apache.hadoop.fs.LocalFileSystem
>       at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.ensureDfs(Hadoop23Shims.java:813) 
> ~[hive-exec-2.0.0.jar:2.0.0]
>       at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.listLocatedHdfsStatus(Hadoop23Shims.java:784)
>  ~[hive-exec-2.0.0.jar:2.0.0]
>       at 
> org.apache.hadoop.hive.ql.io.AcidUtils.getAcidState(AcidUtils.java:477) 
> [hive-exec-2.0.0.jar:2.0.0]
>       at 
> org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$FileGenerator.call(OrcInputFormat.java:890)
>  [hive-exec-2.0.0.jar:2.0.0]
>       at 
> org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$FileGenerator.call(OrcInputFormat.java:875)
>  [hive-exec-2.0.0.jar:2.0.0]
>       at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
> [?:1.8.0_181]
>       at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
> [?:1.8.0_181]
>       at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
> [?:1.8.0_181]
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  [?:1.8.0_181]
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  [?:1.8.0_181]
>       at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
> {code}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

Reply via email to