[ 
https://issues.apache.org/jira/browse/HIVE-21683?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16844558#comment-16844558
 ] 

Hive QA commented on HIVE-21683:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12969187/hive-21683.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/17264/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/17264/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-17264/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Tests exited with: Exception: Patch URL 
https://issues.apache.org/jira/secure/attachment/12969187/hive-21683.patch was 
found in seen patch url's cache and a test was probably run already on it. 
Aborting...
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12969187 - PreCommit-HIVE-Build

> ProxyFileSystem breaks with Hadoop trunk
> ----------------------------------------
>
>                 Key: HIVE-21683
>                 URL: https://issues.apache.org/jira/browse/HIVE-21683
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Todd Lipcon
>            Assignee: Todd Lipcon
>            Priority: Major
>         Attachments: hive-21683-javassist.patch, hive-21683-simple.patch, 
> hive-21683.patch
>
>
> When trying to run with a recent build of Hadoop which includes HADOOP-15229 
> I ran into the following stack:
> {code}
> Caused by: java.lang.IllegalArgumentException: Wrong FS: 
> pfile:/src/hive/itests/qtest/target/warehouse/src/kv1.txt, expected: file:///
>         at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:793) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
>         at 
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:86)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
>         at 
> org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:636)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
>         at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:930)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
>         at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
>         at 
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:456)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
>         at 
> org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:153)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
>         at 
> org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:354) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
>         at 
> org.apache.hadoop.fs.ChecksumFileSystem.lambda$openFileWithOptions$0(ChecksumFileSystem.java:846)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
>         at org.apache.hadoop.util.LambdaUtils.eval(LambdaUtils.java:52) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
>         at 
> org.apache.hadoop.fs.ChecksumFileSystem.openFileWithOptions(ChecksumFileSystem.java:845)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
>         at 
> org.apache.hadoop.fs.FileSystem$FSDataInputStreamBuilder.build(FileSystem.java:4522)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
>         at 
> org.apache.hadoop.mapred.LineRecordReader.<init>(LineRecordReader.java:115) 
> ~[hadoop-mapreduce-client-core-3.1.1.6.0.99.0-135.jar:?]{code}
> We need to add appropriate path-swizzling wrappers for the new APIs in 
> ProxyFileSystem23



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to