[jira] [Commented] (HIVE-21683) ProxyFileSystem breaks with Hadoop trunk

2020-04-27 Thread Hive QA (Jira)


[ 
https://issues.apache.org/jira/browse/HIVE-21683?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17093493#comment-17093493
 ] 

Hive QA commented on HIVE-21683:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12969187/hive-21683.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/21968/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/21968/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-21968/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2020-04-27 13:08:32.666
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-21968/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2020-04-27 13:08:32.669
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at d6ad73c HIVE-23286: The clean-up in case of an aborted 
FileSinkOperator is not correct for ACID direct insert (Marta Kuczora, reviewed 
by Peter Vary)
+ git clean -f -d
Removing standalone-metastore/metastore-server/src/gen/
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at d6ad73c HIVE-23286: The clean-up in case of an aborted 
FileSinkOperator is not correct for ACID direct insert (Marta Kuczora, reviewed 
by Peter Vary)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2020-04-27 13:08:34.001
+ rm -rf ../yetus_PreCommit-HIVE-Build-21968
+ mkdir ../yetus_PreCommit-HIVE-Build-21968
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-21968
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-21968/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
Trying to apply the patch with -p0
error: a/pom.xml: does not exist in index
error: a/shims/0.23/pom.xml: does not exist in index
error: 
a/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java: 
does not exist in index
error: a/shims/common/src/main/java/org/apache/hadoop/fs/ProxyFileSystem.java: 
does not exist in index
Trying to apply the patch with -p1
error: patch failed: pom.xml:177
Falling back to three-way merge...
Applied patch to 'pom.xml' with conflicts.
error: patch failed: shims/0.23/pom.xml:46
Falling back to three-way merge...
Applied patch to 'shims/0.23/pom.xml' with conflicts.
error: patch failed: 
shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java:37
Falling back to three-way merge...
Applied patch to 
'shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java' with 
conflicts.
Going to apply patch with: git apply -p1
error: patch failed: pom.xml:177
Falling back to three-way merge...
Applied patch to 'pom.xml' with conflicts.
error: patch failed: shims/0.23/pom.xml:46
Falling back to three-way merge...
Applied patch to 'shims/0.23/pom.xml' with conflicts.
error: patch failed: 
shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java:37
Falling back to three-way merge...
Applied patch to 
'shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java' with 
conflicts.
U pom.xml
U shims/0.23/pom.xml
U shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java
+ result=1
+ '[' 1 -ne 0 ']'
+ rm -rf yetus_PreCommit-HIVE-Build-21968
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12969187 - PreCommit-HIVE-Build

> ProxyFileSystem breaks with Hadoop trunk
> 
>
> Key: HIVE-21683
> URL: 

[jira] [Commented] (HIVE-21683) ProxyFileSystem breaks with Hadoop trunk

2019-05-21 Thread Hive QA (JIRA)


[ 
https://issues.apache.org/jira/browse/HIVE-21683?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16844558#comment-16844558
 ] 

Hive QA commented on HIVE-21683:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12969187/hive-21683.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/17264/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/17264/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-17264/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Tests exited with: Exception: Patch URL 
https://issues.apache.org/jira/secure/attachment/12969187/hive-21683.patch was 
found in seen patch url's cache and a test was probably run already on it. 
Aborting...
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12969187 - PreCommit-HIVE-Build

> ProxyFileSystem breaks with Hadoop trunk
> 
>
> Key: HIVE-21683
> URL: https://issues.apache.org/jira/browse/HIVE-21683
> Project: Hive
>  Issue Type: Bug
>Reporter: Todd Lipcon
>Assignee: Todd Lipcon
>Priority: Major
> Attachments: hive-21683-javassist.patch, hive-21683-simple.patch, 
> hive-21683.patch
>
>
> When trying to run with a recent build of Hadoop which includes HADOOP-15229 
> I ran into the following stack:
> {code}
> Caused by: java.lang.IllegalArgumentException: Wrong FS: 
> pfile:/src/hive/itests/qtest/target/warehouse/src/kv1.txt, expected: file:///
> at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:793) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:86)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:636)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:930)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:456)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.(ChecksumFileSystem.java:153)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:354) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem.lambda$openFileWithOptions$0(ChecksumFileSystem.java:846)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at org.apache.hadoop.util.LambdaUtils.eval(LambdaUtils.java:52) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem.openFileWithOptions(ChecksumFileSystem.java:845)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.FileSystem$FSDataInputStreamBuilder.build(FileSystem.java:4522)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.mapred.LineRecordReader.(LineRecordReader.java:115) 
> ~[hadoop-mapreduce-client-core-3.1.1.6.0.99.0-135.jar:?]{code}
> We need to add appropriate path-swizzling wrappers for the new APIs in 
> ProxyFileSystem23



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-21683) ProxyFileSystem breaks with Hadoop trunk

2019-05-20 Thread Hive QA (JIRA)


[ 
https://issues.apache.org/jira/browse/HIVE-21683?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16844219#comment-16844219
 ] 

Hive QA commented on HIVE-21683:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12969187/hive-21683.patch

{color:red}ERROR:{color} -1 due to no test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 7 failed/errored test(s), 16057 tests 
executed
*Failed tests:*
{noformat}
org.apache.hive.jdbc.TestTriggersTezSessionPoolManager.testTriggerCustomCreatedDynamicPartitions
 (batchId=274)
org.apache.hive.jdbc.TestTriggersTezSessionPoolManager.testTriggerCustomCreatedDynamicPartitionsUnionAll
 (batchId=274)
org.apache.hive.jdbc.TestTriggersTezSessionPoolManager.testTriggerCustomNonExistent
 (batchId=274)
org.apache.hive.jdbc.TestTriggersTezSessionPoolManager.testTriggerHighBytesRead 
(batchId=274)
org.apache.hive.jdbc.TestTriggersTezSessionPoolManager.testTriggerHighShuffleBytes
 (batchId=274)
org.apache.hive.jdbc.TestTriggersTezSessionPoolManager.testTriggerSlowQueryElapsedTime
 (batchId=274)
org.apache.hive.jdbc.TestTriggersTezSessionPoolManager.testTriggerSlowQueryExecutionTime
 (batchId=274)
{noformat}

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/17259/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/17259/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-17259/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 7 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12969187 - PreCommit-HIVE-Build

> ProxyFileSystem breaks with Hadoop trunk
> 
>
> Key: HIVE-21683
> URL: https://issues.apache.org/jira/browse/HIVE-21683
> Project: Hive
>  Issue Type: Bug
>Reporter: Todd Lipcon
>Assignee: Todd Lipcon
>Priority: Major
> Attachments: hive-21683-javassist.patch, hive-21683-simple.patch, 
> hive-21683.patch
>
>
> When trying to run with a recent build of Hadoop which includes HADOOP-15229 
> I ran into the following stack:
> {code}
> Caused by: java.lang.IllegalArgumentException: Wrong FS: 
> pfile:/src/hive/itests/qtest/target/warehouse/src/kv1.txt, expected: file:///
> at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:793) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:86)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:636)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:930)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:456)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.(ChecksumFileSystem.java:153)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:354) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem.lambda$openFileWithOptions$0(ChecksumFileSystem.java:846)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at org.apache.hadoop.util.LambdaUtils.eval(LambdaUtils.java:52) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem.openFileWithOptions(ChecksumFileSystem.java:845)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.FileSystem$FSDataInputStreamBuilder.build(FileSystem.java:4522)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.mapred.LineRecordReader.(LineRecordReader.java:115) 
> ~[hadoop-mapreduce-client-core-3.1.1.6.0.99.0-135.jar:?]{code}
> We need to add appropriate path-swizzling wrappers for the new APIs in 
> ProxyFileSystem23



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-21683) ProxyFileSystem breaks with Hadoop trunk

2019-05-20 Thread Hive QA (JIRA)


[ 
https://issues.apache.org/jira/browse/HIVE-21683?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16844205#comment-16844205
 ] 

Hive QA commented on HIVE-21683:


| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
1s{color} | {color:green} The patch does not contain any @author tags. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  1m 
58s{color} | {color:blue} Maven dependency ordering for branch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  7m 
30s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  6m 
53s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  2m 
18s{color} | {color:green} master passed {color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  0m 
20s{color} | {color:blue} shims/common in master has 6 extant Findbugs 
warnings. {color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  0m 
22s{color} | {color:blue} shims/0.23 in master has 7 extant Findbugs warnings. 
{color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  7m 
25s{color} | {color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m 
26s{color} | {color:blue} Maven dependency ordering for patch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  7m 
48s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  7m  
6s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  7m  
6s{color} | {color:green} the patch passed {color} |
| {color:red}-1{color} | {color:red} checkstyle {color} | {color:red}  0m 
10s{color} | {color:red} shims/0.23: The patch generated 4 new + 65 unchanged - 
0 fixed = 69 total (was 65) {color} |
| {color:red}-1{color} | {color:red} checkstyle {color} | {color:red}  2m  
7s{color} | {color:red} root: The patch generated 4 new + 79 unchanged - 0 
fixed = 83 total (was 79) {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green}  0m 
 0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} xml {color} | {color:green}  0m  
2s{color} | {color:green} The patch has no ill-formed XML file. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  0m 
27s{color} | {color:green} common in the patch passed. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  0m 
30s{color} | {color:green} shims/0.23 generated 0 new + 6 unchanged - 1 fixed = 
6 total (was 7) {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  7m 
10s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green}  0m 
13s{color} | {color:green} The patch does not generate ASF License warnings. 
{color} |
| {color:black}{color} | {color:black} {color} | {color:black} 53m 41s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Optional Tests |  asflicense  javac  javadoc  xml  compile  findbugs  
checkstyle  |
| uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 
3.16.43-2+deb8u5 (2017-09-19) x86_64 GNU/Linux |
| Build tool | maven |
| Personality | 
/data/hiveptest/working/yetus_PreCommit-HIVE-Build-17259/dev-support/hive-personality.sh
 |
| git revision | master / 8b8e702 |
| Default Java | 1.8.0_111 |
| findbugs | v3.0.0 |
| checkstyle | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-17259/yetus/diff-checkstyle-shims_0.23.txt
 |
| checkstyle | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-17259/yetus/diff-checkstyle-root.txt
 |
| modules | C: shims/common shims/0.23 . U: . |
| Console output | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-17259/yetus.txt |
| Powered by | Apache Yetushttp://yetus.apache.org |


This message was automatically generated.



> ProxyFileSystem breaks with Hadoop trunk
> 
>
> Key: HIVE-21683
> URL: https://issues.apache.org/jira/browse/HIVE-21683
> Project: Hive
>  Issue Type: Bug
>Reporter: Todd Lipcon
>Assignee: Todd Lipcon
>Priority: Major
> Attachments: hive-21683-javassist.patch, hive-21683-simple.patch, 

[jira] [Commented] (HIVE-21683) ProxyFileSystem breaks with Hadoop trunk

2019-05-06 Thread Peter Vary (JIRA)


[ 
https://issues.apache.org/jira/browse/HIVE-21683?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16833613#comment-16833613
 ] 

Peter Vary commented on HIVE-21683:
---

[~tlipcon]: I would prefer the javassist way.

Could you please attach the patch with the name HIVE-21683.patch and set the 
Jira to "Patch Available" state, so the PreCommit tests are run?

Thanks,

Peter

> ProxyFileSystem breaks with Hadoop trunk
> 
>
> Key: HIVE-21683
> URL: https://issues.apache.org/jira/browse/HIVE-21683
> Project: Hive
>  Issue Type: Bug
>Reporter: Todd Lipcon
>Assignee: Todd Lipcon
>Priority: Major
> Attachments: hive-21683-javassist.patch, hive-21683-simple.patch
>
>
> When trying to run with a recent build of Hadoop which includes HADOOP-15229 
> I ran into the following stack:
> {code}
> Caused by: java.lang.IllegalArgumentException: Wrong FS: 
> pfile:/src/hive/itests/qtest/target/warehouse/src/kv1.txt, expected: file:///
> at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:793) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:86)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:636)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:930)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:456)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.(ChecksumFileSystem.java:153)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:354) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem.lambda$openFileWithOptions$0(ChecksumFileSystem.java:846)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at org.apache.hadoop.util.LambdaUtils.eval(LambdaUtils.java:52) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem.openFileWithOptions(ChecksumFileSystem.java:845)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.FileSystem$FSDataInputStreamBuilder.build(FileSystem.java:4522)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.mapred.LineRecordReader.(LineRecordReader.java:115) 
> ~[hadoop-mapreduce-client-core-3.1.1.6.0.99.0-135.jar:?]{code}
> We need to add appropriate path-swizzling wrappers for the new APIs in 
> ProxyFileSystem23



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-21683) ProxyFileSystem breaks with Hadoop trunk

2019-05-02 Thread Todd Lipcon (JIRA)


[ 
https://issues.apache.org/jira/browse/HIVE-21683?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16832242#comment-16832242
 ] 

Todd Lipcon commented on HIVE-21683:


Attached is one approach that uses javassist to dynamically wrap any newly 
added methods for ProxyFileSystem23. I verified this fixes the issue.

I'll also momentarily attach a more "straightforward" approach which just adds 
the new method. The problem with this latter approach is that it won't compile 
against Hadoop 3.1, since the new methods are in Hadoop 3.3 (not yet released). 
We could just wait until that releases before committing if we want to go with 
the simpler approach, though.

> ProxyFileSystem breaks with Hadoop trunk
> 
>
> Key: HIVE-21683
> URL: https://issues.apache.org/jira/browse/HIVE-21683
> Project: Hive
>  Issue Type: Bug
>Reporter: Todd Lipcon
>Assignee: Todd Lipcon
>Priority: Major
> Attachments: hive-21683-javassist.patch
>
>
> When trying to run with a recent build of Hadoop which includes HADOOP-15229 
> I ran into the following stack:
> {code}
> Caused by: java.lang.IllegalArgumentException: Wrong FS: 
> pfile:/src/hive/itests/qtest/target/warehouse/src/kv1.txt, expected: file:///
> at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:793) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:86)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:636)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:930)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:456)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.(ChecksumFileSystem.java:153)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:354) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem.lambda$openFileWithOptions$0(ChecksumFileSystem.java:846)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at org.apache.hadoop.util.LambdaUtils.eval(LambdaUtils.java:52) 
> ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.ChecksumFileSystem.openFileWithOptions(ChecksumFileSystem.java:845)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.fs.FileSystem$FSDataInputStreamBuilder.build(FileSystem.java:4522)
>  ~[hadoop-common-3.1.1.6.0.99.0-135.jar:?]
> at 
> org.apache.hadoop.mapred.LineRecordReader.(LineRecordReader.java:115) 
> ~[hadoop-mapreduce-client-core-3.1.1.6.0.99.0-135.jar:?]{code}
> We need to add appropriate path-swizzling wrappers for the new APIs in 
> ProxyFileSystem23



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)