[GitHub] [hadoop] hadoop-yetus commented on pull request #2784: HDFS-15850. Superuser actions should be reported to external enforcers
hadoop-yetus commented on pull request #2784: URL: https://github.com/apache/hadoop/pull/2784#issuecomment-801659165 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 52s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 13m 57s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 23m 3s | | trunk passed | | +1 :green_heart: | compile | 5m 22s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 4m 37s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | checkstyle | 1m 16s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 58s | | trunk passed | | +1 :green_heart: | javadoc | 1m 30s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 2m 16s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 4m 25s | | trunk passed | | +1 :green_heart: | shadedclient | 16m 48s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 21s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 44s | | the patch passed | | +1 :green_heart: | compile | 5m 0s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 5m 0s | | the patch passed | | +1 :green_heart: | compile | 4m 34s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | javac | 4m 34s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 9s | [/results-checkstyle-hadoop-hdfs-project.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2784/1/artifact/out/results-checkstyle-hadoop-hdfs-project.txt) | hadoop-hdfs-project: The patch generated 2 new + 317 unchanged - 1 fixed = 319 total (was 318) | | +1 :green_heart: | mvnsite | 1m 47s | | the patch passed | | +1 :green_heart: | javadoc | 1m 16s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 2m 4s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 4m 38s | | the patch passed | | +1 :green_heart: | shadedclient | 16m 50s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 371m 21s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2784/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | -1 :x: | unit | 25m 48s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs-rbf.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2784/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-rbf.txt) | hadoop-hdfs-rbf in the patch passed. | | +1 :green_heart: | asflicense | 0m 47s | | The patch does not generate ASF License warnings. | | | | 515m 17s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.datanode.TestBlockScanner | | | hadoop.fs.viewfs.TestViewFileSystemOverloadSchemeWithHdfsScheme | | | hadoop.hdfs.TestViewDistributedFileSystemWithMountLinks | | | hadoop.hdfs.server.namenode.ha.TestBootstrapStandby | | | hadoop.hdfs.TestPersistBlocks | | | hadoop.hdfs.server.namenode.ha.TestEditLogTailer | | | hadoop.hdfs.TestDFSShell | | | hadoop.hdfs.server.namenode.snapshot.TestNestedSnapshots | | | hadoop.hdfs.server.datanode.TestIncrementalBrVariations | | | hadoop.hdfs.server.datanode.fsdataset.impl.TestFsVolumeList | | | hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes | | | hadoop.hdfs.server.namenode.TestDecommissioningStatusWithBackoffMonitor | | | hadoop.hdfs.server.namenode.TestDecommissioningStatus | | | hadoop.hdfs.server
[GitHub] [hadoop] GauthamBanasandra commented on pull request #2783: HDFS-15903. Refactor X-Platform lib
GauthamBanasandra commented on pull request #2783: URL: https://github.com/apache/hadoop/pull/2783#issuecomment-801604494 The warnings seen in the above run aren't related to my PR. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] GauthamBanasandra commented on pull request #2783: HDFS-15903. Refactor X-Platform lib
GauthamBanasandra commented on pull request #2783: URL: https://github.com/apache/hadoop/pull/2783#issuecomment-801604350 X-Platform started out as a utility to help in writing cross platform code in Hadoop. As its scope expanding to cover various scenarios, it is necessary to refactor it in early stages to provide proper organization and growth of the X-Platform library. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17578) Improve UGI debug log to help troubleshooting TokenCache related issues
[ https://issues.apache.org/jira/browse/HADOOP-17578?focusedWorklogId=568076&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-568076 ] ASF GitHub Bot logged work on HADOOP-17578: --- Author: ASF GitHub Bot Created on: 18/Mar/21 03:49 Start Date: 18/Mar/21 03:49 Worklog Time Spent: 10m Work Description: aajisaka commented on pull request #2762: URL: https://github.com/apache/hadoop/pull/2762#issuecomment-801599351 > Java doc issue is unrelated. ``` [ERROR] /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-2762/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java:1923: error: exception not thrown: java.io.IOException [ERROR]* @throws IOException [ERROR] ^ ``` `throws IOException` has been removed by the patch and it is related. @xiaoyuyao @cxorm Would you please fix this? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 568076) Time Spent: 1.5h (was: 1h 20m) > Improve UGI debug log to help troubleshooting TokenCache related issues > --- > > Key: HADOOP-17578 > URL: https://issues.apache.org/jira/browse/HADOOP-17578 > Project: Hadoop Common > Issue Type: Bug >Affects Versions: 3.2.0 >Reporter: Xiaoyu Yao >Assignee: Xiaoyu Yao >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 1.5h > Remaining Estimate: 0h > > We have seen some issues around TokenCache getDelegationToken failures even > though the UGI already has a valid token. The tricky part is the token map is > keyed by the canonical service name, which can be different from the actual > service field in the token, e.g. KMS token in HA case. The current UGI log > dumps all the tokens but not the keys of the token map. This ticket is opened > to include the complete token map information in the debug log. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on pull request #2762: HADOOP-17578. Improve UGI debug log to help troubleshooting TokenCach…
aajisaka commented on pull request #2762: URL: https://github.com/apache/hadoop/pull/2762#issuecomment-801599351 > Java doc issue is unrelated. ``` [ERROR] /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-2762/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java:1923: error: exception not thrown: java.io.IOException [ERROR]* @throws IOException [ERROR] ^ ``` `throws IOException` has been removed by the patch and it is related. @xiaoyuyao @cxorm Would you please fix this? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2736: HDFS-15868. Possible Resource Leak in EditLogFileOutputStream
hadoop-yetus commented on pull request #2736: URL: https://github.com/apache/hadoop/pull/2736#issuecomment-801579068 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 24s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 40m 43s | | trunk passed | | +1 :green_heart: | compile | 1m 32s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 1m 22s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | checkstyle | 1m 9s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 32s | | trunk passed | | +1 :green_heart: | javadoc | 1m 2s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 32s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 3m 51s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 55s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 22m 15s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 29s | | the patch passed | | +1 :green_heart: | compile | 1m 39s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 1m 39s | | the patch passed | | +1 :green_heart: | compile | 1m 26s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | javac | 1m 26s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 3s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 25s | | the patch passed | | +1 :green_heart: | javadoc | 0m 58s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 34s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 4m 26s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 29s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 366m 41s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2736/6/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 35s | | The patch does not generate ASF License warnings. | | | | 476m 24s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.datanode.TestDirectoryScanner | | | hadoop.hdfs.TestPersistBlocks | | | hadoop.hdfs.server.namenode.snapshot.TestNestedSnapshots | | | hadoop.hdfs.server.datanode.TestBlockScanner | | | hadoop.hdfs.server.namenode.TestFileTruncate | | | hadoop.hdfs.server.namenode.ha.TestEditLogTailer | | | hadoop.hdfs.server.namenode.snapshot.TestINodeFileUnderConstructionWithSnapshot | | | hadoop.hdfs.TestStateAlignmentContextWithHA | | | hadoop.hdfs.TestSnapshotCommands | | | hadoop.hdfs.server.namenode.TestDecommissioningStatusWithBackoffMonitor | | | hadoop.hdfs.server.namenode.TestDecommissioningStatus | | | hadoop.hdfs.TestDFSShell | | | hadoop.hdfs.server.namenode.TestAddOverReplicatedStripedBlocks | | | hadoop.hdfs.server.datanode.fsdataset.impl.TestFsVolumeList | | | hadoop.hdfs.server.namenode.ha.TestBootstrapStandby | | | hadoop.hdfs.server.datanode.TestIncrementalBrVariations | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2736/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2736 | | Optional Tests | dupname asflicense compile javac j
[jira] [Work logged] (HADOOP-16202) Stabilize openFile() and adopt internally
[ https://issues.apache.org/jira/browse/HADOOP-16202?focusedWorklogId=568043&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-568043 ] ASF GitHub Bot logged work on HADOOP-16202: --- Author: ASF GitHub Bot Created on: 18/Mar/21 01:59 Start Date: 18/Mar/21 01:59 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #2584: URL: https://github.com/apache/hadoop/pull/2584#issuecomment-801558206 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 2s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | markdownlint | 0m 1s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 16 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 19s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 20m 8s | | trunk passed | | +1 :green_heart: | compile | 20m 58s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 18m 2s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | checkstyle | 3m 55s | | trunk passed | | +1 :green_heart: | mvnsite | 7m 24s | | trunk passed | | -1 :x: | javadoc | 1m 4s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2584/3/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt) | hadoop-common in trunk failed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04. | | +1 :green_heart: | javadoc | 6m 43s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 11m 6s | | trunk passed | | +1 :green_heart: | shadedclient | 14m 27s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 25s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 4m 25s | | the patch passed | | +1 :green_heart: | compile | 20m 1s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 20m 1s | | the patch passed | | +1 :green_heart: | compile | 17m 59s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | javac | 17m 59s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2584/3/artifact/out/blanks-eol.txt) | The patch has 11 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | -0 :warning: | checkstyle | 4m 26s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2584/3/artifact/out/results-checkstyle-root.txt) | root: The patch generated 3 new + 821 unchanged - 2 fixed = 824 total (was 823) | | +1 :green_heart: | mvnsite | 7m 29s | | the patch passed | | -1 :x: | javadoc | 1m 3s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2584/3/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt) | hadoop-common in the patch failed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04. | | +1 :green_heart: | javadoc | 6m 40s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 12m 38s | | the patch passed | | +1 :green_heart: | shadedclient | 14m 39s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 18m 53s | [/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2584/3/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 4m 55s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 7m 10s | | hadoop-mapreduce-client-core in the patch passed. |
[GitHub] [hadoop] hadoop-yetus commented on pull request #2584: HADOOP-16202. Enhance openFile()
hadoop-yetus commented on pull request #2584: URL: https://github.com/apache/hadoop/pull/2584#issuecomment-801558206 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 2s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | markdownlint | 0m 1s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 16 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 19s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 20m 8s | | trunk passed | | +1 :green_heart: | compile | 20m 58s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 18m 2s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | checkstyle | 3m 55s | | trunk passed | | +1 :green_heart: | mvnsite | 7m 24s | | trunk passed | | -1 :x: | javadoc | 1m 4s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2584/3/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt) | hadoop-common in trunk failed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04. | | +1 :green_heart: | javadoc | 6m 43s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 11m 6s | | trunk passed | | +1 :green_heart: | shadedclient | 14m 27s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 25s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 4m 25s | | the patch passed | | +1 :green_heart: | compile | 20m 1s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 20m 1s | | the patch passed | | +1 :green_heart: | compile | 17m 59s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | javac | 17m 59s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2584/3/artifact/out/blanks-eol.txt) | The patch has 11 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | -0 :warning: | checkstyle | 4m 26s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2584/3/artifact/out/results-checkstyle-root.txt) | root: The patch generated 3 new + 821 unchanged - 2 fixed = 824 total (was 823) | | +1 :green_heart: | mvnsite | 7m 29s | | the patch passed | | -1 :x: | javadoc | 1m 3s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2584/3/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt) | hadoop-common in the patch failed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04. | | +1 :green_heart: | javadoc | 6m 40s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 12m 38s | | the patch passed | | +1 :green_heart: | shadedclient | 14m 39s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 18m 53s | [/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2584/3/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 4m 55s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 7m 10s | | hadoop-mapreduce-client-core in the patch passed. | | +1 :green_heart: | unit | 8m 36s | | hadoop-mapreduce-client-app in the patch passed. | | -1 :x: | unit | 15m 46s | [/patch-unit-hadoop-tools_hadoop-distcp.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2584/3/artifact/out/patch-unit-hadoop-tools_hadoop-distcp.txt) | hadoop-distcp in the patch passed. | | +1 :green_heart: | unit | 0m 57s | | hadoop-mapreduce-examples in the patch p
[jira] [Work logged] (HADOOP-17511) Add an Audit plugin point for S3A auditing/context
[ https://issues.apache.org/jira/browse/HADOOP-17511?focusedWorklogId=567994&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567994 ] ASF GitHub Bot logged work on HADOOP-17511: --- Author: ASF GitHub Bot Created on: 17/Mar/21 21:49 Start Date: 17/Mar/21 21:49 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #2675: URL: https://github.com/apache/hadoop/pull/2675#issuecomment-801465358 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 4s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | markdownlint | 0m 1s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 37 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 0s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 23m 14s | | trunk passed | | +1 :green_heart: | compile | 22m 29s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 18m 47s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | checkstyle | 4m 4s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 17s | | trunk passed | | -1 :x: | javadoc | 0m 58s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/28/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt) | hadoop-common in trunk failed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04. | | +1 :green_heart: | javadoc | 2m 7s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 3m 32s | | trunk passed | | +1 :green_heart: | shadedclient | 16m 50s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 20s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 26s | | the patch passed | | +1 :green_heart: | compile | 21m 40s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javac | 21m 40s | [/results-compile-javac-root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/28/artifact/out/results-compile-javac-root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt) | root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 generated 1 new + 1955 unchanged - 1 fixed = 1956 total (was 1956) | | +1 :green_heart: | compile | 18m 52s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | -1 :x: | javac | 18m 52s | [/results-compile-javac-root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/28/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt) | root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 generated 1 new + 1850 unchanged - 1 fixed = 1851 total (was 1851) | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/28/artifact/out/blanks-eol.txt) | The patch has 3 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | -0 :warning: | checkstyle | 3m 52s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/28/artifact/out/results-checkstyle-root.txt) | root: The patch generated 48 new + 192 unchanged - 7 fixed = 240 total (was 199) | | +1 :green_heart: | mvnsite | 2m 14s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | -1 :x: | javadoc | 0m 57s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/28/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt) | hadoop-common in the p
[GitHub] [hadoop] hadoop-yetus commented on pull request #2675: HADOOP-17511. Add audit/telemetry logging to S3A connector
hadoop-yetus commented on pull request #2675: URL: https://github.com/apache/hadoop/pull/2675#issuecomment-801465358 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 4s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | markdownlint | 0m 1s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 37 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 0s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 23m 14s | | trunk passed | | +1 :green_heart: | compile | 22m 29s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 18m 47s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | checkstyle | 4m 4s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 17s | | trunk passed | | -1 :x: | javadoc | 0m 58s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/28/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt) | hadoop-common in trunk failed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04. | | +1 :green_heart: | javadoc | 2m 7s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 3m 32s | | trunk passed | | +1 :green_heart: | shadedclient | 16m 50s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 20s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 26s | | the patch passed | | +1 :green_heart: | compile | 21m 40s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javac | 21m 40s | [/results-compile-javac-root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/28/artifact/out/results-compile-javac-root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt) | root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 generated 1 new + 1955 unchanged - 1 fixed = 1956 total (was 1956) | | +1 :green_heart: | compile | 18m 52s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | -1 :x: | javac | 18m 52s | [/results-compile-javac-root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/28/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt) | root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 generated 1 new + 1850 unchanged - 1 fixed = 1851 total (was 1851) | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/28/artifact/out/blanks-eol.txt) | The patch has 3 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | -0 :warning: | checkstyle | 3m 52s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/28/artifact/out/results-checkstyle-root.txt) | root: The patch generated 48 new + 192 unchanged - 7 fixed = 240 total (was 199) | | +1 :green_heart: | mvnsite | 2m 14s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | -1 :x: | javadoc | 0m 57s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/28/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt) | hadoop-common in the patch failed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04. | | -1 :x: | javadoc | 0m 37s | [/patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/28/artifact/out/patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt) | hadoop-aws in the patch failed with JDK Private
[GitHub] [hadoop] vivekratnavel opened a new pull request #2784: HDFS-15850. Superuser actions should be reported to external enforcers
vivekratnavel opened a new pull request #2784: URL: https://github.com/apache/hadoop/pull/2784 https://issues.apache.org/jira/browse/HDFS-15850 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2783: HDFS-15903. Refactor X-Platform lib
hadoop-yetus commented on pull request #2783: URL: https://github.com/apache/hadoop/pull/2783#issuecomment-801453336 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 9s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 23s | | trunk passed | | +1 :green_heart: | compile | 3m 16s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 3m 27s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | mvnsite | 0m 27s | | trunk passed | | +1 :green_heart: | shadedclient | 63m 57s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 16s | | the patch passed | | +1 :green_heart: | compile | 3m 3s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | -1 :x: | cc | 3m 3s | [/results-compile-cc-hadoop-hdfs-project_hadoop-hdfs-native-client-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2783/1/artifact/out/results-compile-cc-hadoop-hdfs-project_hadoop-hdfs-native-client-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt) | hadoop-hdfs-project_hadoop-hdfs-native-client-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 generated 14 new + 79 unchanged - 14 fixed = 93 total (was 93) | | +1 :green_heart: | golang | 3m 3s | | the patch passed | | +1 :green_heart: | javac | 3m 3s | | the patch passed | | +1 :green_heart: | compile | 3m 4s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | -1 :x: | cc | 3m 4s | [/results-compile-cc-hadoop-hdfs-project_hadoop-hdfs-native-client-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2783/1/artifact/out/results-compile-cc-hadoop-hdfs-project_hadoop-hdfs-native-client-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt) | hadoop-hdfs-project_hadoop-hdfs-native-client-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 generated 8 new + 85 unchanged - 8 fixed = 93 total (was 93) | | +1 :green_heart: | golang | 3m 4s | | the patch passed | | +1 :green_heart: | javac | 3m 4s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 17s | | the patch passed | | +1 :green_heart: | shadedclient | 17m 49s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 124m 46s | | hadoop-hdfs-native-client in the patch passed. | | +1 :green_heart: | asflicense | 0m 29s | | The patch does not generate ASF License warnings. | | | | 217m 40s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2783/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2783 | | Optional Tests | dupname asflicense compile cc mvnsite javac unit codespell golang | | uname | Linux d705dc079fad 4.15.0-126-generic #129-Ubuntu SMP Mon Nov 23 18:53:38 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 5bc1eafec3c1b6cd2add4e88e50b557a1ddfecef | | Default Java | Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2783/1/testReport/ | | Max. process+thread count | 514 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: hadoop-hdfs-project/hadoop-hdfs-native-client | | Console output | https://ci-hadoop.apach
[GitHub] [hadoop] hadoop-yetus commented on pull request #2782: HDFS-15901.Solve the problem of DN repeated block reports occupying too many RPCs during Safemode.
hadoop-yetus commented on pull request #2782: URL: https://github.com/apache/hadoop/pull/2782#issuecomment-801439442 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 53s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 34m 43s | | trunk passed | | +1 :green_heart: | compile | 1m 20s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 1m 12s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | checkstyle | 1m 1s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 21s | | trunk passed | | +1 :green_heart: | javadoc | 0m 52s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 27s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 3m 16s | | trunk passed | | +1 :green_heart: | shadedclient | 18m 48s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 11s | | the patch passed | | +1 :green_heart: | compile | 1m 15s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 1m 15s | | the patch passed | | +1 :green_heart: | compile | 1m 7s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | javac | 1m 7s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 54s | [/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2782/1/artifact/out/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs-project/hadoop-hdfs: The patch generated 6 new + 164 unchanged - 0 fixed = 170 total (was 164) | | +1 :green_heart: | mvnsite | 1m 13s | | the patch passed | | +1 :green_heart: | javadoc | 0m 46s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 22s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 3m 20s | | the patch passed | | +1 :green_heart: | shadedclient | 18m 19s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 346m 13s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2782/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 44s | | The patch does not generate ASF License warnings. | | | | 438m 36s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.namenode.ha.TestBootstrapStandby | | | hadoop.hdfs.server.namenode.snapshot.TestNestedSnapshots | | | hadoop.hdfs.server.namenode.TestFileTruncate | | | hadoop.hdfs.TestHDFSFileSystemContract | | | hadoop.hdfs.server.namenode.ha.TestEditLogTailer | | | hadoop.hdfs.TestLeaseRecovery2 | | | hadoop.hdfs.TestPersistBlocks | | | hadoop.hdfs.server.namenode.ha.TestPipelinesFailover | | | hadoop.hdfs.TestDFSShell | | | hadoop.hdfs.server.datanode.TestDirectoryScanner | | | hadoop.hdfs.server.namenode.TestDecommissioningStatusWithBackoffMonitor | | | hadoop.hdfs.server.datanode.fsdataset.impl.TestFsVolumeList | | | hadoop.fs.viewfs.TestViewFileSystemOverloadSchemeWithHdfsScheme | | | hadoop.hdfs.server.namenode.TestDecommissioningStatus | | | hadoop.hdfs.server.datanode.TestBlockScanner | | | hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2782/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2782 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codesp
[GitHub] [hadoop] Nargeshdb commented on pull request #2736: HDFS-15868. Possible Resource Leak in EditLogFileOutputStream
Nargeshdb commented on pull request #2736: URL: https://github.com/apache/hadoop/pull/2736#issuecomment-801362682 I've checked failed unit tests and I don't think they're related to this PR. I have merged in the latest trunk again. Thanks. @Hexiaoqiao This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17531) DistCp: Reduce memory usage on copying huge directories
[ https://issues.apache.org/jira/browse/HADOOP-17531?focusedWorklogId=567923&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567923 ] ASF GitHub Bot logged work on HADOOP-17531: --- Author: ASF GitHub Bot Created on: 17/Mar/21 19:21 Start Date: 17/Mar/21 19:21 Worklog Time Spent: 10m Work Description: ayushtkn commented on pull request #2732: URL: https://github.com/apache/hadoop/pull/2732#issuecomment-801347166 @steveloughran any further comments? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567923) Time Spent: 5h 50m (was: 5h 40m) > DistCp: Reduce memory usage on copying huge directories > --- > > Key: HADOOP-17531 > URL: https://issues.apache.org/jira/browse/HADOOP-17531 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Ayush Saxena >Assignee: Ayush Saxena >Priority: Critical > Labels: pull-request-available > Attachments: MoveToStackIterator.patch, gc-NewD-512M-3.8ML.log > > Time Spent: 5h 50m > Remaining Estimate: 0h > > Presently distCp, uses the producer-consumer kind of setup while building the > listing, the input queue and output queue are both unbounded, thus the > listStatus grows quite huge. > Rel Code Part : > https://github.com/apache/hadoop/blob/trunk/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/SimpleCopyListing.java#L635 > This goes on bredth-first traversal kind of stuff(uses queue instead of > earlier stack), so if you have files at lower depth, it will like open up the > entire tree and the start processing -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ayushtkn commented on pull request #2732: HADOOP-17531. DistCp: Reduce memory usage on copying huge directories.
ayushtkn commented on pull request #2732: URL: https://github.com/apache/hadoop/pull/2732#issuecomment-801347166 @steveloughran any further comments? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17511) Add an Audit plugin point for S3A auditing/context
[ https://issues.apache.org/jira/browse/HADOOP-17511?focusedWorklogId=567885&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567885 ] ASF GitHub Bot logged work on HADOOP-17511: --- Author: ASF GitHub Bot Created on: 17/Mar/21 18:25 Start Date: 17/Mar/21 18:25 Worklog Time Spent: 10m Work Description: steveloughran commented on pull request #2675: URL: https://github.com/apache/hadoop/pull/2675#issuecomment-801310714 This PR is (And will repeatedly be) rebased on top of #2778; as that gets revised this will be force pushed. Latest PR pulls out getContentStatus as it turns out some apps use it too much. The stats and logs will now make clear how much IO it uses. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567885) Time Spent: 9h 10m (was: 9h) > Add an Audit plugin point for S3A auditing/context > -- > > Key: HADOOP-17511 > URL: https://issues.apache.org/jira/browse/HADOOP-17511 > Project: Hadoop Common > Issue Type: Sub-task >Affects Versions: 3.3.1 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 9h 10m > Remaining Estimate: 0h > > Add a way for auditing tools to correlate S3 object calls with Hadoop FS API > calls. > Initially just to log/forward to an auditing service. > Later: let us attach them as parameters in S3 requests, such as opentrace > headeers or (my initial idea: http referrer header -where it will get into > the log) > Challenges > * ensuring the audit span is created for every public entry point. That will > have to include those used in s3guard tools, some defacto public APIs > * and not re-entered for active spans. s3A code must not call back into the > FS API points > * Propagation across worker threads -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #2675: HADOOP-17511. Add audit/telemetry logging to S3A connector
steveloughran commented on pull request #2675: URL: https://github.com/apache/hadoop/pull/2675#issuecomment-801310714 This PR is (And will repeatedly be) rebased on top of #2778; as that gets revised this will be force pushed. Latest PR pulls out getContentStatus as it turns out some apps use it too much. The stats and logs will now make clear how much IO it uses. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-13551) Collect AwsSdkMetrics in S3A FileSystem IOStatistics
[ https://issues.apache.org/jira/browse/HADOOP-13551?focusedWorklogId=567883&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567883 ] ASF GitHub Bot logged work on HADOOP-13551: --- Author: ASF GitHub Bot Created on: 17/Mar/21 18:23 Start Date: 17/Mar/21 18:23 Worklog Time Spent: 10m Work Description: steveloughran commented on pull request #2778: URL: https://github.com/apache/hadoop/pull/2778#issuecomment-801308539 ``` [WARNING] The requested profile "docs" could not be activated because it does not exist. [ERROR] Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork (default-cli) on project hadoop-aws: An error has occurred in Javadoc report generation: [ERROR] Exit code: 1 - /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-2778/src/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/DefaultS3ClientFactory.java:71: error: reference not found [ERROR]* and then invoking {@link #buildAmazonS3Client(AWSCredentialsProvider, ClientConfiguration, S3ClientCreationParameters, String, boolean)} ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567883) Time Spent: 3h (was: 2h 50m) > Collect AwsSdkMetrics in S3A FileSystem IOStatistics > > > Key: HADOOP-13551 > URL: https://issues.apache.org/jira/browse/HADOOP-13551 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.0.0-beta1 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 3h > Remaining Estimate: 0h > > The S3A Connector has the ability to pass statistics collected by the AWS SDK > into the IOStatistics store of the (stream, FS) > But > * wiring up doesn't (yet) work > * its best if there was thread-context level collection, though FS-level > statistics would be a good start. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #2778: HADOOP-13551. AWS metrics wire-up
steveloughran commented on pull request #2778: URL: https://github.com/apache/hadoop/pull/2778#issuecomment-801308539 ``` [WARNING] The requested profile "docs" could not be activated because it does not exist. [ERROR] Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork (default-cli) on project hadoop-aws: An error has occurred in Javadoc report generation: [ERROR] Exit code: 1 - /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-2778/src/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/DefaultS3ClientFactory.java:71: error: reference not found [ERROR]* and then invoking {@link #buildAmazonS3Client(AWSCredentialsProvider, ClientConfiguration, S3ClientCreationParameters, String, boolean)} ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-13551) Collect AwsSdkMetrics in S3A FileSystem IOStatistics
[ https://issues.apache.org/jira/browse/HADOOP-13551?focusedWorklogId=567882&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567882 ] ASF GitHub Bot logged work on HADOOP-13551: --- Author: ASF GitHub Bot Created on: 17/Mar/21 18:22 Start Date: 17/Mar/21 18:22 Worklog Time Spent: 10m Work Description: hadoop-yetus removed a comment on pull request #2778: URL: https://github.com/apache/hadoop/pull/2778#issuecomment-800516477 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567882) Time Spent: 2h 50m (was: 2h 40m) > Collect AwsSdkMetrics in S3A FileSystem IOStatistics > > > Key: HADOOP-13551 > URL: https://issues.apache.org/jira/browse/HADOOP-13551 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.0.0-beta1 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 2h 50m > Remaining Estimate: 0h > > The S3A Connector has the ability to pass statistics collected by the AWS SDK > into the IOStatistics store of the (stream, FS) > But > * wiring up doesn't (yet) work > * its best if there was thread-context level collection, though FS-level > statistics would be a good start. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus removed a comment on pull request #2778: HADOOP-13551. AWS metrics wire-up
hadoop-yetus removed a comment on pull request #2778: URL: https://github.com/apache/hadoop/pull/2778#issuecomment-800516477 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-13551) Collect AwsSdkMetrics in S3A FileSystem IOStatistics
[ https://issues.apache.org/jira/browse/HADOOP-13551?focusedWorklogId=567881&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567881 ] ASF GitHub Bot logged work on HADOOP-13551: --- Author: ASF GitHub Bot Created on: 17/Mar/21 18:21 Start Date: 17/Mar/21 18:21 Worklog Time Spent: 10m Work Description: hadoop-yetus removed a comment on pull request #2778: URL: https://github.com/apache/hadoop/pull/2778#issuecomment-799727114 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567881) Time Spent: 2h 40m (was: 2.5h) > Collect AwsSdkMetrics in S3A FileSystem IOStatistics > > > Key: HADOOP-13551 > URL: https://issues.apache.org/jira/browse/HADOOP-13551 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.0.0-beta1 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 2h 40m > Remaining Estimate: 0h > > The S3A Connector has the ability to pass statistics collected by the AWS SDK > into the IOStatistics store of the (stream, FS) > But > * wiring up doesn't (yet) work > * its best if there was thread-context level collection, though FS-level > statistics would be a good start. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus removed a comment on pull request #2778: HADOOP-13551. AWS metrics wire-up
hadoop-yetus removed a comment on pull request #2778: URL: https://github.com/apache/hadoop/pull/2778#issuecomment-799727114 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-16819) Possible inconsistent state of AbstractDelegationTokenSecretManager
[ https://issues.apache.org/jira/browse/HADOOP-16819?focusedWorklogId=567880&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567880 ] ASF GitHub Bot logged work on HADOOP-16819: --- Author: ASF GitHub Bot Created on: 17/Mar/21 18:19 Start Date: 17/Mar/21 18:19 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #1894: URL: https://github.com/apache/hadoop/pull/1894#issuecomment-801306561 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 11s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 6s | | trunk passed | | +1 :green_heart: | compile | 27m 10s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 23m 18s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | checkstyle | 1m 11s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 46s | | trunk passed | | +1 :green_heart: | javadoc | 1m 16s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 43s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 2m 44s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 34s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 5s | | the patch passed | | +1 :green_heart: | compile | 26m 18s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 26m 18s | | the patch passed | | +1 :green_heart: | compile | 22m 50s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | javac | 22m 50s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 12s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 58s | | the patch passed | | +1 :green_heart: | javadoc | 1m 11s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 47s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | -1 :x: | spotbugs | 2m 48s | [/new-spotbugs-hadoop-common-project_hadoop-common.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-1894/6/artifact/out/new-spotbugs-hadoop-common-project_hadoop-common.html) | hadoop-common-project/hadoop-common generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) | | +1 :green_heart: | shadedclient | 20m 19s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 17m 58s | [/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-1894/6/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 58s | | The patch does not generate ASF License warnings. | | | | 217m 18s | | | | Reason | Tests | |---:|:--| | SpotBugs | module:hadoop-common-project/hadoop-common | | | Inconsistent synchronization of org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager.currentKey; locked 88% of time Unsynchronized access at AbstractDelegationTokenSecretManager.java:88% of time Unsynchronized access at AbstractDelegationTokenSecretManager.java:[line 379] | | Failed junit tests | hadoop.security.TestRaceWhenRelogin | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-1894/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/1894 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotb
[GitHub] [hadoop] hadoop-yetus commented on pull request #1894: HADOOP-16819 Possible inconsistent state of AbstractDelegationTokenSecretManager
hadoop-yetus commented on pull request #1894: URL: https://github.com/apache/hadoop/pull/1894#issuecomment-801306561 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 11s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 6s | | trunk passed | | +1 :green_heart: | compile | 27m 10s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 23m 18s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | checkstyle | 1m 11s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 46s | | trunk passed | | +1 :green_heart: | javadoc | 1m 16s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 43s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 2m 44s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 34s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 5s | | the patch passed | | +1 :green_heart: | compile | 26m 18s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 26m 18s | | the patch passed | | +1 :green_heart: | compile | 22m 50s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | javac | 22m 50s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 12s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 58s | | the patch passed | | +1 :green_heart: | javadoc | 1m 11s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 47s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | -1 :x: | spotbugs | 2m 48s | [/new-spotbugs-hadoop-common-project_hadoop-common.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-1894/6/artifact/out/new-spotbugs-hadoop-common-project_hadoop-common.html) | hadoop-common-project/hadoop-common generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) | | +1 :green_heart: | shadedclient | 20m 19s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 17m 58s | [/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-1894/6/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 58s | | The patch does not generate ASF License warnings. | | | | 217m 18s | | | | Reason | Tests | |---:|:--| | SpotBugs | module:hadoop-common-project/hadoop-common | | | Inconsistent synchronization of org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager.currentKey; locked 88% of time Unsynchronized access at AbstractDelegationTokenSecretManager.java:88% of time Unsynchronized access at AbstractDelegationTokenSecretManager.java:[line 379] | | Failed junit tests | hadoop.security.TestRaceWhenRelogin | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-1894/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/1894 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 0e96f88fb06a 4.15.0-136-generic #140-Ubuntu SMP Thu Jan 28 05:20:47 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 759aa2b89fcaeee15340f05a02af9ff0db855eb5 | | Default Java | Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-a
[jira] [Work logged] (HADOOP-17578) Improve UGI debug log to help troubleshooting TokenCache related issues
[ https://issues.apache.org/jira/browse/HADOOP-17578?focusedWorklogId=567861&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567861 ] ASF GitHub Bot logged work on HADOOP-17578: --- Author: ASF GitHub Bot Created on: 17/Mar/21 17:57 Start Date: 17/Mar/21 17:57 Worklog Time Spent: 10m Work Description: xiaoyuyao commented on pull request #2762: URL: https://github.com/apache/hadoop/pull/2762#issuecomment-801291174 Java doc issue is unrelated. Thanks @cxorm for the review. I'll merge the PR shortly. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567861) Time Spent: 1h 10m (was: 1h) > Improve UGI debug log to help troubleshooting TokenCache related issues > --- > > Key: HADOOP-17578 > URL: https://issues.apache.org/jira/browse/HADOOP-17578 > Project: Hadoop Common > Issue Type: Bug >Affects Versions: 3.2.0 >Reporter: Xiaoyu Yao >Assignee: Xiaoyu Yao >Priority: Major > Labels: pull-request-available > Time Spent: 1h 10m > Remaining Estimate: 0h > > We have seen some issues around TokenCache getDelegationToken failures even > though the UGI already has a valid token. The tricky part is the token map is > keyed by the canonical service name, which can be different from the actual > service field in the token, e.g. KMS token in HA case. The current UGI log > dumps all the tokens but not the keys of the token map. This ticket is opened > to include the complete token map information in the debug log. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17578) Improve UGI debug log to help troubleshooting TokenCache related issues
[ https://issues.apache.org/jira/browse/HADOOP-17578?focusedWorklogId=567862&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567862 ] ASF GitHub Bot logged work on HADOOP-17578: --- Author: ASF GitHub Bot Created on: 17/Mar/21 17:57 Start Date: 17/Mar/21 17:57 Worklog Time Spent: 10m Work Description: xiaoyuyao merged pull request #2762: URL: https://github.com/apache/hadoop/pull/2762 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567862) Time Spent: 1h 20m (was: 1h 10m) > Improve UGI debug log to help troubleshooting TokenCache related issues > --- > > Key: HADOOP-17578 > URL: https://issues.apache.org/jira/browse/HADOOP-17578 > Project: Hadoop Common > Issue Type: Bug >Affects Versions: 3.2.0 >Reporter: Xiaoyu Yao >Assignee: Xiaoyu Yao >Priority: Major > Labels: pull-request-available > Time Spent: 1h 20m > Remaining Estimate: 0h > > We have seen some issues around TokenCache getDelegationToken failures even > though the UGI already has a valid token. The tricky part is the token map is > keyed by the canonical service name, which can be different from the actual > service field in the token, e.g. KMS token in HA case. The current UGI log > dumps all the tokens but not the keys of the token map. This ticket is opened > to include the complete token map information in the debug log. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17578) Improve UGI debug log to help troubleshooting TokenCache related issues
[ https://issues.apache.org/jira/browse/HADOOP-17578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Xiaoyu Yao updated HADOOP-17578: Fix Version/s: 3.4.0 Hadoop Flags: Reviewed Resolution: Fixed Status: Resolved (was: Patch Available) > Improve UGI debug log to help troubleshooting TokenCache related issues > --- > > Key: HADOOP-17578 > URL: https://issues.apache.org/jira/browse/HADOOP-17578 > Project: Hadoop Common > Issue Type: Bug >Affects Versions: 3.2.0 >Reporter: Xiaoyu Yao >Assignee: Xiaoyu Yao >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 1h 20m > Remaining Estimate: 0h > > We have seen some issues around TokenCache getDelegationToken failures even > though the UGI already has a valid token. The tricky part is the token map is > keyed by the canonical service name, which can be different from the actual > service field in the token, e.g. KMS token in HA case. The current UGI log > dumps all the tokens but not the keys of the token map. This ticket is opened > to include the complete token map information in the debug log. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] xiaoyuyao merged pull request #2762: HADOOP-17578. Improve UGI debug log to help troubleshooting TokenCach…
xiaoyuyao merged pull request #2762: URL: https://github.com/apache/hadoop/pull/2762 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] xiaoyuyao commented on pull request #2762: HADOOP-17578. Improve UGI debug log to help troubleshooting TokenCach…
xiaoyuyao commented on pull request #2762: URL: https://github.com/apache/hadoop/pull/2762#issuecomment-801291174 Java doc issue is unrelated. Thanks @cxorm for the review. I'll merge the PR shortly. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] GauthamBanasandra opened a new pull request #2783: HDFS-15903. Refactor X-Platform lib
GauthamBanasandra opened a new pull request #2783: URL: https://github.com/apache/hadoop/pull/2783 * Moved C API to a separate folder which currently contains the syscall API. * Dropped _utils_ from the X-Platform object library targets since it doesn't contain only utils anymore. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17511) Add an Audit plugin point for S3A auditing/context
[ https://issues.apache.org/jira/browse/HADOOP-17511?focusedWorklogId=567851&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567851 ] ASF GitHub Bot logged work on HADOOP-17511: --- Author: ASF GitHub Bot Created on: 17/Mar/21 17:45 Start Date: 17/Mar/21 17:45 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #2675: URL: https://github.com/apache/hadoop/pull/2675#issuecomment-801282132 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 37s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 2s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | markdownlint | 0m 1s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 1s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 36 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 24s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 20m 16s | | trunk passed | | +1 :green_heart: | compile | 20m 55s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 17m 56s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | checkstyle | 3m 47s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 27s | | trunk passed | | +1 :green_heart: | javadoc | 1m 41s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 2m 20s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 3m 37s | | trunk passed | | +1 :green_heart: | shadedclient | 15m 14s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 23s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 30s | | the patch passed | | +1 :green_heart: | compile | 22m 21s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javac | 22m 21s | [/results-compile-javac-root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/27/artifact/out/results-compile-javac-root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt) | root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 generated 1 new + 1955 unchanged - 1 fixed = 1956 total (was 1956) | | +1 :green_heart: | compile | 19m 45s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | -1 :x: | javac | 19m 45s | [/results-compile-javac-root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/27/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt) | root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 generated 1 new + 1850 unchanged - 1 fixed = 1851 total (was 1851) | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/27/artifact/out/blanks-eol.txt) | The patch has 3 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | -0 :warning: | checkstyle | 4m 13s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/27/artifact/out/results-checkstyle-root.txt) | root: The patch generated 47 new + 192 unchanged - 7 fixed = 239 total (was 199) | | +1 :green_heart: | mvnsite | 2m 22s | | the patch passed | | +1 :green_heart: | xml | 0m 2s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 1m 30s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javadoc | 0m 41s | [/patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/27/artifact/out/patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt) | hadoop-aws in the patch failed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08. | | -1 :x: | spotbugs | 1m 25s | [/new-spotbugs-hadoop-tools_hadoop-aws.html](https
[GitHub] [hadoop] hadoop-yetus commented on pull request #2675: HADOOP-17511. Add audit/telemetry logging to S3A connector
hadoop-yetus commented on pull request #2675: URL: https://github.com/apache/hadoop/pull/2675#issuecomment-801282132 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 37s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 2s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | markdownlint | 0m 1s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 1s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 36 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 24s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 20m 16s | | trunk passed | | +1 :green_heart: | compile | 20m 55s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 17m 56s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | checkstyle | 3m 47s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 27s | | trunk passed | | +1 :green_heart: | javadoc | 1m 41s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 2m 20s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 3m 37s | | trunk passed | | +1 :green_heart: | shadedclient | 15m 14s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 23s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 30s | | the patch passed | | +1 :green_heart: | compile | 22m 21s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javac | 22m 21s | [/results-compile-javac-root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/27/artifact/out/results-compile-javac-root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt) | root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 generated 1 new + 1955 unchanged - 1 fixed = 1956 total (was 1956) | | +1 :green_heart: | compile | 19m 45s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | -1 :x: | javac | 19m 45s | [/results-compile-javac-root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/27/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt) | root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 generated 1 new + 1850 unchanged - 1 fixed = 1851 total (was 1851) | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/27/artifact/out/blanks-eol.txt) | The patch has 3 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | -0 :warning: | checkstyle | 4m 13s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/27/artifact/out/results-checkstyle-root.txt) | root: The patch generated 47 new + 192 unchanged - 7 fixed = 239 total (was 199) | | +1 :green_heart: | mvnsite | 2m 22s | | the patch passed | | +1 :green_heart: | xml | 0m 2s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 1m 30s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javadoc | 0m 41s | [/patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/27/artifact/out/patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt) | hadoop-aws in the patch failed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08. | | -1 :x: | spotbugs | 1m 25s | [/new-spotbugs-hadoop-tools_hadoop-aws.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2675/27/artifact/out/new-spotbugs-hadoop-tools_hadoop-aws.html) | hadoop-tools/hadoop-aws generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | +1 :green_heart: | shadedclient | 14m 13s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 38s | | hadoop-common in the pa
[jira] [Work logged] (HADOOP-16948) ABFS: Support single writer dirs
[ https://issues.apache.org/jira/browse/HADOOP-16948?focusedWorklogId=567817&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567817 ] ASF GitHub Bot logged work on HADOOP-16948: --- Author: ASF GitHub Bot Created on: 17/Mar/21 16:48 Start Date: 17/Mar/21 16:48 Worklog Time Spent: 10m Work Description: billierinaldi commented on a change in pull request #1925: URL: https://github.com/apache/hadoop/pull/1925#discussion_r596203785 ## File path: hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AbfsConfiguration.java ## @@ -208,6 +209,15 @@ DefaultValue = DEFAULT_FS_AZURE_APPEND_BLOB_DIRECTORIES) private String azureAppendBlobDirs; + @StringConfigurationValidatorAnnotation(ConfigurationKey = FS_AZURE_INFINITE_LEASE_KEY, Review comment: I have run the unit test with HNS and flat namespace storage accounts, so I think it will work. I have not done extensive testing with HNS disabled, however. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567817) Time Spent: 8h 50m (was: 8h 40m) > ABFS: Support single writer dirs > > > Key: HADOOP-16948 > URL: https://issues.apache.org/jira/browse/HADOOP-16948 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Billie Rinaldi >Assignee: Billie Rinaldi >Priority: Minor > Labels: abfsactive, pull-request-available > Time Spent: 8h 50m > Remaining Estimate: 0h > > This would allow some directories to be configured as single writer > directories. The ABFS driver would obtain a lease when creating or opening a > file for writing and would automatically renew the lease and release the > lease when closing the file. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] billierinaldi commented on a change in pull request #1925: HADOOP-16948. Support single writer dirs.
billierinaldi commented on a change in pull request #1925: URL: https://github.com/apache/hadoop/pull/1925#discussion_r596203785 ## File path: hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AbfsConfiguration.java ## @@ -208,6 +209,15 @@ DefaultValue = DEFAULT_FS_AZURE_APPEND_BLOB_DIRECTORIES) private String azureAppendBlobDirs; + @StringConfigurationValidatorAnnotation(ConfigurationKey = FS_AZURE_INFINITE_LEASE_KEY, Review comment: I have run the unit test with HNS and flat namespace storage accounts, so I think it will work. I have not done extensive testing with HNS disabled, however. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-16948) ABFS: Support single writer dirs
[ https://issues.apache.org/jira/browse/HADOOP-16948?focusedWorklogId=567815&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567815 ] ASF GitHub Bot logged work on HADOOP-16948: --- Author: ASF GitHub Bot Created on: 17/Mar/21 16:47 Start Date: 17/Mar/21 16:47 Worklog Time Spent: 10m Work Description: billierinaldi commented on a change in pull request #1925: URL: https://github.com/apache/hadoop/pull/1925#discussion_r596202590 ## File path: hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AbfsConfiguration.java ## @@ -208,6 +209,15 @@ DefaultValue = DEFAULT_FS_AZURE_APPEND_BLOB_DIRECTORIES) private String azureAppendBlobDirs; + @StringConfigurationValidatorAnnotation(ConfigurationKey = FS_AZURE_INFINITE_LEASE_KEY, + DefaultValue = DEFAULT_FS_AZURE_INFINITE_LEASE_DIRECTORIES) + private String azureInfiniteLeaseDirs; + + @IntegerConfigurationValidatorAnnotation(ConfigurationKey = FS_AZURE_LEASE_THREADS, Review comment: I think it will still be useful to issue the acquire and release operations in a thread pool for now. Possibly this could be removed if all acquire and release operations are moved into create and flush-with-close in the future. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567815) Time Spent: 8h 40m (was: 8.5h) > ABFS: Support single writer dirs > > > Key: HADOOP-16948 > URL: https://issues.apache.org/jira/browse/HADOOP-16948 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Billie Rinaldi >Assignee: Billie Rinaldi >Priority: Minor > Labels: abfsactive, pull-request-available > Time Spent: 8h 40m > Remaining Estimate: 0h > > This would allow some directories to be configured as single writer > directories. The ABFS driver would obtain a lease when creating or opening a > file for writing and would automatically renew the lease and release the > lease when closing the file. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] billierinaldi commented on a change in pull request #1925: HADOOP-16948. Support single writer dirs.
billierinaldi commented on a change in pull request #1925: URL: https://github.com/apache/hadoop/pull/1925#discussion_r596202590 ## File path: hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AbfsConfiguration.java ## @@ -208,6 +209,15 @@ DefaultValue = DEFAULT_FS_AZURE_APPEND_BLOB_DIRECTORIES) private String azureAppendBlobDirs; + @StringConfigurationValidatorAnnotation(ConfigurationKey = FS_AZURE_INFINITE_LEASE_KEY, + DefaultValue = DEFAULT_FS_AZURE_INFINITE_LEASE_DIRECTORIES) + private String azureInfiniteLeaseDirs; + + @IntegerConfigurationValidatorAnnotation(ConfigurationKey = FS_AZURE_LEASE_THREADS, Review comment: I think it will still be useful to issue the acquire and release operations in a thread pool for now. Possibly this could be removed if all acquire and release operations are moved into create and flush-with-close in the future. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mukund-thakur edited a comment on pull request #1867: HADOOP-16983. Update ADLS client credential creation docs.
mukund-thakur edited a comment on pull request #1867: URL: https://github.com/apache/hadoop/pull/1867#issuecomment-801225131 Hi @snvijaya @bilaharith I tried setting up running of ADLS tests under hadoo-azure-datalake using https://github.com/apache/hadoop/blob/trunk/hadoop-tools/hadoop-azure-datalake/src/site/markdown/index.md and the updated steve's PR but getting this error. > 2021-03-17 18:52:38,507 [Thread-0] DEBUG AccessTokenProvider - AADToken: no token. Returning expiring=true 2021-03-17 18:52:38,507 [Thread-0] DEBUG AccessTokenProvider - AAD Token is missing or expired: Calling refresh-token from abstract base class 2021-03-17 18:52:38,507 [Thread-0] DEBUG ClientCredsTokenProvider - AADToken: refreshing client-credential based token 2021-03-17 18:52:38,507 [Thread-0] DEBUG AzureADAuthenticator - AADToken: starting to fetch token using client creds for client ID 2cb01b86-cd61-4304-b5de-9123b6e2bfb0 2021-03-17 18:52:46,724 [Thread-0] DEBUG HttpTransport - HTTPRequest,Failed,cReqId:846438c5-2395-4d95-bbca-7c98a04f2f7d.2,lat:8216,err:java.io.IOException,Reqlen:0,Resplen:0,token_ns:8216907524,sReqId:null,path:/test,qp:op=MKDIRS&permission=755&api-version=2018-09-01 Not sure what I missed and how to debug further. Can you guys please help? Thanks This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mukund-thakur commented on pull request #1867: HADOOP-16983. Update ADLS client credential creation docs.
mukund-thakur commented on pull request #1867: URL: https://github.com/apache/hadoop/pull/1867#issuecomment-801225131 Hi @snvijaya @bilaharith I tried setting up running of ADLS tests under hadoo-azure-datalake using https://github.com/apache/hadoop/blob/trunk/hadoop-tools/hadoop-azure-datalake/src/site/markdown/index.md and the updated steve's PR but getting this error. `2021-03-17 18:52:38,507 [Thread-0] DEBUG AccessTokenProvider - AADToken: no token. Returning expiring=true 2021-03-17 18:52:38,507 [Thread-0] DEBUG AccessTokenProvider - AAD Token is missing or expired: Calling refresh-token from abstract base class 2021-03-17 18:52:38,507 [Thread-0] DEBUG ClientCredsTokenProvider - AADToken: refreshing client-credential based token 2021-03-17 18:52:38,507 [Thread-0] DEBUG AzureADAuthenticator - AADToken: starting to fetch token using client creds for client ID 2cb01b86-cd61-4304-b5de-9123b6e2bfb0 2021-03-17 18:52:46,724 [Thread-0] DEBUG HttpTransport - HTTPRequest,Failed,cReqId:846438c5-2395-4d95-bbca-7c98a04f2f7d.2,lat:8216,err:java.io.IOException,Reqlen:0,Resplen:0,token_ns:8216907524,sReqId:null,path:/test,qp:op=MKDIRS&permission=755&api-version=2018-09-01` Not sure what I missed and how to debug further. Can you guys please help? Thanks This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-13551) Collect AwsSdkMetrics in S3A FileSystem IOStatistics
[ https://issues.apache.org/jira/browse/HADOOP-13551?focusedWorklogId=567746&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567746 ] ASF GitHub Bot logged work on HADOOP-13551: --- Author: ASF GitHub Bot Created on: 17/Mar/21 15:19 Start Date: 17/Mar/21 15:19 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #2778: URL: https://github.com/apache/hadoop/pull/2778#issuecomment-801169153 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 37s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 5 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 42s | | trunk passed | | +1 :green_heart: | compile | 0m 43s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 0m 37s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | checkstyle | 0m 29s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 44s | | trunk passed | | +1 :green_heart: | javadoc | 0m 24s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 32s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 1m 8s | | trunk passed | | +1 :green_heart: | shadedclient | 13m 57s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 34s | | the patch passed | | +1 :green_heart: | compile | 0m 36s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 0m 36s | | the patch passed | | +1 :green_heart: | compile | 0m 29s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | javac | 0m 29s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 20s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2778/5/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) | hadoop-tools/hadoop-aws: The patch generated 1 new + 17 unchanged - 4 fixed = 18 total (was 21) | | +1 :green_heart: | mvnsite | 0m 32s | | the patch passed | | +1 :green_heart: | javadoc | 0m 16s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javadoc | 0m 22s | [/patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2778/5/artifact/out/patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt) | hadoop-aws in the patch failed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08. | | +1 :green_heart: | spotbugs | 1m 8s | | the patch passed | | +1 :green_heart: | shadedclient | 13m 54s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 3s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 32s | | The patch does not generate ASF License warnings. | | | | 73m 43s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2778/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2778 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 4913e90d4081 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 0923b23028e484361b2165b36435ac36e4fe1010 | | Default Java | Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build
[GitHub] [hadoop] hadoop-yetus commented on pull request #2778: HADOOP-13551. AWS metrics wire-up
hadoop-yetus commented on pull request #2778: URL: https://github.com/apache/hadoop/pull/2778#issuecomment-801169153 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 37s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 5 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 42s | | trunk passed | | +1 :green_heart: | compile | 0m 43s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 0m 37s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | checkstyle | 0m 29s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 44s | | trunk passed | | +1 :green_heart: | javadoc | 0m 24s | | trunk passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 32s | | trunk passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | spotbugs | 1m 8s | | trunk passed | | +1 :green_heart: | shadedclient | 13m 57s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 34s | | the patch passed | | +1 :green_heart: | compile | 0m 36s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 0m 36s | | the patch passed | | +1 :green_heart: | compile | 0m 29s | | the patch passed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | +1 :green_heart: | javac | 0m 29s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 20s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2778/5/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) | hadoop-tools/hadoop-aws: The patch generated 1 new + 17 unchanged - 4 fixed = 18 total (was 21) | | +1 :green_heart: | mvnsite | 0m 32s | | the patch passed | | +1 :green_heart: | javadoc | 0m 16s | | the patch passed with JDK Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javadoc | 0m 22s | [/patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2778/5/artifact/out/patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt) | hadoop-aws in the patch failed with JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08. | | +1 :green_heart: | spotbugs | 1m 8s | | the patch passed | | +1 :green_heart: | shadedclient | 13m 54s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 3s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 32s | | The patch does not generate ASF License warnings. | | | | 73m 43s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2778/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2778 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 4913e90d4081 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 0923b23028e484361b2165b36435ac36e4fe1010 | | Default Java | Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2778/5/testReport/ | | Max. process+thread count | 670 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2778/5/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
[jira] [Assigned] (HADOOP-17569) Building native code fails on Fedora 33
[ https://issues.apache.org/jira/browse/HADOOP-17569?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Masatake Iwasaki reassigned HADOOP-17569: - Assignee: Masatake Iwasaki > Building native code fails on Fedora 33 > --- > > Key: HADOOP-17569 > URL: https://issues.apache.org/jira/browse/HADOOP-17569 > Project: Hadoop Common > Issue Type: Improvement > Components: build, common >Reporter: Kengo Seki >Assignee: Masatake Iwasaki >Priority: Major > > I tried to build native code on Fedora 33, in which glibc 2.32 is installed > by default, but it failed with the following error. > {code:java} > $ cat /etc/redhat-release > Fedora release 33 (Thirty Three) > $ sudo dnf info --installed glibc > Installed Packages > Name : glibc > Version : 2.32 > Release : 1.fc33 > Architecture : x86_64 > Size : 17 M > Source : glibc-2.32-1.fc33.src.rpm > Repository : @System > From repo: anaconda > Summary : The GNU libc libraries > URL : http://www.gnu.org/software/glibc/ > License : LGPLv2+ and LGPLv2+ with exceptions and GPLv2+ and GPLv2+ with > exceptions and BSD and Inner-Net and ISC and Public Domain and GFDL > Description : The glibc package contains standard libraries which are used by > : multiple programs on the system. In order to save disk space > and > : memory, as well as to make upgrading easier, common system > code is > : kept in one place and shared between programs. This particular > package > : contains the most important sets of shared libraries: the > standard C > : library and the standard math library. Without these two > libraries, a > : Linux system will not function. > $ mvn clean compile -Pnative > ... > [INFO] Running make -j 1 VERBOSE=1 > [WARNING] /usr/bin/cmake > -S/home/vagrant/hadoop/hadoop-common-project/hadoop-common/src > -B/home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native > --check-build-system CMakeFiles/Makefile.cmake 0 > [WARNING] /usr/bin/cmake -E cmake_progress_start > /home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native/CMakeFiles > > /home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native//CMakeFiles/progress.marks > [WARNING] make -f CMakeFiles/Makefile2 all > [WARNING] make[1]: Entering directory > '/home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native' > [WARNING] make -f CMakeFiles/hadoop_static.dir/build.make > CMakeFiles/hadoop_static.dir/depend > [WARNING] make[2]: Entering directory > '/home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native' > [WARNING] cd > /home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native && > /usr/bin/cmake -E cmake_depends "Unix Makefiles" > /home/vagrant/hadoop/hadoop-common-project/hadoop-common/src > /home/vagrant/hadoop/hadoop-common-project/hadoop-common/src > /home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native > /home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native > /home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native/CMakeFiles/hadoop_static.dir/DependInfo.cmake > --color= > [WARNING] Dependee > "/home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native/CMakeFiles/hadoop_static.dir/DependInfo.cmake" > is newer than depender > "/home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native/CMakeFiles/hadoop_static.dir/depend.internal". > [WARNING] Dependee > "/home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native/CMakeFiles/CMakeDirectoryInformation.cmake" > is newer than depender > "/home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native/CMakeFiles/hadoop_static.dir/depend.internal". > [WARNING] Scanning dependencies of target hadoop_static > [WARNING] make[2]: Leaving directory > '/home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native' > [WARNING] make -f CMakeFiles/hadoop_static.dir/build.make > CMakeFiles/hadoop_static.dir/build > [WARNING] make[2]: Entering directory > '/home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native' > [WARNING] [ 2%] Building C object > CMakeFiles/hadoop_static.dir/main/native/src/exception.c.o > [WARNING] /usr/bin/cc > -I/home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native/javah > > -I/home/vagrant/hadoop/hadoop-common-project/hadoop-common/src/main/native/src > -I/home/vagrant/hadoop/hadoop-common-project/hadoop-common/src > -I/home/vagrant/hadoop/hadoop-common-project/hadoop-common/src/src > -I/home/vagrant/hadoop/hadoop-common-project/hadoop-common/target/native > -I/usr/lib/jvm/java-1.8.0/include -I/usr/lib/jvm/java-1.8.0/include/linux > -I/home/vagrant/hadoop/hadoop-comm
[GitHub] [hadoop] jianghuazhu opened a new pull request #2782: HDFS-15901.Solve the problem of DN repeated block reports occupying too many RPCs during Safemode.
jianghuazhu opened a new pull request #2782: URL: https://github.com/apache/hadoop/pull/2782 …oo many RPCs during Safemode. ## NOTICE Please create an issue in ASF JIRA before opening a pull request, and you need to set the title of the pull request which starts with the corresponding JIRA issue number. (e.g. HADOOP-X. Fix a typo in YYY.) For more details, please see https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17548) ABFS: Toggle Store Mkdirs request overwrite parameter
[ https://issues.apache.org/jira/browse/HADOOP-17548?focusedWorklogId=567649&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567649 ] ASF GitHub Bot logged work on HADOOP-17548: --- Author: ASF GitHub Bot Created on: 17/Mar/21 13:27 Start Date: 17/Mar/21 13:27 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #2781: URL: https://github.com/apache/hadoop/pull/2781#issuecomment-801078328 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 21m 45s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ branch-3.3 Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 19s | | branch-3.3 passed | | +1 :green_heart: | compile | 0m 32s | | branch-3.3 passed | | +1 :green_heart: | checkstyle | 0m 28s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 0m 40s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 0m 29s | | branch-3.3 passed | | +1 :green_heart: | spotbugs | 0m 59s | | branch-3.3 passed | | +1 :green_heart: | shadedclient | 15m 48s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 29s | | the patch passed | | +1 :green_heart: | compile | 0m 26s | | the patch passed | | +1 :green_heart: | javac | 0m 26s | | the patch passed | | +1 :green_heart: | blanks | 0m 1s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 17s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 29s | | the patch passed | | +1 :green_heart: | javadoc | 0m 20s | | the patch passed | | +1 :green_heart: | spotbugs | 1m 0s | | the patch passed | | +1 :green_heart: | shadedclient | 15m 52s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 56s | | hadoop-azure in the patch passed. | | +1 :green_heart: | asflicense | 0m 34s | | The patch does not generate ASF License warnings. | | | | 95m 15s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2781/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2781 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 2457a5eba562 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / 4432e094de445741f01d12311d34d62a0677c530 | | Default Java | Private Build-1.8.0_282-8u282-b08-0ubuntu1~18.04-b08 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2781/1/testReport/ | | Max. process+thread count | 668 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-azure U: hadoop-tools/hadoop-azure | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2781/1/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567649) Time Spent: 1h 10m (was: 1h) > ABFS: Toggle Store Mkdirs request overwrite parameter > - > > Key: HADOOP-17548 > URL: https://issues.apache.org/jira/browse/HADOOP-17548 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.3.1 >Reporter: Sumangala Patki >Assignee: Sumangala Patki >Priority: Major > L
[GitHub] [hadoop] hadoop-yetus commented on pull request #2781: HADOOP-17548. ABFS: Toggle Store Mkdirs request overwrite parameter (#2729)
hadoop-yetus commented on pull request #2781: URL: https://github.com/apache/hadoop/pull/2781#issuecomment-801078328 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 21m 45s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ branch-3.3 Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 19s | | branch-3.3 passed | | +1 :green_heart: | compile | 0m 32s | | branch-3.3 passed | | +1 :green_heart: | checkstyle | 0m 28s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 0m 40s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 0m 29s | | branch-3.3 passed | | +1 :green_heart: | spotbugs | 0m 59s | | branch-3.3 passed | | +1 :green_heart: | shadedclient | 15m 48s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 29s | | the patch passed | | +1 :green_heart: | compile | 0m 26s | | the patch passed | | +1 :green_heart: | javac | 0m 26s | | the patch passed | | +1 :green_heart: | blanks | 0m 1s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 17s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 29s | | the patch passed | | +1 :green_heart: | javadoc | 0m 20s | | the patch passed | | +1 :green_heart: | spotbugs | 1m 0s | | the patch passed | | +1 :green_heart: | shadedclient | 15m 52s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 56s | | hadoop-azure in the patch passed. | | +1 :green_heart: | asflicense | 0m 34s | | The patch does not generate ASF License warnings. | | | | 95m 15s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2781/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2781 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 2457a5eba562 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / 4432e094de445741f01d12311d34d62a0677c530 | | Default Java | Private Build-1.8.0_282-8u282-b08-0ubuntu1~18.04-b08 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2781/1/testReport/ | | Max. process+thread count | 668 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-azure U: hadoop-tools/hadoop-azure | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2781/1/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-16492) Support HuaweiCloud Object Storage as a Hadoop Backend File System
[ https://issues.apache.org/jira/browse/HADOOP-16492?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17303392#comment-17303392 ] Steve Loughran commented on HADOOP-16492: - [~zhongjun] [~brahmareddy]added some homework in HADOOP-17593 my goal of hadoop-cloud-storage POM is to add the minimum extra to the classpath above hadoop-common for apps downstream to work with the stores. Do the transitive dependencies of hadoop-huaweicloud need log4j2? If not: please remove > Support HuaweiCloud Object Storage as a Hadoop Backend File System > -- > > Key: HADOOP-16492 > URL: https://issues.apache.org/jira/browse/HADOOP-16492 > Project: Hadoop Common > Issue Type: New Feature > Components: fs >Affects Versions: 3.4.0 >Reporter: zhongjun >Assignee: zhongjun >Priority: Major > Fix For: 3.4.0 > > Attachments: Difference Between OBSA and S3A.pdf, > HADOOP-16492.001.patch, HADOOP-16492.002.patch, HADOOP-16492.003.patch, > HADOOP-16492.004.patch, HADOOP-16492.005.patch, HADOOP-16492.006.patch, > HADOOP-16492.007.patch, HADOOP-16492.008.patch, HADOOP-16492.009.patch, > HADOOP-16492.010.patch, HADOOP-16492.011.patch, HADOOP-16492.012.patch, > HADOOP-16492.013.patch, HADOOP-16492.014.patch, HADOOP-16492.015.patch, > HADOOP-16492.016.patch, HADOOP-16492.017.patch, OBSA HuaweiCloud OBS Adapter > for Hadoop Support.pdf, image-2020-11-21-18-51-51-981.png > > > Added support for HuaweiCloud OBS > ([https://www.huaweicloud.com/en-us/product/obs.html]) to Hadoop file system, > just like what we do before for S3, ADLS, OSS, etc. With simple > configuration, Hadoop applications can read/write data from OBS without any > code change. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Created] (HADOOP-17593) hadoop-huaweicloud and hadoop-cloud-storage to remove log4j as transitive dependency
Steve Loughran created HADOOP-17593: --- Summary: hadoop-huaweicloud and hadoop-cloud-storage to remove log4j as transitive dependency Key: HADOOP-17593 URL: https://issues.apache.org/jira/browse/HADOOP-17593 Project: Hadoop Common Issue Type: Bug Components: build Affects Versions: 3.3.1, 3.4.0 Reporter: Steve Loughran Dependencies of hadoop-cloud-storage show that hadoop-huaweicloud is pulling in logj4. it should not/must not, at least, not if the huaweicloud can live without it. * A version of log4j 2.,2 on the CP is only going to complicate lives * once we can move onto it ourselves we need to be in control of versions [INFO] \- org.apache.hadoop:hadoop-huaweicloud:jar:3.4.0-SNAPSHOT:compile [INFO]\- com.huaweicloud:esdk-obs-java:jar:3.20.4.2:compile [INFO] +- com.jamesmurty.utils:java-xmlbuilder:jar:1.2:compile [INFO] +- com.squareup.okhttp3:okhttp:jar:3.14.2:compile [INFO] +- org.apache.logging.log4j:log4j-core:jar:2.12.0:compile [INFO] \- org.apache.logging.log4j:log4j-api:jar:2.12.0:compile -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17592) Fix the wrong CIDR range example in Proxy User documentation
[ https://issues.apache.org/jira/browse/HADOOP-17592?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17303364#comment-17303364 ] Hadoop QA commented on HADOOP-17592: | (/) *{color:green}+1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Logfile || Comment || | {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 2m 19s{color} | {color:blue}{color} | {color:blue} Docker mode activated. {color} | || || || || {color:brown} Prechecks {color} || || | {color:green}+1{color} | {color:green} dupname {color} | {color:green} 0m 0s{color} | {color:green}{color} | {color:green} No case conflicting files found. {color} | | {color:blue}0{color} | {color:blue} markdownlint {color} | {color:blue} 0m 0s{color} | {color:blue}{color} | {color:blue} markdownlint was not available. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green}{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} trunk Compile Tests {color} || || | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 22m 37s{color} | {color:green}{color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 1m 17s{color} | {color:green}{color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 39m 20s{color} | {color:green}{color} | {color:green} branch has no errors when building and testing our client artifacts. {color} | || || || || {color:brown} Patch Compile Tests {color} || || | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 0m 53s{color} | {color:green}{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 1m 11s{color} | {color:green}{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green}{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 14m 40s{color} | {color:green}{color} | {color:green} patch has no errors when building and testing our client artifacts. {color} | || || || || {color:brown} Other Tests {color} || || | {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m 30s{color} | {color:green}{color} | {color:green} The patch does not generate ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 59m 29s{color} | {color:black}{color} | {color:black}{color} | \\ \\ || Subsystem || Report/Notes || | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/PreCommit-HADOOP-Build/167/artifact/out/Dockerfile | | JIRA Issue | HADOOP-17592 | | JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/13022490/HADOOP-17592.patch | | Optional Tests | dupname asflicense mvnsite markdownlint | | uname | Linux 00ad6706606a 4.15.0-126-generic #129-Ubuntu SMP Mon Nov 23 18:53:38 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / 9c43b60348b | | Max. process+thread count | 569 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/PreCommit-HADOOP-Build/167/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.13.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. > Fix the wrong CIDR range example in Proxy User documentation > > > Key: HADOOP-17592 > URL: https://issues.apache.org/jira/browse/HADOOP-17592 > Project: Hadoop Common > Issue Type: Bug > Components: documentation >Affects Versions: 3.2.2 >Reporter: Kwangsun Noh >Priority: Trivial > Labels: pull-request-available > Attachments: HADOOP-17592.patch > > Time Spent: 0.5h > Remaining Estimate: 0h > > The CIDR range example on the Proxy user description page is wrong. > > In the Configurations section of Proxy user page, CIDR format 10.222.0.0/16 > means 10.222.0.0-15. > > But It's not true. the CIDR format 10.222.0.0/16 means > 10.222.0.0-10.222.255.255. > > as-is : hosts in the range `10.222.0.0-15` and > to-be : hosts in the range `10.222.0.0-10.222.255.255` and -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h
[jira] [Work logged] (HADOOP-17548) ABFS: Toggle Store Mkdirs request overwrite parameter
[ https://issues.apache.org/jira/browse/HADOOP-17548?focusedWorklogId=567600&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567600 ] ASF GitHub Bot logged work on HADOOP-17548: --- Author: ASF GitHub Bot Created on: 17/Mar/21 11:50 Start Date: 17/Mar/21 11:50 Worklog Time Spent: 10m Work Description: sumangala-patki opened a new pull request #2781: URL: https://github.com/apache/hadoop/pull/2781 Contributed by Sumangala Patki. (cherry picked from commit fe633d473935fe285a12821fb70b19cfc9aa9b8c) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567600) Time Spent: 1h (was: 50m) > ABFS: Toggle Store Mkdirs request overwrite parameter > - > > Key: HADOOP-17548 > URL: https://issues.apache.org/jira/browse/HADOOP-17548 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.3.1 >Reporter: Sumangala Patki >Assignee: Sumangala Patki >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 1h > Remaining Estimate: 0h > > The call to mkdirs with overwrite set to true results in an additional call > to set properties (LMT update, etc) at the backend, which is not required for > the HDFS scenario. Moreover, mkdirs on an existing file path returns success. > This PR provides an option to set the overwrite parameter to false, and > ensures that mkdirs on a file throws an exception. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] sumangala-patki opened a new pull request #2781: HADOOP-17548. ABFS: Toggle Store Mkdirs request overwrite parameter (#2729)
sumangala-patki opened a new pull request #2781: URL: https://github.com/apache/hadoop/pull/2781 Contributed by Sumangala Patki. (cherry picked from commit fe633d473935fe285a12821fb70b19cfc9aa9b8c) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17592) Fix the wrong CIDR range example in Proxy User documentation
[ https://issues.apache.org/jira/browse/HADOOP-17592?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kwangsun Noh updated HADOOP-17592: -- Attachment: (was: HADOOP-17592.patch) > Fix the wrong CIDR range example in Proxy User documentation > > > Key: HADOOP-17592 > URL: https://issues.apache.org/jira/browse/HADOOP-17592 > Project: Hadoop Common > Issue Type: Bug > Components: documentation >Affects Versions: 3.2.2 >Reporter: Kwangsun Noh >Priority: Trivial > Labels: pull-request-available > Attachments: HADOOP-17592.patch > > Time Spent: 0.5h > Remaining Estimate: 0h > > The CIDR range example on the Proxy user description page is wrong. > > In the Configurations section of Proxy user page, CIDR format 10.222.0.0/16 > means 10.222.0.0-15. > > But It's not true. the CIDR format 10.222.0.0/16 means > 10.222.0.0-10.222.255.255. > > as-is : hosts in the range `10.222.0.0-15` and > to-be : hosts in the range `10.222.0.0-10.222.255.255` and -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17592) Fix the wrong CIDR range example in Proxy User documentation
[ https://issues.apache.org/jira/browse/HADOOP-17592?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kwangsun Noh updated HADOOP-17592: -- Attachment: HADOOP-17592.patch Status: Patch Available (was: Open) > Fix the wrong CIDR range example in Proxy User documentation > > > Key: HADOOP-17592 > URL: https://issues.apache.org/jira/browse/HADOOP-17592 > Project: Hadoop Common > Issue Type: Bug > Components: documentation >Affects Versions: 3.2.2 >Reporter: Kwangsun Noh >Priority: Trivial > Labels: pull-request-available > Attachments: HADOOP-17592.patch > > Time Spent: 0.5h > Remaining Estimate: 0h > > The CIDR range example on the Proxy user description page is wrong. > > In the Configurations section of Proxy user page, CIDR format 10.222.0.0/16 > means 10.222.0.0-15. > > But It's not true. the CIDR format 10.222.0.0/16 means > 10.222.0.0-10.222.255.255. > > as-is : hosts in the range `10.222.0.0-15` and > to-be : hosts in the range `10.222.0.0-10.222.255.255` and -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #2775: MAPREDUCE-7329: HadoopPipes task may fail when linux kernel upgrade from 3.x to 4.x
steveloughran commented on pull request #2775: URL: https://github.com/apache/hadoop/pull/2775#issuecomment-801006630 Never seen this code before so I'm not really in a position to review. Just trying to revise my sockets API knowledge, which dates from when I was writing Windows 3.1 code and hasn't been refreshed much, not since HTTP came along This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tomscut commented on pull request #2770: HDFS-15892. Add metric for editPendingQ in FSEditLogAsync
tomscut commented on pull request #2770: URL: https://github.com/apache/hadoop/pull/2770#issuecomment-801002600 Hi @daryn-sharp @umamaheswararao , could you please help review the code? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17476) ITestAssumeRole.testAssumeRoleBadInnerAuth failure
[ https://issues.apache.org/jira/browse/HADOOP-17476?focusedWorklogId=567577&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567577 ] ASF GitHub Bot logged work on HADOOP-17476: --- Author: ASF GitHub Bot Created on: 17/Mar/21 11:11 Start Date: 17/Mar/21 11:11 Worklog Time Spent: 10m Work Description: steveloughran commented on a change in pull request #2777: URL: https://github.com/apache/hadoop/pull/2777#discussion_r595920411 ## File path: hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/auth/ITestAssumeRole.java ## @@ -255,8 +255,7 @@ public void testAssumeRoleBadInnerAuth() throws Exception { conf.set(SECRET_KEY, "not secret"); expectFileSystemCreateFailure(conf, AWSBadRequestException.class, -"not a valid " + -"key=value pair (missing equal-sign) in Authorization header"); +"IncompleteSignature"); Review comment: yes. They've changed the string, and rather than try and keep up with whatever error message comes back, I'll just remove the match. We had to do the same with some of the assumed role tests a few releases back when they changed the max life of a role token from 60 minutes to 12h. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567577) Time Spent: 1h (was: 50m) > ITestAssumeRole.testAssumeRoleBadInnerAuth failure > -- > > Key: HADOOP-17476 > URL: https://issues.apache.org/jira/browse/HADOOP-17476 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, test >Affects Versions: 3.3.0, 3.3.1 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 1h > Remaining Estimate: 0h > > failure of test {{ITestAssumeRole.testAssumeRoleBadInnerAuth}} where a > failure was expected, but the error text was wrong. > Either STS has changed its error text or something is changing where the > failure happens. > Given the nature of the test, it may be simplest to keep the expectation of > an FS init faiure, but remove the text match -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on a change in pull request #2777: HADOOP-17476. ITestAssumeRole.testAssumeRoleBadInnerAuth failure.
steveloughran commented on a change in pull request #2777: URL: https://github.com/apache/hadoop/pull/2777#discussion_r595920411 ## File path: hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/auth/ITestAssumeRole.java ## @@ -255,8 +255,7 @@ public void testAssumeRoleBadInnerAuth() throws Exception { conf.set(SECRET_KEY, "not secret"); expectFileSystemCreateFailure(conf, AWSBadRequestException.class, -"not a valid " + -"key=value pair (missing equal-sign) in Authorization header"); +"IncompleteSignature"); Review comment: yes. They've changed the string, and rather than try and keep up with whatever error message comes back, I'll just remove the match. We had to do the same with some of the assumed role tests a few releases back when they changed the max life of a role token from 60 minutes to 12h. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-16819) Possible inconsistent state of AbstractDelegationTokenSecretManager
[ https://issues.apache.org/jira/browse/HADOOP-16819?focusedWorklogId=567575&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567575 ] ASF GitHub Bot logged work on HADOOP-16819: --- Author: ASF GitHub Bot Created on: 17/Mar/21 11:10 Start Date: 17/Mar/21 11:10 Worklog Time Spent: 10m Work Description: steveloughran commented on pull request #1894: URL: https://github.com/apache/hadoop/pull/1894#issuecomment-800997098 OK., now I understand why we need this and why the PR does the right thing, Can you do rebase and forced push of the PR to kick off yetus again? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567575) Time Spent: 1h (was: 50m) > Possible inconsistent state of AbstractDelegationTokenSecretManager > --- > > Key: HADOOP-16819 > URL: https://issues.apache.org/jira/browse/HADOOP-16819 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, security >Affects Versions: 3.3.0 >Reporter: Hankó Gergely >Assignee: Hankó Gergely >Priority: Major > Labels: pull-request-available > Attachments: HADOOP-16819.001.patch > > Time Spent: 1h > Remaining Estimate: 0h > > [AbstractDelegationTokenSecretManager.updateCurrentKey|https://github.com/apache/hadoop/blob/581072a8f04f7568d3560f105fd1988d3acc9e54/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/AbstractDelegationTokenSecretManager.java#L360] > increments the current key id and creates the new delegation key in two > distinct synchronized blocks. > This means that other threads can see the class in an *inconsistent state, > where the key for the current key id doesn't exist (yet)*. > For example the following method sometimes returns null when the token > remover thread is between the two synchronized blocks: > {noformat} > @Override > public DelegationKey getCurrentKey() { > return getDelegationKey(getCurrentKeyId()); > }{noformat} > > Also it is possible that updateCurrentKey is called from multiple threads at > the same time so *distinct keys can be generated with the same key id*. > > This issue is suspected to be the cause of the intermittent failure of > [TestLlapSignerImpl.testSigning|https://github.com/apache/hive/blob/3c0705eaf5121c7b61f2dbe9db9545c3926f26f1/llap-server/src/test/org/apache/hadoop/hive/llap/security/TestLlapSignerImpl.java#L195] > - HIVE-22621. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #1894: HADOOP-16819 Possible inconsistent state of AbstractDelegationTokenSecretManager
steveloughran commented on pull request #1894: URL: https://github.com/apache/hadoop/pull/1894#issuecomment-800997098 OK., now I understand why we need this and why the PR does the right thing, Can you do rebase and forced push of the PR to kick off yetus again? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-16819) Possible inconsistent state of AbstractDelegationTokenSecretManager
[ https://issues.apache.org/jira/browse/HADOOP-16819?focusedWorklogId=567573&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567573 ] ASF GitHub Bot logged work on HADOOP-16819: --- Author: ASF GitHub Bot Created on: 17/Mar/21 11:08 Start Date: 17/Mar/21 11:08 Worklog Time Spent: 10m Work Description: steveloughran commented on a change in pull request #1894: URL: https://github.com/apache/hadoop/pull/1894#discussion_r595918678 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/AbstractDelegationTokenSecretManager.java ## @@ -356,16 +356,14 @@ private void updateCurrentKey() throws IOException { int newCurrentId; synchronized (this) { newCurrentId = incrementCurrentKeyId(); -} -DelegationKey newKey = new DelegationKey(newCurrentId, System -.currentTimeMillis() -+ keyUpdateInterval + tokenMaxLifetime, generateSecret()); -//Log must be invoked outside the lock on 'this' -logUpdateMasterKey(newKey); -synchronized (this) { - currentKey = newKey; + currentKey = new DelegationKey(newCurrentId, System + .currentTimeMillis() + + keyUpdateInterval + tokenMaxLifetime, generateSecret()); + storeDelegationKey(currentKey); } +//Log must be invoked outside the lock on 'this' +logUpdateMasterKey(currentKey); Review comment: I see. Probably it's just the original order and then sync blocks went up around some of the ops This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567573) Time Spent: 50m (was: 40m) > Possible inconsistent state of AbstractDelegationTokenSecretManager > --- > > Key: HADOOP-16819 > URL: https://issues.apache.org/jira/browse/HADOOP-16819 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, security >Affects Versions: 3.3.0 >Reporter: Hankó Gergely >Assignee: Hankó Gergely >Priority: Major > Labels: pull-request-available > Attachments: HADOOP-16819.001.patch > > Time Spent: 50m > Remaining Estimate: 0h > > [AbstractDelegationTokenSecretManager.updateCurrentKey|https://github.com/apache/hadoop/blob/581072a8f04f7568d3560f105fd1988d3acc9e54/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/AbstractDelegationTokenSecretManager.java#L360] > increments the current key id and creates the new delegation key in two > distinct synchronized blocks. > This means that other threads can see the class in an *inconsistent state, > where the key for the current key id doesn't exist (yet)*. > For example the following method sometimes returns null when the token > remover thread is between the two synchronized blocks: > {noformat} > @Override > public DelegationKey getCurrentKey() { > return getDelegationKey(getCurrentKeyId()); > }{noformat} > > Also it is possible that updateCurrentKey is called from multiple threads at > the same time so *distinct keys can be generated with the same key id*. > > This issue is suspected to be the cause of the intermittent failure of > [TestLlapSignerImpl.testSigning|https://github.com/apache/hive/blob/3c0705eaf5121c7b61f2dbe9db9545c3926f26f1/llap-server/src/test/org/apache/hadoop/hive/llap/security/TestLlapSignerImpl.java#L195] > - HIVE-22621. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on a change in pull request #1894: HADOOP-16819 Possible inconsistent state of AbstractDelegationTokenSecretManager
steveloughran commented on a change in pull request #1894: URL: https://github.com/apache/hadoop/pull/1894#discussion_r595918678 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/AbstractDelegationTokenSecretManager.java ## @@ -356,16 +356,14 @@ private void updateCurrentKey() throws IOException { int newCurrentId; synchronized (this) { newCurrentId = incrementCurrentKeyId(); -} -DelegationKey newKey = new DelegationKey(newCurrentId, System -.currentTimeMillis() -+ keyUpdateInterval + tokenMaxLifetime, generateSecret()); -//Log must be invoked outside the lock on 'this' -logUpdateMasterKey(newKey); -synchronized (this) { - currentKey = newKey; + currentKey = new DelegationKey(newCurrentId, System + .currentTimeMillis() + + keyUpdateInterval + tokenMaxLifetime, generateSecret()); + storeDelegationKey(currentKey); } +//Log must be invoked outside the lock on 'this' +logUpdateMasterKey(currentKey); Review comment: I see. Probably it's just the original order and then sync blocks went up around some of the ops This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17592) Fix the wrong CIDR range example in Proxy User documentation
[ https://issues.apache.org/jira/browse/HADOOP-17592?focusedWorklogId=567559&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567559 ] ASF GitHub Bot logged work on HADOOP-17592: --- Author: ASF GitHub Bot Created on: 17/Mar/21 10:29 Start Date: 17/Mar/21 10:29 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #2780: URL: https://github.com/apache/hadoop/pull/2780#issuecomment-800971254 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 38s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 33s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 20s | | trunk passed | | +1 :green_heart: | shadedclient | 46m 52s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 51s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 1m 8s | | the patch passed | | +1 :green_heart: | shadedclient | 13m 59s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 0m 35s | | The patch does not generate ASF License warnings. | | | | 64m 43s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2780/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2780 | | JIRA Issue | HADOOP-17592 | | Optional Tests | dupname asflicense mvnsite codespell markdownlint | | uname | Linux 6b032e985acb 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 706083cb42ce8745365cdc6e023b443b551398da | | Max. process+thread count | 699 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2780/1/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567559) Time Spent: 0.5h (was: 20m) > Fix the wrong CIDR range example in Proxy User documentation > > > Key: HADOOP-17592 > URL: https://issues.apache.org/jira/browse/HADOOP-17592 > Project: Hadoop Common > Issue Type: Bug > Components: documentation >Affects Versions: 3.2.2 >Reporter: Kwangsun Noh >Priority: Trivial > Labels: pull-request-available > Attachments: HADOOP-17592.patch > > Time Spent: 0.5h > Remaining Estimate: 0h > > The CIDR range example on the Proxy user description page is wrong. > > In the Configurations section of Proxy user page, CIDR format 10.222.0.0/16 > means 10.222.0.0-15. > > But It's not true. the CIDR format 10.222.0.0/16 means > 10.222.0.0-10.222.255.255. > > as-is : hosts in the range `10.222.0.0-15` and > to-be : hosts in the range `10.222.0.0-10.222.255.255` and -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17592) Fix the wrong CIDR range example in Proxy User documentation
[ https://issues.apache.org/jira/browse/HADOOP-17592?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17303283#comment-17303283 ] Hadoop QA commented on HADOOP-17592: | (/) *{color:green}+1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Logfile || Comment || | {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 0m 38s{color} | | {color:blue} Docker mode activated. {color} | || || || || {color:brown} Prechecks {color} || || | {color:green}+1{color} | {color:green} dupname {color} | {color:green} 0m 0s{color} | | {color:green} No case conflicting files found. {color} | | {color:blue}0{color} | {color:blue} codespell {color} | {color:blue} 0m 0s{color} | | {color:blue} codespell was not available. {color} | | {color:blue}0{color} | {color:blue} markdownlint {color} | {color:blue} 0m 0s{color} | | {color:blue} markdownlint was not available. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} trunk Compile Tests {color} || || | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 32m 33s{color} | | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 1m 20s{color} | | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 46m 52s{color} | | {color:green} branch has no errors when building and testing our client artifacts. {color} | || || || || {color:brown} Patch Compile Tests {color} || || | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 0m 51s{color} | | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} blanks {color} | {color:green} 0m 0s{color} | | {color:green} The patch has no blanks issues. {color} | | {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 1m 8s{color} | | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 13m 59s{color} | | {color:green} patch has no errors when building and testing our client artifacts. {color} | || || || || {color:brown} Other Tests {color} || || | {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m 35s{color} | | {color:green} The patch does not generate ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 64m 43s{color} | | {color:black}{color} | \\ \\ || Subsystem || Report/Notes || | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2780/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2780 | | JIRA Issue | HADOOP-17592 | | Optional Tests | dupname asflicense mvnsite codespell markdownlint | | uname | Linux 6b032e985acb 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 706083cb42ce8745365cdc6e023b443b551398da | | Max. process+thread count | 699 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2780/1/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. > Fix the wrong CIDR range example in Proxy User documentation > > > Key: HADOOP-17592 > URL: https://issues.apache.org/jira/browse/HADOOP-17592 > Project: Hadoop Common > Issue Type: Bug > Components: documentation >Affects Versions: 3.2.2 >Reporter: Kwangsun Noh >Priority: Trivial > Labels: pull-request-available > Attachments: HADOOP-17592.patch > > Time Spent: 20m > Remaining Estimate: 0h > > The CIDR range example on the Proxy user description page is wrong. > > In the Configurations section of Proxy user page, CIDR format 10.222.0.0/16 > means 10.222.0.0-15. > > But It's not true. the CIDR format 10.222.0.0/16 means > 10.222.0.0-10.222.255.255. > > as-is : hosts in the range `10.222.0.0-15` and > to-be : hosts in the range `10.222.0.0-10.222.255.255` and -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2780: [HADOOP-17592] Fix the wrong CIDR range example.
hadoop-yetus commented on pull request #2780: URL: https://github.com/apache/hadoop/pull/2780#issuecomment-800971254 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 38s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 33s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 20s | | trunk passed | | +1 :green_heart: | shadedclient | 46m 52s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 51s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 1m 8s | | the patch passed | | +1 :green_heart: | shadedclient | 13m 59s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 0m 35s | | The patch does not generate ASF License warnings. | | | | 64m 43s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2780/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2780 | | JIRA Issue | HADOOP-17592 | | Optional Tests | dupname asflicense mvnsite codespell markdownlint | | uname | Linux 6b032e985acb 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 706083cb42ce8745365cdc6e023b443b551398da | | Max. process+thread count | 699 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2780/1/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17592) Fix the wrong CIDR range example in Proxy User documentation
[ https://issues.apache.org/jira/browse/HADOOP-17592?focusedWorklogId=567538&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567538 ] ASF GitHub Bot logged work on HADOOP-17592: --- Author: ASF GitHub Bot Created on: 17/Mar/21 09:47 Start Date: 17/Mar/21 09:47 Worklog Time Spent: 10m Work Description: aajisaka commented on pull request #2780: URL: https://github.com/apache/hadoop/pull/2780#issuecomment-800945272 Nice catch! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567538) Time Spent: 20m (was: 10m) > Fix the wrong CIDR range example in Proxy User documentation > > > Key: HADOOP-17592 > URL: https://issues.apache.org/jira/browse/HADOOP-17592 > Project: Hadoop Common > Issue Type: Bug > Components: documentation >Affects Versions: 3.2.2 >Reporter: Kwangsun Noh >Priority: Trivial > Labels: pull-request-available > Attachments: HADOOP-17592.patch > > Time Spent: 20m > Remaining Estimate: 0h > > The CIDR range example on the Proxy user description page is wrong. > > In the Configurations section of Proxy user page, CIDR format 10.222.0.0/16 > means 10.222.0.0-15. > > But It's not true. the CIDR format 10.222.0.0/16 means > 10.222.0.0-10.222.255.255. > > as-is : hosts in the range `10.222.0.0-15` and > to-be : hosts in the range `10.222.0.0-10.222.255.255` and -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on pull request #2780: [HADOOP-17592] Fix the wrong CIDR range example.
aajisaka commented on pull request #2780: URL: https://github.com/apache/hadoop/pull/2780#issuecomment-800945272 Nice catch! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17592) Fix the wrong CIDR range example in Proxy User documentation
[ https://issues.apache.org/jira/browse/HADOOP-17592?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-17592: Labels: pull-request-available (was: ) > Fix the wrong CIDR range example in Proxy User documentation > > > Key: HADOOP-17592 > URL: https://issues.apache.org/jira/browse/HADOOP-17592 > Project: Hadoop Common > Issue Type: Bug > Components: documentation >Affects Versions: 3.2.2 >Reporter: Kwangsun Noh >Priority: Trivial > Labels: pull-request-available > Attachments: HADOOP-17592.patch > > Time Spent: 10m > Remaining Estimate: 0h > > The CIDR range example on the Proxy user description page is wrong. > > In the Configurations section of Proxy user page, CIDR format 10.222.0.0/16 > means 10.222.0.0-15. > > But It's not true. the CIDR format 10.222.0.0/16 means > 10.222.0.0-10.222.255.255. > > as-is : hosts in the range `10.222.0.0-15` and > to-be : hosts in the range `10.222.0.0-10.222.255.255` and -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17592) Fix the wrong CIDR range example in Proxy User documentation
[ https://issues.apache.org/jira/browse/HADOOP-17592?focusedWorklogId=567531&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567531 ] ASF GitHub Bot logged work on HADOOP-17592: --- Author: ASF GitHub Bot Created on: 17/Mar/21 09:23 Start Date: 17/Mar/21 09:23 Worklog Time Spent: 10m Work Description: nohkwangsun opened a new pull request #2780: URL: https://github.com/apache/hadoop/pull/2780 The CIDR range example on the Proxy user description page is wrong. In the Configurations section of Proxy user page, CIDR format 10.222.0.0/16 means 10.222.0.0-15. But It's not true. the CIDR format 10.222.0.0/16 means 10.222.0.0-10.222.255.255. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567531) Remaining Estimate: 0h Time Spent: 10m > Fix the wrong CIDR range example in Proxy User documentation > > > Key: HADOOP-17592 > URL: https://issues.apache.org/jira/browse/HADOOP-17592 > Project: Hadoop Common > Issue Type: Bug > Components: documentation >Affects Versions: 3.2.2 >Reporter: Kwangsun Noh >Priority: Trivial > Attachments: HADOOP-17592.patch > > Time Spent: 10m > Remaining Estimate: 0h > > The CIDR range example on the Proxy user description page is wrong. > > In the Configurations section of Proxy user page, CIDR format 10.222.0.0/16 > means 10.222.0.0-15. > > But It's not true. the CIDR format 10.222.0.0/16 means > 10.222.0.0-10.222.255.255. > > as-is : hosts in the range `10.222.0.0-15` and > to-be : hosts in the range `10.222.0.0-10.222.255.255` and -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] nohkwangsun opened a new pull request #2780: [HADOOP-17592] Fix the wrong CIDR range example.
nohkwangsun opened a new pull request #2780: URL: https://github.com/apache/hadoop/pull/2780 The CIDR range example on the Proxy user description page is wrong. In the Configurations section of Proxy user page, CIDR format 10.222.0.0/16 means 10.222.0.0-15. But It's not true. the CIDR format 10.222.0.0/16 means 10.222.0.0-10.222.255.255. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17592) Fix the wrong CIDR range example in Proxy User documentation
[ https://issues.apache.org/jira/browse/HADOOP-17592?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kwangsun Noh updated HADOOP-17592: -- Attachment: HADOOP-17592.patch > Fix the wrong CIDR range example in Proxy User documentation > > > Key: HADOOP-17592 > URL: https://issues.apache.org/jira/browse/HADOOP-17592 > Project: Hadoop Common > Issue Type: Bug > Components: documentation >Affects Versions: 3.2.2 >Reporter: Kwangsun Noh >Priority: Trivial > Attachments: HADOOP-17592.patch > > > The CIDR range example on the Proxy user description page is wrong. > > In the Configurations section of Proxy user page, CIDR format 10.222.0.0/16 > means 10.222.0.0-15. > > But It's not true. the CIDR format 10.222.0.0/16 means > 10.222.0.0-10.222.255.255. > > as-is : hosts in the range `10.222.0.0-15` and > to-be : hosts in the range `10.222.0.0-10.222.255.255` and -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Created] (HADOOP-17591) Fix the wrong CIDR range example in Proxy User documentation
Kwangsun Noh created HADOOP-17591: - Summary: Fix the wrong CIDR range example in Proxy User documentation Key: HADOOP-17591 URL: https://issues.apache.org/jira/browse/HADOOP-17591 Project: Hadoop Common Issue Type: Bug Components: documentation Affects Versions: 3.2.2 Reporter: Kwangsun Noh The CIDR range example on the Proxy user description page is wrong. In the Configurations section of the Proxy user page, CIDR 10.222.0.0/16 means range 10.222.0.0-15. But It's not true. CIDR format 10.222.0.0/16 means 10.222.0.0-10.222.255.255 as-is : 10.222.0.0-15 to-be : 10.222.0.0-10.222.255.255 -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Created] (HADOOP-17592) Fix the wrong CIDR range example in Proxy User documentation
Kwangsun Noh created HADOOP-17592: - Summary: Fix the wrong CIDR range example in Proxy User documentation Key: HADOOP-17592 URL: https://issues.apache.org/jira/browse/HADOOP-17592 Project: Hadoop Common Issue Type: Bug Components: documentation Affects Versions: 3.2.2 Reporter: Kwangsun Noh The CIDR range example on the Proxy user description page is wrong. In the Configurations section of Proxy user page, CIDR format 10.222.0.0/16 means 10.222.0.0-15. But It's not true. the CIDR format 10.222.0.0/16 means 10.222.0.0-10.222.255.255. as-is : hosts in the range `10.222.0.0-15` and to-be : hosts in the range `10.222.0.0-10.222.255.255` and -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17476) ITestAssumeRole.testAssumeRoleBadInnerAuth failure
[ https://issues.apache.org/jira/browse/HADOOP-17476?focusedWorklogId=567463&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-567463 ] ASF GitHub Bot logged work on HADOOP-17476: --- Author: ASF GitHub Bot Created on: 17/Mar/21 07:12 Start Date: 17/Mar/21 07:12 Worklog Time Spent: 10m Work Description: mukund-thakur commented on a change in pull request #2777: URL: https://github.com/apache/hadoop/pull/2777#discussion_r595760826 ## File path: hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/auth/ITestAssumeRole.java ## @@ -255,8 +255,7 @@ public void testAssumeRoleBadInnerAuth() throws Exception { conf.set(SECRET_KEY, "not secret"); expectFileSystemCreateFailure(conf, AWSBadRequestException.class, -"not a valid " + -"key=value pair (missing equal-sign) in Authorization header"); +"IncompleteSignature"); Review comment: As per the commit message "Removes string match so change in AWS S3 error message doesn't cause the test to fail", the complete string matching is to be removed right? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 567463) Time Spent: 50m (was: 40m) > ITestAssumeRole.testAssumeRoleBadInnerAuth failure > -- > > Key: HADOOP-17476 > URL: https://issues.apache.org/jira/browse/HADOOP-17476 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, test >Affects Versions: 3.3.0, 3.3.1 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 50m > Remaining Estimate: 0h > > failure of test {{ITestAssumeRole.testAssumeRoleBadInnerAuth}} where a > failure was expected, but the error text was wrong. > Either STS has changed its error text or something is changing where the > failure happens. > Given the nature of the test, it may be simplest to keep the expectation of > an FS init faiure, but remove the text match -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mukund-thakur commented on a change in pull request #2777: HADOOP-17476. ITestAssumeRole.testAssumeRoleBadInnerAuth failure.
mukund-thakur commented on a change in pull request #2777: URL: https://github.com/apache/hadoop/pull/2777#discussion_r595760826 ## File path: hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/auth/ITestAssumeRole.java ## @@ -255,8 +255,7 @@ public void testAssumeRoleBadInnerAuth() throws Exception { conf.set(SECRET_KEY, "not secret"); expectFileSystemCreateFailure(conf, AWSBadRequestException.class, -"not a valid " + -"key=value pair (missing equal-sign) in Authorization header"); +"IncompleteSignature"); Review comment: As per the commit message "Removes string match so change in AWS S3 error message doesn't cause the test to fail", the complete string matching is to be removed right? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org