[jira] [Commented] (HADOOP-18468) upgrade jettison json jar due to fix CVE-2022-40149
[ https://issues.apache.org/jira/browse/HADOOP-18468?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613873#comment-17613873 ] ASF GitHub Bot commented on HADOOP-18468: - hadoop-yetus commented on PR #4937: URL: https://github.com/apache/hadoop/pull/4937#issuecomment-1271141303 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 24s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | shelldocs | 0m 0s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 30s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 26m 45s | | trunk passed | | +1 :green_heart: | compile | 25m 27s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 22m 21s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 4m 49s | | trunk passed | | +1 :green_heart: | mvnsite | 19m 59s | | trunk passed | | +1 :green_heart: | javadoc | 8m 16s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 7m 33s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 0m 27s | | branch/hadoop-project no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 53m 33s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 33s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 23m 21s | | the patch passed | | +1 :green_heart: | compile | 22m 53s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 22m 53s | | the patch passed | | +1 :green_heart: | compile | 20m 56s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 56s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 2s | | the patch passed | | +1 :green_heart: | mvnsite | 19m 11s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | javadoc | 7m 58s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 7m 29s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 0m 26s | | hadoop-project has no data from spotbugs | | +1 :green_heart: | shadedclient | 53m 47s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 820m 9s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4937/5/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 2m 18s | | The patch does not generate ASF License warnings. | | | | 1179m 10s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.TestRollingUpgrade | | | hadoop.yarn.server.applicationhistoryservice.webapp.TestAHSWebServices | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4937/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4937 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle shellcheck shelldocs | | uname | Linux c43a58c077e4 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / e72e25a79b900340e25c26ddf96b82657a847d5c | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 |
[GitHub] [hadoop] hadoop-yetus commented on pull request #4937: HADOOP-18468: jettison 1.5.1 (CVE fix)
hadoop-yetus commented on PR #4937: URL: https://github.com/apache/hadoop/pull/4937#issuecomment-1271141303 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 24s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | shelldocs | 0m 0s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 30s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 26m 45s | | trunk passed | | +1 :green_heart: | compile | 25m 27s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 22m 21s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 4m 49s | | trunk passed | | +1 :green_heart: | mvnsite | 19m 59s | | trunk passed | | +1 :green_heart: | javadoc | 8m 16s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 7m 33s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 0m 27s | | branch/hadoop-project no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 53m 33s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 33s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 23m 21s | | the patch passed | | +1 :green_heart: | compile | 22m 53s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 22m 53s | | the patch passed | | +1 :green_heart: | compile | 20m 56s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 56s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 2s | | the patch passed | | +1 :green_heart: | mvnsite | 19m 11s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | javadoc | 7m 58s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 7m 29s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 0m 26s | | hadoop-project has no data from spotbugs | | +1 :green_heart: | shadedclient | 53m 47s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 820m 9s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4937/5/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 2m 18s | | The patch does not generate ASF License warnings. | | | | 1179m 10s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.TestRollingUpgrade | | | hadoop.yarn.server.applicationhistoryservice.webapp.TestAHSWebServices | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4937/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4937 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle shellcheck shelldocs | | uname | Linux c43a58c077e4 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / e72e25a79b900340e25c26ddf96b82657a847d5c | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results |
[jira] [Commented] (HADOOP-18304) Improve S3A committers documentation clarity
[ https://issues.apache.org/jira/browse/HADOOP-18304?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613870#comment-17613870 ] ASF GitHub Bot commented on HADOOP-18304: - mehakmeet commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r989684378 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -356,19 +351,20 @@ task commit. However, it has extra requirements of the filesystem -1. [Obsolete] It requires a consistent object store. +1. The object store must be consistent. 1. The S3A client must be configured to recognize interactions -with the magic directories and treat them specially. +with the magic directories and treat them as a special case. -Now that Amazon S3 is consistent, the magic committer is enabled by default. +Now that [Amazon S3 is consistent](https://aws.amazon.com/s3/consistency/), +the magic committer is enabled by default. Review Comment: File committer is still the default committer, right? So what does magic committer being "enabled by default" here mean? ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -530,18 +527,22 @@ performance. Review Comment: nit: blank lines ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -180,20 +180,20 @@ and restarting the job. whose output is in the job attempt directory, *and only rerunning all uncommitted tasks*. -This algorithm does not works safely or swiftly with AWS S3 storage because -tenames go from being fast, atomic operations to slow operations which can fail partway through. +This algorithm does not work safely or swiftly with AWS S3 storage because +renames go from being fast, atomic operations to slow operations which can fail partway through. This then is the problem which the S3A committers address: -*How to safely and reliably commit work to Amazon S3 or compatible object store* +>*How to safely and reliably commit work to Amazon S3 or compatible object store.* Review Comment: What are we quoting here? > Improve S3A committers documentation clarity > > > Key: HADOOP-18304 > URL: https://issues.apache.org/jira/browse/HADOOP-18304 > Project: Hadoop Common > Issue Type: Sub-task > Components: documentation >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Trivial > Labels: pull-request-available > Time Spent: 2.5h > Remaining Estimate: 0h > > I recently was learning more about the S3A committers. I'm hoping to provide > some improvements as someone who has recently read [this > documentation|https://github.com/apache/hadoop/blob/1f157f802d2d6142d21482eaa86baf1bef458ed4/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md#L495] > without fully understanding prior. > For instance, referencing different components more explicitly and adding > pre-requisite info. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mehakmeet commented on a diff in pull request #4478: HADOOP-18304. Improve user-facing S3A committers documentation
mehakmeet commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r989684378 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -356,19 +351,20 @@ task commit. However, it has extra requirements of the filesystem -1. [Obsolete] It requires a consistent object store. +1. The object store must be consistent. 1. The S3A client must be configured to recognize interactions -with the magic directories and treat them specially. +with the magic directories and treat them as a special case. -Now that Amazon S3 is consistent, the magic committer is enabled by default. +Now that [Amazon S3 is consistent](https://aws.amazon.com/s3/consistency/), +the magic committer is enabled by default. Review Comment: File committer is still the default committer, right? So what does magic committer being "enabled by default" here mean? ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -530,18 +527,22 @@ performance. Review Comment: nit: blank lines ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -180,20 +180,20 @@ and restarting the job. whose output is in the job attempt directory, *and only rerunning all uncommitted tasks*. -This algorithm does not works safely or swiftly with AWS S3 storage because -tenames go from being fast, atomic operations to slow operations which can fail partway through. +This algorithm does not work safely or swiftly with AWS S3 storage because +renames go from being fast, atomic operations to slow operations which can fail partway through. This then is the problem which the S3A committers address: -*How to safely and reliably commit work to Amazon S3 or compatible object store* +>*How to safely and reliably commit work to Amazon S3 or compatible object store.* Review Comment: What are we quoting here? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4979: HDFS-16795: use secure XML parsers
hadoop-yetus commented on PR #4979: URL: https://github.com/apache/hadoop/pull/4979#issuecomment-1271120975 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 18m 28s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 35s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 28m 40s | | trunk passed | | +1 :green_heart: | compile | 6m 53s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 6m 39s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 31s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 44s | | trunk passed | | +1 :green_heart: | javadoc | 2m 4s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 2m 28s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 6m 25s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 59s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 24s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 15s | | the patch passed | | +1 :green_heart: | compile | 6m 43s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 6m 43s | | the patch passed | | +1 :green_heart: | compile | 6m 14s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 6m 14s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 15s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 26s | | the patch passed | | +1 :green_heart: | javadoc | 1m 38s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 2m 8s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 6m 23s | | the patch passed | | +1 :green_heart: | shadedclient | 26m 39s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 28s | | hadoop-hdfs-client in the patch passed. | | -1 :x: | unit | 355m 53s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4979/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 56s | | The patch does not generate ASF License warnings. | | | | 530m 58s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewer | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4979/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4979 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 8b18e37bd995 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 8767e6d9cd50da7f2729d9c5a72a901959b74a1f | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4979/1/testReport/ | | Max. process+thread count | 2119 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-client hadoop-hdfs-project/hadoop-hdfs U:
[jira] [Commented] (HADOOP-18437) Fix typo in class StartEndTImesBase
[ https://issues.apache.org/jira/browse/HADOOP-18437?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613861#comment-17613861 ] ASF GitHub Bot commented on HADOOP-18437: - cbaenziger commented on PR #4894: URL: https://github.com/apache/hadoop/pull/4894#issuecomment-1271101107 This looks like a good typo fix to me. I don't see any instance of `slowTaskRelativeTresholds` elsewhere in the Hadoop code base. > Fix typo in class StartEndTImesBase > --- > > Key: HADOOP-18437 > URL: https://issues.apache.org/jira/browse/HADOOP-18437 > Project: Hadoop Common > Issue Type: Improvement >Affects Versions: 3.3.4 >Reporter: Samrat Deb >Priority: Minor > Labels: newbie, pull-request-available > > While going through the code , found some typo in the code related to naming > variables > - +slowTaskRelativeTresholds+ spells wrong can be fixed to > +slowTaskRelativeThresholds+ -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] cbaenziger commented on pull request #4894: HADOOP-18437. Fix type in StartEndTimeBase
cbaenziger commented on PR #4894: URL: https://github.com/apache/hadoop/pull/4894#issuecomment-1271101107 This looks like a good typo fix to me. I don't see any instance of `slowTaskRelativeTresholds` elsewhere in the Hadoop code base. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4982: YARN-11332. [Federation] Improve FederationClientInterceptor#ThreadPool thread pool configuration.
hadoop-yetus commented on PR #4982: URL: https://github.com/apache/hadoop/pull/4982#issuecomment-1271100867 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 44s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 16m 11s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 27m 4s | | trunk passed | | +1 :green_heart: | compile | 10m 17s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 9m 49s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 2m 10s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 27s | | trunk passed | | +1 :green_heart: | javadoc | 2m 23s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 2m 13s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 54s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 5s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 27s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 17s | | the patch passed | | +1 :green_heart: | compile | 10m 11s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | -1 :x: | javac | 10m 11s | [/results-compile-javac-hadoop-yarn-project_hadoop-yarn-jdkUbuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4982/1/artifact/out/results-compile-javac-hadoop-yarn-project_hadoop-yarn-jdkUbuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04.txt) | hadoop-yarn-project_hadoop-yarn-jdkUbuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 generated 3 new + 731 unchanged - 1 fixed = 734 total (was 732) | | +1 :green_heart: | compile | 9m 19s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | -1 :x: | javac | 9m 19s | [/results-compile-javac-hadoop-yarn-project_hadoop-yarn-jdkPrivateBuild-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4982/1/artifact/out/results-compile-javac-hadoop-yarn-project_hadoop-yarn-jdkPrivateBuild-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07.txt) | hadoop-yarn-project_hadoop-yarn-jdkPrivateBuild-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 generated 3 new + 641 unchanged - 1 fixed = 644 total (was 642) | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 58s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 5s | | the patch passed | | +1 :green_heart: | javadoc | 1m 47s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 1m 47s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | -1 :x: | spotbugs | 1m 25s | [/new-spotbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4982/1/artifact/out/new-spotbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router.html) | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) | | +1 :green_heart: | shadedclient | 22m 7s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 1m 30s | [/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-api.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4982/1/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-api.txt) | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 4m 16s | | hadoop-yarn-server-router in the patch
[GitHub] [hadoop] mccormickt12 commented on a diff in pull request #4967: HDFS-16791 WIP - client protocol and Filesystem apis implemented and …
mccormickt12 commented on code in PR #4967: URL: https://github.com/apache/hadoop/pull/4967#discussion_r989668538 ## hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/fs/viewfs/TestViewFileSystemHdfs.java: ## @@ -506,4 +508,29 @@ public void testInternalDirectoryPermissions() throws IOException { assertEquals(fs.getFileStatus(subDirOfInternalDir).getPermission(), fs.getFileStatus(subDirOfRealDir).getPermission()); } + + @Test + public void testEnclosingRootsBase() throws Exception { +final Path zone = new Path("/data/EZ"); +fsTarget.mkdirs(zone); +final Path zone1 = new Path("/data/EZ/zone1"); +fsTarget.mkdirs(zone1); + +DFSTestUtil.createKey("test_key", cluster, 0, CONF); +HdfsAdmin hdfsAdmin = new HdfsAdmin(cluster.getURI(0), CONF); +final EnumSet provisionTrash = +EnumSet.of(CreateEncryptionZoneFlag.PROVISION_TRASH); +hdfsAdmin.createEncryptionZone(zone1, "test_key", provisionTrash); +RemoteIterator zones = hdfsAdmin.listEncryptionZones(); +assertEquals(fsView.getEnclosingRoot(zone), new Path("/data")); +assertEquals(fsView.getEnclosingRoot(zone1), zone1); + +Path nn02Ez = new Path("/mountOnNn2/EZ"); +fsTarget2.mkdirs(nn02Ez); +assertEquals(fsView.getEnclosingRoot((nn02Ez)), new Path("/mountOnNn2")); +HdfsAdmin hdfsAdmin2 = new HdfsAdmin(cluster.getURI(1), CONF); +DFSTestUtil.createKey("test_key", cluster, 1, CONF); +hdfsAdmin2.createEncryptionZone(nn02Ez, "test_key", provisionTrash); +assertEquals(fsView.getEnclosingRoot((nn02Ez)), nn02Ez); Review Comment: I don't think an encryption zone can span multiple mount points -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mccormickt12 commented on a diff in pull request #4967: HDFS-16791 WIP - client protocol and Filesystem apis implemented and …
mccormickt12 commented on code in PR #4967: URL: https://github.com/apache/hadoop/pull/4967#discussion_r989666424 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/router/RouterClientProtocol.java: ## @@ -1935,6 +1935,21 @@ public DatanodeInfo[] getSlowDatanodeReport() throws IOException { return rpcServer.getSlowDatanodeReport(true, 0); } + @Override + public String getEnclosingRoot(String src) throws IOException { +Path mountPath = new Path("/"); +if (subclusterResolver instanceof MountTableResolver) { + MountTableResolver mountTable = (MountTableResolver) subclusterResolver; + if (mountTable.getMountPoint(src) != null) { +// unclear if this is the correct thing to do, probably depends on default mount point / link fallback +mountPath = new Path(mountTable.getMountPoint(src).getSourcePath()); Review Comment: yeah we check that in the if condition above -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4981: YARN-11330: use secure XML parsers
hadoop-yetus commented on PR #4981: URL: https://github.com/apache/hadoop/pull/4981#issuecomment-1271018309 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 48s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 16 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 24s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 28m 42s | | trunk passed | | +1 :green_heart: | compile | 10m 34s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 9m 11s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 2m 4s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 27s | | trunk passed | | +1 :green_heart: | javadoc | 3m 3s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 2m 47s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 5m 30s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 22s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 25s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 4s | | the patch passed | | +1 :green_heart: | compile | 9m 52s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 9m 52s | | the patch passed | | +1 :green_heart: | compile | 8m 59s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 8m 59s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 53s | | the patch passed | | +1 :green_heart: | mvnsite | 3m 8s | | the patch passed | | +1 :green_heart: | javadoc | 2m 37s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 2m 30s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 5m 36s | | the patch passed | | +1 :green_heart: | shadedclient | 24m 27s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 103m 17s | | hadoop-yarn-server-resourcemanager in the patch passed. | | -1 :x: | unit | 24m 5s | [/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4981/1/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager.txt) | hadoop-yarn-server-nodemanager in the patch passed. | | +1 :green_heart: | unit | 28m 25s | | hadoop-yarn-client in the patch passed. | | +1 :green_heart: | asflicense | 1m 7s | | The patch does not generate ASF License warnings. | | | | 327m 56s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.yarn.server.nodemanager.webapp.dao.gpu.TestGpuDeviceInformationParser | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4981/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4981 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 1983159e8c5f 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 976eb867c808e5b11c304c57f74d02d88e91e316 | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test
[GitHub] [hadoop] slfan1989 opened a new pull request, #4982: YARN-11332. [Federation] Improve FederationClientInterceptor#ThreadPool thread pool configuration.
slfan1989 opened a new pull request, #4982: URL: https://github.com/apache/hadoop/pull/4982 JIRA: YARN-11332. [Federation] Improve FederationClientInterceptor#ThreadPool thread pool configuration. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4980: MAPREDUCE-7411: use secure XML parsers
hadoop-yetus commented on PR #4980: URL: https://github.com/apache/hadoop/pull/4980#issuecomment-1270992591 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 54s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 12 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 16m 3s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 28m 53s | | trunk passed | | +1 :green_heart: | compile | 2m 48s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 2m 29s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 19s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 25s | | trunk passed | | +1 :green_heart: | javadoc | 2m 46s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 2m 43s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 5m 24s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 40s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 31s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 17s | | the patch passed | | +1 :green_heart: | compile | 2m 31s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 2m 31s | | the patch passed | | +1 :green_heart: | compile | 2m 14s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 2m 14s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 56s | | hadoop-mapreduce-project/hadoop-mapreduce-client: The patch generated 0 new + 32 unchanged - 1 fixed = 32 total (was 33) | | +1 :green_heart: | mvnsite | 2m 27s | | the patch passed | | +1 :green_heart: | javadoc | 1m 45s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 1m 41s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 4m 36s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 20s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 7m 15s | | hadoop-mapreduce-client-core in the patch passed. | | +1 :green_heart: | unit | 8m 51s | | hadoop-mapreduce-client-app in the patch passed. | | +1 :green_heart: | unit | 5m 0s | | hadoop-mapreduce-client-hs in the patch passed. | | +1 :green_heart: | unit | 140m 33s | | hadoop-mapreduce-client-jobclient in the patch passed. | | +1 :green_heart: | asflicense | 1m 1s | | The patch does not generate ASF License warnings. | | | | 293m 16s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4980/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4980 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 639dd1f5bc01 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / c2a34a2cd432555fa79e050f1b3eaf774a233bcd | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4980/1/testReport/ | | Max. process+thread count | 1613 (vs. ulimit of 5500) | | modules | C:
[GitHub] [hadoop] hadoop-yetus commented on pull request #4775: YARN-11260. Upgrade JUnit from 4 to 5 in hadoop-yarn-server-timelineservice
hadoop-yetus commented on PR #4775: URL: https://github.com/apache/hadoop/pull/4775#issuecomment-1270936981 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 54s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 17 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 41m 26s | | trunk passed | | +1 :green_heart: | compile | 0m 35s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 0m 33s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 35s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 39s | | trunk passed | | +1 :green_heart: | javadoc | 0m 42s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 0m 31s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 6s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 40s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 31s | | the patch passed | | +1 :green_heart: | compile | 0m 25s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 0m 25s | | the patch passed | | +1 :green_heart: | compile | 0m 23s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 23s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 19s | | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice: The patch generated 0 new + 5 unchanged - 1 fixed = 5 total (was 6) | | +1 :green_heart: | mvnsite | 0m 25s | | the patch passed | | +1 :green_heart: | javadoc | 0m 23s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 0m 22s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 0m 53s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 17s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 43s | | hadoop-yarn-server-timelineservice in the patch passed. | | +1 :green_heart: | asflicense | 0m 39s | | The patch does not generate ASF License warnings. | | | | 101m 56s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4775/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4775 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle | | uname | Linux ca506af90d40 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / bcbf519a0b5f2af8fe60d6212ea7449cd403263c | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4775/3/testReport/ | | Max. process+thread count | 531 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4775/3/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0
[GitHub] [hadoop] cnauroth commented on pull request #4248: MAPREDUCE-7370. Parallelize MultipleOutputs#close call
cnauroth commented on PR #4248: URL: https://github.com/apache/hadoop/pull/4248#issuecomment-1270833272 I have merged this to trunk and branch-3.3 (after resolving a trivial merge conflict). @ashutoshcipher , thank you for the contribution. @steveloughran and @aajisaka , thank you for the code reviews. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] cnauroth merged pull request #4248: MAPREDUCE-7370. Parallelize MultipleOutputs#close call
cnauroth merged PR #4248: URL: https://github.com/apache/hadoop/pull/4248 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18469) Add XMLUtils methods to centralise code that creates secure XML parsers
[ https://issues.apache.org/jira/browse/HADOOP-18469?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613802#comment-17613802 ] ASF GitHub Bot commented on HADOOP-18469: - hadoop-yetus commented on PR #4978: URL: https://github.com/apache/hadoop/pull/4978#issuecomment-1270739777 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 10m 8s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 5 new or modified test files. | _ branch-3.3 Compile Tests _ | | +0 :ok: | mvndep | 15m 13s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 26m 56s | | branch-3.3 passed | | +1 :green_heart: | compile | 19m 6s | | branch-3.3 passed | | +1 :green_heart: | checkstyle | 3m 15s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 2m 41s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 1m 52s | | branch-3.3 passed | | +1 :green_heart: | spotbugs | 4m 2s | | branch-3.3 passed | | +1 :green_heart: | shadedclient | 26m 31s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 27s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 36s | | the patch passed | | +1 :green_heart: | compile | 18m 37s | | the patch passed | | +1 :green_heart: | javac | 18m 37s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 3m 7s | | root: The patch generated 0 new + 183 unchanged - 1 fixed = 183 total (was 184) | | +1 :green_heart: | mvnsite | 2m 40s | | the patch passed | | +1 :green_heart: | javadoc | 1m 42s | | the patch passed | | +1 :green_heart: | spotbugs | 4m 12s | | the patch passed | | +1 :green_heart: | shadedclient | 26m 52s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 17m 59s | [/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4978/1/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 0m 55s | | hadoop-rumen in the patch passed. | | +1 :green_heart: | asflicense | 1m 8s | | The patch does not generate ASF License warnings. | | | | 192m 13s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.fs.shell.TestCopyToLocal | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4978/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4978 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 2044ee1d8acc 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / df47f3e9cf4d3fae66bcfdfb36806eeef66f35ba | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4978/1/testReport/ | | Max. process+thread count | 3137 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-rumen U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4978/1/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Add XMLUtils methods to centralise code that creates secure XML parsers > --- > > Key: HADOOP-18469 > URL: https://issues.apache.org/jira/browse/HADOOP-18469 > Project: Hadoop Common > Issue Type:
[GitHub] [hadoop] hadoop-yetus commented on pull request #4978: HADOOP-18469. Add secure XML parser factories to XMLUtils (#4940)
hadoop-yetus commented on PR #4978: URL: https://github.com/apache/hadoop/pull/4978#issuecomment-1270739777 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 10m 8s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 5 new or modified test files. | _ branch-3.3 Compile Tests _ | | +0 :ok: | mvndep | 15m 13s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 26m 56s | | branch-3.3 passed | | +1 :green_heart: | compile | 19m 6s | | branch-3.3 passed | | +1 :green_heart: | checkstyle | 3m 15s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 2m 41s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 1m 52s | | branch-3.3 passed | | +1 :green_heart: | spotbugs | 4m 2s | | branch-3.3 passed | | +1 :green_heart: | shadedclient | 26m 31s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 27s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 36s | | the patch passed | | +1 :green_heart: | compile | 18m 37s | | the patch passed | | +1 :green_heart: | javac | 18m 37s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 3m 7s | | root: The patch generated 0 new + 183 unchanged - 1 fixed = 183 total (was 184) | | +1 :green_heart: | mvnsite | 2m 40s | | the patch passed | | +1 :green_heart: | javadoc | 1m 42s | | the patch passed | | +1 :green_heart: | spotbugs | 4m 12s | | the patch passed | | +1 :green_heart: | shadedclient | 26m 52s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 17m 59s | [/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4978/1/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 0m 55s | | hadoop-rumen in the patch passed. | | +1 :green_heart: | asflicense | 1m 8s | | The patch does not generate ASF License warnings. | | | | 192m 13s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.fs.shell.TestCopyToLocal | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4978/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4978 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 2044ee1d8acc 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / df47f3e9cf4d3fae66bcfdfb36806eeef66f35ba | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4978/1/testReport/ | | Max. process+thread count | 3137 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-rumen U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4978/1/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4750: HDFS-6874. Add GETFILEBLOCKLOCATIONS operation to HttpFS
hadoop-yetus commented on PR #4750: URL: https://github.com/apache/hadoop/pull/4750#issuecomment-1270724564 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 49s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 16m 11s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 26m 11s | | trunk passed | | +1 :green_heart: | compile | 6m 12s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 5m 54s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 34s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 50s | | trunk passed | | +1 :green_heart: | javadoc | 2m 43s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 3m 29s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 7m 26s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 32s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 33s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 43s | | the patch passed | | +1 :green_heart: | compile | 6m 5s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 6m 5s | | the patch passed | | +1 :green_heart: | compile | 5m 43s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 5m 43s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 13s | | hadoop-hdfs-project: The patch generated 0 new + 417 unchanged - 1 fixed = 417 total (was 418) | | +1 :green_heart: | mvnsite | 2m 59s | | the patch passed | | +1 :green_heart: | javadoc | 2m 8s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 2m 39s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | -1 :x: | spotbugs | 3m 27s | [/new-spotbugs-hadoop-hdfs-project_hadoop-hdfs.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4750/4/artifact/out/new-spotbugs-hadoop-hdfs-project_hadoop-hdfs.html) | hadoop-hdfs-project/hadoop-hdfs generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) | | +1 :green_heart: | shadedclient | 20m 24s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 36s | | hadoop-hdfs-client in the patch passed. | | -1 :x: | unit | 249m 24s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4750/4/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | -1 :x: | unit | 6m 44s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4750/4/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt) | hadoop-hdfs-httpfs in the patch passed. | | +1 :green_heart: | asflicense | 1m 12s | | The patch does not generate ASF License warnings. | | | | 410m 32s | | | | Reason | Tests | |---:|:--| | SpotBugs | module:hadoop-hdfs-project/hadoop-hdfs | | | There is an apparent infinite recursive loop in org.apache.hadoop.hdfs.web.JsonUtil.toJsonString(BlockLocation[]) At JsonUtil.java:recursive loop in org.apache.hadoop.hdfs.web.JsonUtil.toJsonString(BlockLocation[]) At JsonUtil.java:[line 712] | | Failed junit tests | hadoop.fs.TestSWebHdfsFileContextMainOperations | | | hadoop.hdfs.web.TestWebHdfsFileSystemContract | | | hadoop.hdfs.web.TestWebHDFS | | | hadoop.hdfs.web.TestWebHdfsUrl | | | hadoop.hdfs.server.namenode.ha.TestObserverNode | | |
[GitHub] [hadoop] hadoop-yetus commented on pull request #4960: YARN-6766 HelperMethod added in AppsBlock class
hadoop-yetus commented on PR #4960: URL: https://github.com/apache/hadoop/pull/4960#issuecomment-1270701093 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 57s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 42m 44s | | trunk passed | | +1 :green_heart: | compile | 1m 18s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 1m 4s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 1s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 10s | | trunk passed | | +1 :green_heart: | javadoc | 1m 6s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 0m 50s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 19s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 58s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 55s | | the patch passed | | +1 :green_heart: | compile | 1m 6s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 1m 6s | | the patch passed | | +1 :green_heart: | compile | 0m 55s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 55s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 44s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 57s | | the patch passed | | +1 :green_heart: | javadoc | 0m 45s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 0m 40s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 5s | | the patch passed | | +1 :green_heart: | shadedclient | 24m 46s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 117m 38s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | asflicense | 0m 40s | | The patch does not generate ASF License warnings. | | | | 227m 56s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4960/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4960 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux b2575655f508 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 1467b3b7dcafc953ebe8e531f4cd4b6468c5a4f6 | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4960/5/testReport/ | | Max. process+thread count | 893 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4960/5/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated
[GitHub] [hadoop] pjfanning opened a new pull request, #4981: YARN-11330: use secure XML parsers
pjfanning opened a new pull request, #4981: URL: https://github.com/apache/hadoop/pull/4981 ### Description of PR ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] pjfanning opened a new pull request, #4980: MAPREDUCE-7411: use secure XML parsers
pjfanning opened a new pull request, #4980: URL: https://github.com/apache/hadoop/pull/4980 ### Description of PR ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] pjfanning opened a new pull request, #4979: HDFS-16795: use secure XML parsers
pjfanning opened a new pull request, #4979: URL: https://github.com/apache/hadoop/pull/4979 ### Description of PR Use XMLUtils to create XML parser factories ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18469) Add XMLUtils methods to centralise code that creates secure XML parsers
[ https://issues.apache.org/jira/browse/HADOOP-18469?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613756#comment-17613756 ] ASF GitHub Bot commented on HADOOP-18469: - steveloughran opened a new pull request, #4978: URL: https://github.com/apache/hadoop/pull/4978 Add to XMLUtils a set of methods to create secure XML Parsers/transformers, locking down DTD, schema, XXE exposure. Use these wherever XML parsers are created. Contributed by PJ Fanning ### Description of PR #4940 merged to branch-3.3; minor import complaints around the move off guava ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? > Add XMLUtils methods to centralise code that creates secure XML parsers > --- > > Key: HADOOP-18469 > URL: https://issues.apache.org/jira/browse/HADOOP-18469 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > Relates to HDFS-16766 > There are other places in the code where DocumentBuilderFactory instances are > created that could benefit from the same changes as HDFS-16766 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran opened a new pull request, #4978: HADOOP-18469. Add secure XML parser factories to XMLUtils (#4940)
steveloughran opened a new pull request, #4978: URL: https://github.com/apache/hadoop/pull/4978 Add to XMLUtils a set of methods to create secure XML Parsers/transformers, locking down DTD, schema, XXE exposure. Use these wherever XML parsers are created. Contributed by PJ Fanning ### Description of PR #4940 merged to branch-3.3; minor import complaints around the move off guava ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18469) Add XMLUtils methods to centralise code that creates secure XML parsers
[ https://issues.apache.org/jira/browse/HADOOP-18469?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613751#comment-17613751 ] ASF GitHub Bot commented on HADOOP-18469: - steveloughran commented on PR #4940: URL: https://github.com/apache/hadoop/pull/4940#issuecomment-1270515281 merged to trunk; will see the build in 3.3. changed the title of the commit as (a) the centralisation hasn't taken place (b) didn't want to have to choose between us and non us spellings of centralise > Add XMLUtils methods to centralise code that creates secure XML parsers > --- > > Key: HADOOP-18469 > URL: https://issues.apache.org/jira/browse/HADOOP-18469 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > Relates to HDFS-16766 > There are other places in the code where DocumentBuilderFactory instances are > created that could benefit from the same changes as HDFS-16766 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #4940: HADOOP-18469: centralise XML parser creation in XMLUtils
steveloughran commented on PR #4940: URL: https://github.com/apache/hadoop/pull/4940#issuecomment-1270515281 merged to trunk; will see the build in 3.3. changed the title of the commit as (a) the centralisation hasn't taken place (b) didn't want to have to choose between us and non us spellings of centralise -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18469) Add XMLUtils methods to centralise code that creates secure XML parsers
[ https://issues.apache.org/jira/browse/HADOOP-18469?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613749#comment-17613749 ] ASF GitHub Bot commented on HADOOP-18469: - steveloughran merged PR #4940: URL: https://github.com/apache/hadoop/pull/4940 > Add XMLUtils methods to centralise code that creates secure XML parsers > --- > > Key: HADOOP-18469 > URL: https://issues.apache.org/jira/browse/HADOOP-18469 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > Relates to HDFS-16766 > There are other places in the code where DocumentBuilderFactory instances are > created that could benefit from the same changes as HDFS-16766 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran merged pull request #4940: HADOOP-18469: centralise XML parser creation in XMLUtils
steveloughran merged PR #4940: URL: https://github.com/apache/hadoop/pull/4940 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18401) No ARM binaries in branch-3.3.x releases
[ https://issues.apache.org/jira/browse/HADOOP-18401?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613747#comment-17613747 ] ASF GitHub Bot commented on HADOOP-18401: - steveloughran commented on PR #4953: URL: https://github.com/apache/hadoop/pull/4953#issuecomment-1270505378 yeah, i hit the same thing ``` $ mv /build/source/target/hadoop-site-3.3.9-SNAPSHOT.tar.gz /build/source/target/artifacts/hadoop-3.3.9-SNAPSHOT-site.tar.gz $ cp -p /build/source/hadoop-common-project/hadoop-common/src/site/markdown/release/3.3.9-SNAPSHOT/CHANGELOG*.md /build/source/target/artifacts/CHANGELOG.md cp: cannot stat '/build/source/hadoop-common-project/hadoop-common/src/site/markdown/release/3.3.9-SNAPSHOT/CHANGELOG*.md': No such file or directory ``` I think this is good to get into -3.3 and then -3.3.5; then we can tune the releases there > No ARM binaries in branch-3.3.x releases > > > Key: HADOOP-18401 > URL: https://issues.apache.org/jira/browse/HADOOP-18401 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.3.2, 3.3.3, 3.3.4 >Reporter: Ling Xu >Priority: Minor > Labels: pull-request-available > Attachments: image-2022-08-11-14-54-15-490.png > > > release files miss hadoop-3.3.4-aarch64.tar.gz > !image-2022-08-11-14-54-15-490.png! -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #4953: HADOOP-18401. No ARM binaries in branch-3.3.x releases.
steveloughran commented on PR #4953: URL: https://github.com/apache/hadoop/pull/4953#issuecomment-1270505378 yeah, i hit the same thing ``` $ mv /build/source/target/hadoop-site-3.3.9-SNAPSHOT.tar.gz /build/source/target/artifacts/hadoop-3.3.9-SNAPSHOT-site.tar.gz $ cp -p /build/source/hadoop-common-project/hadoop-common/src/site/markdown/release/3.3.9-SNAPSHOT/CHANGELOG*.md /build/source/target/artifacts/CHANGELOG.md cp: cannot stat '/build/source/hadoop-common-project/hadoop-common/src/site/markdown/release/3.3.9-SNAPSHOT/CHANGELOG*.md': No such file or directory ``` I think this is good to get into -3.3 and then -3.3.5; then we can tune the releases there -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18480) upgrade AWS SDK for release 3.3.5
[ https://issues.apache.org/jira/browse/HADOOP-18480?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613709#comment-17613709 ] ASF GitHub Bot commented on HADOOP-18480: - steveloughran commented on PR #4972: URL: https://github.com/apache/hadoop/pull/4972#issuecomment-1270375525 test failure is covered in "HDFS-16142. TestObservernode#testMkdirsRaceWithObserverRead is flaky" > upgrade AWS SDK for release 3.3.5 > -- > > Key: HADOOP-18480 > URL: https://issues.apache.org/jira/browse/HADOOP-18480 > Project: Hadoop Common > Issue Type: Sub-task > Components: build, fs/s3 >Affects Versions: 3.3.5 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > > go up to the latest sdk through the usual qualification process. > no doubt it'll be bigger... -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #4972: HADOOP-18480. Upgrade aws sdk to 1.12.316
steveloughran commented on PR #4972: URL: https://github.com/apache/hadoop/pull/4972#issuecomment-1270375525 test failure is covered in "HDFS-16142. TestObservernode#testMkdirsRaceWithObserverRead is flaky" -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4963: YARN-11326. [Federation] Add RM FederationStateStoreService Metrics.
hadoop-yetus commented on PR #4963: URL: https://github.com/apache/hadoop/pull/4963#issuecomment-1270374891 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 17m 31s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 52s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 42s | | trunk passed | | +1 :green_heart: | compile | 4m 0s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 3m 28s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 27s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 9s | | trunk passed | | +1 :green_heart: | javadoc | 2m 5s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 1m 37s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 40s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 46s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 30s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 30s | | the patch passed | | +1 :green_heart: | compile | 3m 49s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 3m 48s | | the patch passed | | +1 :green_heart: | compile | 3m 16s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 3m 16s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 12s | [/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4963/8/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server.txt) | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server: The patch generated 3 new + 0 unchanged - 0 fixed = 3 total (was 0) | | +1 :green_heart: | mvnsite | 1m 40s | | the patch passed | | +1 :green_heart: | javadoc | 1m 21s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 1m 18s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 34s | | the patch passed | | +1 :green_heart: | shadedclient | 21m 31s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 21s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 103m 17s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | asflicense | 0m 43s | | The patch does not generate ASF License warnings. | | | | 246m 51s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4963/8/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4963 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux da7805bcf7df 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / a127c357793a49d52baaea7c1c96cf31b7289a3c | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4963/8/testReport/ | | Max. process+thread count | 984 (vs. ulimit of 5500) | | modules | C:
[jira] [Commented] (HADOOP-17912) ABFS: Support for Encryption Context
[ https://issues.apache.org/jira/browse/HADOOP-17912?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613699#comment-17613699 ] ASF GitHub Bot commented on HADOOP-17912: - steveloughran commented on code in PR #3440: URL: https://github.com/apache/hadoop/pull/3440#discussion_r989072831 ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AzureBlobFileSystemStore.java: ## @@ -779,6 +787,18 @@ public AbfsInputStream openFileForRead(Path path, contentLength = Long.parseLong( op.getResponseHeader(HttpHeaderConfigurations.CONTENT_LENGTH)); eTag = op.getResponseHeader(HttpHeaderConfigurations.ETAG); +if (client.getEncryptionType() == EncryptionType.ENCRYPTION_CONTEXT) { + try { +encryptionAdapter = new EncryptionAdapter( +client.getEncryptionContextProvider(), getRelativePath(path), + op.getResponseHeader(HttpHeaderConfigurations.X_MS_ENCRYPTION_CONTEXT) +.getBytes(StandardCharsets.UTF_8)); + } catch (NullPointerException ex) { +LOG.debug("EncryptionContext missing in GetPathStatus response"); +throw new IOException( Review Comment: raise a PathIOExtension and include the path of the file, for better diagnostics ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AzureBlobFileSystemStore.java: ## @@ -599,26 +605,25 @@ public OutputStream createFile(final Path path, * only if there is match for eTag of existing file. * @param relativePath * @param statistics - * @param permission - * @param umask + * @param permissions contains permission and umask * @param isAppendBlob * @return * @throws AzureBlobFileSystemException */ private AbfsRestOperation conditionalCreateOverwriteFile(final String relativePath, final FileSystem.Statistics statistics, - final String permission, - final String umask, + Permissions permissions, Review Comment: nit: make final for consistency with the others. ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AzureBlobFileSystemStore.java: ## @@ -1616,16 +1647,39 @@ private void initializeClient(URI uri, String fileSystemName, abfsConfiguration.getRawConfiguration()); } +// Encryption setup +EncryptionContextProvider encryptionContextProvider = null; +if (isSecure) { + encryptionContextProvider = + abfsConfiguration.createEncryptionContextProvider(); + if (encryptionContextProvider != null) { +if (abfsConfiguration.getEncodedClientProvidedEncryptionKey() != null) { + throw new IOException( + "Both global key and encryption context are set, only one allowed"); +} +encryptionContextProvider.initialize( +abfsConfiguration.getRawConfiguration(), accountName, +fileSystemName); + } else if (abfsConfiguration.getEncodedClientProvidedEncryptionKey() != null) { +if (abfsConfiguration.getEncodedClientProvidedEncryptionKeySHA() != null) { +} else { + throw new IOException( + "Encoded SHA256 hash must be provided for global encryption"); Review Comment: make PathIOException and include uri of the store ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/constants/ConfigurationKeys.java: ## @@ -188,8 +188,14 @@ public final class ConfigurationKeys { public static final String AZURE_KEY_ACCOUNT_SHELLKEYPROVIDER_SCRIPT = "fs.azure.shellkeyprovider.script"; /** Setting this true will make the driver use it's own RemoteIterator implementation */ public static final String FS_AZURE_ENABLE_ABFS_LIST_ITERATOR = "fs.azure.enable.abfslistiterator"; - /** Server side encryption key */ - public static final String FS_AZURE_CLIENT_PROVIDED_ENCRYPTION_KEY = "fs.azure.client-provided-encryption-key"; + /** Server side encryption key encoded in Base6format */ Review Comment: 1. add a `{@Value}` reference for the javadocs to insert it (and IDEs to show it) 2. add a ".' at the end of the javadocs (here and any new ones) to stop some versions of javadoc rejecting the comment ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/extensions/EncryptionContextProvider.java: ## @@ -0,0 +1,58 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + *
[jira] [Commented] (HADOOP-17912) ABFS: Support for Encryption Context
[ https://issues.apache.org/jira/browse/HADOOP-17912?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613701#comment-17613701 ] ASF GitHub Bot commented on HADOOP-17912: - steveloughran commented on code in PR #3440: URL: https://github.com/apache/hadoop/pull/3440#discussion_r989213630 ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/services/AbfsClient.java: ## @@ -228,16 +231,65 @@ List createDefaultHeaders() { return requestHeaders; } - private void addCustomerProvidedKeyHeaders( - final List requestHeaders) { -if (clientProvidedEncryptionKey != null) { - requestHeaders.add( - new AbfsHttpHeader(X_MS_ENCRYPTION_KEY, clientProvidedEncryptionKey)); - requestHeaders.add(new AbfsHttpHeader(X_MS_ENCRYPTION_KEY_SHA256, - clientProvidedEncryptionKeySHA)); - requestHeaders.add(new AbfsHttpHeader(X_MS_ENCRYPTION_ALGORITHM, - SERVER_SIDE_ENCRYPTION_ALGORITHM)); + private void addEncryptionKeyRequestHeaders(String path, Review Comment: 1. needs javadocs 2. this may now do a HEAD request on any operation, and seems to be called everywhere, including getPathStatus. Which methods actually need the header and how can we minimise that IO? ## hadoop-tools/hadoop-azure/src/site/markdown/abfs.md: ## @@ -888,6 +888,38 @@ specified SSL channel mode. Value should be of the enum DelegatingSSLSocketFactory.SSLChannelMode. The default value will be DelegatingSSLSocketFactory.SSLChannelMode.Default. +### Encryption Options +Only one of the following two options can be configured. If config values of +both types are set, ABFS driver will throw an exception. If using the global +key type, ensure both pre-computed values are provided. + + Customer-Provided Global Key +A global encryption key can be configured by providing the following +pre-computed values. The key will be applied to any new files created post +setting the configuration, and will be required in the requests to read ro Review Comment: nit; ro should be "or" ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/services/AbfsClient.java: ## @@ -346,24 +398,29 @@ public AbfsRestOperation deleteFilesystem(TracingContext tracingContext) throws return op; } - public AbfsRestOperation createPath(final String path, final boolean isFile, final boolean overwrite, - final String permission, final String umask, - final boolean isAppendBlob, final String eTag, - TracingContext tracingContext) throws AzureBlobFileSystemException { + public AbfsRestOperation createPath(final String path, final boolean isFile, + final boolean overwrite, final Permissions permissions, + final boolean isAppendBlob, final String eTag, + EncryptionAdapter encryptionAdapter, TracingContext tracingContext) Review Comment: there's a lot of args here 1. add javadocs 2. put one parameter to a line 3. make these two final ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/services/AbfsClient.java: ## @@ -228,16 +231,65 @@ List createDefaultHeaders() { return requestHeaders; } - private void addCustomerProvidedKeyHeaders( - final List requestHeaders) { -if (clientProvidedEncryptionKey != null) { - requestHeaders.add( - new AbfsHttpHeader(X_MS_ENCRYPTION_KEY, clientProvidedEncryptionKey)); - requestHeaders.add(new AbfsHttpHeader(X_MS_ENCRYPTION_KEY_SHA256, - clientProvidedEncryptionKeySHA)); - requestHeaders.add(new AbfsHttpHeader(X_MS_ENCRYPTION_ALGORITHM, - SERVER_SIDE_ENCRYPTION_ALGORITHM)); + private void addEncryptionKeyRequestHeaders(String path, + List requestHeaders, boolean isCreateFileRequest, + EncryptionAdapter encryptionAdapter, TracingContext tracingContext) + throws IOException { +String encodedKey, encodedKeySHA256; +boolean encryptionAdapterCreated = false; +switch (encryptionType) { +case GLOBAL_KEY: + encodedKey = clientProvidedEncryptionKey; + encodedKeySHA256 = clientProvidedEncryptionKeySHA; + break; + +case ENCRYPTION_CONTEXT: + if (isCreateFileRequest) { +// get new context for create file request +requestHeaders.add(new AbfsHttpHeader(X_MS_ENCRYPTION_CONTEXT, +encryptionAdapter.getEncodedContext())); + } else if (encryptionAdapter == null) { +// get encryption context from GetPathStatus response header +byte[] encryptionContext; +try { + encryptionContext = getPathStatus(path, false, tracingContext) + .getResult().getResponseHeader(X_MS_ENCRYPTION_CONTEXT) + .getBytes(StandardCharsets.UTF_8); +} catch
[GitHub] [hadoop] steveloughran commented on a diff in pull request #3440: HADOOP-17912. ABFS: Support for Encryption Context
steveloughran commented on code in PR #3440: URL: https://github.com/apache/hadoop/pull/3440#discussion_r989213630 ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/services/AbfsClient.java: ## @@ -228,16 +231,65 @@ List createDefaultHeaders() { return requestHeaders; } - private void addCustomerProvidedKeyHeaders( - final List requestHeaders) { -if (clientProvidedEncryptionKey != null) { - requestHeaders.add( - new AbfsHttpHeader(X_MS_ENCRYPTION_KEY, clientProvidedEncryptionKey)); - requestHeaders.add(new AbfsHttpHeader(X_MS_ENCRYPTION_KEY_SHA256, - clientProvidedEncryptionKeySHA)); - requestHeaders.add(new AbfsHttpHeader(X_MS_ENCRYPTION_ALGORITHM, - SERVER_SIDE_ENCRYPTION_ALGORITHM)); + private void addEncryptionKeyRequestHeaders(String path, Review Comment: 1. needs javadocs 2. this may now do a HEAD request on any operation, and seems to be called everywhere, including getPathStatus. Which methods actually need the header and how can we minimise that IO? ## hadoop-tools/hadoop-azure/src/site/markdown/abfs.md: ## @@ -888,6 +888,38 @@ specified SSL channel mode. Value should be of the enum DelegatingSSLSocketFactory.SSLChannelMode. The default value will be DelegatingSSLSocketFactory.SSLChannelMode.Default. +### Encryption Options +Only one of the following two options can be configured. If config values of +both types are set, ABFS driver will throw an exception. If using the global +key type, ensure both pre-computed values are provided. + + Customer-Provided Global Key +A global encryption key can be configured by providing the following +pre-computed values. The key will be applied to any new files created post +setting the configuration, and will be required in the requests to read ro Review Comment: nit; ro should be "or" ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/services/AbfsClient.java: ## @@ -346,24 +398,29 @@ public AbfsRestOperation deleteFilesystem(TracingContext tracingContext) throws return op; } - public AbfsRestOperation createPath(final String path, final boolean isFile, final boolean overwrite, - final String permission, final String umask, - final boolean isAppendBlob, final String eTag, - TracingContext tracingContext) throws AzureBlobFileSystemException { + public AbfsRestOperation createPath(final String path, final boolean isFile, + final boolean overwrite, final Permissions permissions, + final boolean isAppendBlob, final String eTag, + EncryptionAdapter encryptionAdapter, TracingContext tracingContext) Review Comment: there's a lot of args here 1. add javadocs 2. put one parameter to a line 3. make these two final ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/services/AbfsClient.java: ## @@ -228,16 +231,65 @@ List createDefaultHeaders() { return requestHeaders; } - private void addCustomerProvidedKeyHeaders( - final List requestHeaders) { -if (clientProvidedEncryptionKey != null) { - requestHeaders.add( - new AbfsHttpHeader(X_MS_ENCRYPTION_KEY, clientProvidedEncryptionKey)); - requestHeaders.add(new AbfsHttpHeader(X_MS_ENCRYPTION_KEY_SHA256, - clientProvidedEncryptionKeySHA)); - requestHeaders.add(new AbfsHttpHeader(X_MS_ENCRYPTION_ALGORITHM, - SERVER_SIDE_ENCRYPTION_ALGORITHM)); + private void addEncryptionKeyRequestHeaders(String path, + List requestHeaders, boolean isCreateFileRequest, + EncryptionAdapter encryptionAdapter, TracingContext tracingContext) + throws IOException { +String encodedKey, encodedKeySHA256; +boolean encryptionAdapterCreated = false; +switch (encryptionType) { +case GLOBAL_KEY: + encodedKey = clientProvidedEncryptionKey; + encodedKeySHA256 = clientProvidedEncryptionKeySHA; + break; + +case ENCRYPTION_CONTEXT: + if (isCreateFileRequest) { +// get new context for create file request +requestHeaders.add(new AbfsHttpHeader(X_MS_ENCRYPTION_CONTEXT, +encryptionAdapter.getEncodedContext())); + } else if (encryptionAdapter == null) { +// get encryption context from GetPathStatus response header +byte[] encryptionContext; +try { + encryptionContext = getPathStatus(path, false, tracingContext) + .getResult().getResponseHeader(X_MS_ENCRYPTION_CONTEXT) + .getBytes(StandardCharsets.UTF_8); +} catch (NullPointerException e) { Review Comment: i would prefer checking the result or response header for being null, rather than relying on exceptions and downgrading. it's a lot less efficient ##
[GitHub] [hadoop] goiri commented on a diff in pull request #4960: YARN-6766 HelperMethod added in AppsBlock class
goiri commented on code in PR #4960: URL: https://github.com/apache/hadoop/pull/4960#discussion_r989248515 ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/main/java/org/apache/hadoop/yarn/server/resourcemanager/webapp/FairSchedulerAppsBlock.java: ## @@ -129,6 +129,12 @@ protected Boolean hasAccess(RMApp app, HttpServletRequest hsr) { return true; } + public String printAppInfo(long appInfoFunc) { +if (appInfoFunc== -1) { Review Comment: Space before == ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/main/java/org/apache/hadoop/yarn/server/resourcemanager/webapp/FairSchedulerAppsBlock.java: ## @@ -129,6 +129,12 @@ protected Boolean hasAccess(RMApp app, HttpServletRequest hsr) { return true; } + public String printAppInfo(long appInfoFunc) { Review Comment: make it private and static. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on a diff in pull request #3440: HADOOP-17912. ABFS: Support for Encryption Context
steveloughran commented on code in PR #3440: URL: https://github.com/apache/hadoop/pull/3440#discussion_r989072831 ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AzureBlobFileSystemStore.java: ## @@ -779,6 +787,18 @@ public AbfsInputStream openFileForRead(Path path, contentLength = Long.parseLong( op.getResponseHeader(HttpHeaderConfigurations.CONTENT_LENGTH)); eTag = op.getResponseHeader(HttpHeaderConfigurations.ETAG); +if (client.getEncryptionType() == EncryptionType.ENCRYPTION_CONTEXT) { + try { +encryptionAdapter = new EncryptionAdapter( +client.getEncryptionContextProvider(), getRelativePath(path), + op.getResponseHeader(HttpHeaderConfigurations.X_MS_ENCRYPTION_CONTEXT) +.getBytes(StandardCharsets.UTF_8)); + } catch (NullPointerException ex) { +LOG.debug("EncryptionContext missing in GetPathStatus response"); +throw new IOException( Review Comment: raise a PathIOExtension and include the path of the file, for better diagnostics ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AzureBlobFileSystemStore.java: ## @@ -599,26 +605,25 @@ public OutputStream createFile(final Path path, * only if there is match for eTag of existing file. * @param relativePath * @param statistics - * @param permission - * @param umask + * @param permissions contains permission and umask * @param isAppendBlob * @return * @throws AzureBlobFileSystemException */ private AbfsRestOperation conditionalCreateOverwriteFile(final String relativePath, final FileSystem.Statistics statistics, - final String permission, - final String umask, + Permissions permissions, Review Comment: nit: make final for consistency with the others. ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AzureBlobFileSystemStore.java: ## @@ -1616,16 +1647,39 @@ private void initializeClient(URI uri, String fileSystemName, abfsConfiguration.getRawConfiguration()); } +// Encryption setup +EncryptionContextProvider encryptionContextProvider = null; +if (isSecure) { + encryptionContextProvider = + abfsConfiguration.createEncryptionContextProvider(); + if (encryptionContextProvider != null) { +if (abfsConfiguration.getEncodedClientProvidedEncryptionKey() != null) { + throw new IOException( + "Both global key and encryption context are set, only one allowed"); +} +encryptionContextProvider.initialize( +abfsConfiguration.getRawConfiguration(), accountName, +fileSystemName); + } else if (abfsConfiguration.getEncodedClientProvidedEncryptionKey() != null) { +if (abfsConfiguration.getEncodedClientProvidedEncryptionKeySHA() != null) { +} else { + throw new IOException( + "Encoded SHA256 hash must be provided for global encryption"); Review Comment: make PathIOException and include uri of the store ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/constants/ConfigurationKeys.java: ## @@ -188,8 +188,14 @@ public final class ConfigurationKeys { public static final String AZURE_KEY_ACCOUNT_SHELLKEYPROVIDER_SCRIPT = "fs.azure.shellkeyprovider.script"; /** Setting this true will make the driver use it's own RemoteIterator implementation */ public static final String FS_AZURE_ENABLE_ABFS_LIST_ITERATOR = "fs.azure.enable.abfslistiterator"; - /** Server side encryption key */ - public static final String FS_AZURE_CLIENT_PROVIDED_ENCRYPTION_KEY = "fs.azure.client-provided-encryption-key"; + /** Server side encryption key encoded in Base6format */ Review Comment: 1. add a `{@Value}` reference for the javadocs to insert it (and IDEs to show it) 2. add a ".' at the end of the javadocs (here and any new ones) to stop some versions of javadoc rejecting the comment ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/extensions/EncryptionContextProvider.java: ## @@ -0,0 +1,58 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +
[jira] [Commented] (HADOOP-18469) Add XMLUtils methods to centralise code that creates secure XML parsers
[ https://issues.apache.org/jira/browse/HADOOP-18469?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613608#comment-17613608 ] ASF GitHub Bot commented on HADOOP-18469: - pjfanning commented on PR #4940: URL: https://github.com/apache/hadoop/pull/4940#issuecomment-1270290164 > thanks, happy with the explanation. > > I'm +1 for the change. there is one suggestion, use AbstractHadoopTestBase, but its not a blocker for this patch. if you don't want to do that, say so and i will merge as is. @steveloughran I fixed the formatting issue and the build passed > Add XMLUtils methods to centralise code that creates secure XML parsers > --- > > Key: HADOOP-18469 > URL: https://issues.apache.org/jira/browse/HADOOP-18469 > Project: Hadoop Common > Issue Type: Improvement >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > Relates to HDFS-16766 > There are other places in the code where DocumentBuilderFactory instances are > created that could benefit from the same changes as HDFS-16766 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] pjfanning commented on pull request #4940: HADOOP-18469: centralise XML parser creation in XMLUtils
pjfanning commented on PR #4940: URL: https://github.com/apache/hadoop/pull/4940#issuecomment-1270290164 > thanks, happy with the explanation. > > I'm +1 for the change. there is one suggestion, use AbstractHadoopTestBase, but its not a blocker for this patch. if you don't want to do that, say so and i will merge as is. @steveloughran I fixed the formatting issue and the build passed -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18469) Add XMLUtils methods to centralise code that creates secure XML parsers
[ https://issues.apache.org/jira/browse/HADOOP-18469?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613605#comment-17613605 ] ASF GitHub Bot commented on HADOOP-18469: - hadoop-yetus commented on PR #4940: URL: https://github.com/apache/hadoop/pull/4940#issuecomment-1270285461 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 49s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 5 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 25s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 50s | | trunk passed | | +1 :green_heart: | compile | 23m 16s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 20m 42s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 4m 7s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 5s | | trunk passed | | +1 :green_heart: | javadoc | 2m 26s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 2m 9s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 4m 16s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 17s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 33s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 35s | | the patch passed | | +1 :green_heart: | compile | 22m 46s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 22m 46s | | the patch passed | | +1 :green_heart: | compile | 20m 44s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 44s | | the patch passed | | +1 :green_heart: | blanks | 0m 1s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 57s | | root: The patch generated 0 new + 182 unchanged - 1 fixed = 182 total (was 183) | | +1 :green_heart: | mvnsite | 3m 13s | | the patch passed | | +1 :green_heart: | javadoc | 2m 22s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 2m 6s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 4m 36s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 28s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 36s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 1m 12s | | hadoop-rumen in the patch passed. | | +1 :green_heart: | asflicense | 1m 22s | | The patch does not generate ASF License warnings. | | | | 237m 47s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4940/13/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4940 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 40d53db007e2 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / e44488d863e24c85ac482f192402e1175097b438 | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4940/13/testReport/ | | Max. process+thread count | 1301 (vs. ulimit of 5500) | | modules | C:
[GitHub] [hadoop] hadoop-yetus commented on pull request #4940: HADOOP-18469: centralise XML parser creation in XMLUtils
hadoop-yetus commented on PR #4940: URL: https://github.com/apache/hadoop/pull/4940#issuecomment-1270285461 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 49s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 5 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 25s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 50s | | trunk passed | | +1 :green_heart: | compile | 23m 16s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 20m 42s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 4m 7s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 5s | | trunk passed | | +1 :green_heart: | javadoc | 2m 26s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 2m 9s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 4m 16s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 17s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 33s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 35s | | the patch passed | | +1 :green_heart: | compile | 22m 46s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 22m 46s | | the patch passed | | +1 :green_heart: | compile | 20m 44s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 44s | | the patch passed | | +1 :green_heart: | blanks | 0m 1s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 57s | | root: The patch generated 0 new + 182 unchanged - 1 fixed = 182 total (was 183) | | +1 :green_heart: | mvnsite | 3m 13s | | the patch passed | | +1 :green_heart: | javadoc | 2m 22s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 2m 6s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 4m 36s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 28s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 36s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 1m 12s | | hadoop-rumen in the patch passed. | | +1 :green_heart: | asflicense | 1m 22s | | The patch does not generate ASF License warnings. | | | | 237m 47s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4940/13/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4940 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 40d53db007e2 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / e44488d863e24c85ac482f192402e1175097b438 | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4940/13/testReport/ | | Max. process+thread count | 1301 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-rumen U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4940/13/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by |
[GitHub] [hadoop] hadoop-yetus commented on pull request #4963: YARN-11326. [Federation] Add RM FederationStateStoreService Metrics.
hadoop-yetus commented on PR #4963: URL: https://github.com/apache/hadoop/pull/4963#issuecomment-1270278699 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 2s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 35s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 30m 42s | | trunk passed | | +1 :green_heart: | compile | 4m 30s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 3m 48s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 35s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 11s | | trunk passed | | +1 :green_heart: | javadoc | 1m 57s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 1m 47s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 4m 9s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 8s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 30s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 49s | | the patch passed | | +1 :green_heart: | compile | 4m 46s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 4m 46s | | the patch passed | | +1 :green_heart: | compile | 3m 37s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 3m 37s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 12s | [/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4963/6/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server.txt) | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server: The patch generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | +1 :green_heart: | mvnsite | 1m 42s | | the patch passed | | +1 :green_heart: | javadoc | 1m 22s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 1m 14s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 49s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 27s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 12s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 103m 55s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | | The patch does not generate ASF License warnings. | | | | 245m 54s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4963/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4963 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux fcc74930473b 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / b14d43b48204ff5eaa8490a543dec659484469f6 | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4963/6/testReport/ | | Max. process+thread count | 912 (vs. ulimit of 5500) | | modules | C:
[GitHub] [hadoop] hadoop-yetus commented on pull request #4963: YARN-11326. [Federation] Add RM FederationStateStoreService Metrics.
hadoop-yetus commented on PR #4963: URL: https://github.com/apache/hadoop/pull/4963#issuecomment-1270273630 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 3s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 5s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 28m 55s | | trunk passed | | +1 :green_heart: | compile | 4m 14s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 3m 29s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 25s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 59s | | trunk passed | | +1 :green_heart: | javadoc | 1m 43s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 1m 28s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 44s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 24s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 24s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 33s | | the patch passed | | +1 :green_heart: | compile | 4m 2s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 4m 2s | | the patch passed | | +1 :green_heart: | compile | 3m 18s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 3m 18s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 11s | [/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4963/7/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server.txt) | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server: The patch generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | +1 :green_heart: | mvnsite | 1m 38s | | the patch passed | | +1 :green_heart: | javadoc | 1m 22s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 1m 13s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 38s | | the patch passed | | +1 :green_heart: | shadedclient | 24m 18s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 2s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 107m 27s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | asflicense | 0m 39s | | The patch does not generate ASF License warnings. | | | | 241m 17s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4963/7/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4963 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 8cfbc9981a92 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / b14d43b48204ff5eaa8490a543dec659484469f6 | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4963/7/testReport/ | | Max. process+thread count | 905 (vs. ulimit of 5500) | | modules | C:
[jira] [Commented] (HADOOP-18465) S3A server-side encryption tests fail before checking encryption tests should skip
[ https://issues.apache.org/jira/browse/HADOOP-18465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613602#comment-17613602 ] ASF GitHub Bot commented on HADOOP-18465: - hadoop-yetus commented on PR #4977: URL: https://github.com/apache/hadoop/pull/4977#issuecomment-1270248216 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 10m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ branch-3.3 Compile Tests _ | | +1 :green_heart: | mvninstall | 39m 36s | | branch-3.3 passed | | +1 :green_heart: | compile | 0m 34s | | branch-3.3 passed | | +1 :green_heart: | checkstyle | 0m 31s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 0m 44s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 0m 35s | | branch-3.3 passed | | +1 :green_heart: | spotbugs | 1m 15s | | branch-3.3 passed | | +1 :green_heart: | shadedclient | 25m 48s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 37s | | the patch passed | | +1 :green_heart: | compile | 0m 29s | | the patch passed | | +1 :green_heart: | javac | 0m 29s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 17s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 33s | | the patch passed | | +1 :green_heart: | javadoc | 0m 22s | | the patch passed | | +1 :green_heart: | spotbugs | 1m 8s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 30s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 7s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 33s | | The patch does not generate ASF License warnings. | | | | 111m 44s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4977/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4977 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 7ff1fca53413 4.15.0-192-generic #203-Ubuntu SMP Wed Aug 10 17:40:03 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / 0c1fb26165a86ef1862e4560170e21fca770601c | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4977/1/testReport/ | | Max. process+thread count | 532 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4977/1/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > S3A server-side encryption tests fail before checking encryption tests should > skip > -- > > Key: HADOOP-18465 > URL: https://issues.apache.org/jira/browse/HADOOP-18465 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Minor > Labels: pull-request-available > > When setting {{test.fs.s3a.encryption.enabled}} to {{{}false{}}}, this is not > respected by ITestS3AEncryptionSSEKMSDefaultKey. See failure below. > > {code:java} > -- > Test set: org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey > --- > Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 6.053 s
[GitHub] [hadoop] hadoop-yetus commented on pull request #4977: HADOOP-18465. Fix S3A SSE test skip when encryption is disabled
hadoop-yetus commented on PR #4977: URL: https://github.com/apache/hadoop/pull/4977#issuecomment-1270248216 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 10m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ branch-3.3 Compile Tests _ | | +1 :green_heart: | mvninstall | 39m 36s | | branch-3.3 passed | | +1 :green_heart: | compile | 0m 34s | | branch-3.3 passed | | +1 :green_heart: | checkstyle | 0m 31s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 0m 44s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 0m 35s | | branch-3.3 passed | | +1 :green_heart: | spotbugs | 1m 15s | | branch-3.3 passed | | +1 :green_heart: | shadedclient | 25m 48s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 37s | | the patch passed | | +1 :green_heart: | compile | 0m 29s | | the patch passed | | +1 :green_heart: | javac | 0m 29s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 17s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 33s | | the patch passed | | +1 :green_heart: | javadoc | 0m 22s | | the patch passed | | +1 :green_heart: | spotbugs | 1m 8s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 30s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 7s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 33s | | The patch does not generate ASF License warnings. | | | | 111m 44s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4977/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4977 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 7ff1fca53413 4.15.0-192-generic #203-Ubuntu SMP Wed Aug 10 17:40:03 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / 0c1fb26165a86ef1862e4560170e21fca770601c | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4977/1/testReport/ | | Max. process+thread count | 532 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4977/1/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4960: YARN-6766 HelperMethod added in AppsBlock class
hadoop-yetus commented on PR #4960: URL: https://github.com/apache/hadoop/pull/4960#issuecomment-1270246966 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 55s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 43m 19s | | trunk passed | | +1 :green_heart: | compile | 1m 31s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 1m 13s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 3s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 20s | | trunk passed | | +1 :green_heart: | javadoc | 1m 11s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 0m 55s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 43s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 55s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 59s | | the patch passed | | +1 :green_heart: | compile | 1m 8s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 1m 8s | | the patch passed | | +1 :green_heart: | compile | 0m 57s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 57s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 49s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 9s | | the patch passed | | +1 :green_heart: | javadoc | 0m 49s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 0m 42s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 33s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 28s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 104m 49s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | asflicense | 0m 40s | | The patch does not generate ASF License warnings. | | | | 218m 48s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4960/4/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4960 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux ec8e853c369e 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / d35b3031b08ec8b3890e4462fbc9213791547668 | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4960/4/testReport/ | | Max. process+thread count | 899 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4960/4/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated
[GitHub] [hadoop] hadoop-yetus commented on pull request #4976: YARN-11328. Refactoring part of the code of SQLFederationStateStore.
hadoop-yetus commented on PR #4976: URL: https://github.com/apache/hadoop/pull/4976#issuecomment-1270235890 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 48s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 42m 5s | | trunk passed | | +1 :green_heart: | compile | 0m 50s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 0m 44s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 37s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 50s | | trunk passed | | +1 :green_heart: | javadoc | 0m 51s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 0m 38s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 41s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 22s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 38s | | the patch passed | | +1 :green_heart: | compile | 0m 49s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 0m 49s | | the patch passed | | +1 :green_heart: | compile | 0m 42s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 42s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 26s | | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common: The patch generated 0 new + 0 unchanged - 1 fixed = 0 total (was 1) | | +1 :green_heart: | mvnsite | 0m 47s | | the patch passed | | +1 :green_heart: | javadoc | 0m 41s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 0m 35s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 56s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 44s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 0s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 38s | | The patch does not generate ASF License warnings. | | | | 110m 43s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4976/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4976 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux e2afebde7e0e 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 8eab610c4180e263a479a5e13180b91a43103aeb | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4976/1/testReport/ | | Max. process+thread count | 530 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4976/1/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache
[GitHub] [hadoop] ashutoshcipher commented on pull request #4775: YARN-11260. Upgrade JUnit from 4 to 5 in hadoop-yarn-server-timelineservice
ashutoshcipher commented on PR #4775: URL: https://github.com/apache/hadoop/pull/4775#issuecomment-1270206047 Thanks @aajisaka for review. I will address you comments in my next commit. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on pull request #4962: HDFS-16024: RBF: Rename data to the Trash should be based on src locations
aajisaka commented on PR #4962: URL: https://github.com/apache/hadoop/pull/4962#issuecomment-1270205799 > @aajisaka @jojochuang - Do we need a new jira for cherry pick to specific branches ? I think no. In the past we had to reopen the jira or create a separate jira to run precommit jenkins tests for specific branches, but now we can open a separate PR to test against different branches. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on a diff in pull request #4775: YARN-11260. Upgrade JUnit from 4 to 5 in hadoop-yarn-server-timelineservice
aajisaka commented on code in PR #4775: URL: https://github.com/apache/hadoop/pull/4775#discussion_r989128859 ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice/src/test/java/org/apache/hadoop/yarn/server/timelineservice/reader/TestTimelineReaderWebServicesBasicAcl.java: ## @@ -127,25 +134,23 @@ public class TestTimelineReaderWebServicesBasicAcl { TimelineReaderWebServices .checkAccess(manager, adminUgi, entities, userKey, true); // admin is allowed to view other entities -Assert.assertTrue(entities.size() == 10); +assertTrue(entities.size() == 10); Review Comment: Could you replace with `assertEquals(10, entities.size())`? ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice/src/test/java/org/apache/hadoop/yarn/server/timelineservice/storage/TestFileSystemTimelineWriterImpl.java: ## @@ -191,13 +192,13 @@ public void testWriteMultipleEntities() throws Exception { FileSystemTimelineWriterImpl.TIMELINE_SERVICE_STORAGE_EXTENSION; Path path = new Path(fileName); FileSystem fs = FileSystem.get(conf); - assertTrue("Specified path(" + fileName + ") should exist: ", - fs.exists(path)); + assertTrue(fs.exists(path), + "Specified path(" + fileName + ") should exist: "); FileStatus fileStatus = fs.getFileStatus(path); - assertTrue("Specified path should be a file", - !fileStatus.isDirectory()); + assertTrue(!fileStatus.isDirectory(), Review Comment: Could you replace with `assertFalse(fileStatus.isDirectory(), ...)`? ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice/src/test/java/org/apache/hadoop/yarn/server/timelineservice/storage/TestFileSystemTimelineWriterImpl.java: ## @@ -278,4 +279,13 @@ private List readFromFile(FileSystem fs, Path path) } return data; } + + private static File newFolder(File root, String... subDirs) throws IOException { +String subFolder = String.join("/", subDirs); +File result = new File(root, subFolder); +if (!result.mkdirs()) { + throw new IOException("Couldn't create folders " + root); +} +return result; + } Review Comment: This method is unused and can be removed. ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice/src/test/java/org/apache/hadoop/yarn/server/timelineservice/storage/TestFileSystemTimelineWriterImpl.java: ## @@ -191,13 +192,13 @@ public void testWriteMultipleEntities() throws Exception { FileSystemTimelineWriterImpl.TIMELINE_SERVICE_STORAGE_EXTENSION; Path path = new Path(fileName); FileSystem fs = FileSystem.get(conf); - assertTrue("Specified path(" + fileName + ") should exist: ", - fs.exists(path)); + assertTrue(fs.exists(path), + "Specified path(" + fileName + ") should exist: "); FileStatus fileStatus = fs.getFileStatus(path); - assertTrue("Specified path should be a file", - !fileStatus.isDirectory()); + assertTrue(!fileStatus.isDirectory(), + "Specified path should be a file"); List data = readFromFile(fs, path); - assertTrue("data size is:" + data.size(), data.size() == 3); + assertTrue(data.size() == 3, "data size is:" + data.size()); Review Comment: Could you replace with `assertEquals(3, data.size(), ...)`? ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice/src/test/java/org/apache/hadoop/yarn/server/timelineservice/storage/TestFileSystemTimelineWriterImpl.java: ## @@ -248,13 +249,13 @@ public void testWriteEntitiesWithEmptyFlowName() throws Exception { FileSystemTimelineWriterImpl.TIMELINE_SERVICE_STORAGE_EXTENSION; Path path = new Path(fileName); FileSystem fs = FileSystem.get(conf); - assertTrue("Specified path(" + fileName + ") should exist: ", - fs.exists(path)); + assertTrue(fs.exists(path), + "Specified path(" + fileName + ") should exist: "); FileStatus fileStatus = fs.getFileStatus(path); - assertTrue("Specified path should be a file", - !fileStatus.isDirectory()); + assertTrue(!fileStatus.isDirectory(), Review Comment: Could you replace with `assertFalse(fileStatus.isDirectory(), ...)`? ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice/src/test/java/org/apache/hadoop/yarn/server/timelineservice/storage/TestFileSystemTimelineWriterImpl.java: ## @@ -248,13 +249,13 @@ public void testWriteEntitiesWithEmptyFlowName() throws Exception { FileSystemTimelineWriterImpl.TIMELINE_SERVICE_STORAGE_EXTENSION; Path path = new Path(fileName); FileSystem fs = FileSystem.get(conf); - assertTrue("Specified path(" + fileName + ") should exist: ", -
[jira] [Commented] (HADOOP-18442) Remove the hadoop-openstack module
[ https://issues.apache.org/jira/browse/HADOOP-18442?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613581#comment-17613581 ] ASF GitHub Bot commented on HADOOP-18442: - hadoop-yetus commented on PR #4975: URL: https://github.com/apache/hadoop/pull/4975#issuecomment-1270193963 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 3s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 39 new or modified test files. | _ branch-3.3 Compile Tests _ | | +0 :ok: | mvndep | 14m 57s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 4s | | branch-3.3 passed | | +1 :green_heart: | compile | 18m 53s | | branch-3.3 passed | | +1 :green_heart: | checkstyle | 3m 22s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 6m 31s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 6m 6s | | branch-3.3 passed | | +0 :ok: | spotbugs | 1m 7s | | branch/hadoop-project no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 1m 3s | | branch/hadoop-tools/hadoop-tools-dist no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 55s | | branch/hadoop-cloud-storage-project/hadoop-cloud-storage no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 25m 6s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 30s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 3m 12s | | the patch passed | | +1 :green_heart: | compile | 17m 22s | | the patch passed | | +1 :green_heart: | javac | 17m 22s | | root generated 0 new + 1858 unchanged - 7 fixed = 1858 total (was 1865) | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4975/1/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | checkstyle | 2m 57s | | root: The patch generated 0 new + 5 unchanged - 473 fixed = 5 total (was 478) | | +1 :green_heart: | mvnsite | 6m 48s | | the patch passed | | +1 :green_heart: | javadoc | 0m 54s | | hadoop-project in the patch passed. | | +1 :green_heart: | javadoc | 1m 5s | | hadoop-common in the patch passed. | | +1 :green_heart: | javadoc | 0m 58s | | hadoop-distcp in the patch passed. | | +1 :green_heart: | javadoc | 0m 53s | | hadoop-tools-dist in the patch passed. | | +1 :green_heart: | javadoc | 0m 52s | | hadoop-tools_hadoop-openstack generated 0 new + 0 unchanged - 22 fixed = 0 total (was 22) | | +1 :green_heart: | javadoc | 0m 52s | | hadoop-cloud-storage in the patch passed. | | +0 :ok: | spotbugs | 0m 53s | | hadoop-project has no data from spotbugs | | +0 :ok: | spotbugs | 0m 57s | | hadoop-tools/hadoop-tools-dist has no data from spotbugs | | -1 :x: | spotbugs | 1m 8s | [/patch-spotbugs-hadoop-tools_hadoop-openstack.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4975/1/artifact/out/patch-spotbugs-hadoop-tools_hadoop-openstack.txt) | hadoop-tools/hadoop-openstack cannot run convertXmlToText from spotbugs | | +0 :ok: | spotbugs | 0m 54s | | hadoop-cloud-storage-project/hadoop-cloud-storage has no data from spotbugs | | +1 :green_heart: | shadedclient | 25m 10s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 52s | | hadoop-project in the patch passed. | | +1 :green_heart: | unit | 18m 25s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 17m 7s | | hadoop-distcp in the patch passed. | | +1 :green_heart: | unit | 0m 54s | | hadoop-tools-dist in the patch passed. | | +1 :green_heart: | unit | 1m 4s | | hadoop-openstack in the patch passed. | | +1 :green_heart: | unit | 0m 51s | | hadoop-cloud-storage in the
[GitHub] [hadoop] hadoop-yetus commented on pull request #4975: HADOOP-18442. Remove openstack support (#4855)
hadoop-yetus commented on PR #4975: URL: https://github.com/apache/hadoop/pull/4975#issuecomment-1270193963 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 3s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 39 new or modified test files. | _ branch-3.3 Compile Tests _ | | +0 :ok: | mvndep | 14m 57s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 4s | | branch-3.3 passed | | +1 :green_heart: | compile | 18m 53s | | branch-3.3 passed | | +1 :green_heart: | checkstyle | 3m 22s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 6m 31s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 6m 6s | | branch-3.3 passed | | +0 :ok: | spotbugs | 1m 7s | | branch/hadoop-project no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 1m 3s | | branch/hadoop-tools/hadoop-tools-dist no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 55s | | branch/hadoop-cloud-storage-project/hadoop-cloud-storage no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 25m 6s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 30s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 3m 12s | | the patch passed | | +1 :green_heart: | compile | 17m 22s | | the patch passed | | +1 :green_heart: | javac | 17m 22s | | root generated 0 new + 1858 unchanged - 7 fixed = 1858 total (was 1865) | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4975/1/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | checkstyle | 2m 57s | | root: The patch generated 0 new + 5 unchanged - 473 fixed = 5 total (was 478) | | +1 :green_heart: | mvnsite | 6m 48s | | the patch passed | | +1 :green_heart: | javadoc | 0m 54s | | hadoop-project in the patch passed. | | +1 :green_heart: | javadoc | 1m 5s | | hadoop-common in the patch passed. | | +1 :green_heart: | javadoc | 0m 58s | | hadoop-distcp in the patch passed. | | +1 :green_heart: | javadoc | 0m 53s | | hadoop-tools-dist in the patch passed. | | +1 :green_heart: | javadoc | 0m 52s | | hadoop-tools_hadoop-openstack generated 0 new + 0 unchanged - 22 fixed = 0 total (was 22) | | +1 :green_heart: | javadoc | 0m 52s | | hadoop-cloud-storage in the patch passed. | | +0 :ok: | spotbugs | 0m 53s | | hadoop-project has no data from spotbugs | | +0 :ok: | spotbugs | 0m 57s | | hadoop-tools/hadoop-tools-dist has no data from spotbugs | | -1 :x: | spotbugs | 1m 8s | [/patch-spotbugs-hadoop-tools_hadoop-openstack.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4975/1/artifact/out/patch-spotbugs-hadoop-tools_hadoop-openstack.txt) | hadoop-tools/hadoop-openstack cannot run convertXmlToText from spotbugs | | +0 :ok: | spotbugs | 0m 54s | | hadoop-cloud-storage-project/hadoop-cloud-storage has no data from spotbugs | | +1 :green_heart: | shadedclient | 25m 10s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 52s | | hadoop-project in the patch passed. | | +1 :green_heart: | unit | 18m 25s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 17m 7s | | hadoop-distcp in the patch passed. | | +1 :green_heart: | unit | 0m 54s | | hadoop-tools-dist in the patch passed. | | +1 :green_heart: | unit | 1m 4s | | hadoop-openstack in the patch passed. | | +1 :green_heart: | unit | 0m 51s | | hadoop-cloud-storage in the patch passed. | | +1 :green_heart: | asflicense | 1m 12s | | The patch does not generate ASF License warnings. | | | | 224m 6s | | | | Subsystem | Report/Notes | |--:|:-| | Docker |
[GitHub] [hadoop] slfan1989 commented on pull request #4934: YARN-11315. [Federation] YARN Federation Router Supports Cross-Origin.
slfan1989 commented on PR #4934: URL: https://github.com/apache/hadoop/pull/4934#issuecomment-1270192564 @goiri Can you help merge this pr into trunk branch? Thank you so much! I will follow up on YARN-11329. Refactor Router#startWepApp#setupSecurityAndFilters. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on pull request #4963: YARN-11326. [Federation] Add RM FederationStateStoreService Metrics.
slfan1989 commented on PR #4963: URL: https://github.com/apache/hadoop/pull/4963#issuecomment-1270186602 @goiri Please help to review this pr again, thank you very much! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on a diff in pull request #4750: HDFS-6874. Add GETFILEBLOCKLOCATIONS operation to HttpFS
ashutoshcipher commented on code in PR #4750: URL: https://github.com/apache/hadoop/pull/4750#discussion_r989124730 ## hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServer.java: ## @@ -2003,4 +2006,40 @@ public void testContentType() throws Exception { () -> HttpFSUtils.jsonParse(conn)); conn.disconnect(); } + + @Test + @TestDir + @TestJetty + @TestHdfs + public void testGetFileBlockLocations() throws Exception { +createHttpFSServer(false, false); +// Create a test directory +String pathStr = "/tmp/tmp-get-block-location-test"; +createDirWithHttp(pathStr, "700", null); + +Path path = new Path(pathStr); +DistributedFileSystem dfs = (DistributedFileSystem) FileSystem +.get(path.toUri(), TestHdfsHelper.getHdfsConf()); + +String file1 = pathStr + "/file1"; +createWithHttp(file1, null); +HttpURLConnection conn = sendRequestToHttpFSServer(file1, +"GETFILEBLOCKLOCATIONS", "length=10"); +Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode()); +BlockLocation[] locations1 = +dfs.getFileBlockLocations(new Path(file1), 0, 1); +Assert.assertNotNull(locations1); + +Map jsonMap = JsonSerialization.mapReader().readValue(conn.getInputStream()); + +BlockLocation[] httpfsBlockLocations = +JsonUtilClient.toBlockLocationArray(jsonMap); + +assertEquals(locations1.length, httpfsBlockLocations.length); +for (int i = 0; i < locations1.length; i++) { + assertEquals(locations1.toString(), httpfsBlockLocations.toString()); Review Comment: I checked, it need to be `assertEquals(locations1.toString(), httpfsBlockLocations.toString())`; -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on pull request #4938: YARN-8041. [Router] Federation: Improve Router REST API Metrics.
slfan1989 commented on PR #4938: URL: https://github.com/apache/hadoop/pull/4938#issuecomment-1270184972 @goiri Please help to review this pr again, thank you very much! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on pull request #4946: YARN-11317. [Federation] Refactoring Yarn Router's About Web Page.
slfan1989 commented on PR #4946: URL: https://github.com/apache/hadoop/pull/4946#issuecomment-1270184278 @goiri Please help to review this pr again, thank you very much! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4938: YARN-8041. [Router] Federation: Improve Router REST API Metrics.
hadoop-yetus commented on PR #4938: URL: https://github.com/apache/hadoop/pull/4938#issuecomment-1270179940 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 44s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 39m 17s | | trunk passed | | +1 :green_heart: | compile | 0m 43s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 0m 37s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 39s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 44s | | trunk passed | | +1 :green_heart: | javadoc | 0m 51s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 0m 41s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 10s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 35s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 20m 57s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 34s | | the patch passed | | +1 :green_heart: | compile | 0m 29s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 0m 29s | | the patch passed | | +1 :green_heart: | compile | 0m 29s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 29s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 20s | | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router: The patch generated 0 new + 0 unchanged - 1 fixed = 0 total (was 1) | | +1 :green_heart: | mvnsite | 0m 33s | | the patch passed | | +1 :green_heart: | javadoc | 0m 25s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 0m 23s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 0m 57s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 28s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 4m 0s | | hadoop-yarn-server-router in the patch passed. | | +1 :green_heart: | asflicense | 0m 45s | | The patch does not generate ASF License warnings. | | | | 99m 18s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4938/21/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4938 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 9832ebded730 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 5780d4bcf8791010804ca745effcf7e5d7a6f5aa | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4938/21/testReport/ | | Max. process+thread count | 734 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4938/21/console | | versions | git=2.25.1 maven=3.6.3
[GitHub] [hadoop] slfan1989 commented on pull request #4463: YARN-11187. Remove WhiteBox in yarn module.
slfan1989 commented on PR #4463: URL: https://github.com/apache/hadoop/pull/4463#issuecomment-1270168988 @aajisaka Thank you very much for your help reviewing the code! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on a diff in pull request #4750: HDFS-6874. Add GETFILEBLOCKLOCATIONS operation to HttpFS
ashutoshcipher commented on code in PR #4750: URL: https://github.com/apache/hadoop/pull/4750#discussion_r989124730 ## hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServer.java: ## @@ -2003,4 +2006,40 @@ public void testContentType() throws Exception { () -> HttpFSUtils.jsonParse(conn)); conn.disconnect(); } + + @Test + @TestDir + @TestJetty + @TestHdfs + public void testGetFileBlockLocations() throws Exception { +createHttpFSServer(false, false); +// Create a test directory +String pathStr = "/tmp/tmp-get-block-location-test"; +createDirWithHttp(pathStr, "700", null); + +Path path = new Path(pathStr); +DistributedFileSystem dfs = (DistributedFileSystem) FileSystem +.get(path.toUri(), TestHdfsHelper.getHdfsConf()); + +String file1 = pathStr + "/file1"; +createWithHttp(file1, null); +HttpURLConnection conn = sendRequestToHttpFSServer(file1, +"GETFILEBLOCKLOCATIONS", "length=10"); +Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode()); +BlockLocation[] locations1 = +dfs.getFileBlockLocations(new Path(file1), 0, 1); +Assert.assertNotNull(locations1); + +Map jsonMap = JsonSerialization.mapReader().readValue(conn.getInputStream()); + +BlockLocation[] httpfsBlockLocations = +JsonUtilClient.toBlockLocationArray(jsonMap); + +assertEquals(locations1.length, httpfsBlockLocations.length); +for (int i = 0; i < locations1.length; i++) { + assertEquals(locations1.toString(), httpfsBlockLocations.toString()); Review Comment: I checked, it need to be `assertEquals(locations1.toString(), httpfsBlockLocations.toString())`; -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on a diff in pull request #4750: HDFS-6874. Add GETFILEBLOCKLOCATIONS operation to HttpFS
ashutoshcipher commented on code in PR #4750: URL: https://github.com/apache/hadoop/pull/4750#discussion_r989120654 ## hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/client/BaseTestHttpFSWith.java: ## @@ -73,39 +102,15 @@ import org.apache.hadoop.test.TestJetty; import org.apache.hadoop.test.TestJettyHelper; import org.apache.hadoop.util.Lists; -import org.junit.Assert; -import org.junit.Assume; -import org.junit.Test; -import org.junit.runner.RunWith; -import org.junit.runners.Parameterized; -import org.eclipse.jetty.server.Server; -import org.eclipse.jetty.webapp.WebAppContext; - -import java.io.File; -import java.io.FileOutputStream; Review Comment: Its just import optimization. may be we can keep this. What do you think? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on a diff in pull request #4750: HDFS-6874. Add GETFILEBLOCKLOCATIONS operation to HttpFS
ashutoshcipher commented on code in PR #4750: URL: https://github.com/apache/hadoop/pull/4750#discussion_r989119024 ## hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSParametersProvider.java: ## @@ -127,6 +126,10 @@ public class HttpFSParametersProvider extends ParametersProvider { PARAMS_DEF.put(Operation.GETECPOLICY, new Class[] {}); PARAMS_DEF.put(Operation.UNSETECPOLICY, new Class[] {}); PARAMS_DEF.put(Operation.SATISFYSTORAGEPOLICY, new Class[] {}); +PARAMS_DEF.put(Operation.GETFILEBLOCKLOCATIONS, +new Class[] {OffsetParam.class, LenParam.class}); +PARAMS_DEF.put(Operation.GET_BLOCK_LOCATIONS, +new Class[] {OffsetParam.class, LenParam.class}); Review Comment: Changed last me. 1st will cross 100. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on a diff in pull request #4750: HDFS-6874. Add GETFILEBLOCKLOCATIONS operation to HttpFS
ashutoshcipher commented on code in PR #4750: URL: https://github.com/apache/hadoop/pull/4750#discussion_r989115132 ## hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/FSOperations.java: ## @@ -2192,4 +2194,76 @@ public Void execute(FileSystem fs) throws IOException { return null; } } + + /** + * Executor that performs a getFileBlockLocations operation. + */ + + @InterfaceAudience.Private + @SuppressWarnings("rawtypes") + public static class FSFileBlockLocations + implements FileSystemAccess.FileSystemExecutor { +final private Path path; +final private long offsetValue; +final private long lengthValue; + +/** + * Creates a file-block-locations executor. + * + * @param path the path to retrieve the location + * @param offsetValue offset into the given file + * @param lengthValue length for which to get locations for + */ +public FSFileBlockLocations(String path, long offsetValue, long lengthValue) { + this.path = new Path(path); + this.offsetValue = offsetValue; + this.lengthValue = lengthValue; +} + +@Override +public Map execute(FileSystem fs) throws IOException { + BlockLocation[] locations = fs.getFileBlockLocations(this.path, + this.offsetValue, this.lengthValue); + return JsonUtil.toJsonMap(locations); +} + } + + /** + * Executor that performs a getFileBlockLocations operation for legacy + * clients that supports only GET_BLOCK_LOCATIONS. + */ + + @InterfaceAudience.Private + @SuppressWarnings("rawtypes") + public static class FSFileBlockLocationsLegacy + implements FileSystemAccess.FileSystemExecutor { Review Comment: making this single line will cross 100 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on a diff in pull request #4750: HDFS-6874. Add GETFILEBLOCKLOCATIONS operation to HttpFS
ashutoshcipher commented on code in PR #4750: URL: https://github.com/apache/hadoop/pull/4750#discussion_r989110352 ## hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/web/WebHdfsFileSystem.java: ## @@ -1882,18 +1884,52 @@ public BlockLocation[] getFileBlockLocations(final FileStatus status, } @Override - public BlockLocation[] getFileBlockLocations(final Path p, - final long offset, final long length) throws IOException { + public BlockLocation[] getFileBlockLocations(final Path p, final long offset, + final long length) throws IOException { statistics.incrementReadOps(1); storageStatistics.incrementOpCounter(OpType.GET_FILE_BLOCK_LOCATIONS); +BlockLocation[] locations; +try { + if (isServerHCFSCompatible) { +locations = getFileBlockLocations(GetOpParam.Op.GETFILEBLOCKLOCATIONS, p, offset, length); + } else { +locations = getFileBlockLocations(GetOpParam.Op.GET_BLOCK_LOCATIONS, p, offset, length); + } +} catch (RemoteException e) { + // parsing the exception is needed only if the client thinks the service is compatible + if (isServerHCFSCompatible && isGetFileBlockLocationsException(e)) { Review Comment: Sure -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18401) No ARM binaries in branch-3.3.x releases
[ https://issues.apache.org/jira/browse/HADOOP-18401?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613565#comment-17613565 ] ASF GitHub Bot commented on HADOOP-18401: - ayushtkn commented on PR #4953: URL: https://github.com/apache/hadoop/pull/4953#issuecomment-1270138553 For me everything worked apart from the last step which moves the changelog because I didn’t change the version and there was no jira marked to 3.3.9-SNAPSHOT(obviously), so should have changed version to 3.3.5 to make it pass the last step. > No ARM binaries in branch-3.3.x releases > > > Key: HADOOP-18401 > URL: https://issues.apache.org/jira/browse/HADOOP-18401 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.3.2, 3.3.3, 3.3.4 >Reporter: Ling Xu >Priority: Minor > Labels: pull-request-available > Attachments: image-2022-08-11-14-54-15-490.png > > > release files miss hadoop-3.3.4-aarch64.tar.gz > !image-2022-08-11-14-54-15-490.png! -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ayushtkn commented on pull request #4953: HADOOP-18401. No ARM binaries in branch-3.3.x releases.
ayushtkn commented on PR #4953: URL: https://github.com/apache/hadoop/pull/4953#issuecomment-1270138553 For me everything worked apart from the last step which moves the changelog because I didn’t change the version and there was no jira marked to 3.3.9-SNAPSHOT(obviously), so should have changed version to 3.3.5 to make it pass the last step. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on a diff in pull request #4750: HDFS-6874. Add GETFILEBLOCKLOCATIONS operation to HttpFS
ashutoshcipher commented on code in PR #4750: URL: https://github.com/apache/hadoop/pull/4750#discussion_r989106311 ## hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/web/JsonUtilClient.java: ## @@ -965,4 +968,53 @@ private static SnapshotStatus toSnapshotStatus( SnapshotStatus.getParentPath(fullPath))); return snapshotStatus; } + + @VisibleForTesting + public static BlockLocation[] toBlockLocationArray(Map json) + throws IOException { +final Map rootmap = +(Map) json.get(BlockLocation.class.getSimpleName() + "s"); +final List array = +JsonUtilClient.getList(rootmap, BlockLocation.class.getSimpleName()); +Preconditions.checkNotNull(array); +final BlockLocation[] locations = new BlockLocation[array.size()]; +int i = 0; +for (Object object : array) { + final Map m = (Map) object; + locations[i++] = JsonUtilClient.toBlockLocation(m); +} +return locations; + } + + /** Convert a Json map to BlockLocation. **/ + private static BlockLocation toBlockLocation(Map m) throws IOException { +if (m == null) { + return null; +} +long length = ((Number) m.get("length")).longValue(); +long offset = ((Number) m.get("offset")).longValue(); +boolean corrupt = Boolean.getBoolean(m.get("corrupt").toString()); +String[] storageIds = toStringArray(getList(m, "storageIds")); Review Comment: Thanks for suggestion. I think current way looks fine too. Any specific reason to modify? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on a diff in pull request #4750: HDFS-6874. Add GETFILEBLOCKLOCATIONS operation to HttpFS
ashutoshcipher commented on code in PR #4750: URL: https://github.com/apache/hadoop/pull/4750#discussion_r989105774 ## hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSServer.java: ## @@ -370,7 +370,26 @@ public InputStream run() throws Exception { break; } case GETFILEBLOCKLOCATIONS: { - response = Response.status(Response.Status.BAD_REQUEST).build(); + long offset = 0; + long len = Long.MAX_VALUE; + Long offsetParam = params.get(OffsetParam.NAME, OffsetParam.class); + Long lenParam = params.get(LenParam.NAME, LenParam.class); + AUDIT_LOG.info("[{}] offset [{}] len [{}]", path, offsetParam, lenParam); + if (offsetParam != null && offsetParam.longValue() > 0) { Review Comment: Sure -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka merged pull request #4463: YARN-11187. Remove WhiteBox in yarn module.
aajisaka merged PR #4463: URL: https://github.com/apache/hadoop/pull/4463 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18465) S3A server-side encryption tests fail before checking encryption tests should skip
[ https://issues.apache.org/jira/browse/HADOOP-18465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613547#comment-17613547 ] ASF GitHub Bot commented on HADOOP-18465: - dannycjones commented on code in PR #4925: URL: https://github.com/apache/hadoop/pull/4925#discussion_r989065662 ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/AbstractTestS3AEncryption.java: ## @@ -78,6 +78,14 @@ protected void patchConfigurationEncryptionSettings( 0, 1, 2, 3, 4, 5, 254, 255, 256, 257, 2 ^ 12 - 1 }; + /** + * Skips the tests if encryption is not enabled in configuration. + * + * @implNote We can use {@link #createConfiguration()} here since Review Comment: I never realised they weren't JavaDoc standard tags. I think they're just used with JDK docs. I'm not really fussed either way, I can move to an inline comment really. Just as a note, if this was in `public` src code then I think builds would fail with something like this: ``` [ERROR] hadoop/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/audit/impl/LoggingAuditor.java:61: error: unknown tag: myTag [ERROR] * @myTag What happens if I build this? ``` Happy to move it to inline comment in `trunk` - let me know. > S3A server-side encryption tests fail before checking encryption tests should > skip > -- > > Key: HADOOP-18465 > URL: https://issues.apache.org/jira/browse/HADOOP-18465 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Minor > Labels: pull-request-available > > When setting {{test.fs.s3a.encryption.enabled}} to {{{}false{}}}, this is not > respected by ITestS3AEncryptionSSEKMSDefaultKey. See failure below. > > {code:java} > -- > Test set: org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey > --- > Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 6.053 s <<< > FAILURE! - in org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey > testEncryptionOverRename(org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey) > Time elapsed: 3.063 s <<< ERROR! > org.apache.hadoop.fs.s3a.AWSBadRequestException: PUT 0-byte object on > fork-0002/test: com.amazonaws.services.s3.model.AmazonS3Exception: SSE > unavailable (Service: Amazon S3; Status Code: 400; Proxy: null) > at > org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:242) > at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:124) > at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$4(Invoker.java:376) > at > org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:468) > at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:372) > at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:347) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.createEmptyObject(S3AFileSystem.java:4394) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.createFakeDirectory(S3AFileSystem.java:4379) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.access$1800(S3AFileSystem.java:268) > at > org.apache.hadoop.fs.s3a.S3AFileSystem$MkdirOperationCallbacksImpl.createFakeDirectory(S3AFileSystem.java:3469) > at > org.apache.hadoop.fs.s3a.impl.MkdirOperation.execute(MkdirOperation.java:159) > at > org.apache.hadoop.fs.s3a.impl.MkdirOperation.execute(MkdirOperation.java:57) > at > org.apache.hadoop.fs.s3a.impl.ExecutingStoreOperation.apply(ExecutingStoreOperation.java:76) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.invokeTrackingDuration(IOStatisticsBinding.java:547) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:528) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.trackDuration(IOStatisticsBinding.java:449) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2441) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2460) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.mkdirs(S3AFileSystem.java:3435) > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2456) > at > org.apache.hadoop.fs.contract.AbstractFSContractTestBase.mkdirs(AbstractFSContractTestBase.java:363) > at > org.apache.hadoop.fs.contract.AbstractFSContractTestBase.setup(AbstractFSContractTestBase.java:205) > at >
[GitHub] [hadoop] dannycjones commented on a diff in pull request #4925: HADOOP-18465. Fix S3A SSE test skip when encryption is disabled
dannycjones commented on code in PR #4925: URL: https://github.com/apache/hadoop/pull/4925#discussion_r989065662 ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/AbstractTestS3AEncryption.java: ## @@ -78,6 +78,14 @@ protected void patchConfigurationEncryptionSettings( 0, 1, 2, 3, 4, 5, 254, 255, 256, 257, 2 ^ 12 - 1 }; + /** + * Skips the tests if encryption is not enabled in configuration. + * + * @implNote We can use {@link #createConfiguration()} here since Review Comment: I never realised they weren't JavaDoc standard tags. I think they're just used with JDK docs. I'm not really fussed either way, I can move to an inline comment really. Just as a note, if this was in `public` src code then I think builds would fail with something like this: ``` [ERROR] hadoop/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/audit/impl/LoggingAuditor.java:61: error: unknown tag: myTag [ERROR] * @myTag What happens if I build this? ``` Happy to move it to inline comment in `trunk` - let me know. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18465) S3A server-side encryption tests fail before checking encryption tests should skip
[ https://issues.apache.org/jira/browse/HADOOP-18465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613542#comment-17613542 ] ASF GitHub Bot commented on HADOOP-18465: - dannycjones commented on code in PR #4925: URL: https://github.com/apache/hadoop/pull/4925#discussion_r989065662 ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/AbstractTestS3AEncryption.java: ## @@ -78,6 +78,14 @@ protected void patchConfigurationEncryptionSettings( 0, 1, 2, 3, 4, 5, 254, 255, 256, 257, 2 ^ 12 - 1 }; + /** + * Skips the tests if encryption is not enabled in configuration. + * + * @implNote We can use {@link #createConfiguration()} here since Review Comment: I never realised they weren't JavaDoc standard tags. I'm not really fussed either way, I can move to an inline comment really. Just as a note, if this was in `public` src code then I think builds would fail with something like this: ``` [ERROR] hadoop/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/audit/impl/LoggingAuditor.java:61: error: unknown tag: myTag [ERROR] * @myTag What happens if I build this? ``` Happy to move it to inline comment in `trunk` - let me know. > S3A server-side encryption tests fail before checking encryption tests should > skip > -- > > Key: HADOOP-18465 > URL: https://issues.apache.org/jira/browse/HADOOP-18465 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Minor > Labels: pull-request-available > > When setting {{test.fs.s3a.encryption.enabled}} to {{{}false{}}}, this is not > respected by ITestS3AEncryptionSSEKMSDefaultKey. See failure below. > > {code:java} > -- > Test set: org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey > --- > Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 6.053 s <<< > FAILURE! - in org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey > testEncryptionOverRename(org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey) > Time elapsed: 3.063 s <<< ERROR! > org.apache.hadoop.fs.s3a.AWSBadRequestException: PUT 0-byte object on > fork-0002/test: com.amazonaws.services.s3.model.AmazonS3Exception: SSE > unavailable (Service: Amazon S3; Status Code: 400; Proxy: null) > at > org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:242) > at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:124) > at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$4(Invoker.java:376) > at > org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:468) > at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:372) > at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:347) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.createEmptyObject(S3AFileSystem.java:4394) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.createFakeDirectory(S3AFileSystem.java:4379) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.access$1800(S3AFileSystem.java:268) > at > org.apache.hadoop.fs.s3a.S3AFileSystem$MkdirOperationCallbacksImpl.createFakeDirectory(S3AFileSystem.java:3469) > at > org.apache.hadoop.fs.s3a.impl.MkdirOperation.execute(MkdirOperation.java:159) > at > org.apache.hadoop.fs.s3a.impl.MkdirOperation.execute(MkdirOperation.java:57) > at > org.apache.hadoop.fs.s3a.impl.ExecutingStoreOperation.apply(ExecutingStoreOperation.java:76) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.invokeTrackingDuration(IOStatisticsBinding.java:547) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:528) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.trackDuration(IOStatisticsBinding.java:449) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2441) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2460) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.mkdirs(S3AFileSystem.java:3435) > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2456) > at > org.apache.hadoop.fs.contract.AbstractFSContractTestBase.mkdirs(AbstractFSContractTestBase.java:363) > at > org.apache.hadoop.fs.contract.AbstractFSContractTestBase.setup(AbstractFSContractTestBase.java:205) > at >
[GitHub] [hadoop] dannycjones commented on a diff in pull request #4925: HADOOP-18465. Fix S3A SSE test skip when encryption is disabled
dannycjones commented on code in PR #4925: URL: https://github.com/apache/hadoop/pull/4925#discussion_r989065662 ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/AbstractTestS3AEncryption.java: ## @@ -78,6 +78,14 @@ protected void patchConfigurationEncryptionSettings( 0, 1, 2, 3, 4, 5, 254, 255, 256, 257, 2 ^ 12 - 1 }; + /** + * Skips the tests if encryption is not enabled in configuration. + * + * @implNote We can use {@link #createConfiguration()} here since Review Comment: I never realised they weren't JavaDoc standard tags. I'm not really fussed either way, I can move to an inline comment really. Just as a note, if this was in `public` src code then I think builds would fail with something like this: ``` [ERROR] hadoop/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/audit/impl/LoggingAuditor.java:61: error: unknown tag: myTag [ERROR] * @myTag What happens if I build this? ``` Happy to move it to inline comment in `trunk` - let me know. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17767) ABFS: Improve test scripts
[ https://issues.apache.org/jira/browse/HADOOP-17767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613535#comment-17613535 ] ASF GitHub Bot commented on HADOOP-17767: - steveloughran commented on PR #3124: URL: https://github.com/apache/hadoop/pull/3124#issuecomment-1270064249 rebase to/merge in trunk to see if that fixes the build > ABFS: Improve test scripts > -- > > Key: HADOOP-17767 > URL: https://issues.apache.org/jira/browse/HADOOP-17767 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.3.0 >Reporter: Sneha Vijayarajan >Assignee: Sneha Vijayarajan >Priority: Minor > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 1h 50m > Remaining Estimate: 0h > > Current test run scripts need manual update across all combinations in > runTests.sh for account name and is working off a single azure-auth-keys.xml > file. While having to test across accounts that span various geo, the config > file grows big and also needs a manual change for configs such as > fs.contract.test.[abfs/abfss] which has to be uniquely set. To use the script > across various combinations, dev to be aware of the names of all the > combinations defined in runTests.sh as well. > > These concerns are addressed in the new version of the scripts. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17767) ABFS: Improve test scripts
[ https://issues.apache.org/jira/browse/HADOOP-17767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613534#comment-17613534 ] ASF GitHub Bot commented on HADOOP-17767: - steveloughran commented on code in PR #3124: URL: https://github.com/apache/hadoop/pull/3124#discussion_r989059216 ## hadoop-tools/hadoop-azure/src/test/java/org/apache/hadoop/fs/azurebfs/utils/CleanupTestContainers.java: ## @@ -0,0 +1,74 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hadoop.fs.azurebfs.utils; + +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import com.microsoft.azure.storage.CloudStorageAccount; +import com.microsoft.azure.storage.blob.CloudBlobClient; +import com.microsoft.azure.storage.blob.CloudBlobContainer; +import com.microsoft.azure.storage.StorageCredentials; +import com.microsoft.azure.storage.StorageCredentialsAccountAndKey; + +import org.apache.hadoop.fs.azurebfs.AbstractAbfsIntegrationTest; +import org.apache.hadoop.fs.azurebfs.AbfsConfiguration; + +/** + * This looks like a test, but it is really a command to invoke to + * clean up containers created in other test runs. + * + */ +public class CleanupTestContainers extends AbstractAbfsIntegrationTest { + private static final Logger LOG = LoggerFactory.getLogger(CleanupTestContainers.class); + private static final String CONTAINER_PREFIX = "abfs-testcontainer-"; + + public CleanupTestContainers() throws Exception { + } + + @org.junit.Test + public void testDeleteContainers() throws Throwable { Review Comment: we will get a stack trace; that should be enough unless there's extra information to print about credentials &+ > ABFS: Improve test scripts > -- > > Key: HADOOP-17767 > URL: https://issues.apache.org/jira/browse/HADOOP-17767 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.3.0 >Reporter: Sneha Vijayarajan >Assignee: Sneha Vijayarajan >Priority: Minor > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 1h 50m > Remaining Estimate: 0h > > Current test run scripts need manual update across all combinations in > runTests.sh for account name and is working off a single azure-auth-keys.xml > file. While having to test across accounts that span various geo, the config > file grows big and also needs a manual change for configs such as > fs.contract.test.[abfs/abfss] which has to be uniquely set. To use the script > across various combinations, dev to be aware of the names of all the > combinations defined in runTests.sh as well. > > These concerns are addressed in the new version of the scripts. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #3124: HADOOP-17767. ABFS: Updates test scripts
steveloughran commented on PR #3124: URL: https://github.com/apache/hadoop/pull/3124#issuecomment-1270064249 rebase to/merge in trunk to see if that fixes the build -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on a diff in pull request #3124: HADOOP-17767. ABFS: Updates test scripts
steveloughran commented on code in PR #3124: URL: https://github.com/apache/hadoop/pull/3124#discussion_r989059216 ## hadoop-tools/hadoop-azure/src/test/java/org/apache/hadoop/fs/azurebfs/utils/CleanupTestContainers.java: ## @@ -0,0 +1,74 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hadoop.fs.azurebfs.utils; + +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import com.microsoft.azure.storage.CloudStorageAccount; +import com.microsoft.azure.storage.blob.CloudBlobClient; +import com.microsoft.azure.storage.blob.CloudBlobContainer; +import com.microsoft.azure.storage.StorageCredentials; +import com.microsoft.azure.storage.StorageCredentialsAccountAndKey; + +import org.apache.hadoop.fs.azurebfs.AbstractAbfsIntegrationTest; +import org.apache.hadoop.fs.azurebfs.AbfsConfiguration; + +/** + * This looks like a test, but it is really a command to invoke to + * clean up containers created in other test runs. + * + */ +public class CleanupTestContainers extends AbstractAbfsIntegrationTest { + private static final Logger LOG = LoggerFactory.getLogger(CleanupTestContainers.class); + private static final String CONTAINER_PREFIX = "abfs-testcontainer-"; + + public CleanupTestContainers() throws Exception { + } + + @org.junit.Test + public void testDeleteContainers() throws Throwable { Review Comment: we will get a stack trace; that should be enough unless there's extra information to print about credentials &+ -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18465) S3A server-side encryption tests fail before checking encryption tests should skip
[ https://issues.apache.org/jira/browse/HADOOP-18465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613531#comment-17613531 ] ASF GitHub Bot commented on HADOOP-18465: - dannycjones commented on PR #4925: URL: https://github.com/apache/hadoop/pull/4925#issuecomment-1270045433 > ok, merged to trunk. this can go into -3.3 as well, can't it? Yeah, I don't see why not. I've opened #4977. Integ tests are running ATM. > S3A server-side encryption tests fail before checking encryption tests should > skip > -- > > Key: HADOOP-18465 > URL: https://issues.apache.org/jira/browse/HADOOP-18465 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Minor > Labels: pull-request-available > > When setting {{test.fs.s3a.encryption.enabled}} to {{{}false{}}}, this is not > respected by ITestS3AEncryptionSSEKMSDefaultKey. See failure below. > > {code:java} > -- > Test set: org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey > --- > Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 6.053 s <<< > FAILURE! - in org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey > testEncryptionOverRename(org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey) > Time elapsed: 3.063 s <<< ERROR! > org.apache.hadoop.fs.s3a.AWSBadRequestException: PUT 0-byte object on > fork-0002/test: com.amazonaws.services.s3.model.AmazonS3Exception: SSE > unavailable (Service: Amazon S3; Status Code: 400; Proxy: null) > at > org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:242) > at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:124) > at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$4(Invoker.java:376) > at > org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:468) > at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:372) > at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:347) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.createEmptyObject(S3AFileSystem.java:4394) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.createFakeDirectory(S3AFileSystem.java:4379) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.access$1800(S3AFileSystem.java:268) > at > org.apache.hadoop.fs.s3a.S3AFileSystem$MkdirOperationCallbacksImpl.createFakeDirectory(S3AFileSystem.java:3469) > at > org.apache.hadoop.fs.s3a.impl.MkdirOperation.execute(MkdirOperation.java:159) > at > org.apache.hadoop.fs.s3a.impl.MkdirOperation.execute(MkdirOperation.java:57) > at > org.apache.hadoop.fs.s3a.impl.ExecutingStoreOperation.apply(ExecutingStoreOperation.java:76) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.invokeTrackingDuration(IOStatisticsBinding.java:547) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:528) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.trackDuration(IOStatisticsBinding.java:449) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2441) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2460) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.mkdirs(S3AFileSystem.java:3435) > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2456) > at > org.apache.hadoop.fs.contract.AbstractFSContractTestBase.mkdirs(AbstractFSContractTestBase.java:363) > at > org.apache.hadoop.fs.contract.AbstractFSContractTestBase.setup(AbstractFSContractTestBase.java:205) > at > org.apache.hadoop.fs.s3a.AbstractS3ATestBase.setup(AbstractS3ATestBase.java:111) > at > org.apache.hadoop.fs.s3a.AbstractTestS3AEncryption.setup(AbstractTestS3AEncryption.java:94) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) > at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) > at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) > at >
[GitHub] [hadoop] dannycjones commented on pull request #4925: HADOOP-18465. Fix S3A SSE test skip when encryption is disabled
dannycjones commented on PR #4925: URL: https://github.com/apache/hadoop/pull/4925#issuecomment-1270045433 > ok, merged to trunk. this can go into -3.3 as well, can't it? Yeah, I don't see why not. I've opened #4977. Integ tests are running ATM. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18465) S3A server-side encryption tests fail before checking encryption tests should skip
[ https://issues.apache.org/jira/browse/HADOOP-18465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613528#comment-17613528 ] ASF GitHub Bot commented on HADOOP-18465: - dannycjones opened a new pull request, #4977: URL: https://github.com/apache/hadoop/pull/4977 ### Description of PR **Note: This is the same as #4925 but against `branch-3.3`. When running integration tests against an S3-compatible endpoint without SSE support, tests would fail despite marking `test.fs.s3a.encryption.enabled` as false. ### How was this patch tested? Against eu-west-1, no special configuration. Encryption enabled. TBD ### For code changes: - [x] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [x] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [x] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? > S3A server-side encryption tests fail before checking encryption tests should > skip > -- > > Key: HADOOP-18465 > URL: https://issues.apache.org/jira/browse/HADOOP-18465 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Minor > Labels: pull-request-available > > When setting {{test.fs.s3a.encryption.enabled}} to {{{}false{}}}, this is not > respected by ITestS3AEncryptionSSEKMSDefaultKey. See failure below. > > {code:java} > -- > Test set: org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey > --- > Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 6.053 s <<< > FAILURE! - in org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey > testEncryptionOverRename(org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey) > Time elapsed: 3.063 s <<< ERROR! > org.apache.hadoop.fs.s3a.AWSBadRequestException: PUT 0-byte object on > fork-0002/test: com.amazonaws.services.s3.model.AmazonS3Exception: SSE > unavailable (Service: Amazon S3; Status Code: 400; Proxy: null) > at > org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:242) > at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:124) > at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$4(Invoker.java:376) > at > org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:468) > at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:372) > at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:347) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.createEmptyObject(S3AFileSystem.java:4394) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.createFakeDirectory(S3AFileSystem.java:4379) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.access$1800(S3AFileSystem.java:268) > at > org.apache.hadoop.fs.s3a.S3AFileSystem$MkdirOperationCallbacksImpl.createFakeDirectory(S3AFileSystem.java:3469) > at > org.apache.hadoop.fs.s3a.impl.MkdirOperation.execute(MkdirOperation.java:159) > at > org.apache.hadoop.fs.s3a.impl.MkdirOperation.execute(MkdirOperation.java:57) > at > org.apache.hadoop.fs.s3a.impl.ExecutingStoreOperation.apply(ExecutingStoreOperation.java:76) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.invokeTrackingDuration(IOStatisticsBinding.java:547) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:528) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.trackDuration(IOStatisticsBinding.java:449) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2441) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2460) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.mkdirs(S3AFileSystem.java:3435) > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2456) > at > org.apache.hadoop.fs.contract.AbstractFSContractTestBase.mkdirs(AbstractFSContractTestBase.java:363) > at > org.apache.hadoop.fs.contract.AbstractFSContractTestBase.setup(AbstractFSContractTestBase.java:205) > at >
[GitHub] [hadoop] dannycjones opened a new pull request, #4977: HADOOP-18465. Fix S3A SSE test skip when encryption is disabled
dannycjones opened a new pull request, #4977: URL: https://github.com/apache/hadoop/pull/4977 ### Description of PR **Note: This is the same as #4925 but against `branch-3.3`. When running integration tests against an S3-compatible endpoint without SSE support, tests would fail despite marking `test.fs.s3a.encryption.enabled` as false. ### How was this patch tested? Against eu-west-1, no special configuration. Encryption enabled. TBD ### For code changes: - [x] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [x] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [x] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 opened a new pull request, #4976: YARN-11328. Refactoring part of the code of SQLFederationStateStore.
slfan1989 opened a new pull request, #4976: URL: https://github.com/apache/hadoop/pull/4976 JIRA: YARN-11328. Refactoring part of the code of SQLFederationStateStore. 1. When passing parameters to the database, use the field name instead of Index to increase the readability of the code. 2. Modify some logs and use {} instead of string splicing. 3. Other CheckStyle bug fixes. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4972: HADOOP-18480. Upgrade aws sdk to 1.12.316
hadoop-yetus commented on PR #4972: URL: https://github.com/apache/hadoop/pull/4972#issuecomment-1269998855 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 20m 21s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | shelldocs | 0m 1s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 27s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 30m 26s | | trunk passed | | +1 :green_heart: | compile | 27m 18s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 23m 40s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 22m 2s | | trunk passed | | +1 :green_heart: | javadoc | 9m 18s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 8m 1s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 40m 35s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 38s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 26m 16s | | the patch passed | | +1 :green_heart: | compile | 27m 42s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 27m 42s | | the patch passed | | +1 :green_heart: | compile | 24m 21s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 24m 21s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 22m 56s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 1s | | No new issues. | | +1 :green_heart: | javadoc | 9m 47s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 9m 0s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 45m 30s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 1071m 6s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4972/1/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 2m 21s | | The patch does not generate ASF License warnings. | | | | 1403m 41s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.namenode.ha.TestObserverNode | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4972/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4972 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint shellcheck shelldocs | | uname | Linux 762b195f39e5 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / e351ff3910db637d7658ae530545be1fae238d24 | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4972/1/testReport/ | | Max. process+thread count | 3438 (vs. ulimit of 5500) | | modules | C: hadoop-project . U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4972/1/console | | versions | git=2.25.1 maven=3.6.3
[jira] [Updated] (HADOOP-18480) upgrade AWS SDK for release 3.3.5
[ https://issues.apache.org/jira/browse/HADOOP-18480?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-18480: Labels: pull-request-available (was: ) > upgrade AWS SDK for release 3.3.5 > -- > > Key: HADOOP-18480 > URL: https://issues.apache.org/jira/browse/HADOOP-18480 > Project: Hadoop Common > Issue Type: Sub-task > Components: build, fs/s3 >Affects Versions: 3.3.5 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > > go up to the latest sdk through the usual qualification process. > no doubt it'll be bigger... -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18480) upgrade AWS SDK for release 3.3.5
[ https://issues.apache.org/jira/browse/HADOOP-18480?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613502#comment-17613502 ] ASF GitHub Bot commented on HADOOP-18480: - hadoop-yetus commented on PR #4972: URL: https://github.com/apache/hadoop/pull/4972#issuecomment-1269998855 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 20m 21s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | shelldocs | 0m 1s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 27s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 30m 26s | | trunk passed | | +1 :green_heart: | compile | 27m 18s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 23m 40s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 22m 2s | | trunk passed | | +1 :green_heart: | javadoc | 9m 18s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 8m 1s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 40m 35s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 38s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 26m 16s | | the patch passed | | +1 :green_heart: | compile | 27m 42s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 27m 42s | | the patch passed | | +1 :green_heart: | compile | 24m 21s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 24m 21s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 22m 56s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 1s | | No new issues. | | +1 :green_heart: | javadoc | 9m 47s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 9m 0s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 45m 30s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 1071m 6s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4972/1/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 2m 21s | | The patch does not generate ASF License warnings. | | | | 1403m 41s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.namenode.ha.TestObserverNode | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4972/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4972 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint shellcheck shelldocs | | uname | Linux 762b195f39e5 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / e351ff3910db637d7658ae530545be1fae238d24 | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4972/1/testReport/ | |
[GitHub] [hadoop] hadoop-yetus commented on pull request #4938: YARN-8041. [Router] Federation: Improve Router REST API Metrics.
hadoop-yetus commented on PR #4938: URL: https://github.com/apache/hadoop/pull/4938#issuecomment-1269992808 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 47s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 39m 28s | | trunk passed | | +1 :green_heart: | compile | 0m 41s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 0m 39s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 39s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 52s | | trunk passed | | +1 :green_heart: | javadoc | 0m 54s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 0m 37s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 18s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 38s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 21m 1s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 27s | | the patch passed | | +1 :green_heart: | compile | 0m 31s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 0m 31s | | the patch passed | | +1 :green_heart: | compile | 0m 26s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 26s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 20s | [/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4938/20/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router.txt) | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router: The patch generated 1 new + 0 unchanged - 1 fixed = 1 total (was 1) | | +1 :green_heart: | mvnsite | 0m 28s | | the patch passed | | -1 :x: | javadoc | 0m 24s | [/results-javadoc-javadoc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router-jdkUbuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4938/20/artifact/out/results-javadoc-javadoc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router-jdkUbuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04.txt) | hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router-jdkUbuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 generated 1 new + 5 unchanged - 0 fixed = 6 total (was 5) | | -1 :x: | javadoc | 0m 23s | [/results-javadoc-javadoc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router-jdkPrivateBuild-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4938/20/artifact/out/results-javadoc-javadoc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router-jdkPrivateBuild-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07.txt) | hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router-jdkPrivateBuild-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 generated 1 new + 5 unchanged - 0 fixed = 6 total (was 5) | | +1 :green_heart: | spotbugs | 0m 53s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 13s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 57s | | hadoop-yarn-server-router in the patch passed. | | +1 :green_heart: | asflicense | 0m 49s | | The patch does not generate ASF License warnings. | | | | 97m 33s | | | | Subsystem | Report/Notes |
[GitHub] [hadoop] hadoop-yetus commented on pull request #4934: YARN-11315. [Federation] YARN Federation Router Supports Cross-Origin.
hadoop-yetus commented on PR #4934: URL: https://github.com/apache/hadoop/pull/4934#issuecomment-1269956592 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 13m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 16m 7s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 27m 26s | | trunk passed | | +1 :green_heart: | compile | 23m 35s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 21m 1s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 4m 11s | | trunk passed | | +1 :green_heart: | mvnsite | 6m 54s | | trunk passed | | +1 :green_heart: | javadoc | 5m 56s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 5m 7s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 0m 55s | | branch/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-site no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 21m 26s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 31s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 3m 23s | | the patch passed | | +1 :green_heart: | compile | 22m 49s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 22m 49s | | the patch passed | | +1 :green_heart: | compile | 21m 3s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 21m 3s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 3s | | the patch passed | | +1 :green_heart: | mvnsite | 6m 49s | | the patch passed | | +1 :green_heart: | javadoc | 5m 38s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 5m 49s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 0m 51s | | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-site has no data from spotbugs | | +1 :green_heart: | shadedclient | 21m 21s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 56s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 1m 40s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 5m 19s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 4m 25s | | hadoop-yarn-server-router in the patch passed. | | +1 :green_heart: | unit | 0m 50s | | hadoop-yarn-site in the patch passed. | | +1 :green_heart: | asflicense | 1m 36s | | The patch does not generate ASF License warnings. | | | | 296m 56s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4934/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4934 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint markdownlint | | uname | Linux 304ab9fdcc61 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 7af2e21a8e56a9365616e88ac388435a419b294f | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private
[GitHub] [hadoop] hadoop-yetus commented on pull request #4915: YARN-11294. [Federation] Router Support DelegationToken store/update/remove Token With MemoryStateStore.
hadoop-yetus commented on PR #4915: URL: https://github.com/apache/hadoop/pull/4915#issuecomment-1269947430 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 8s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 6 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 31s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 26m 17s | | trunk passed | | +1 :green_heart: | compile | 9m 56s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 8m 47s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 2m 1s | | trunk passed | | +1 :green_heart: | mvnsite | 4m 2s | | trunk passed | | +1 :green_heart: | javadoc | 3m 50s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 3m 57s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 6m 33s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 10s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 23m 40s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 31s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 18s | | the patch passed | | +1 :green_heart: | compile | 9m 7s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 9m 7s | | the patch passed | | +1 :green_heart: | compile | 8m 49s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 8m 49s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 55s | | hadoop-yarn-project/hadoop-yarn: The patch generated 0 new + 26 unchanged - 2 fixed = 26 total (was 28) | | +1 :green_heart: | mvnsite | 3m 53s | | the patch passed | | +1 :green_heart: | javadoc | 3m 6s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 3m 4s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 6m 31s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 53s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 5m 24s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 3m 51s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 104m 54s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | asflicense | 1m 10s | | The patch does not generate ASF License warnings. | | | | 286m 57s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4915/13/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4915 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux ff055d369515 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / bb7b13cf0f86c3cc83d3b3d69af25d59cc4f9555 | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4915/13/testReport/ | | Max. process+thread count | 964 (vs. ulimit of 5500) | |
[jira] [Commented] (HADOOP-18304) Improve S3A committers documentation clarity
[ https://issues.apache.org/jira/browse/HADOOP-18304?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613485#comment-17613485 ] ASF GitHub Bot commented on HADOOP-18304: - mehakmeet commented on PR #4478: URL: https://github.com/apache/hadoop/pull/4478#issuecomment-1269937773 Yes @dannycjones, I'll check this by tomorrow morning IST. > Improve S3A committers documentation clarity > > > Key: HADOOP-18304 > URL: https://issues.apache.org/jira/browse/HADOOP-18304 > Project: Hadoop Common > Issue Type: Sub-task > Components: documentation >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Trivial > Labels: pull-request-available > Time Spent: 2.5h > Remaining Estimate: 0h > > I recently was learning more about the S3A committers. I'm hoping to provide > some improvements as someone who has recently read [this > documentation|https://github.com/apache/hadoop/blob/1f157f802d2d6142d21482eaa86baf1bef458ed4/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md#L495] > without fully understanding prior. > For instance, referencing different components more explicitly and adding > pre-requisite info. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mehakmeet commented on pull request #4478: HADOOP-18304. Improve user-facing S3A committers documentation
mehakmeet commented on PR #4478: URL: https://github.com/apache/hadoop/pull/4478#issuecomment-1269937773 Yes @dannycjones, I'll check this by tomorrow morning IST. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18401) No ARM binaries in branch-3.3.x releases
[ https://issues.apache.org/jira/browse/HADOOP-18401?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613480#comment-17613480 ] ASF GitHub Bot commented on HADOOP-18401: - hadoop-yetus commented on PR #4953: URL: https://github.com/apache/hadoop/pull/4953#issuecomment-1269924246 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 37s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | shelldocs | 0m 0s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | _ branch-3.3 Compile Tests _ | | +1 :green_heart: | shadedclient | 32m 14s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | hadolint | 0m 2s | | No new issues. | | +1 :green_heart: | shellcheck | 0m 1s | | No new issues. | | +1 :green_heart: | shadedclient | 23m 56s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 0m 46s | | The patch does not generate ASF License warnings. | | | | 59m 36s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4953/7/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4953 | | Optional Tests | dupname asflicense codespell detsecrets shellcheck shelldocs hadolint | | uname | Linux 5c1e5471201d 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / 2141e9d77b42d824fd3f8989662a7ee9b964d01d | | Max. process+thread count | 626 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4953/7/console | | versions | git=2.17.1 maven=3.6.0 hadolint=1.11.1-0-g0e692dd shellcheck=0.4.6 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > No ARM binaries in branch-3.3.x releases > > > Key: HADOOP-18401 > URL: https://issues.apache.org/jira/browse/HADOOP-18401 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.3.2, 3.3.3, 3.3.4 >Reporter: Ling Xu >Priority: Minor > Labels: pull-request-available > Attachments: image-2022-08-11-14-54-15-490.png > > > release files miss hadoop-3.3.4-aarch64.tar.gz > !image-2022-08-11-14-54-15-490.png! -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4953: HADOOP-18401. No ARM binaries in branch-3.3.x releases.
hadoop-yetus commented on PR #4953: URL: https://github.com/apache/hadoop/pull/4953#issuecomment-1269924246 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 37s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | shelldocs | 0m 0s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | _ branch-3.3 Compile Tests _ | | +1 :green_heart: | shadedclient | 32m 14s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | hadolint | 0m 2s | | No new issues. | | +1 :green_heart: | shellcheck | 0m 1s | | No new issues. | | +1 :green_heart: | shadedclient | 23m 56s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 0m 46s | | The patch does not generate ASF License warnings. | | | | 59m 36s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4953/7/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4953 | | Optional Tests | dupname asflicense codespell detsecrets shellcheck shelldocs hadolint | | uname | Linux 5c1e5471201d 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / 2141e9d77b42d824fd3f8989662a7ee9b964d01d | | Max. process+thread count | 626 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4953/7/console | | versions | git=2.17.1 maven=3.6.0 hadolint=1.11.1-0-g0e692dd shellcheck=0.4.6 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4954: YARN-11323. [Federation] Improve ResourceManager Handler FinishApps.
hadoop-yetus commented on PR #4954: URL: https://github.com/apache/hadoop/pull/4954#issuecomment-1269919110 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 54s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 2s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 29m 49s | | trunk passed | | +1 :green_heart: | compile | 11m 16s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 9m 16s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 2m 7s | | trunk passed | | +1 :green_heart: | mvnsite | 4m 47s | | trunk passed | | +1 :green_heart: | javadoc | 4m 22s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 4m 4s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 8m 37s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 15s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 25m 39s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 25s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 56s | | the patch passed | | +1 :green_heart: | compile | 9m 58s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 9m 58s | | the patch passed | | +1 :green_heart: | compile | 9m 5s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 9m 5s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 53s | | the patch passed | | +1 :green_heart: | mvnsite | 4m 22s | | the patch passed | | +1 :green_heart: | javadoc | 3m 50s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 3m 40s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 8m 52s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 25s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 18s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 5m 3s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 3m 18s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 102m 52s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | asflicense | 1m 7s | | The patch does not generate ASF License warnings. | | | | 302m 32s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4954/13/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4954 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux e0a82cd33b00 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / fec1fa4fa26f5e12bb4f98d13d1e5634f3fa3709 | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results |
[GitHub] [hadoop] hadoop-yetus commented on pull request #4946: YARN-11317. [Federation] Refactoring Yarn Router's About Web Page.
hadoop-yetus commented on PR #4946: URL: https://github.com/apache/hadoop/pull/4946#issuecomment-1269894934 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 43s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 16s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 29m 45s | | trunk passed | | +1 :green_heart: | compile | 10m 32s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | compile | 9m 9s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 2m 2s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 47s | | trunk passed | | +1 :green_heart: | javadoc | 3m 50s | | trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 3m 23s | | trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 6m 18s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 22s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 13s | | the patch passed | | +1 :green_heart: | compile | 10m 14s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javac | 10m 14s | | the patch passed | | +1 :green_heart: | compile | 9m 6s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 9m 6s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 52s | | the patch passed | | +1 :green_heart: | mvnsite | 3m 42s | | the patch passed | | +1 :green_heart: | javadoc | 3m 31s | | the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 | | +1 :green_heart: | javadoc | 3m 17s | | the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 5m 51s | | the patch passed | | +1 :green_heart: | shadedclient | 21m 33s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 5m 14s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 101m 12s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | unit | 5m 14s | | hadoop-yarn-server-router in the patch passed. | | +1 :green_heart: | asflicense | 1m 9s | | The patch does not generate ASF License warnings. | | | | 285m 43s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4946/9/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4946 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 68fe81e806ac 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / dfeee2cbd30d8e784333ce9cc266bd057b82446d | | Default Java | Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4946/9/testReport/ | | Max. process+thread count | 965 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router U: hadoop-yarn-project/hadoop-yarn | |
[jira] [Commented] (HADOOP-18465) S3A server-side encryption tests fail before checking encryption tests should skip
[ https://issues.apache.org/jira/browse/HADOOP-18465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613458#comment-17613458 ] ASF GitHub Bot commented on HADOOP-18465: - steveloughran merged PR #4925: URL: https://github.com/apache/hadoop/pull/4925 > S3A server-side encryption tests fail before checking encryption tests should > skip > -- > > Key: HADOOP-18465 > URL: https://issues.apache.org/jira/browse/HADOOP-18465 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Minor > Labels: pull-request-available > > When setting {{test.fs.s3a.encryption.enabled}} to {{{}false{}}}, this is not > respected by ITestS3AEncryptionSSEKMSDefaultKey. See failure below. > > {code:java} > -- > Test set: org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey > --- > Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 6.053 s <<< > FAILURE! - in org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey > testEncryptionOverRename(org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey) > Time elapsed: 3.063 s <<< ERROR! > org.apache.hadoop.fs.s3a.AWSBadRequestException: PUT 0-byte object on > fork-0002/test: com.amazonaws.services.s3.model.AmazonS3Exception: SSE > unavailable (Service: Amazon S3; Status Code: 400; Proxy: null) > at > org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:242) > at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:124) > at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$4(Invoker.java:376) > at > org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:468) > at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:372) > at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:347) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.createEmptyObject(S3AFileSystem.java:4394) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.createFakeDirectory(S3AFileSystem.java:4379) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.access$1800(S3AFileSystem.java:268) > at > org.apache.hadoop.fs.s3a.S3AFileSystem$MkdirOperationCallbacksImpl.createFakeDirectory(S3AFileSystem.java:3469) > at > org.apache.hadoop.fs.s3a.impl.MkdirOperation.execute(MkdirOperation.java:159) > at > org.apache.hadoop.fs.s3a.impl.MkdirOperation.execute(MkdirOperation.java:57) > at > org.apache.hadoop.fs.s3a.impl.ExecutingStoreOperation.apply(ExecutingStoreOperation.java:76) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.invokeTrackingDuration(IOStatisticsBinding.java:547) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:528) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.trackDuration(IOStatisticsBinding.java:449) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2441) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2460) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.mkdirs(S3AFileSystem.java:3435) > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2456) > at > org.apache.hadoop.fs.contract.AbstractFSContractTestBase.mkdirs(AbstractFSContractTestBase.java:363) > at > org.apache.hadoop.fs.contract.AbstractFSContractTestBase.setup(AbstractFSContractTestBase.java:205) > at > org.apache.hadoop.fs.s3a.AbstractS3ATestBase.setup(AbstractS3ATestBase.java:111) > at > org.apache.hadoop.fs.s3a.AbstractTestS3AEncryption.setup(AbstractTestS3AEncryption.java:94) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) > at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) > at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) > at > org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33) > at > org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24) > at >
[jira] [Commented] (HADOOP-18465) S3A server-side encryption tests fail before checking encryption tests should skip
[ https://issues.apache.org/jira/browse/HADOOP-18465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17613459#comment-17613459 ] ASF GitHub Bot commented on HADOOP-18465: - steveloughran commented on PR #4925: URL: https://github.com/apache/hadoop/pull/4925#issuecomment-1269877427 ok, merged to trunk. this can go into -3.3 as well, can't it? > S3A server-side encryption tests fail before checking encryption tests should > skip > -- > > Key: HADOOP-18465 > URL: https://issues.apache.org/jira/browse/HADOOP-18465 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Minor > Labels: pull-request-available > > When setting {{test.fs.s3a.encryption.enabled}} to {{{}false{}}}, this is not > respected by ITestS3AEncryptionSSEKMSDefaultKey. See failure below. > > {code:java} > -- > Test set: org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey > --- > Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 6.053 s <<< > FAILURE! - in org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey > testEncryptionOverRename(org.apache.hadoop.fs.s3a.ITestS3AEncryptionSSEKMSDefaultKey) > Time elapsed: 3.063 s <<< ERROR! > org.apache.hadoop.fs.s3a.AWSBadRequestException: PUT 0-byte object on > fork-0002/test: com.amazonaws.services.s3.model.AmazonS3Exception: SSE > unavailable (Service: Amazon S3; Status Code: 400; Proxy: null) > at > org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:242) > at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:124) > at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$4(Invoker.java:376) > at > org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:468) > at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:372) > at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:347) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.createEmptyObject(S3AFileSystem.java:4394) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.createFakeDirectory(S3AFileSystem.java:4379) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.access$1800(S3AFileSystem.java:268) > at > org.apache.hadoop.fs.s3a.S3AFileSystem$MkdirOperationCallbacksImpl.createFakeDirectory(S3AFileSystem.java:3469) > at > org.apache.hadoop.fs.s3a.impl.MkdirOperation.execute(MkdirOperation.java:159) > at > org.apache.hadoop.fs.s3a.impl.MkdirOperation.execute(MkdirOperation.java:57) > at > org.apache.hadoop.fs.s3a.impl.ExecutingStoreOperation.apply(ExecutingStoreOperation.java:76) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.invokeTrackingDuration(IOStatisticsBinding.java:547) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:528) > at > org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.trackDuration(IOStatisticsBinding.java:449) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2441) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2460) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.mkdirs(S3AFileSystem.java:3435) > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2456) > at > org.apache.hadoop.fs.contract.AbstractFSContractTestBase.mkdirs(AbstractFSContractTestBase.java:363) > at > org.apache.hadoop.fs.contract.AbstractFSContractTestBase.setup(AbstractFSContractTestBase.java:205) > at > org.apache.hadoop.fs.s3a.AbstractS3ATestBase.setup(AbstractS3ATestBase.java:111) > at > org.apache.hadoop.fs.s3a.AbstractTestS3AEncryption.setup(AbstractTestS3AEncryption.java:94) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) > at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) > at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) > at > org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33) > at >
[GitHub] [hadoop] steveloughran commented on pull request #4925: HADOOP-18465. Fix S3A SSE test skip when encryption is disabled
steveloughran commented on PR #4925: URL: https://github.com/apache/hadoop/pull/4925#issuecomment-1269877427 ok, merged to trunk. this can go into -3.3 as well, can't it? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran merged pull request #4925: HADOOP-18465. Fix S3A SSE test skip when encryption is disabled
steveloughran merged PR #4925: URL: https://github.com/apache/hadoop/pull/4925 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org