[GitHub] [hadoop] prasad-acit commented on pull request #4241: HDFS-16563. Namenode WebUI prints sensitive information on Token expiry
prasad-acit commented on PR #4241: URL: https://github.com/apache/hadoop/pull/4241#issuecomment-1112921828 Thanks @jojochuang I have added logs with Token Info. I will consider other error / improvement scenario and analyze further. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18079) Upgrade Netty to 4.1.74
[ https://issues.apache.org/jira/browse/HADOOP-18079?focusedWorklogId=763989&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763989 ] ASF GitHub Bot logged work on HADOOP-18079: --- Author: ASF GitHub Bot Created on: 29/Apr/22 04:42 Start Date: 29/Apr/22 04:42 Worklog Time Spent: 10m Work Description: brahmareddybattula commented on code in PR #3977: URL: https://github.com/apache/hadoop/pull/3977#discussion_r861245824 ## hadoop-project/pom.xml: ## @@ -957,6 +957,72 @@ ${netty4.version} + +io.netty +netty-codec-socks +${netty4.version} + + + +io.netty +netty-handler-proxy +${netty4.version} + + + +io.netty +netty-resolver +${netty4.version} + + + +io.netty +netty-handler +${netty4.version} + + + +io.netty +netty-buffer +${netty4.version} + + + +io.netty +netty-transport +${netty4.version} + + + +io.netty +netty-common +${netty4.version} + + + +io.netty +netty-transport-native-unix-common +${netty4.version} + + + +io.netty +netty-transport Review Comment: Looks netty-transport given two times.. Here and #992 ## hadoop-project/pom.xml: ## @@ -141,7 +141,7 @@ 2.8.9 3.2.4 3.10.6.Final -4.1.68.Final +4.1.75.Final Review Comment: Sorry to ask again, looks now 4.1.76 also available. May be we can raise another jira for this if not with this. Issue Time Tracking --- Worklog Id: (was: 763989) Time Spent: 2h 20m (was: 2h 10m) > Upgrade Netty to 4.1.74 > --- > > Key: HADOOP-18079 > URL: https://issues.apache.org/jira/browse/HADOOP-18079 > Project: Hadoop Common > Issue Type: Bug >Reporter: Renukaprasad C >Priority: Major > Labels: pull-request-available > Time Spent: 2h 20m > Remaining Estimate: 0h > > h4. Netty version - 4.1.71 has fix some CVEs. We can upgrade the netty to > 4.1.7.1.Final or latest stable version - 4.1.7.2.Final. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] brahmareddybattula commented on a diff in pull request #3977: HADOOP-18079. Upgrade Netty to 4.1.75.
brahmareddybattula commented on code in PR #3977: URL: https://github.com/apache/hadoop/pull/3977#discussion_r861245824 ## hadoop-project/pom.xml: ## @@ -957,6 +957,72 @@ ${netty4.version} + +io.netty +netty-codec-socks +${netty4.version} + + + +io.netty +netty-handler-proxy +${netty4.version} + + + +io.netty +netty-resolver +${netty4.version} + + + +io.netty +netty-handler +${netty4.version} + + + +io.netty +netty-buffer +${netty4.version} + + + +io.netty +netty-transport +${netty4.version} + + + +io.netty +netty-common +${netty4.version} + + + +io.netty +netty-transport-native-unix-common +${netty4.version} + + + +io.netty +netty-transport Review Comment: Looks netty-transport given two times.. Here and #992 ## hadoop-project/pom.xml: ## @@ -141,7 +141,7 @@ 2.8.9 3.2.4 3.10.6.Final -4.1.68.Final +4.1.75.Final Review Comment: Sorry to ask again, looks now 4.1.76 also available. May be we can raise another jira for this if not with this. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4247: MAPREDUCE-7369. Fixed MapReduce tasks timing out when spends more time on MultipleOutputs#close
hadoop-yetus commented on PR #4247: URL: https://github.com/apache/hadoop/pull/4247#issuecomment-1112842289 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 1s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 6s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 24m 59s | | trunk passed | | +1 :green_heart: | compile | 2m 50s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 2m 32s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 26s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 1s | | trunk passed | | +1 :green_heart: | javadoc | 1m 33s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 27s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 8s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 35s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 34s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 20s | | the patch passed | | +1 :green_heart: | compile | 2m 34s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 2m 34s | | the patch passed | | +1 :green_heart: | compile | 2m 16s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 2m 16s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 5s | [/results-checkstyle-hadoop-mapreduce-project_hadoop-mapreduce-client.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4247/1/artifact/out/results-checkstyle-hadoop-mapreduce-project_hadoop-mapreduce-client.txt) | hadoop-mapreduce-project/hadoop-mapreduce-client: The patch generated 2 new + 478 unchanged - 0 fixed = 480 total (was 478) | | +1 :green_heart: | mvnsite | 1m 26s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 1m 1s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 59s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 47s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 16s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 6m 39s | | hadoop-mapreduce-client-core in the patch passed. | | +1 :green_heart: | unit | 8m 51s | | hadoop-mapreduce-client-app in the patch passed. | | +1 :green_heart: | asflicense | 0m 52s | | The patch does not generate ASF License warnings. | | | | 129m 16s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4247/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4247 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml | | uname | Linux b5d123e9a357 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / aa1e7fa48eb705be7746f86f02031a9548ec7130 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4247/1/testReport/ | | Max. process+thread count | 1247 (vs. ulimit of 5500) | | modules | C: hadoop-mapreduce-project/hadoop-mapredu
[GitHub] [hadoop] ashutoshcipher opened a new pull request, #4247: MAPREDUCE-7369. Fixed MapReduce tasks timing out when spends more time on MultipleOutputs#close
ashutoshcipher opened a new pull request, #4247: URL: https://github.com/apache/hadoop/pull/4247 ### Description of PR Fixed MapReduce tasks timing out when spends more time on MultipleOutputs#close * JIRA: MAPREDUCE-7369 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Assigned] (HADOOP-18069) CVE-2021-0341 in okhttp@2.7.5 detected in hdfs-client
[ https://issues.apache.org/jira/browse/HADOOP-18069?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ashutosh Gupta reassigned HADOOP-18069: --- Assignee: Ashutosh Gupta > CVE-2021-0341 in okhttp@2.7.5 detected in hdfs-client > --- > > Key: HADOOP-18069 > URL: https://issues.apache.org/jira/browse/HADOOP-18069 > Project: Hadoop Common > Issue Type: Bug > Components: hdfs-client >Affects Versions: 3.3.1 >Reporter: Eugene Shinn (Truveta) >Assignee: Ashutosh Gupta >Priority: Major > Labels: pull-request-available > Time Spent: 3h 50m > Remaining Estimate: 0h > > Our static vulnerability scanner (Fortify On Demand) detected [NVD - > CVE-2021-0341 > (nist.gov)|https://nvd.nist.gov/vuln/detail/CVE-2021-0341#VulnChangeHistorySection] > in our application. We traced the vulnerability to a transitive dependency > coming from hadoop-hdfs-client, which depends on okhttp@2.7.5 > ([hadoop/pom.xml at trunk · apache/hadoop > (github.com)|https://github.com/apache/hadoop/blob/trunk/hadoop-project/pom.xml#L137]). > To resolve this issue, okhttp should be upgraded to 4.9.2+ (ref: > [CVE-2021-0341 · Issue #6724 · square/okhttp > (github.com)|https://github.com/square/okhttp/issues/6724]). -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4245: HDFS-16564. Use uint32_t for hdfs_find
hadoop-yetus commented on PR #4245: URL: https://github.com/apache/hadoop/pull/4245#issuecomment-1112767405 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 12m 1s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 26m 3s | | trunk passed | | +1 :green_heart: | compile | 3m 41s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 44s | | trunk passed | | -1 :x: | shadedclient | 56m 29s | | branch has errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 24s | | the patch passed | | +1 :green_heart: | compile | 3m 19s | | the patch passed | | +1 :green_heart: | cc | 3m 19s | | the patch passed | | +1 :green_heart: | golang | 3m 19s | | the patch passed | | +1 :green_heart: | javac | 3m 19s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 27s | | the patch passed | | -1 :x: | shadedclient | 26m 1s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 32m 37s | | hadoop-hdfs-native-client in the patch passed. | | +1 :green_heart: | asflicense | 0m 46s | | The patch does not generate ASF License warnings. | | | | 134m 29s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4245/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4245 | | Optional Tests | dupname asflicense compile cc mvnsite javac unit codespell golang | | uname | Linux b65ac514df5a 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 2eef94c4ae6df5e3b45e5cb286d7914f338f94ae | | Default Java | Debian-11.0.14+9-post-Debian-1deb10u1 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4245/1/testReport/ | | Max. process+thread count | 440 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: hadoop-hdfs-project/hadoop-hdfs-native-client | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4245/1/console | | versions | git=2.20.1 maven=3.6.0 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18069) CVE-2021-0341 in okhttp@2.7.5 detected in hdfs-client
[ https://issues.apache.org/jira/browse/HADOOP-18069?focusedWorklogId=763941&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763941 ] ASF GitHub Bot logged work on HADOOP-18069: --- Author: ASF GitHub Bot Created on: 29/Apr/22 00:11 Start Date: 29/Apr/22 00:11 Worklog Time Spent: 10m Work Description: ashutoshcipher commented on code in PR #4229: URL: https://github.com/apache/hadoop/pull/4229#discussion_r861393238 ## LICENSE-binary: ## @@ -242,6 +242,7 @@ com.google.guava:listenablefuture:.0-empty-to-avoid-conflict-with-guava com.microsoft.azure:azure-storage:7.0.0 com.nimbusds:nimbus-jose-jwt:9.8.1 com.squareup.okhttp:okhttp:2.7.5 Review Comment: Done Issue Time Tracking --- Worklog Id: (was: 763941) Time Spent: 3h 50m (was: 3h 40m) > CVE-2021-0341 in okhttp@2.7.5 detected in hdfs-client > --- > > Key: HADOOP-18069 > URL: https://issues.apache.org/jira/browse/HADOOP-18069 > Project: Hadoop Common > Issue Type: Bug > Components: hdfs-client >Affects Versions: 3.3.1 >Reporter: Eugene Shinn (Truveta) >Priority: Major > Labels: pull-request-available > Time Spent: 3h 50m > Remaining Estimate: 0h > > Our static vulnerability scanner (Fortify On Demand) detected [NVD - > CVE-2021-0341 > (nist.gov)|https://nvd.nist.gov/vuln/detail/CVE-2021-0341#VulnChangeHistorySection] > in our application. We traced the vulnerability to a transitive dependency > coming from hadoop-hdfs-client, which depends on okhttp@2.7.5 > ([hadoop/pom.xml at trunk · apache/hadoop > (github.com)|https://github.com/apache/hadoop/blob/trunk/hadoop-project/pom.xml#L137]). > To resolve this issue, okhttp should be upgraded to 4.9.2+ (ref: > [CVE-2021-0341 · Issue #6724 · square/okhttp > (github.com)|https://github.com/square/okhttp/issues/6724]). -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on a diff in pull request #4229: HADOOP-18069. okhttp@2.7.5 to 4.9.3
ashutoshcipher commented on code in PR #4229: URL: https://github.com/apache/hadoop/pull/4229#discussion_r861393238 ## LICENSE-binary: ## @@ -242,6 +242,7 @@ com.google.guava:listenablefuture:.0-empty-to-avoid-conflict-with-guava com.microsoft.azure:azure-storage:7.0.0 com.nimbusds:nimbus-jose-jwt:9.8.1 com.squareup.okhttp:okhttp:2.7.5 Review Comment: Done -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4241: HDFS-16563. Namenode WebUI prints sensitive information on Token expiry
hadoop-yetus commented on PR #4241: URL: https://github.com/apache/hadoop/pull/4241#issuecomment-1112741832 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 59s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 40m 56s | | trunk passed | | +1 :green_heart: | compile | 25m 15s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 21m 47s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 32s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 0s | | trunk passed | | -1 :x: | javadoc | 1m 38s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4241/2/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt) | hadoop-common in trunk failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. | | +1 :green_heart: | javadoc | 2m 5s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 6s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 45s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 4s | | the patch passed | | +1 :green_heart: | compile | 24m 22s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 24m 22s | | the patch passed | | +1 :green_heart: | compile | 21m 53s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 21m 53s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 26s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 57s | | the patch passed | | -1 :x: | javadoc | 1m 27s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4241/2/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt) | hadoop-common in the patch failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. | | +1 :green_heart: | javadoc | 1m 59s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 1s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 42s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 7s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 18s | | The patch does not generate ASF License warnings. | | | | 227m 41s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4241/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4241 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 9fd1af415c69 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 9986c15091c1b084d611ccc5d6c5035e13a94f0f | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4241/2/testReport/ | | Max. process+thread count | 2396 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common
[jira] [Work logged] (HADOOP-18079) Upgrade Netty to 4.1.74
[ https://issues.apache.org/jira/browse/HADOOP-18079?focusedWorklogId=763917&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763917 ] ASF GitHub Bot logged work on HADOOP-18079: --- Author: ASF GitHub Bot Created on: 28/Apr/22 22:09 Start Date: 28/Apr/22 22:09 Worklog Time Spent: 10m Work Description: jojochuang commented on PR #3977: URL: https://github.com/apache/hadoop/pull/3977#issuecomment-1112699848 The test failures do not reproduce in my local tree. I'm triggering a rebuild to double check. Issue Time Tracking --- Worklog Id: (was: 763917) Time Spent: 2h 10m (was: 2h) > Upgrade Netty to 4.1.74 > --- > > Key: HADOOP-18079 > URL: https://issues.apache.org/jira/browse/HADOOP-18079 > Project: Hadoop Common > Issue Type: Bug >Reporter: Renukaprasad C >Priority: Major > Labels: pull-request-available > Time Spent: 2h 10m > Remaining Estimate: 0h > > h4. Netty version - 4.1.71 has fix some CVEs. We can upgrade the netty to > 4.1.7.1.Final or latest stable version - 4.1.7.2.Final. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] jojochuang commented on pull request #3977: HADOOP-18079. Upgrade Netty to 4.1.75.
jojochuang commented on PR #3977: URL: https://github.com/apache/hadoop/pull/3977#issuecomment-1112699848 The test failures do not reproduce in my local tree. I'm triggering a rebuild to double check. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4245: HDFS-16564. Use uint32_t for hdfs_find
hadoop-yetus commented on PR #4245: URL: https://github.com/apache/hadoop/pull/4245#issuecomment-1112692180 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 21m 29s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 22m 5s | | trunk passed | | +1 :green_heart: | compile | 4m 11s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 2s | | trunk passed | | +1 :green_heart: | shadedclient | 46m 29s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 32s | | the patch passed | | +1 :green_heart: | compile | 3m 42s | | the patch passed | | +1 :green_heart: | cc | 3m 42s | | the patch passed | | +1 :green_heart: | golang | 3m 42s | | the patch passed | | +1 :green_heart: | javac | 3m 43s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 35s | | the patch passed | | +1 :green_heart: | shadedclient | 19m 1s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 33m 38s | | hadoop-hdfs-native-client in the patch passed. | | +1 :green_heart: | asflicense | 0m 59s | | The patch does not generate ASF License warnings. | | | | 128m 53s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4245/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4245 | | Optional Tests | dupname asflicense compile cc mvnsite javac unit codespell golang | | uname | Linux 475f325777dc 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 2eef94c4ae6df5e3b45e5cb286d7914f338f94ae | | Default Java | Red Hat, Inc.-1.8.0_312-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4245/1/testReport/ | | Max. process+thread count | 549 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: hadoop-hdfs-project/hadoop-hdfs-native-client | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4245/1/console | | versions | git=2.27.0 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18214) Update BUILDING.txt
[ https://issues.apache.org/jira/browse/HADOOP-18214?focusedWorklogId=763897&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763897 ] ASF GitHub Bot logged work on HADOOP-18214: --- Author: ASF GitHub Bot Created on: 28/Apr/22 21:22 Start Date: 28/Apr/22 21:22 Worklog Time Spent: 10m Work Description: ayushtkn commented on PR #3811: URL: https://github.com/apache/hadoop/pull/3811#issuecomment-1112666224 @gvieri well yes. This one got committed initially without a Jira since it wasn’t changing the core code, but in general we tend to have jira for almost everything. That’s how we track issues. BTW. If you have a Jira account and have it assigned on your name. You will get the credit in the Release Notes as well when Hadoop does the release. Just one of the things if that interests or motivates you :-) Issue Time Tracking --- Worklog Id: (was: 763897) Time Spent: 40m (was: 0.5h) > Update BUILDING.txt > --- > > Key: HADOOP-18214 > URL: https://issues.apache.org/jira/browse/HADOOP-18214 > Project: Hadoop Common > Issue Type: Improvement > Components: build, documentation >Affects Versions: 3.3.2 >Reporter: Steve Loughran >Priority: Minor > Labels: pull-request-available > Fix For: 3.4.0, 3.3.3 > > Time Spent: 40m > Remaining Estimate: 0h > > update building.txt to match the docker build settings. > this patch has already gone in, just without a jira in its name > https://github.com/apache/hadoop/pull/3811 -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ayushtkn commented on pull request #3811: HADOOP-18214. Update BUILDING.txt
ayushtkn commented on PR #3811: URL: https://github.com/apache/hadoop/pull/3811#issuecomment-1112666224 @gvieri well yes. This one got committed initially without a Jira since it wasn’t changing the core code, but in general we tend to have jira for almost everything. That’s how we track issues. BTW. If you have a Jira account and have it assigned on your name. You will get the credit in the Release Notes as well when Hadoop does the release. Just one of the things if that interests or motivates you :-) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18214) Update BUILDING.txt
[ https://issues.apache.org/jira/browse/HADOOP-18214?focusedWorklogId=763890&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763890 ] ASF GitHub Bot logged work on HADOOP-18214: --- Author: ASF GitHub Bot Created on: 28/Apr/22 21:05 Start Date: 28/Apr/22 21:05 Worklog Time Spent: 10m Work Description: gvieri commented on PR #3811: URL: https://github.com/apache/hadoop/pull/3811#issuecomment-1112653767 It is necessary to have ASF JIRA account ? Issue Time Tracking --- Worklog Id: (was: 763890) Time Spent: 0.5h (was: 20m) > Update BUILDING.txt > --- > > Key: HADOOP-18214 > URL: https://issues.apache.org/jira/browse/HADOOP-18214 > Project: Hadoop Common > Issue Type: Improvement > Components: build, documentation >Affects Versions: 3.3.2 >Reporter: Steve Loughran >Priority: Minor > Labels: pull-request-available > Fix For: 3.4.0, 3.3.3 > > Time Spent: 0.5h > Remaining Estimate: 0h > > update building.txt to match the docker build settings. > this patch has already gone in, just without a jira in its name > https://github.com/apache/hadoop/pull/3811 -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] gvieri commented on pull request #3811: HADOOP-18214. Update BUILDING.txt
gvieri commented on PR #3811: URL: https://github.com/apache/hadoop/pull/3811#issuecomment-1112653767 It is necessary to have ASF JIRA account ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] dmmkr commented on pull request #4240: HDFS-16562. Upgrade moment.min.js to 2.29.2
dmmkr commented on PR #4240: URL: https://github.com/apache/hadoop/pull/4240#issuecomment-1112651982 Attaching the namenode and datanode UI ![Screenshot from 2022-04-29 02-26-11](https://user-images.githubusercontent.com/13732639/165845208-c6627a29-0771-403a-8903-100e9b191ef1.png) ![Screenshot from 2022-04-29 02-26-31](https://user-images.githubusercontent.com/13732639/165845216-95a773f2-9b16-462b-b880-44f528d11e49.png) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] prasad-acit commented on pull request #4241: HDFS-16563. Namenode WebUI prints sensitive information on Token expiry
prasad-acit commented on PR #4241: URL: https://github.com/apache/hadoop/pull/4241#issuecomment-1112597494 Thanks @hemanthboyina @Hexiaoqiao @steveloughran for the quick review & feedback. > the key and sensitive information is DelegationKey/Password for DelegationToken, the output message here does not include this information right? Yes, there is no password printed in it. But as per our internal security guidelines displaying the complete Token info is also prohibited. So, suppressed the token from being displayed in the browser. > if the issue is that toString leaks a secret, it should be fixed at that level, as it is likely to end up in logs. we don't want any output to expose secrets. Logging exception or full stack has no issue in this case. We are trying to avoid the token in the browser and keep the message abstract to the end-user. Here additional information is not necessary which can be avoided in the browser. Failed tests corrected, please review the changes. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4245: HDFS-16564. Use uint32_t for hdfs_find
hadoop-yetus commented on PR #4245: URL: https://github.com/apache/hadoop/pull/4245#issuecomment-1112595640 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 38m 36s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 39m 13s | | trunk passed | | +1 :green_heart: | compile | 3m 52s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 39s | | trunk passed | | +1 :green_heart: | shadedclient | 63m 1s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 24s | | the patch passed | | +1 :green_heart: | compile | 3m 38s | | the patch passed | | +1 :green_heart: | cc | 3m 38s | | the patch passed | | +1 :green_heart: | golang | 3m 38s | | the patch passed | | +1 :green_heart: | javac | 3m 38s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 28s | | the patch passed | | +1 :green_heart: | shadedclient | 18m 48s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 33m 19s | | hadoop-hdfs-native-client in the patch passed. | | +1 :green_heart: | asflicense | 0m 54s | | The patch does not generate ASF License warnings. | | | | 161m 37s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4245/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4245 | | Optional Tests | dupname asflicense compile cc mvnsite javac unit codespell golang | | uname | Linux 626333c8af82 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 2eef94c4ae6df5e3b45e5cb286d7914f338f94ae | | Default Java | Red Hat, Inc.-1.8.0_322-b06 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4245/1/testReport/ | | Max. process+thread count | 544 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: hadoop-hdfs-project/hadoop-hdfs-native-client | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4245/1/console | | versions | git=2.9.5 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] brahmareddybattula commented on pull request #4240: HDFS-16562. Upgrade moment.min.js to 2.29.2
brahmareddybattula commented on PR #4240: URL: https://github.com/apache/hadoop/pull/4240#issuecomment-1112579621 lgtm. it will be good, if you can attach the UI with these changes.. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18168) ITestMarkerTool.testRunLimitedLandsatAudit failing due to most of bucket content purged
[ https://issues.apache.org/jira/browse/HADOOP-18168?focusedWorklogId=763813&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763813 ] ASF GitHub Bot logged work on HADOOP-18168: --- Author: ASF GitHub Bot Created on: 28/Apr/22 18:42 Start Date: 28/Apr/22 18:42 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4140: URL: https://github.com/apache/hadoop/pull/4140#issuecomment-1112541391 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 35s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 5 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 44m 31s | | trunk passed | | +1 :green_heart: | compile | 0m 57s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 0m 52s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 50s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 2s | | trunk passed | | +1 :green_heart: | javadoc | 0m 48s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 51s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 31s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 43s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 22m 8s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 39s | | the patch passed | | +1 :green_heart: | compile | 0m 45s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 0m 45s | | the patch passed | | +1 :green_heart: | compile | 0m 36s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 36s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 25s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 43s | | the patch passed | | +1 :green_heart: | xml | 0m 2s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 0m 23s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 30s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 14s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 45s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 29s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 43s | | The patch does not generate ASF License warnings. | | | | 103m 52s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4140/9/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4140 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml | | uname | Linux 6a40a55d820d 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 73eeb50311a3e06deaafe4b7ceb1fbb72107f538 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4140/9/testReport/ | | Max. process+thread count | 624 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoo
[GitHub] [hadoop] hadoop-yetus commented on pull request #4140: HADOOP-18168. Fix S3A ITestMarkerTool dependency on purged public bucket
hadoop-yetus commented on PR #4140: URL: https://github.com/apache/hadoop/pull/4140#issuecomment-1112541391 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 35s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 5 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 44m 31s | | trunk passed | | +1 :green_heart: | compile | 0m 57s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 0m 52s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 50s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 2s | | trunk passed | | +1 :green_heart: | javadoc | 0m 48s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 51s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 31s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 43s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 22m 8s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 39s | | the patch passed | | +1 :green_heart: | compile | 0m 45s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 0m 45s | | the patch passed | | +1 :green_heart: | compile | 0m 36s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 36s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 25s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 43s | | the patch passed | | +1 :green_heart: | xml | 0m 2s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 0m 23s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 30s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 14s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 45s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 29s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 43s | | The patch does not generate ASF License warnings. | | | | 103m 52s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4140/9/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4140 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml | | uname | Linux 6a40a55d820d 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 73eeb50311a3e06deaafe4b7ceb1fbb72107f538 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4140/9/testReport/ | | Max. process+thread count | 624 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4140/9/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use
[jira] [Work logged] (HADOOP-16965) Introduce StreamContext for Abfs Input and Output streams.
[ https://issues.apache.org/jira/browse/HADOOP-16965?focusedWorklogId=763805&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763805 ] ASF GitHub Bot logged work on HADOOP-16965: --- Author: ASF GitHub Bot Created on: 28/Apr/22 18:36 Start Date: 28/Apr/22 18:36 Worklog Time Spent: 10m Work Description: mukund-thakur commented on PR #4171: URL: https://github.com/apache/hadoop/pull/4171#issuecomment-1112536885 > > Yetus still failing with unit tests. Please fix those https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4171/3/artifact/out/patch-unit-hadoop-tools_hadoop-azure.txt > > Hi @mukund-thakur , those unit tests fails in branch-2.10 . This PR does not introduce any new failing tests. e.g. take this PR #4151. It is merged into branch-2.10 and it also has some failing unit tests. Okay if that's the case. I am okay with the change and no longer have any concerns with the current PR. Issue Time Tracking --- Worklog Id: (was: 763805) Time Spent: 4h 10m (was: 4h) > Introduce StreamContext for Abfs Input and Output streams. > -- > > Key: HADOOP-16965 > URL: https://issues.apache.org/jira/browse/HADOOP-16965 > Project: Hadoop Common > Issue Type: Improvement > Components: fs/azure >Reporter: Mukund Thakur >Assignee: Mukund Thakur >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 4h 10m > Remaining Estimate: 0h > > The number of configuration keeps growing in AbfsOutputStream and > AbfsInputStream as we keep on adding new features. It is time to refactor the > configurations in a separate class like StreamContext and pass them around. > This is will improve the readability of code and reduce cherry-pick-backport > pain. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mukund-thakur commented on pull request #4171: HADOOP-16965. Refactor abfs stream configuration. (#1956)
mukund-thakur commented on PR #4171: URL: https://github.com/apache/hadoop/pull/4171#issuecomment-1112536885 > > Yetus still failing with unit tests. Please fix those https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4171/3/artifact/out/patch-unit-hadoop-tools_hadoop-azure.txt > > Hi @mukund-thakur , those unit tests fails in branch-2.10 . This PR does not introduce any new failing tests. e.g. take this PR #4151. It is merged into branch-2.10 and it also has some failing unit tests. Okay if that's the case. I am okay with the change and no longer have any concerns with the current PR. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] goiri commented on a diff in pull request #4245: HDFS-16564. Use uint32_t for hdfs_find
goiri commented on code in PR #4245: URL: https://github.com/apache/hadoop/pull/4245#discussion_r861205984 ## hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests/hdfspp_mini_dfs_smoke.cc: ## @@ -34,7 +34,6 @@ TEST_F(HdfsMiniDfsSmokeTest, SmokeTest) { EXPECT_NE(nullptr, connection.handle()); } - Review Comment: Avoid -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18079) Upgrade Netty to 4.1.74
[ https://issues.apache.org/jira/browse/HADOOP-18079?focusedWorklogId=763787&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763787 ] ASF GitHub Bot logged work on HADOOP-18079: --- Author: ASF GitHub Bot Created on: 28/Apr/22 18:23 Start Date: 28/Apr/22 18:23 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #3977: URL: https://github.com/apache/hadoop/pull/3977#issuecomment-1112525743 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 34s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | shelldocs | 0m 1s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 14s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 18s | | trunk passed | | +1 :green_heart: | compile | 23m 13s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 20m 41s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 19m 8s | | trunk passed | | -1 :x: | javadoc | 1m 44s | [/branch-javadoc-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3977/3/artifact/out/branch-javadoc-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | root in trunk failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | +1 :green_heart: | javadoc | 8m 23s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 29m 53s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 1m 8s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 22m 4s | | the patch passed | | +1 :green_heart: | compile | 22m 57s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 22m 57s | | the patch passed | | +1 :green_heart: | compile | 20m 46s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 46s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 19m 2s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | -1 :x: | javadoc | 1m 34s | [/patch-javadoc-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3977/3/artifact/out/patch-javadoc-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | root in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | +1 :green_heart: | javadoc | 8m 22s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 32m 15s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 794m 49s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3977/3/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 2m 45s | | The patch does not generate ASF License warnings. | | | | 1052m 22s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.yarn.client.TestResourceManagerAdministrationProtocolPBClientImpl | | | hadoop.yarn.client.TestGetGroups | | | hadoop.yarn.csi.client.TestCsiClient | | | hadoop.yarn.server.timeline.webapp.TestTimelineWebServicesWithSSL | | | hadoop.yarn.server.timeline.security.TestTimelineAuthenticationFilterForV1 | | | hadoop.yarn.server.applicationhistoryservice.TestApplicationHistoryServer | | | hadoop.yarn.server.resourcemanager.metrics.TestSystemMetricsPublisher | | | hadoop.yarn.web
[GitHub] [hadoop] hadoop-yetus commented on pull request #3977: HADOOP-18079. Upgrade Netty to 4.1.74.
hadoop-yetus commented on PR #3977: URL: https://github.com/apache/hadoop/pull/3977#issuecomment-1112525743 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 34s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | shelldocs | 0m 1s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 14s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 18s | | trunk passed | | +1 :green_heart: | compile | 23m 13s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 20m 41s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 19m 8s | | trunk passed | | -1 :x: | javadoc | 1m 44s | [/branch-javadoc-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3977/3/artifact/out/branch-javadoc-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | root in trunk failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | +1 :green_heart: | javadoc | 8m 23s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 29m 53s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 1m 8s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 22m 4s | | the patch passed | | +1 :green_heart: | compile | 22m 57s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 22m 57s | | the patch passed | | +1 :green_heart: | compile | 20m 46s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 46s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 19m 2s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | -1 :x: | javadoc | 1m 34s | [/patch-javadoc-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3977/3/artifact/out/patch-javadoc-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | root in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | +1 :green_heart: | javadoc | 8m 22s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 32m 15s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 794m 49s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3977/3/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 2m 45s | | The patch does not generate ASF License warnings. | | | | 1052m 22s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.yarn.client.TestResourceManagerAdministrationProtocolPBClientImpl | | | hadoop.yarn.client.TestGetGroups | | | hadoop.yarn.csi.client.TestCsiClient | | | hadoop.yarn.server.timeline.webapp.TestTimelineWebServicesWithSSL | | | hadoop.yarn.server.timeline.security.TestTimelineAuthenticationFilterForV1 | | | hadoop.yarn.server.applicationhistoryservice.TestApplicationHistoryServer | | | hadoop.yarn.server.resourcemanager.metrics.TestSystemMetricsPublisher | | | hadoop.yarn.webapp.TestRMWithXFSFilter | | | hadoop.yarn.server.resourcemanager.TestClientRMService | | | hadoop.yarn.server.resourcemanager.webapp.TestRMWebServicesDelegationTokenAuthentication | | | hadoop.yarn.server.resourcemanager.webapp.TestRMWebappAuthentication | | | hadoop.yarn.server.resourcemanager.TestRMHA | | | hadoop.yarn.server.resourcemanager.metrics.TestCombinedSystemMetricsPublisher | | | hadoo
[jira] [Work logged] (HADOOP-18168) ITestMarkerTool.testRunLimitedLandsatAudit failing due to most of bucket content purged
[ https://issues.apache.org/jira/browse/HADOOP-18168?focusedWorklogId=763782&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763782 ] ASF GitHub Bot logged work on HADOOP-18168: --- Author: ASF GitHub Bot Created on: 28/Apr/22 18:16 Start Date: 28/Apr/22 18:16 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4140: URL: https://github.com/apache/hadoop/pull/4140#issuecomment-1112520064 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 49s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 5 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 46s | | trunk passed | | +1 :green_heart: | compile | 0m 48s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 0m 39s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 36s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 49s | | trunk passed | | +1 :green_heart: | javadoc | 0m 35s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 35s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 22s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 42s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 23m 1s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 35s | | the patch passed | | +1 :green_heart: | compile | 0m 42s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 0m 42s | | the patch passed | | +1 :green_heart: | compile | 0m 31s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 31s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 21s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4140/8/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) | hadoop-tools/hadoop-aws: The patch generated 1 new + 1 unchanged - 0 fixed = 2 total (was 1) | | +1 :green_heart: | mvnsite | 0m 39s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 0m 18s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 25s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 8s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 51s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 21s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 38s | | The patch does not generate ASF License warnings. | | | | 97m 2s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4140/8/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4140 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml | | uname | Linux 341d2ef8742d 4.15.0-169-generic #177-Ubuntu SMP Thu Feb 3 10:50:38 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 364ce9eba20743aad75689f92cd14e0517ce0a5e | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8
[GitHub] [hadoop] hadoop-yetus commented on pull request #4140: HADOOP-18168. Fix S3A ITestMarkerTool dependency on purged public bucket
hadoop-yetus commented on PR #4140: URL: https://github.com/apache/hadoop/pull/4140#issuecomment-1112520064 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 49s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 5 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 46s | | trunk passed | | +1 :green_heart: | compile | 0m 48s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 0m 39s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 36s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 49s | | trunk passed | | +1 :green_heart: | javadoc | 0m 35s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 35s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 22s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 42s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 23m 1s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 35s | | the patch passed | | +1 :green_heart: | compile | 0m 42s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 0m 42s | | the patch passed | | +1 :green_heart: | compile | 0m 31s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 31s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 21s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4140/8/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) | hadoop-tools/hadoop-aws: The patch generated 1 new + 1 unchanged - 0 fixed = 2 total (was 1) | | +1 :green_heart: | mvnsite | 0m 39s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 0m 18s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 25s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 8s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 51s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 21s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 38s | | The patch does not generate ASF License warnings. | | | | 97m 2s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4140/8/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4140 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml | | uname | Linux 341d2ef8742d 4.15.0-169-generic #177-Ubuntu SMP Thu Feb 3 10:50:38 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 364ce9eba20743aad75689f92cd14e0517ce0a5e | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4140/8/testReport/ | | Max. process+thread count | 718 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4140/8/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Po
[GitHub] [hadoop] saintstack opened a new pull request, #4246: HDFS-16540. Data locality is lost when DataNode pod restarts in kubernetes (#4170)
saintstack opened a new pull request, #4246: URL: https://github.com/apache/hadoop/pull/4246 ### Description of PR Cherry-pick of 9ed8d60511dccf96108239c5c96e108a7d4bc975 ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4244: YARN-11119. Backport YARN-10538 to branch-2.10
hadoop-yetus commented on PR #4244: URL: https://github.com/apache/hadoop/pull/4244#issuecomment-1112471513 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 12m 31s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ branch-2.10 Compile Tests _ | | +1 :green_heart: | mvninstall | 16m 52s | | branch-2.10 passed | | +1 :green_heart: | compile | 1m 3s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | compile | 0m 55s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | checkstyle | 0m 45s | | branch-2.10 passed | | +1 :green_heart: | mvnsite | 1m 3s | | branch-2.10 passed | | +1 :green_heart: | javadoc | 0m 49s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 0m 40s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | spotbugs | 2m 7s | | branch-2.10 passed | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 46s | | the patch passed | | +1 :green_heart: | compile | 0m 50s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javac | 0m 50s | | the patch passed | | +1 :green_heart: | compile | 0m 42s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | javac | 0m 42s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 29s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 47s | | the patch passed | | +1 :green_heart: | javadoc | 0m 36s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 0m 30s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | spotbugs | 1m 40s | | the patch passed | _ Other Tests _ | | +1 :green_heart: | unit | 68m 26s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | asflicense | 0m 35s | | The patch does not generate ASF License warnings. | | | | 116m 21s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4244/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4244 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux a359ebc4d0c6 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-2.10 / ebc95e037b7450b6d229ec2c6d4d8a585fee8228 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Multi-JDK versions | /usr/lib/jvm/zulu-7-amd64:Azul Systems, Inc.-1.7.0_262-b10 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4244/1/testReport/ | | Max. process+thread count | 902 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4244/1/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail
[GitHub] [hadoop] saintstack merged pull request #4170: HDFS-16540 Data locality is lost when DataNode pod restarts in kubern…
saintstack merged PR #4170: URL: https://github.com/apache/hadoop/pull/4170 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18168) ITestMarkerTool.testRunLimitedLandsatAudit failing due to most of bucket content purged
[ https://issues.apache.org/jira/browse/HADOOP-18168?focusedWorklogId=763727&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763727 ] ASF GitHub Bot logged work on HADOOP-18168: --- Author: ASF GitHub Bot Created on: 28/Apr/22 17:18 Start Date: 28/Apr/22 17:18 Worklog Time Spent: 10m Work Description: dannycjones commented on PR #4140: URL: https://github.com/apache/hadoop/pull/4140#issuecomment-1112461995 Tested latest patch against `eu-west-1`, all OK Issue Time Tracking --- Worklog Id: (was: 763727) Time Spent: 2h 40m (was: 2.5h) > ITestMarkerTool.testRunLimitedLandsatAudit failing due to most of bucket > content purged > --- > > Key: HADOOP-18168 > URL: https://issues.apache.org/jira/browse/HADOOP-18168 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, test >Affects Versions: 3.3.4 >Reporter: Steve Loughran >Assignee: Daniel Carl Jones >Priority: Minor > Labels: pull-request-available > Time Spent: 2h 40m > Remaining Estimate: 0h > > {{ITestMarkerTool.testRunLimitedLandsatAudit}} is failing -a scan which was > meant to stop after the first page of results is finishing because there > aren't so many objects there. > > first visible sign of the landsat-pds cleanup > now we have requester pays, we could do this against another store with > stability promises, e.g common crawl. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] dannycjones commented on pull request #4140: HADOOP-18168. Fix S3A ITestMarkerTool dependency on purged public bucket
dannycjones commented on PR #4140: URL: https://github.com/apache/hadoop/pull/4140#issuecomment-1112461995 Tested latest patch against `eu-west-1`, all OK -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18168) ITestMarkerTool.testRunLimitedLandsatAudit failing due to most of bucket content purged
[ https://issues.apache.org/jira/browse/HADOOP-18168?focusedWorklogId=763726&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763726 ] ASF GitHub Bot logged work on HADOOP-18168: --- Author: ASF GitHub Bot Created on: 28/Apr/22 17:17 Start Date: 28/Apr/22 17:17 Worklog Time Spent: 10m Work Description: steveloughran commented on code in PR #4140: URL: https://github.com/apache/hadoop/pull/4140#discussion_r861131956 ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/PublicDatasetTestUtils.java: ## @@ -0,0 +1,100 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hadoop.fs.s3a; Review Comment: can you put in the sub package .test ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/S3ATestConstants.java: ## @@ -18,6 +18,8 @@ package org.apache.hadoop.fs.s3a; +import org.apache.hadoop.conf.Configuration; Review Comment: nit: import should go below the java. one ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/ITestS3ARequesterPays.java: ## @@ -102,14 +88,30 @@ public void testRequesterPaysDisabledFails() throws Throwable { } } - private Path getRequesterPaysPath(Configuration conf) { -String requesterPaysFile = -conf.getTrimmed(KEY_REQUESTER_PAYS_FILE, DEFAULT_REQUESTER_PAYS_FILE); -S3ATestUtils.assume( -"Empty test property: " + KEY_REQUESTER_PAYS_FILE, -!requesterPaysFile.isEmpty() -); -return new Path(requesterPaysFile); + /** + * Use this after creating the file system, as this is when bucket-specific + * overrides are applied. + * @param conf Hadoop configuration from FS to mutate + * @param requesterPaysEnabled Indicate if requester pays be on or off + */ + private static void updateConf( + Configuration conf, + boolean requesterPaysEnabled Review Comment: nit: we tend to pull that ) onto the same line as the last paraem Issue Time Tracking --- Worklog Id: (was: 763726) Time Spent: 2.5h (was: 2h 20m) > ITestMarkerTool.testRunLimitedLandsatAudit failing due to most of bucket > content purged > --- > > Key: HADOOP-18168 > URL: https://issues.apache.org/jira/browse/HADOOP-18168 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, test >Affects Versions: 3.3.4 >Reporter: Steve Loughran >Assignee: Daniel Carl Jones >Priority: Minor > Labels: pull-request-available > Time Spent: 2.5h > Remaining Estimate: 0h > > {{ITestMarkerTool.testRunLimitedLandsatAudit}} is failing -a scan which was > meant to stop after the first page of results is finishing because there > aren't so many objects there. > > first visible sign of the landsat-pds cleanup > now we have requester pays, we could do this against another store with > stability promises, e.g common crawl. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on a diff in pull request #4140: HADOOP-18168. Fix S3A ITestMarkerTool dependency on purged public bucket
steveloughran commented on code in PR #4140: URL: https://github.com/apache/hadoop/pull/4140#discussion_r861131956 ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/PublicDatasetTestUtils.java: ## @@ -0,0 +1,100 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hadoop.fs.s3a; Review Comment: can you put in the sub package .test ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/S3ATestConstants.java: ## @@ -18,6 +18,8 @@ package org.apache.hadoop.fs.s3a; +import org.apache.hadoop.conf.Configuration; Review Comment: nit: import should go below the java. one ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/ITestS3ARequesterPays.java: ## @@ -102,14 +88,30 @@ public void testRequesterPaysDisabledFails() throws Throwable { } } - private Path getRequesterPaysPath(Configuration conf) { -String requesterPaysFile = -conf.getTrimmed(KEY_REQUESTER_PAYS_FILE, DEFAULT_REQUESTER_PAYS_FILE); -S3ATestUtils.assume( -"Empty test property: " + KEY_REQUESTER_PAYS_FILE, -!requesterPaysFile.isEmpty() -); -return new Path(requesterPaysFile); + /** + * Use this after creating the file system, as this is when bucket-specific + * overrides are applied. + * @param conf Hadoop configuration from FS to mutate + * @param requesterPaysEnabled Indicate if requester pays be on or off + */ + private static void updateConf( + Configuration conf, + boolean requesterPaysEnabled Review Comment: nit: we tend to pull that ) onto the same line as the last paraem -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] GauthamBanasandra opened a new pull request, #4245: HDFS-16564. Use uint32_t for hdfs_find
GauthamBanasandra opened a new pull request, #4245: URL: https://github.com/apache/hadoop/pull/4245 ### Description of PR `hdfs_find` uses `u_int32_t` type for storing the value for the `max-depth` command line argument - https://github.com/apache/hadoop/blob/a631f45a99c7abf8c9a2dcfb10afb668c8ff6b09/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tools/hdfs-find/hdfs-find.cc#L43. The type `u_int32_t` isn't standard, isn't available on Windows and thus breaks cross-platform compatibility. We need to replace this with `uint32_t` which is available on all platforms since it's part of the C++ standard. ### How was this patch tested? The existing unit tests exercise this PR sufficiently. ### For code changes: - [x] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher opened a new pull request, #4244: YARN-11119. Backport YARN-10538 to branch-2.10
ashutoshcipher opened a new pull request, #4244: URL: https://github.com/apache/hadoop/pull/4244 ### Description of PR Backport YARN-10538 to branch-2.10 * JIRA: YARN-9 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18069) CVE-2021-0341 in okhttp@2.7.5 detected in hdfs-client
[ https://issues.apache.org/jira/browse/HADOOP-18069?focusedWorklogId=763584&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763584 ] ASF GitHub Bot logged work on HADOOP-18069: --- Author: ASF GitHub Bot Created on: 28/Apr/22 15:16 Start Date: 28/Apr/22 15:16 Worklog Time Spent: 10m Work Description: aajisaka commented on code in PR #4229: URL: https://github.com/apache/hadoop/pull/4229#discussion_r860506282 ## LICENSE-binary: ## @@ -242,6 +242,7 @@ com.google.guava:listenablefuture:.0-empty-to-avoid-conflict-with-guava com.microsoft.azure:azure-storage:7.0.0 com.nimbusds:nimbus-jose-jwt:9.8.1 com.squareup.okhttp:okhttp:2.7.5 Review Comment: Would you remove the line `com.squareup.okhttp:okhttp:2.7.5`? Issue Time Tracking --- Worklog Id: (was: 763584) Time Spent: 3h 40m (was: 3.5h) > CVE-2021-0341 in okhttp@2.7.5 detected in hdfs-client > --- > > Key: HADOOP-18069 > URL: https://issues.apache.org/jira/browse/HADOOP-18069 > Project: Hadoop Common > Issue Type: Bug > Components: hdfs-client >Affects Versions: 3.3.1 >Reporter: Eugene Shinn (Truveta) >Priority: Major > Labels: pull-request-available > Time Spent: 3h 40m > Remaining Estimate: 0h > > Our static vulnerability scanner (Fortify On Demand) detected [NVD - > CVE-2021-0341 > (nist.gov)|https://nvd.nist.gov/vuln/detail/CVE-2021-0341#VulnChangeHistorySection] > in our application. We traced the vulnerability to a transitive dependency > coming from hadoop-hdfs-client, which depends on okhttp@2.7.5 > ([hadoop/pom.xml at trunk · apache/hadoop > (github.com)|https://github.com/apache/hadoop/blob/trunk/hadoop-project/pom.xml#L137]). > To resolve this issue, okhttp should be upgraded to 4.9.2+ (ref: > [CVE-2021-0341 · Issue #6724 · square/okhttp > (github.com)|https://github.com/square/okhttp/issues/6724]). -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on a diff in pull request #4229: HADOOP-18069. okhttp@2.7.5 to 4.9.3
aajisaka commented on code in PR #4229: URL: https://github.com/apache/hadoop/pull/4229#discussion_r860506282 ## LICENSE-binary: ## @@ -242,6 +242,7 @@ com.google.guava:listenablefuture:.0-empty-to-avoid-conflict-with-guava com.microsoft.azure:azure-storage:7.0.0 com.nimbusds:nimbus-jose-jwt:9.8.1 com.squareup.okhttp:okhttp:2.7.5 Review Comment: Would you remove the line `com.squareup.okhttp:okhttp:2.7.5`? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18218) make sure prefetching stream memory consumption scales
Steve Loughran created HADOOP-18218: --- Summary: make sure prefetching stream memory consumption scales Key: HADOOP-18218 URL: https://issues.apache.org/jira/browse/HADOOP-18218 Project: Hadoop Common Issue Type: Sub-task Components: fs/s3 Affects Versions: 3.3.4 Reporter: Steve Loughran a recurrent problem in cloud store io is rrunning out of memory because blocks are buffered in reads or writes. we need to make sure thaw data/memory is used in the prefetch code such that it works in processes with many worker threads (hive, spark) -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18069) CVE-2021-0341 in okhttp@2.7.5 detected in hdfs-client
[ https://issues.apache.org/jira/browse/HADOOP-18069?focusedWorklogId=763576&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763576 ] ASF GitHub Bot logged work on HADOOP-18069: --- Author: ASF GitHub Bot Created on: 28/Apr/22 15:08 Start Date: 28/Apr/22 15:08 Worklog Time Spent: 10m Work Description: aajisaka commented on code in PR #4229: URL: https://github.com/apache/hadoop/pull/4229#discussion_r861007436 ## hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/web/oauth2/ConfRefreshTokenBasedAccessTokenProvider.java: ## @@ -103,36 +104,39 @@ public synchronized String getAccessToken() throws IOException { void refresh() throws IOException { try { - OkHttpClient client = new OkHttpClient(); - client.setConnectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, - TimeUnit.MILLISECONDS); - client.setReadTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, -TimeUnit.MILLISECONDS); + OkHttpClient client = + new OkHttpClient.Builder().connectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, + TimeUnit.MILLISECONDS) + .readTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) + .build(); Review Comment: Cool. Sorry my comment was wrong. Issue Time Tracking --- Worklog Id: (was: 763576) Time Spent: 3.5h (was: 3h 20m) > CVE-2021-0341 in okhttp@2.7.5 detected in hdfs-client > --- > > Key: HADOOP-18069 > URL: https://issues.apache.org/jira/browse/HADOOP-18069 > Project: Hadoop Common > Issue Type: Bug > Components: hdfs-client >Affects Versions: 3.3.1 >Reporter: Eugene Shinn (Truveta) >Priority: Major > Labels: pull-request-available > Time Spent: 3.5h > Remaining Estimate: 0h > > Our static vulnerability scanner (Fortify On Demand) detected [NVD - > CVE-2021-0341 > (nist.gov)|https://nvd.nist.gov/vuln/detail/CVE-2021-0341#VulnChangeHistorySection] > in our application. We traced the vulnerability to a transitive dependency > coming from hadoop-hdfs-client, which depends on okhttp@2.7.5 > ([hadoop/pom.xml at trunk · apache/hadoop > (github.com)|https://github.com/apache/hadoop/blob/trunk/hadoop-project/pom.xml#L137]). > To resolve this issue, okhttp should be upgraded to 4.9.2+ (ref: > [CVE-2021-0341 · Issue #6724 · square/okhttp > (github.com)|https://github.com/square/okhttp/issues/6724]). -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on a diff in pull request #4229: HADOOP-18069. okhttp@2.7.5 to 4.9.3
aajisaka commented on code in PR #4229: URL: https://github.com/apache/hadoop/pull/4229#discussion_r861007436 ## hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/web/oauth2/ConfRefreshTokenBasedAccessTokenProvider.java: ## @@ -103,36 +104,39 @@ public synchronized String getAccessToken() throws IOException { void refresh() throws IOException { try { - OkHttpClient client = new OkHttpClient(); - client.setConnectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, - TimeUnit.MILLISECONDS); - client.setReadTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, -TimeUnit.MILLISECONDS); + OkHttpClient client = + new OkHttpClient.Builder().connectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, + TimeUnit.MILLISECONDS) + .readTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) + .build(); Review Comment: Cool. Sorry my comment was wrong. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on a diff in pull request #4242: YARN-11116. Migrate Times util from SimpleDateFormat to thread-safe D…
aajisaka commented on code in PR #4242: URL: https://github.com/apache/hadoop/pull/4242#discussion_r860997327 ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestTimes.java: ## @@ -61,4 +67,15 @@ public void testFinishTimesAheadOfStartTimes() { elapsed = Times.elapsed(Long.MAX_VALUE, 0, true); Assert.assertEquals("Elapsed time is not -1", -1, elapsed); } + + @Test + public void validateISO() throws IOException { +SimpleDateFormat isoFormat = new SimpleDateFormat(ISO8601_DATE_FORMAT); +for (int i = 0; i < 1000; i++) { + long now = System.currentTimeMillis(); + String instant = Times.formatISO8601(now); + String date = isoFormat.format(new Date(now)); + Assert.assertEquals(date, instant); Review Comment: Thank you @jteagles. > isoFormat is using the old method of calculating the date and instant is being checked against isoFormat to ensure the same result has been achieved after the change. +1, thank you for your explanation. Sorry my comment was wrong. Current implementation (old/expected, new/actual) seems correct. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on a diff in pull request #4242: YARN-11116. Migrate Times util from SimpleDateFormat to thread-safe D…
aajisaka commented on code in PR #4242: URL: https://github.com/apache/hadoop/pull/4242#discussion_r860997327 ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestTimes.java: ## @@ -61,4 +67,15 @@ public void testFinishTimesAheadOfStartTimes() { elapsed = Times.elapsed(Long.MAX_VALUE, 0, true); Assert.assertEquals("Elapsed time is not -1", -1, elapsed); } + + @Test + public void validateISO() throws IOException { +SimpleDateFormat isoFormat = new SimpleDateFormat(ISO8601_DATE_FORMAT); +for (int i = 0; i < 1000; i++) { + long now = System.currentTimeMillis(); + String instant = Times.formatISO8601(now); + String date = isoFormat.format(new Date(now)); + Assert.assertEquals(date, instant); Review Comment: Thank you @jteagles. > isoFormat is using the old method of calculating the date and instant is being checked against isoFormat to ensure the same result has been achieved after the change. +1, thank you for your explanation. Sorry my comment was wrong. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] jteagles commented on a diff in pull request #4242: YARN-11116. Migrate Times util from SimpleDateFormat to thread-safe D…
jteagles commented on code in PR #4242: URL: https://github.com/apache/hadoop/pull/4242#discussion_r860986710 ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestTimes.java: ## @@ -61,4 +67,15 @@ public void testFinishTimesAheadOfStartTimes() { elapsed = Times.elapsed(Long.MAX_VALUE, 0, true); Assert.assertEquals("Elapsed time is not -1", -1, elapsed); } + + @Test + public void validateISO() throws IOException { +SimpleDateFormat isoFormat = new SimpleDateFormat(ISO8601_DATE_FORMAT); +for (int i = 0; i < 1000; i++) { + long now = System.currentTimeMillis(); + String instant = Times.formatISO8601(now); + String date = isoFormat.format(new Date(now)); + Assert.assertEquals(date, instant); Review Comment: Thanks for the quick reply, @aajisaka. Double checking on this one, since expected and actual are a bit confusing to me in this case. isoFormat is using the old method of calculating the date and instant is being checked against isoFormat to ensure the same result has been achieved after the change. I can see it either way. If you confirm, I will make the change you suggested. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4107: HDFS-16521. DFS API to retrieve slow datanodes
hadoop-yetus commented on PR #4107: URL: https://github.com/apache/hadoop/pull/4107#issuecomment-1112200478 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 57s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +0 :ok: | markdownlint | 0m 1s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 9s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 6s | | trunk passed | | +1 :green_heart: | compile | 6m 23s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 6m 2s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 41s | | trunk passed | | +1 :green_heart: | mvnsite | 4m 4s | | trunk passed | | +1 :green_heart: | javadoc | 3m 24s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 4m 4s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 8m 8s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 24s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 33s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 3m 2s | | the patch passed | | +1 :green_heart: | compile | 6m 3s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | cc | 6m 3s | | the patch passed | | -1 :x: | javac | 6m 3s | [/results-compile-javac-hadoop-hdfs-project-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4107/9/artifact/out/results-compile-javac-hadoop-hdfs-project-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt) | hadoop-hdfs-project-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 generated 1 new + 651 unchanged - 0 fixed = 652 total (was 651) | | +1 :green_heart: | compile | 5m 51s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | cc | 5m 51s | | the patch passed | | -1 :x: | javac | 5m 51s | [/results-compile-javac-hadoop-hdfs-project-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4107/9/artifact/out/results-compile-javac-hadoop-hdfs-project-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-hdfs-project-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 1 new + 629 unchanged - 0 fixed = 630 total (was 629) | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 20s | | hadoop-hdfs-project: The patch generated 0 new + 455 unchanged - 1 fixed = 455 total (was 456) | | +1 :green_heart: | mvnsite | 3m 16s | | the patch passed | | +1 :green_heart: | javadoc | 2m 28s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 3m 16s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 7m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 59s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 39s | | hadoop-hdfs-client in the patch passed. | | -1 :x: | unit | 442m 24s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4107/9/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | -1 :x: | unit | 35m 44s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs-rbf.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4107/9/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-rbf.txt) | hadoop-hdfs-rbf in the patch passed. | | +1 :green_heart: | asflicense | 1m 18s | | The patch does not generate ASF License
[jira] [Work logged] (HADOOP-18175) test failures with prefetching s3a input stream
[ https://issues.apache.org/jira/browse/HADOOP-18175?focusedWorklogId=763503&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763503 ] ASF GitHub Bot logged work on HADOOP-18175: --- Author: ASF GitHub Bot Created on: 28/Apr/22 13:18 Start Date: 28/Apr/22 13:18 Worklog Time Spent: 10m Work Description: monthonk commented on code in PR #4212: URL: https://github.com/apache/hadoop/pull/4212#discussion_r860878110 ## hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/read/S3PrefetchingInputStream.java: ## @@ -103,8 +103,7 @@ public synchronized int available() throws IOException { */ @Override public synchronized long getPos() throws IOException { Review Comment: `this.inputStream.getPos()` below could throw it, so I have to pass it up here Issue Time Tracking --- Worklog Id: (was: 763503) Time Spent: 2h (was: 1h 50m) > test failures with prefetching s3a input stream > --- > > Key: HADOOP-18175 > URL: https://issues.apache.org/jira/browse/HADOOP-18175 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, test >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Assignee: Monthon Klongklaew >Priority: Major > Labels: pull-request-available > Time Spent: 2h > Remaining Estimate: 0h > > identify and fix all test regressions from the prefetching s3a input stream -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] monthonk commented on a diff in pull request #4212: HADOOP-18175. fix test failures with prefetching s3a input stream
monthonk commented on code in PR #4212: URL: https://github.com/apache/hadoop/pull/4212#discussion_r860878110 ## hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/read/S3PrefetchingInputStream.java: ## @@ -103,8 +103,7 @@ public synchronized int available() throws IOException { */ @Override public synchronized long getPos() throws IOException { Review Comment: `this.inputStream.getPos()` below could throw it, so I have to pass it up here -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18175) test failures with prefetching s3a input stream
[ https://issues.apache.org/jira/browse/HADOOP-18175?focusedWorklogId=763482&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763482 ] ASF GitHub Bot logged work on HADOOP-18175: --- Author: ASF GitHub Bot Created on: 28/Apr/22 12:44 Start Date: 28/Apr/22 12:44 Worklog Time Spent: 10m Work Description: ahmarsuhail commented on code in PR #4212: URL: https://github.com/apache/hadoop/pull/4212#discussion_r860843620 ## hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/read/S3PrefetchingInputStream.java: ## @@ -103,8 +103,7 @@ public synchronized int available() throws IOException { */ @Override public synchronized long getPos() throws IOException { Review Comment: this method no longer throws an IOException Issue Time Tracking --- Worklog Id: (was: 763482) Time Spent: 1h 50m (was: 1h 40m) > test failures with prefetching s3a input stream > --- > > Key: HADOOP-18175 > URL: https://issues.apache.org/jira/browse/HADOOP-18175 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, test >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Assignee: Monthon Klongklaew >Priority: Major > Labels: pull-request-available > Time Spent: 1h 50m > Remaining Estimate: 0h > > identify and fix all test regressions from the prefetching s3a input stream -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ahmarsuhail commented on a diff in pull request #4212: HADOOP-18175. fix test failures with prefetching s3a input stream
ahmarsuhail commented on code in PR #4212: URL: https://github.com/apache/hadoop/pull/4212#discussion_r860843620 ## hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/read/S3PrefetchingInputStream.java: ## @@ -103,8 +103,7 @@ public synchronized int available() throws IOException { */ @Override public synchronized long getPos() throws IOException { Review Comment: this method no longer throws an IOException -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] cndaimin commented on pull request #4104: HDFS-16520. Improve EC pread: avoid potential reading whole block
cndaimin commented on PR #4104: URL: https://github.com/apache/hadoop/pull/4104#issuecomment-1112151903 @ferhui The failed test `hadoop.hdfs.TestRollingUpgrade` seems unrelated, and it runs success on my local PC: ![image](https://user-images.githubusercontent.com/4527219/165753419-a864d4e6-20fa-4866-96ee-509aba2832a3.png) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18069) CVE-2021-0341 in okhttp@2.7.5 detected in hdfs-client
[ https://issues.apache.org/jira/browse/HADOOP-18069?focusedWorklogId=763446&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763446 ] ASF GitHub Bot logged work on HADOOP-18069: --- Author: ASF GitHub Bot Created on: 28/Apr/22 11:42 Start Date: 28/Apr/22 11:42 Worklog Time Spent: 10m Work Description: ashutoshcipher commented on code in PR #4229: URL: https://github.com/apache/hadoop/pull/4229#discussion_r860788203 ## hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/web/oauth2/ConfRefreshTokenBasedAccessTokenProvider.java: ## @@ -103,36 +104,39 @@ public synchronized String getAccessToken() throws IOException { void refresh() throws IOException { try { - OkHttpClient client = new OkHttpClient(); - client.setConnectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, - TimeUnit.MILLISECONDS); - client.setReadTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, -TimeUnit.MILLISECONDS); + OkHttpClient client = + new OkHttpClient.Builder().connectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, + TimeUnit.MILLISECONDS) + .readTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) + .build(); Review Comment: try-with-resources need Required type: AutoCloseable, so adding the following is giving error ``` try (OkHttpClient client = new OkHttpClient.Builder().connectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) .readTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) .build()) { ``` and that's why I added Added here ``` try (Response responseBody = client.newCall(request).execute()) { ``` Ref: https://square.github.io/okhttp/recipes/ Issue Time Tracking --- Worklog Id: (was: 763446) Time Spent: 3h 20m (was: 3h 10m) > CVE-2021-0341 in okhttp@2.7.5 detected in hdfs-client > --- > > Key: HADOOP-18069 > URL: https://issues.apache.org/jira/browse/HADOOP-18069 > Project: Hadoop Common > Issue Type: Bug > Components: hdfs-client >Affects Versions: 3.3.1 >Reporter: Eugene Shinn (Truveta) >Priority: Major > Labels: pull-request-available > Time Spent: 3h 20m > Remaining Estimate: 0h > > Our static vulnerability scanner (Fortify On Demand) detected [NVD - > CVE-2021-0341 > (nist.gov)|https://nvd.nist.gov/vuln/detail/CVE-2021-0341#VulnChangeHistorySection] > in our application. We traced the vulnerability to a transitive dependency > coming from hadoop-hdfs-client, which depends on okhttp@2.7.5 > ([hadoop/pom.xml at trunk · apache/hadoop > (github.com)|https://github.com/apache/hadoop/blob/trunk/hadoop-project/pom.xml#L137]). > To resolve this issue, okhttp should be upgraded to 4.9.2+ (ref: > [CVE-2021-0341 · Issue #6724 · square/okhttp > (github.com)|https://github.com/square/okhttp/issues/6724]). -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18069) CVE-2021-0341 in okhttp@2.7.5 detected in hdfs-client
[ https://issues.apache.org/jira/browse/HADOOP-18069?focusedWorklogId=763445&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763445 ] ASF GitHub Bot logged work on HADOOP-18069: --- Author: ASF GitHub Bot Created on: 28/Apr/22 11:42 Start Date: 28/Apr/22 11:42 Worklog Time Spent: 10m Work Description: ashutoshcipher commented on code in PR #4229: URL: https://github.com/apache/hadoop/pull/4229#discussion_r860788203 ## hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/web/oauth2/ConfRefreshTokenBasedAccessTokenProvider.java: ## @@ -103,36 +104,39 @@ public synchronized String getAccessToken() throws IOException { void refresh() throws IOException { try { - OkHttpClient client = new OkHttpClient(); - client.setConnectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, - TimeUnit.MILLISECONDS); - client.setReadTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, -TimeUnit.MILLISECONDS); + OkHttpClient client = + new OkHttpClient.Builder().connectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, + TimeUnit.MILLISECONDS) + .readTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) + .build(); Review Comment: try-with-resources need Required type: AutoCloseable, so adding the following is giving error ``` try (OkHttpClient client = new OkHttpClient.Builder().connectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) .readTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) .build()) { ``` and that's why I added on Added here ``` try (Response responseBody = client.newCall(request).execute()) { ``` Ref: https://square.github.io/okhttp/recipes/ Issue Time Tracking --- Worklog Id: (was: 763445) Time Spent: 3h 10m (was: 3h) > CVE-2021-0341 in okhttp@2.7.5 detected in hdfs-client > --- > > Key: HADOOP-18069 > URL: https://issues.apache.org/jira/browse/HADOOP-18069 > Project: Hadoop Common > Issue Type: Bug > Components: hdfs-client >Affects Versions: 3.3.1 >Reporter: Eugene Shinn (Truveta) >Priority: Major > Labels: pull-request-available > Time Spent: 3h 10m > Remaining Estimate: 0h > > Our static vulnerability scanner (Fortify On Demand) detected [NVD - > CVE-2021-0341 > (nist.gov)|https://nvd.nist.gov/vuln/detail/CVE-2021-0341#VulnChangeHistorySection] > in our application. We traced the vulnerability to a transitive dependency > coming from hadoop-hdfs-client, which depends on okhttp@2.7.5 > ([hadoop/pom.xml at trunk · apache/hadoop > (github.com)|https://github.com/apache/hadoop/blob/trunk/hadoop-project/pom.xml#L137]). > To resolve this issue, okhttp should be upgraded to 4.9.2+ (ref: > [CVE-2021-0341 · Issue #6724 · square/okhttp > (github.com)|https://github.com/square/okhttp/issues/6724]). -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on a diff in pull request #4229: HADOOP-18069. okhttp@2.7.5 to 4.9.3
ashutoshcipher commented on code in PR #4229: URL: https://github.com/apache/hadoop/pull/4229#discussion_r860788203 ## hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/web/oauth2/ConfRefreshTokenBasedAccessTokenProvider.java: ## @@ -103,36 +104,39 @@ public synchronized String getAccessToken() throws IOException { void refresh() throws IOException { try { - OkHttpClient client = new OkHttpClient(); - client.setConnectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, - TimeUnit.MILLISECONDS); - client.setReadTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, -TimeUnit.MILLISECONDS); + OkHttpClient client = + new OkHttpClient.Builder().connectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, + TimeUnit.MILLISECONDS) + .readTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) + .build(); Review Comment: try-with-resources need Required type: AutoCloseable, so adding the following is giving error ``` try (OkHttpClient client = new OkHttpClient.Builder().connectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) .readTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) .build()) { ``` and that's why I added Added here ``` try (Response responseBody = client.newCall(request).execute()) { ``` Ref: https://square.github.io/okhttp/recipes/ -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on a diff in pull request #4229: HADOOP-18069. okhttp@2.7.5 to 4.9.3
ashutoshcipher commented on code in PR #4229: URL: https://github.com/apache/hadoop/pull/4229#discussion_r860788203 ## hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/web/oauth2/ConfRefreshTokenBasedAccessTokenProvider.java: ## @@ -103,36 +104,39 @@ public synchronized String getAccessToken() throws IOException { void refresh() throws IOException { try { - OkHttpClient client = new OkHttpClient(); - client.setConnectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, - TimeUnit.MILLISECONDS); - client.setReadTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, -TimeUnit.MILLISECONDS); + OkHttpClient client = + new OkHttpClient.Builder().connectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, + TimeUnit.MILLISECONDS) + .readTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) + .build(); Review Comment: try-with-resources need Required type: AutoCloseable, so adding the following is giving error ``` try (OkHttpClient client = new OkHttpClient.Builder().connectTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) .readTimeout(URLConnectionFactory.DEFAULT_SOCKET_TIMEOUT, TimeUnit.MILLISECONDS) .build()) { ``` and that's why I added on Added here ``` try (Response responseBody = client.newCall(request).execute()) { ``` Ref: https://square.github.io/okhttp/recipes/ -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ferhui commented on pull request #4199: HDFS-14750. RBF: Support dynamic handler allocation in routers
ferhui commented on PR #4199: URL: https://github.com/apache/hadoop/pull/4199#issuecomment-1112074429 @kokonguyen191 Thanks. It's a good feature. @fengnanli could you please review this PR? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-16202) Enhance openFile() for better read performance against object stores
[ https://issues.apache.org/jira/browse/HADOOP-16202?focusedWorklogId=763426&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763426 ] ASF GitHub Bot logged work on HADOOP-16202: --- Author: ASF GitHub Bot Created on: 28/Apr/22 11:05 Start Date: 28/Apr/22 11:05 Worklog Time Spent: 10m Work Description: steveloughran commented on PR #4238: URL: https://github.com/apache/hadoop/pull/4238#issuecomment-1112072053 merged locally; closing Issue Time Tracking --- Worklog Id: (was: 763426) Time Spent: 22h 40m (was: 22.5h) > Enhance openFile() for better read performance against object stores > - > > Key: HADOOP-16202 > URL: https://issues.apache.org/jira/browse/HADOOP-16202 > Project: Hadoop Common > Issue Type: Improvement > Components: fs, fs/s3, tools/distcp >Affects Versions: 3.3.0 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.4 > > Time Spent: 22h 40m > Remaining Estimate: 0h > > The {{openFile()}} builder API lets us add new options when reading a file > Add an option {{"fs.s3a.open.option.length"}} which takes a long and allows > the length of the file to be declared. If set, *no check for the existence of > the file is issued when opening the file* > Also: withFileStatus() to take any FileStatus implementation, rather than > only S3AFileStatus -and not check that the path matches the path being > opened. Needed to support viewFS-style wrapping and mounting. > and Adopt where appropriate to stop clusters with S3A reads switched to > random IO from killing download/localization > * fs shell copyToLocal > * distcp > * IOUtils.copy -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-16202) Enhance openFile() for better read performance against object stores
[ https://issues.apache.org/jira/browse/HADOOP-16202?focusedWorklogId=763427&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763427 ] ASF GitHub Bot logged work on HADOOP-16202: --- Author: ASF GitHub Bot Created on: 28/Apr/22 11:05 Start Date: 28/Apr/22 11:05 Worklog Time Spent: 10m Work Description: steveloughran closed pull request #4238: HADOOP-16202. Enhanced openFile() -branch-3.3 backport URL: https://github.com/apache/hadoop/pull/4238 Issue Time Tracking --- Worklog Id: (was: 763427) Time Spent: 22h 50m (was: 22h 40m) > Enhance openFile() for better read performance against object stores > - > > Key: HADOOP-16202 > URL: https://issues.apache.org/jira/browse/HADOOP-16202 > Project: Hadoop Common > Issue Type: Improvement > Components: fs, fs/s3, tools/distcp >Affects Versions: 3.3.0 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.4 > > Time Spent: 22h 50m > Remaining Estimate: 0h > > The {{openFile()}} builder API lets us add new options when reading a file > Add an option {{"fs.s3a.open.option.length"}} which takes a long and allows > the length of the file to be declared. If set, *no check for the existence of > the file is issued when opening the file* > Also: withFileStatus() to take any FileStatus implementation, rather than > only S3AFileStatus -and not check that the path matches the path being > opened. Needed to support viewFS-style wrapping and mounting. > and Adopt where appropriate to stop clusters with S3A reads switched to > random IO from killing download/localization > * fs shell copyToLocal > * distcp > * IOUtils.copy -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran closed pull request #4238: HADOOP-16202. Enhanced openFile() -branch-3.3 backport
steveloughran closed pull request #4238: HADOOP-16202. Enhanced openFile() -branch-3.3 backport URL: https://github.com/apache/hadoop/pull/4238 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #4238: HADOOP-16202. Enhanced openFile() -branch-3.3 backport
steveloughran commented on PR #4238: URL: https://github.com/apache/hadoop/pull/4238#issuecomment-1112072053 merged locally; closing -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-16965) Introduce StreamContext for Abfs Input and Output streams.
[ https://issues.apache.org/jira/browse/HADOOP-16965?focusedWorklogId=763425&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763425 ] ASF GitHub Bot logged work on HADOOP-16965: --- Author: ASF GitHub Bot Created on: 28/Apr/22 11:04 Start Date: 28/Apr/22 11:04 Worklog Time Spent: 10m Work Description: steveloughran commented on PR #4171: URL: https://github.com/apache/hadoop/pull/4171#issuecomment-1112071311 do you have a branch in your personal repo where you have cherrypicked all the changes you need and show that things are good test wise at the end of the chain? we don't need to do the patch by patch test and review if we are confident the final sequence is good Issue Time Tracking --- Worklog Id: (was: 763425) Time Spent: 4h (was: 3h 50m) > Introduce StreamContext for Abfs Input and Output streams. > -- > > Key: HADOOP-16965 > URL: https://issues.apache.org/jira/browse/HADOOP-16965 > Project: Hadoop Common > Issue Type: Improvement > Components: fs/azure >Reporter: Mukund Thakur >Assignee: Mukund Thakur >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 4h > Remaining Estimate: 0h > > The number of configuration keeps growing in AbfsOutputStream and > AbfsInputStream as we keep on adding new features. It is time to refactor the > configurations in a separate class like StreamContext and pass them around. > This is will improve the readability of code and reduce cherry-pick-backport > pain. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #4171: HADOOP-16965. Refactor abfs stream configuration. (#1956)
steveloughran commented on PR #4171: URL: https://github.com/apache/hadoop/pull/4171#issuecomment-1112071311 do you have a branch in your personal repo where you have cherrypicked all the changes you need and show that things are good test wise at the end of the chain? we don't need to do the patch by patch test and review if we are confident the final sequence is good -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ferhui commented on pull request #4104: HDFS-16520. Improve EC pread: avoid potential reading whole block
ferhui commented on PR #4104: URL: https://github.com/apache/hadoop/pull/4104#issuecomment-1112070060 @cndaimin can you check and confirm whether the failed test is related to this PR? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #4241: HDFS-16563. Namenode WebUI prints sensitive information on Token expiry
steveloughran commented on PR #4241: URL: https://github.com/apache/hadoop/pull/4241#issuecomment-997285 looks like there are some big assumptions about the nested stack trace coming back. so please restore that change and see what happens if the issue is that toString leaks a secret, it should be fixed at that level, as it is likely to end up in logs. we don't want any output to expose secrets. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17882) distcp to use openFile() with sequential IO; ranges of reads
[ https://issues.apache.org/jira/browse/HADOOP-17882?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17529346#comment-17529346 ] Steve Loughran commented on HADOOP-17882: - this hasn't been done yet. proposed * pass in file length from file status * ask for whole file when that's the algorithm > distcp to use openFile() with sequential IO; ranges of reads > > > Key: HADOOP-17882 > URL: https://issues.apache.org/jira/browse/HADOOP-17882 > Project: Hadoop Common > Issue Type: Sub-task > Components: tools/distcp >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Priority: Major > > once openFile adds standard options for sequential access, distcp to adopt so > as to enforce sequential reads on all uploads/backups -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17882) distcp to use openFile() with sequential IO; ranges of reads
[ https://issues.apache.org/jira/browse/HADOOP-17882?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran updated HADOOP-17882: Parent Issue: HADOOP-18067 (was: HADOOP-16202) > distcp to use openFile() with sequential IO; ranges of reads > > > Key: HADOOP-17882 > URL: https://issues.apache.org/jira/browse/HADOOP-17882 > Project: Hadoop Common > Issue Type: Sub-task > Components: tools/distcp >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Priority: Major > > once openFile adds standard options for sequential access, distcp to adopt so > as to enforce sequential reads on all uploads/backups -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Reopened] (HADOOP-17882) distcp to use openFile() with sequential IO; ranges of reads
[ https://issues.apache.org/jira/browse/HADOOP-17882?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran reopened HADOOP-17882: - > distcp to use openFile() with sequential IO; ranges of reads > > > Key: HADOOP-17882 > URL: https://issues.apache.org/jira/browse/HADOOP-17882 > Project: Hadoop Common > Issue Type: Sub-task > Components: tools/distcp >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Priority: Major > > once openFile adds standard options for sequential access, distcp to adopt so > as to enforce sequential reads on all uploads/backups -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4243: YARN-11117 check permission for LeveldbRMStateStore
hadoop-yetus commented on PR #4243: URL: https://github.com/apache/hadoop/pull/4243#issuecomment-895280 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 56s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 41m 25s | | trunk passed | | +1 :green_heart: | compile | 1m 15s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 1m 5s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 0s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 12s | | trunk passed | | +1 :green_heart: | javadoc | 1m 3s | | trunk passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 54s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 22s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 13s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 0s | | the patch passed | | +1 :green_heart: | compile | 1m 6s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 1m 6s | | the patch passed | | +1 :green_heart: | compile | 0m 55s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 55s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 46s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 2s | | the patch passed | | +1 :green_heart: | javadoc | 0m 47s | | the patch passed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 44s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 12s | | the patch passed | | +1 :green_heart: | shadedclient | 24m 24s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 103m 10s | [/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4243/2/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt) | hadoop-yarn-server-resourcemanager in the patch passed. | | +0 :ok: | asflicense | 0m 37s | | ASF License check generated no output? | | | | 212m 7s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.yarn.server.resourcemanager.webapp.TestRMWebServicesDelegationTokenAuthentication | | | hadoop.yarn.server.resourcemanager.webapp.TestRMWebappAuthentication | | | hadoop.yarn.server.resourcemanager.scheduler.capacity.TestApplicationPriority | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4243/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4243 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 4518c7d865fb 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 5d7ff02e5ef276bee31e802e300cbda2f70fedb6 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4243/2/testReport/ | | Max. process+thread count | 883 (vs. ulimit of 5500) | | modules | C