[GitHub] [hadoop] lzx404243 commented on pull request #2499: MAPREDUCE-7310. Clear the fileMap in JHEventHandlerForSigtermTest
lzx404243 commented on pull request #2499: URL: https://github.com/apache/hadoop/pull/2499#issuecomment-739132471 @jiwq Thanks for the review! Is there anything I can do on my end for this to be merged? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2521: HDFS-15711. Add Metrics to HttpFS Server.
hadoop-yetus commented on pull request #2521: URL: https://github.com/apache/hadoop/pull/2521#issuecomment-739118969 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 29s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | | 0m 0s | [test4tests](test4tests) | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 59s | | trunk passed | | +1 :green_heart: | compile | 0m 31s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | compile | 0m 29s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | checkstyle | 0m 27s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 40s | | trunk passed | | +1 :green_heart: | shadedclient | 16m 5s | | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 30s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 0m 26s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +0 :ok: | spotbugs | 0m 48s | | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 0m 48s | | trunk passed | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 25s | | the patch passed | | +1 :green_heart: | compile | 0m 23s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javac | 0m 23s | | the patch passed | | +1 :green_heart: | compile | 0m 21s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | javac | 0m 21s | | the patch passed | | -0 :warning: | checkstyle | 0m 16s | [/diff-checkstyle-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2521/2/artifact/out/diff-checkstyle-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt) | hadoop-hdfs-project/hadoop-hdfs-httpfs: The patch generated 2 new + 49 unchanged - 0 fixed = 51 total (was 49) | | +1 :green_heart: | mvnsite | 0m 29s | | the patch passed | | -1 :x: | whitespace | 0m 0s | [/whitespace-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2521/2/artifact/out/whitespace-eol.txt) | The patch has 4 line(s) that end in whitespace. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | shadedclient | 14m 41s | | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 25s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 0m 23s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | findbugs | 0m 48s | | the patch passed | _ Other Tests _ | | +1 :green_heart: | unit | 5m 18s | | hadoop-hdfs-httpfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 35s | | The patch does not generate ASF License warnings. | | | | 80m 34s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2521/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2521 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux 5401478f2ba9 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 7dda804a1a7 | | Default Java | Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2521/2/testReport/ | | Max. process+thread count | 716 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-httpfs U: hadoop-hdfs-project/hadoop-hdfs-httpfs | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2521/2/console | | versions | git=2.17.1 maven=3.6.0
[jira] [Work logged] (HADOOP-16080) hadoop-aws does not work with hadoop-client-api
[ https://issues.apache.org/jira/browse/HADOOP-16080?focusedWorklogId=520417&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-520417 ] ASF GitHub Bot logged work on HADOOP-16080: --- Author: ASF GitHub Bot Created on: 05/Dec/20 03:53 Start Date: 05/Dec/20 03:53 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #2522: URL: https://github.com/apache/hadoop/pull/2522#issuecomment-739118562 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 24m 48s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 2s | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | The patch appears to include 6 new or modified test files. | ||| _ branch-3.3 Compile Tests _ | | +0 :ok: | mvndep | 3m 57s | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 28m 33s | branch-3.3 passed | | +1 :green_heart: | compile | 15m 51s | branch-3.3 passed | | +1 :green_heart: | checkstyle | 2m 43s | branch-3.3 passed | | +1 :green_heart: | mvnsite | 4m 28s | branch-3.3 passed | | +1 :green_heart: | shadedclient | 23m 7s | branch has no errors when building and testing our client artifacts. | | -1 :x: | javadoc | 0m 40s | hadoop-aws in branch-3.3 failed. | | +0 :ok: | spotbugs | 1m 7s | Used deprecated FindBugs config; considering switching to SpotBugs. | | -1 :x: | findbugs | 0m 50s | hadoop-cloud-storage-project/hadoop-cos in branch-3.3 has 1 extant findbugs warnings. | ||| _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 26s | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 34s | the patch passed | | +1 :green_heart: | compile | 14m 48s | the patch passed | | +1 :green_heart: | javac | 14m 48s | the patch passed | | -0 :warning: | checkstyle | 2m 42s | root: The patch generated 1 new + 101 unchanged - 0 fixed = 102 total (was 101) | | +1 :green_heart: | mvnsite | 4m 26s | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 14m 42s | patch has no errors when building and testing our client artifacts. | | -1 :x: | javadoc | 0m 42s | hadoop-aws in the patch failed. | | -1 :x: | findbugs | 1m 23s | hadoop-tools/hadoop-aws generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) | ||| _ Other Tests _ | | +1 :green_heart: | unit | 9m 33s | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 1m 36s | hadoop-aws in the patch passed. | | +1 :green_heart: | unit | 1m 36s | hadoop-azure in the patch passed. | | +1 :green_heart: | unit | 0m 37s | hadoop-aliyun in the patch passed. | | +1 :green_heart: | unit | 0m 37s | hadoop-cos in the patch passed. | | +1 :green_heart: | asflicense | 0m 55s | The patch does not generate ASF License warnings. | | | | 176m 31s | | | Reason | Tests | |---:|:--| | FindBugs | module:hadoop-tools/hadoop-aws | | | Exceptional return value of java.util.concurrent.ExecutorService.submit(Callable) ignored in org.apache.hadoop.fs.s3a.impl.StoreContext.submit(CompletableFuture, Callable) At StoreContext.java:ignored in org.apache.hadoop.fs.s3a.impl.StoreContext.submit(CompletableFuture, Callable) At StoreContext.java:[line 385] | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2522/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2522 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux fa52ec13fcb6 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / 4628647 | | Default Java | Private Build-1.8.0_275-8u275-b01-0ubuntu1~16.04-b01 | | javadoc | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2522/1/artifact/out/branch-javadoc-hadoop-tools_hadoop-aws.txt | | findbugs | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2522/1/artifact/out/branch-findbugs-hadoop-cloud-storage-project_hadoop-cos-warnings.html | | checkstyle | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2522/1/artifact/out/diff-checkstyle-root
[GitHub] [hadoop] hadoop-yetus commented on pull request #2522: HADOOP-16080. hadoop-aws does not work with hadoop-client-api
hadoop-yetus commented on pull request #2522: URL: https://github.com/apache/hadoop/pull/2522#issuecomment-739118562 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 24m 48s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 2s | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | The patch appears to include 6 new or modified test files. | ||| _ branch-3.3 Compile Tests _ | | +0 :ok: | mvndep | 3m 57s | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 28m 33s | branch-3.3 passed | | +1 :green_heart: | compile | 15m 51s | branch-3.3 passed | | +1 :green_heart: | checkstyle | 2m 43s | branch-3.3 passed | | +1 :green_heart: | mvnsite | 4m 28s | branch-3.3 passed | | +1 :green_heart: | shadedclient | 23m 7s | branch has no errors when building and testing our client artifacts. | | -1 :x: | javadoc | 0m 40s | hadoop-aws in branch-3.3 failed. | | +0 :ok: | spotbugs | 1m 7s | Used deprecated FindBugs config; considering switching to SpotBugs. | | -1 :x: | findbugs | 0m 50s | hadoop-cloud-storage-project/hadoop-cos in branch-3.3 has 1 extant findbugs warnings. | ||| _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 26s | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 34s | the patch passed | | +1 :green_heart: | compile | 14m 48s | the patch passed | | +1 :green_heart: | javac | 14m 48s | the patch passed | | -0 :warning: | checkstyle | 2m 42s | root: The patch generated 1 new + 101 unchanged - 0 fixed = 102 total (was 101) | | +1 :green_heart: | mvnsite | 4m 26s | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 14m 42s | patch has no errors when building and testing our client artifacts. | | -1 :x: | javadoc | 0m 42s | hadoop-aws in the patch failed. | | -1 :x: | findbugs | 1m 23s | hadoop-tools/hadoop-aws generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) | ||| _ Other Tests _ | | +1 :green_heart: | unit | 9m 33s | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 1m 36s | hadoop-aws in the patch passed. | | +1 :green_heart: | unit | 1m 36s | hadoop-azure in the patch passed. | | +1 :green_heart: | unit | 0m 37s | hadoop-aliyun in the patch passed. | | +1 :green_heart: | unit | 0m 37s | hadoop-cos in the patch passed. | | +1 :green_heart: | asflicense | 0m 55s | The patch does not generate ASF License warnings. | | | | 176m 31s | | | Reason | Tests | |---:|:--| | FindBugs | module:hadoop-tools/hadoop-aws | | | Exceptional return value of java.util.concurrent.ExecutorService.submit(Callable) ignored in org.apache.hadoop.fs.s3a.impl.StoreContext.submit(CompletableFuture, Callable) At StoreContext.java:ignored in org.apache.hadoop.fs.s3a.impl.StoreContext.submit(CompletableFuture, Callable) At StoreContext.java:[line 385] | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2522/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2522 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux fa52ec13fcb6 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / 4628647 | | Default Java | Private Build-1.8.0_275-8u275-b01-0ubuntu1~16.04-b01 | | javadoc | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2522/1/artifact/out/branch-javadoc-hadoop-tools_hadoop-aws.txt | | findbugs | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2522/1/artifact/out/branch-findbugs-hadoop-cloud-storage-project_hadoop-cos-warnings.html | | checkstyle | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2522/1/artifact/out/diff-checkstyle-root.txt | | javadoc | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2522/1/artifact/out/patch-javadoc-hadoop-tools_hadoop-aws.txt | | findbugs | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2522/1/artifact/out/new-findbugs-hadoop-tools_hadoop-aws.html | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2522/1/testReport/ | | Max. process+thread count | 1481 (vs.
[GitHub] [hadoop] hadoop-yetus commented on pull request #2521: HDFS-15711. Add Metrics to HttpFS Server.
hadoop-yetus commented on pull request #2521: URL: https://github.com/apache/hadoop/pull/2521#issuecomment-739106538 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 26m 20s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | | 0m 0s | [test4tests](test4tests) | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 34m 26s | | trunk passed | | +1 :green_heart: | compile | 0m 31s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | compile | 0m 29s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | checkstyle | 0m 27s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 39s | | trunk passed | | +1 :green_heart: | shadedclient | 16m 44s | | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 29s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 0m 27s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +0 :ok: | spotbugs | 0m 50s | | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 0m 49s | | trunk passed | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 25s | | the patch passed | | +1 :green_heart: | compile | 0m 23s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javac | 0m 23s | | the patch passed | | +1 :green_heart: | compile | 0m 20s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | javac | 0m 20s | | the patch passed | | -0 :warning: | checkstyle | 0m 17s | [/diff-checkstyle-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2521/1/artifact/out/diff-checkstyle-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt) | hadoop-hdfs-project/hadoop-hdfs-httpfs: The patch generated 3 new + 49 unchanged - 0 fixed = 52 total (was 49) | | +1 :green_heart: | mvnsite | 0m 29s | | the patch passed | | -1 :x: | whitespace | 0m 0s | [/whitespace-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2521/1/artifact/out/whitespace-eol.txt) | The patch has 4 line(s) that end in whitespace. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | shadedclient | 15m 13s | | patch has no errors when building and testing our client artifacts. | | -1 :x: | javadoc | 0m 24s | [/diff-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2521/1/artifact/out/diff-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04.txt) | hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 generated 1 new + 55 unchanged - 0 fixed = 56 total (was 55) | | -1 :x: | javadoc | 0m 23s | [/diff-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2521/1/artifact/out/diff-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01.txt) | hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 generated 1 new + 55 unchanged - 0 fixed = 56 total (was 55) | | -1 :x: | findbugs | 0m 50s | [/new-findbugs-hadoop-hdfs-project_hadoop-hdfs-httpfs.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2521/1/artifact/out/new-findbugs-hadoop-hdfs-project_hadoop-hdfs-httpfs.html) | hadoop-hdfs-project/hadoop-hdfs-httpfs generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) | _ Other Tests _ | | +1 :green_heart: | unit | 5m 19s | | hadoop-hdfs-httpfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 34s | | The patch does not generate ASF License warnings. | | | | 109m 8s | | | | Reason | Tests | |---:|:--| | FindBugs | module:hadoop-hdfs-project/hadoop-hdfs-httpfs | | | Write to static field org.apache.hadoop.fs.http.s
[jira] [Work logged] (HADOOP-16080) hadoop-aws does not work with hadoop-client-api
[ https://issues.apache.org/jira/browse/HADOOP-16080?focusedWorklogId=520392&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-520392 ] ASF GitHub Bot logged work on HADOOP-16080: --- Author: ASF GitHub Bot Created on: 05/Dec/20 00:55 Start Date: 05/Dec/20 00:55 Worklog Time Spent: 10m Work Description: sunchao opened a new pull request #2522: URL: https://github.com/apache/hadoop/pull/2522 (This is a backport from #2510) This does the following: - removes ListenableFuture as well as ListeningExecutorService from public interfaces, so that modules such as `hadoop-aws` and `hadoop-aliyun` can consume the class from hadoop-client-api without running into Guava conflicts. - replaces error message template usages in `checkArgument/checkState/checkNotNull` with `String.format`. The former is generally not available in Guava version < 20 so this is to eliminate potential conflicts. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 520392) Time Spent: 1h 50m (was: 1h 40m) > hadoop-aws does not work with hadoop-client-api > --- > > Key: HADOOP-16080 > URL: https://issues.apache.org/jira/browse/HADOOP-16080 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.2.0, 3.1.1 >Reporter: Keith Turner >Assignee: Chao Sun >Priority: Major > Labels: pull-request-available > Fix For: 3.2.2 > > Time Spent: 1h 50m > Remaining Estimate: 0h > > I attempted to use Accumulo and S3a with the following jars on the classpath. > * hadoop-client-api-3.1.1.jar > * hadoop-client-runtime-3.1.1.jar > * hadoop-aws-3.1.1.jar > This failed with the following exception. > {noformat} > Exception in thread "init" java.lang.NoSuchMethodError: > org.apache.hadoop.util.SemaphoredDelegatingExecutor.(Lcom/google/common/util/concurrent/ListeningExecutorService;IZ)V > at org.apache.hadoop.fs.s3a.S3AFileSystem.create(S3AFileSystem.java:769) > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1169) > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1149) > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1108) > at org.apache.hadoop.fs.FileSystem.createNewFile(FileSystem.java:1413) > at > org.apache.accumulo.server.fs.VolumeManagerImpl.createNewFile(VolumeManagerImpl.java:184) > at > org.apache.accumulo.server.init.Initialize.initDirs(Initialize.java:479) > at > org.apache.accumulo.server.init.Initialize.initFileSystem(Initialize.java:487) > at > org.apache.accumulo.server.init.Initialize.initialize(Initialize.java:370) > at org.apache.accumulo.server.init.Initialize.doInit(Initialize.java:348) > at org.apache.accumulo.server.init.Initialize.execute(Initialize.java:967) > at org.apache.accumulo.start.Main.lambda$execKeyword$0(Main.java:129) > at java.lang.Thread.run(Thread.java:748) > {noformat} > The problem is that {{S3AFileSystem.create()}} looks for > {{SemaphoredDelegatingExecutor(com.google.common.util.concurrent.ListeningExecutorService)}} > which does not exist in hadoop-client-api-3.1.1.jar. What does exist is > {{SemaphoredDelegatingExecutor(org.apache.hadoop.shaded.com.google.common.util.concurrent.ListeningExecutorService)}}. > To work around this issue I created a version of hadoop-aws-3.1.1.jar that > relocated references to Guava. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] sunchao opened a new pull request #2522: HADOOP-16080. hadoop-aws does not work with hadoop-client-api
sunchao opened a new pull request #2522: URL: https://github.com/apache/hadoop/pull/2522 (This is a backport from #2510) This does the following: - removes ListenableFuture as well as ListeningExecutorService from public interfaces, so that modules such as `hadoop-aws` and `hadoop-aliyun` can consume the class from hadoop-client-api without running into Guava conflicts. - replaces error message template usages in `checkArgument/checkState/checkNotNull` with `String.format`. The former is generally not available in Guava version < 20 so this is to eliminate potential conflicts. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] amahussein opened a new pull request #2521: HDFS-15711. Add Metrics to HttpFS Server.
amahussein opened a new pull request #2521: URL: https://github.com/apache/hadoop/pull/2521 ## NOTICE Please create an issue in ASF JIRA before opening a pull request, and you need to set the title of the pull request which starts with the corresponding JIRA issue number. (e.g. HADOOP-X. Fix a typo in YYY.) For more details, please see https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17408) Optimize NetworkTopology while sorting of block locations
[ https://issues.apache.org/jira/browse/HADOOP-17408?focusedWorklogId=520379&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-520379 ] ASF GitHub Bot logged work on HADOOP-17408: --- Author: ASF GitHub Bot Created on: 04/Dec/20 23:59 Start Date: 04/Dec/20 23:59 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #2514: URL: https://github.com/apache/hadoop/pull/2514#issuecomment-739083070 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 30m 25s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | | 0m 0s | [test4tests](test4tests) | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 13m 25s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 23m 26s | | trunk passed | | +1 :green_heart: | compile | 21m 24s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | compile | 18m 6s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | checkstyle | 2m 54s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 54s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 36s | | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 1m 57s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 2m 58s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +0 :ok: | spotbugs | 3m 21s | | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 5m 36s | | trunk passed | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 23s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 6s | | the patch passed | | +1 :green_heart: | compile | 20m 36s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javac | 20m 36s | | the patch passed | | +1 :green_heart: | compile | 18m 5s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | javac | 18m 5s | | the patch passed | | +1 :green_heart: | checkstyle | 2m 51s | | root: The patch generated 0 new + 83 unchanged - 6 fixed = 83 total (was 89) | | +1 :green_heart: | mvnsite | 2m 55s | | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 17m 20s | | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 1m 55s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 2m 57s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | findbugs | 5m 54s | | the patch passed | _ Other Tests _ | | +1 :green_heart: | unit | 9m 39s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 117m 34s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 57s | | The patch does not generate ASF License warnings. | | | | 348m 56s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2514/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2514 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux 53798f82afdd 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 7dda804a1a7 | | Default Java | Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2514/2/testReport/ | | Max. process+thread count | 3233 (vs. ulimit of 5
[GitHub] [hadoop] hadoop-yetus commented on pull request #2514: HADOOP-17408. Optimize NetworkTopology while sorting of block locations.
hadoop-yetus commented on pull request #2514: URL: https://github.com/apache/hadoop/pull/2514#issuecomment-739083070 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 30m 25s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | | 0m 0s | [test4tests](test4tests) | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 13m 25s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 23m 26s | | trunk passed | | +1 :green_heart: | compile | 21m 24s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | compile | 18m 6s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | checkstyle | 2m 54s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 54s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 36s | | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 1m 57s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 2m 58s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +0 :ok: | spotbugs | 3m 21s | | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 5m 36s | | trunk passed | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 23s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 6s | | the patch passed | | +1 :green_heart: | compile | 20m 36s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javac | 20m 36s | | the patch passed | | +1 :green_heart: | compile | 18m 5s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | javac | 18m 5s | | the patch passed | | +1 :green_heart: | checkstyle | 2m 51s | | root: The patch generated 0 new + 83 unchanged - 6 fixed = 83 total (was 89) | | +1 :green_heart: | mvnsite | 2m 55s | | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 17m 20s | | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 1m 55s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 2m 57s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | findbugs | 5m 54s | | the patch passed | _ Other Tests _ | | +1 :green_heart: | unit | 9m 39s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 117m 34s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 57s | | The patch does not generate ASF License warnings. | | | | 348m 56s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2514/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2514 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux 53798f82afdd 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 7dda804a1a7 | | Default Java | Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2514/2/testReport/ | | Max. process+thread count | 3233 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2514/2/console | | versions | git=2.17.1 maven=3.6.0 findbugs=4.0.6 | | Powered by | Apache Yetus 0.13.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated.
[GitHub] [hadoop] ericbadger commented on pull request #2513: YARN-10494 CLI tool for docker-to-squashfs conversion (pure Java).
ericbadger commented on pull request #2513: URL: https://github.com/apache/hadoop/pull/2513#issuecomment-739079069 I'd be happy to do a review. It will likely take awhile though. As you said, it's quite a bit of code This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] insideo commented on pull request #2513: YARN-10494 CLI tool for docker-to-squashfs conversion (pure Java).
insideo commented on pull request #2513: URL: https://github.com/apache/hadoop/pull/2513#issuecomment-739075300 Test failure appears to be unrelated. @ericbadger, would you be willing to do a review? I know it's a lot of code. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2513: YARN-10494 CLI tool for docker-to-squashfs conversion (pure Java).
hadoop-yetus commented on pull request #2513: URL: https://github.com/apache/hadoop/pull/2513#issuecomment-739074860 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 7s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 3s | | No case conflicting files found. | | +1 :green_heart: | @author | 0m 1s | | The patch does not contain any @author tags. | | +1 :green_heart: | | 0m 0s | [test4tests](test4tests) | The patch appears to include 69 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 34m 50s | | trunk passed | | +1 :green_heart: | compile | 3m 2s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | compile | 2m 12s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | checkstyle | 0m 39s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 16s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 9s | | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 1m 41s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 1m 23s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +0 :ok: | spotbugs | 5m 44s | | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 5m 40s | | trunk passed | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 23s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 49s | | the patch passed | | +1 :green_heart: | compile | 3m 5s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javac | 3m 5s | | the patch passed | | +1 :green_heart: | compile | 2m 15s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | javac | 2m 15s | | the patch passed | | +1 :green_heart: | checkstyle | 0m 35s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 39s | | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | | The patch has no whitespace issues. | | +1 :green_heart: | xml | 0m 5s | | The patch has no ill-formed XML file. | | +1 :green_heart: | shadedclient | 16m 58s | | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 2m 4s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 1m 46s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | findbugs | 6m 46s | | the patch passed | _ Other Tests _ | | +1 :green_heart: | unit | 0m 29s | | hadoop-runc in the patch passed. | | -1 :x: | unit | 54m 24s | [/patch-unit-hadoop-tools.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2513/5/artifact/out/patch-unit-hadoop-tools.txt) | hadoop-tools in the patch passed. | | +1 :green_heart: | asflicense | 0m 34s | | The patch does not generate ASF License warnings. | | | | 168m 37s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.tools.dynamometer.TestDynamometerInfra | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2513/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2513 | | Optional Tests | dupname asflicense xml compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux 71ca4a00ebfb 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 7dda804a1a7 | | Default Java | Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2513/5/testReport/ | | Max. process+thread count | 958 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-runc hadoop-tools U: hadoop-tools | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2513/5/console | | versions | git=2.17.1 maven=3.6.0 findbugs=4
[jira] [Commented] (HADOOP-17389) KMS should log full UGI principal
[ https://issues.apache.org/jira/browse/HADOOP-17389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17244273#comment-17244273 ] Jim Brennan commented on HADOOP-17389: -- Thanks [~aajisaka]! Any objection to cherry-picking this to other branch-3 branches? > KMS should log full UGI principal > - > > Key: HADOOP-17389 > URL: https://issues.apache.org/jira/browse/HADOOP-17389 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Ahmed Hussein >Assignee: Ahmed Hussein >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 1h 20m > Remaining Estimate: 0h > > [~daryn] reported that the kms-audit log only logs the short username: > {{OK[op=GENERATE_EEK, key=key1, user=hdfs, accessCount=4206, > interval=10427ms]}} > In this example, it's impossible to tell which NN(s) requested EDEKs when > they are all lumped together. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2513: YARN-10494 CLI tool for docker-to-squashfs conversion (pure Java).
hadoop-yetus commented on pull request #2513: URL: https://github.com/apache/hadoop/pull/2513#issuecomment-738994203 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 10s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 3s | | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | | 0m 0s | [test4tests](test4tests) | The patch appears to include 69 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 36m 18s | | trunk passed | | +1 :green_heart: | compile | 3m 4s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | compile | 2m 12s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | checkstyle | 0m 40s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 17s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 26s | | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 1m 41s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 1m 23s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +0 :ok: | spotbugs | 5m 44s | | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 5m 40s | | trunk passed | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 25s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 49s | | the patch passed | | +1 :green_heart: | compile | 3m 3s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javac | 3m 3s | | the patch passed | | +1 :green_heart: | compile | 2m 13s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | javac | 2m 13s | | the patch passed | | +1 :green_heart: | checkstyle | 0m 35s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 37s | | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | | The patch has no whitespace issues. | | +1 :green_heart: | xml | 0m 5s | | The patch has no ill-formed XML file. | | +1 :green_heart: | shadedclient | 16m 56s | | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 2m 4s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 1m 45s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | findbugs | 6m 44s | | the patch passed | _ Other Tests _ | | -1 :x: | unit | 0m 30s | [/patch-unit-hadoop-tools_hadoop-runc.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2513/4/artifact/out/patch-unit-hadoop-tools_hadoop-runc.txt) | hadoop-runc in the patch passed. | | -1 :x: | unit | 54m 23s | [/patch-unit-hadoop-tools.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2513/4/artifact/out/patch-unit-hadoop-tools.txt) | hadoop-tools in the patch passed. | | +1 :green_heart: | asflicense | 0m 33s | | The patch does not generate ASF License warnings. | | | | 169m 59s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.runc.squashfs.TestSquashFsInterop | | | hadoop.runc.squashfs.TestSquashFsInterop | | | hadoop.tools.dynamometer.TestDynamometerInfra | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2513/4/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2513 | | Optional Tests | dupname asflicense xml compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux 32dcfb00bb3d 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / e2c1268ebd5 | | Default Java | Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2513/4/testReport/ | | Max. process+thread
[GitHub] [hadoop] sunchao merged pull request #2517: HDFS-15708. TestURLConnectionFactory fails by NoClassDefFoundError in branch-3.3 and branch-3.2
sunchao merged pull request #2517: URL: https://github.com/apache/hadoop/pull/2517 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2511: HDFS-15704. Mitigate lease monitor's rapid infinite loop.
hadoop-yetus commented on pull request #2511: URL: https://github.com/apache/hadoop/pull/2511#issuecomment-738932742 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 6s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | | 0m 0s | [test4tests](test4tests) | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 54s | | trunk passed | | +1 :green_heart: | compile | 1m 20s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | compile | 1m 16s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | checkstyle | 0m 48s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 22s | | trunk passed | | +1 :green_heart: | shadedclient | 17m 11s | | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 54s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 1m 24s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +0 :ok: | spotbugs | 3m 7s | | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 3m 5s | | trunk passed | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 10s | | the patch passed | | +1 :green_heart: | compile | 1m 11s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javac | 1m 11s | | hadoop-hdfs-project_hadoop-hdfs-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 generated 0 new + 599 unchanged - 3 fixed = 599 total (was 602) | | +1 :green_heart: | compile | 1m 4s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | javac | 1m 4s | | hadoop-hdfs-project_hadoop-hdfs-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 generated 0 new + 583 unchanged - 3 fixed = 583 total (was 586) | | +1 :green_heart: | checkstyle | 0m 40s | | hadoop-hdfs-project/hadoop-hdfs: The patch generated 0 new + 25 unchanged - 1 fixed = 25 total (was 26) | | +1 :green_heart: | mvnsite | 1m 11s | | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 14m 53s | | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 48s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 1m 22s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | findbugs | 3m 5s | | the patch passed | _ Other Tests _ | | +1 :green_heart: | unit | 123m 51s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 44s | | The patch does not generate ASF License warnings. | | | | 213m 5s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2511/4/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2511 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux 6ffb41603865 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / e2c1268ebd5 | | Default Java | Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2511/4/testReport/ | | Max. process+thread count | 3696 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2511/4/console | | versions | git=2.17.1 maven=3.6.0 findbugs=4.0.6 | | Powered by | Apache Yetus
[GitHub] [hadoop] hadoop-yetus commented on pull request #2516: HDFS-15707. NNTop counts don't add up as expected.
hadoop-yetus commented on pull request #2516: URL: https://github.com/apache/hadoop/pull/2516#issuecomment-738929524 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 11s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | | 0m 0s | [test4tests](test4tests) | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 35m 22s | | trunk passed | | +1 :green_heart: | compile | 1m 20s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | compile | 1m 15s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | checkstyle | 0m 48s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 24s | | trunk passed | | +1 :green_heart: | shadedclient | 19m 48s | | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 58s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 1m 22s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +0 :ok: | spotbugs | 3m 49s | | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 3m 46s | | trunk passed | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 22s | | the patch passed | | +1 :green_heart: | compile | 1m 27s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javac | 1m 27s | | the patch passed | | +1 :green_heart: | compile | 1m 14s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | javac | 1m 14s | | the patch passed | | +1 :green_heart: | checkstyle | 0m 56s | | hadoop-hdfs-project/hadoop-hdfs: The patch generated 0 new + 25 unchanged - 4 fixed = 25 total (was 29) | | +1 :green_heart: | mvnsite | 1m 50s | | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 22m 4s | | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 48s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 1m 19s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | findbugs | 3m 20s | | the patch passed | _ Other Tests _ | | -1 :x: | unit | 124m 57s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2516/2/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 36s | | The patch does not generate ASF License warnings. | | | | 228m 44s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.datanode.TestNNHandlesBlockReportPerStorage | | | hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2516/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2516 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux 84ab6657c249 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / e2c1268ebd5 | | Default Java | Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2516/2/testReport/ | | Max. process+thread count | 2750 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2516/2/console | | versions | git=2.17.1 maven=3.6.0 findbugs=4.0.6 | | Powered by | Apa
[GitHub] [hadoop] ankit-kumar-25 commented on pull request #2519: YARN-10491: Fix deprecation warnings in SLSWebApp.java
ankit-kumar-25 commented on pull request #2519: URL: https://github.com/apache/hadoop/pull/2519#issuecomment-738925100 Hey @aajisaka, Thank you for your review. As suggested, I have added `StandardCharsets.UTF_8` instead of `Charset.forName(String)`. Can you please review? Thanks! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2519: YARN-10491: Fix deprecation warnings in SLSWebApp.java
hadoop-yetus commented on pull request #2519: URL: https://github.com/apache/hadoop/pull/2519#issuecomment-738921278 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 31m 7s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | | 0m 0s | [test4tests](test4tests) | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 34m 54s | | trunk passed | | +1 :green_heart: | compile | 0m 26s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | compile | 0m 23s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | checkstyle | 0m 19s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 27s | | trunk passed | | +1 :green_heart: | shadedclient | 17m 53s | | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 24s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 0m 23s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +0 :ok: | spotbugs | 0m 46s | | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 0m 44s | | trunk passed | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 24s | | the patch passed | | +1 :green_heart: | compile | 0m 20s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javac | 0m 20s | | hadoop-tools_hadoop-sls-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 generated 0 new + 0 unchanged - 6 fixed = 0 total (was 6) | | +1 :green_heart: | compile | 0m 17s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | javac | 0m 17s | | hadoop-tools_hadoop-sls-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 generated 0 new + 0 unchanged - 6 fixed = 0 total (was 6) | | +1 :green_heart: | checkstyle | 0m 12s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 20s | | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 16m 48s | | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 20s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 0m 19s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | findbugs | 0m 47s | | the patch passed | _ Other Tests _ | | -1 :x: | unit | 11m 48s | [/patch-unit-hadoop-tools_hadoop-sls.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2519/2/artifact/out/patch-unit-hadoop-tools_hadoop-sls.txt) | hadoop-sls in the patch passed. | | +1 :green_heart: | asflicense | 0m 29s | | The patch does not generate ASF License warnings. | | | | 120m 59s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.yarn.sls.appmaster.TestAMSimulator | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2519/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2519 | | JIRA Issue | YARN-10491 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux ba07bad2d4ee 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / e2c1268ebd5 | | Default Java | Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2519/2/testReport/ | | Max. process+thread count | 513 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-sls U: hadoop-tools/hadoop-sls | | Console outpu
[jira] [Updated] (HADOOP-15775) [JDK9] Add missing javax.activation-api dependency
[ https://issues.apache.org/jira/browse/HADOOP-15775?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Xiaoqiao He updated HADOOP-15775: - Fix Version/s: 3.2.3 3.2.2 Thanks [~aajisaka]. backport to branch-3.2.2 and branch-3.2. > [JDK9] Add missing javax.activation-api dependency > -- > > Key: HADOOP-15775 > URL: https://issues.apache.org/jira/browse/HADOOP-15775 > Project: Hadoop Common > Issue Type: Sub-task > Components: test >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Critical > Fix For: 3.3.0, 3.2.2, 3.2.3 > > Attachments: HADOOP-15775.01.patch, HADOOP-15775.02.patch, > HADOOP-15775.03.patch, HADOOP-15775.04.patch, HADOOP-15775.05.patch, > HADOOP-15775.06.patch > > > Many unit tests fail due to missing java.activation module. This failure can > be fixed by adding javax.activation-api as third-party dependency. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] amahussein commented on pull request #2516: HDFS-15707. NNTop counts don't add up as expected.
amahussein commented on pull request #2516: URL: https://github.com/apache/hadoop/pull/2516#issuecomment-738810243 - I created a new [HDFS-15710]( https://issues.apache.org/jira/browse/HDFS-15710) for `TestBlockTokenWithDFSStriped` failure. - There is an existing jira for `TestUnderReplicatedBlocks` [HDFS-9243](https://issues.apache.org/jira/browse/HDFS-9243) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17359) [Hadoop-Tools]S3A MultiObjectDeleteException after uploading a file
[ https://issues.apache.org/jira/browse/HADOOP-17359?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17244005#comment-17244005 ] Xun REN commented on HADOOP-17359: -- Hi [~ste...@apache.org] sorry, I didn't get time to work with hadoop 3. Instead, I have modified the permission for the parent directory to avoid the problem. > [Hadoop-Tools]S3A MultiObjectDeleteException after uploading a file > --- > > Key: HADOOP-17359 > URL: https://issues.apache.org/jira/browse/HADOOP-17359 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 2.10.0 >Reporter: Xun REN >Priority: Minor > > Hello, > > I am using org.apache.hadoop.fs.s3a.S3AFileSystem as implementation for S3 > related operation. > When I upload a file onto a path, it returns an error: > {code:java} > 20/11/05 11:49:13 ERROR s3a.S3AFileSystem: Partial failure of delete, 1 > errors20/11/05 11:49:13 ERROR s3a.S3AFileSystem: Partial failure of delete, 1 > errorscom.amazonaws.services.s3.model.MultiObjectDeleteException: One or more > objects could not be deleted (Service: null; Status Code: 200; Error Code: > null; Request ID: 767BEC034D0B5B8A; S3 Extended Request ID: > JImfJY9hCl/QvninqT9aO+jrkmyRpRcceAg7t1lO936RfOg7izIom76RtpH+5rLqvmBFRx/++g8=; > Proxy: null), S3 Extended Request ID: > JImfJY9hCl/QvninqT9aO+jrkmyRpRcceAg7t1lO936RfOg7izIom76RtpH+5rLqvmBFRx/++g8= > at > com.amazonaws.services.s3.AmazonS3Client.deleteObjects(AmazonS3Client.java:2287) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.deleteObjects(S3AFileSystem.java:1137) > at org.apache.hadoop.fs.s3a.S3AFileSystem.removeKeys(S3AFileSystem.java:1389) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.deleteUnnecessaryFakeDirectories(S3AFileSystem.java:2304) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.finishedWrite(S3AFileSystem.java:2270) > at > org.apache.hadoop.fs.s3a.S3AFileSystem$WriteOperationHelper.writeSuccessful(S3AFileSystem.java:2768) > at > org.apache.hadoop.fs.s3a.S3ABlockOutputStream.close(S3ABlockOutputStream.java:371) > at > org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:74) > at > org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:108) at > org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:69) at > org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:128) at > org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:488) > at > org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:410) > at > org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:342) > at > org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:277) > at > org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:262) > at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:327) at > org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:299) at > org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:257) > at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:281) at > org.apache.hadoop.fs.shell.Command.processArguments(Command.java:265) at > org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:228) > at > org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands.java:285) > at > org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:119) > at org.apache.hadoop.fs.shell.Command.run(Command.java:175) at > org.apache.hadoop.fs.FsShell.run(FsShell.java:317) at > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at > org.apache.hadoop.fs.FsShell.main(FsShell.java:380)20/11/05 11:49:13 ERROR > s3a.S3AFileSystem: bv/: "AccessDenied" - Access Denied > {code} > The problem is that Hadoop tries to create fake directories to map with S3 > prefix and it cleans them after the operation. The cleaning is done from the > parent folder until the root folder. > If we don't give the corresponding permission for some path, it will > encounter this problem: > [https://github.com/apache/hadoop/blob/rel/release-2.10.0/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3AFileSystem.java#L2296-L2301] > > During uploading, I don't see any "fake" directories are created. Why should > we clean them if it is not really created ? > It is the same for the other operations like rename or mkdir where the > "deleteUnnecessaryFakeDirectories" method is called. > Maybe the solution is to check the deleting permission before it calls the
[jira] [Commented] (HADOOP-17337) NetworkBinding has a runtime class dependency on a third-party shaded class
[ https://issues.apache.org/jira/browse/HADOOP-17337?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17243994#comment-17243994 ] Steve Loughran commented on HADOOP-17337: - Checking up on this. I'd like to make this a blocker for 3.3.1, as it is bad news for lightweight docker deployments. # [~cwensel] do you know anyone who can work on this # what do people think would be the way to do it? I'm thinking we'd need a PatchTheSocketFactory interface, try to load an implementation which does this to the shaded one; if that doesn't load fall back to one to patch the unshaded one. That way even unshaded aws-s3 libraries would be able to switch to openssl > NetworkBinding has a runtime class dependency on a third-party shaded class > --- > > Key: HADOOP-17337 > URL: https://issues.apache.org/jira/browse/HADOOP-17337 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0 >Reporter: Chris Wensel >Priority: Blocker > Fix For: 3.3.1 > > > The hadoop-aws library has a dependency on > 'com.amazonaws':aws-java-sdk-bundle' which in turn is a fat jar of all AWS > SDK libraries and shaded dependencies. > > This dependency is 181MB. > > Some applications using the S3AFilesystem may be sensitive to having a large > footprint. For example, building an application using Parquet and bundled > with Docker. > > Typically, in prior Hadoop versions, the bundle was replaced by the specific > AWS SDK dependencies, dropping the overall footprint. > > In 3.3 (and maybe prior versions) this strategy does not work because of the > following exception: > {{java.lang.NoClassDefFoundError: > com/amazonaws/thirdparty/apache/http/conn/socket/ConnectionSocketFactory}} > {{ at > org.apache.hadoop.fs.s3a.S3AUtils.initProtocolSettings(S3AUtils.java:1335)}} > {{ at > org.apache.hadoop.fs.s3a.S3AUtils.initConnectionSettings(S3AUtils.java:1290)}} > {{ at org.apache.hadoop.fs.s3a.S3AUtils.createAwsConf(S3AUtils.java:1247)}} > {{ at > org.apache.hadoop.fs.s3a.DefaultS3ClientFactory.createS3Client(DefaultS3ClientFactory.java:61)}} > {{ at > org.apache.hadoop.fs.s3a.S3AFileSystem.bindAWSClient(S3AFileSystem.java:644)}} > {{ at > org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:390)}} > {{ at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3414)}} > {{ at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:158)}} > {{ at > org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3474)}} > {{ at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3442)}} > {{ at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:524)}} > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17337) NetworkBinding has a runtime class dependency on a third-party shaded class
[ https://issues.apache.org/jira/browse/HADOOP-17337?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran updated HADOOP-17337: Priority: Blocker (was: Major) > NetworkBinding has a runtime class dependency on a third-party shaded class > --- > > Key: HADOOP-17337 > URL: https://issues.apache.org/jira/browse/HADOOP-17337 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0 >Reporter: Chris Wensel >Priority: Blocker > > The hadoop-aws library has a dependency on > 'com.amazonaws':aws-java-sdk-bundle' which in turn is a fat jar of all AWS > SDK libraries and shaded dependencies. > > This dependency is 181MB. > > Some applications using the S3AFilesystem may be sensitive to having a large > footprint. For example, building an application using Parquet and bundled > with Docker. > > Typically, in prior Hadoop versions, the bundle was replaced by the specific > AWS SDK dependencies, dropping the overall footprint. > > In 3.3 (and maybe prior versions) this strategy does not work because of the > following exception: > {{java.lang.NoClassDefFoundError: > com/amazonaws/thirdparty/apache/http/conn/socket/ConnectionSocketFactory}} > {{ at > org.apache.hadoop.fs.s3a.S3AUtils.initProtocolSettings(S3AUtils.java:1335)}} > {{ at > org.apache.hadoop.fs.s3a.S3AUtils.initConnectionSettings(S3AUtils.java:1290)}} > {{ at org.apache.hadoop.fs.s3a.S3AUtils.createAwsConf(S3AUtils.java:1247)}} > {{ at > org.apache.hadoop.fs.s3a.DefaultS3ClientFactory.createS3Client(DefaultS3ClientFactory.java:61)}} > {{ at > org.apache.hadoop.fs.s3a.S3AFileSystem.bindAWSClient(S3AFileSystem.java:644)}} > {{ at > org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:390)}} > {{ at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3414)}} > {{ at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:158)}} > {{ at > org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3474)}} > {{ at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3442)}} > {{ at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:524)}} > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17337) NetworkBinding has a runtime class dependency on a third-party shaded class
[ https://issues.apache.org/jira/browse/HADOOP-17337?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran updated HADOOP-17337: Fix Version/s: 3.3.1 > NetworkBinding has a runtime class dependency on a third-party shaded class > --- > > Key: HADOOP-17337 > URL: https://issues.apache.org/jira/browse/HADOOP-17337 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0 >Reporter: Chris Wensel >Priority: Blocker > Fix For: 3.3.1 > > > The hadoop-aws library has a dependency on > 'com.amazonaws':aws-java-sdk-bundle' which in turn is a fat jar of all AWS > SDK libraries and shaded dependencies. > > This dependency is 181MB. > > Some applications using the S3AFilesystem may be sensitive to having a large > footprint. For example, building an application using Parquet and bundled > with Docker. > > Typically, in prior Hadoop versions, the bundle was replaced by the specific > AWS SDK dependencies, dropping the overall footprint. > > In 3.3 (and maybe prior versions) this strategy does not work because of the > following exception: > {{java.lang.NoClassDefFoundError: > com/amazonaws/thirdparty/apache/http/conn/socket/ConnectionSocketFactory}} > {{ at > org.apache.hadoop.fs.s3a.S3AUtils.initProtocolSettings(S3AUtils.java:1335)}} > {{ at > org.apache.hadoop.fs.s3a.S3AUtils.initConnectionSettings(S3AUtils.java:1290)}} > {{ at org.apache.hadoop.fs.s3a.S3AUtils.createAwsConf(S3AUtils.java:1247)}} > {{ at > org.apache.hadoop.fs.s3a.DefaultS3ClientFactory.createS3Client(DefaultS3ClientFactory.java:61)}} > {{ at > org.apache.hadoop.fs.s3a.S3AFileSystem.bindAWSClient(S3AFileSystem.java:644)}} > {{ at > org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:390)}} > {{ at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3414)}} > {{ at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:158)}} > {{ at > org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3474)}} > {{ at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3442)}} > {{ at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:524)}} > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17359) [Hadoop-Tools]S3A MultiObjectDeleteException after uploading a file
[ https://issues.apache.org/jira/browse/HADOOP-17359?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17243993#comment-17243993 ] Steve Loughran commented on HADOOP-17359: - [~renxunsaky] -have you had any luck with an upgraded hadoop version? > [Hadoop-Tools]S3A MultiObjectDeleteException after uploading a file > --- > > Key: HADOOP-17359 > URL: https://issues.apache.org/jira/browse/HADOOP-17359 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 2.10.0 >Reporter: Xun REN >Priority: Minor > > Hello, > > I am using org.apache.hadoop.fs.s3a.S3AFileSystem as implementation for S3 > related operation. > When I upload a file onto a path, it returns an error: > {code:java} > 20/11/05 11:49:13 ERROR s3a.S3AFileSystem: Partial failure of delete, 1 > errors20/11/05 11:49:13 ERROR s3a.S3AFileSystem: Partial failure of delete, 1 > errorscom.amazonaws.services.s3.model.MultiObjectDeleteException: One or more > objects could not be deleted (Service: null; Status Code: 200; Error Code: > null; Request ID: 767BEC034D0B5B8A; S3 Extended Request ID: > JImfJY9hCl/QvninqT9aO+jrkmyRpRcceAg7t1lO936RfOg7izIom76RtpH+5rLqvmBFRx/++g8=; > Proxy: null), S3 Extended Request ID: > JImfJY9hCl/QvninqT9aO+jrkmyRpRcceAg7t1lO936RfOg7izIom76RtpH+5rLqvmBFRx/++g8= > at > com.amazonaws.services.s3.AmazonS3Client.deleteObjects(AmazonS3Client.java:2287) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.deleteObjects(S3AFileSystem.java:1137) > at org.apache.hadoop.fs.s3a.S3AFileSystem.removeKeys(S3AFileSystem.java:1389) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.deleteUnnecessaryFakeDirectories(S3AFileSystem.java:2304) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.finishedWrite(S3AFileSystem.java:2270) > at > org.apache.hadoop.fs.s3a.S3AFileSystem$WriteOperationHelper.writeSuccessful(S3AFileSystem.java:2768) > at > org.apache.hadoop.fs.s3a.S3ABlockOutputStream.close(S3ABlockOutputStream.java:371) > at > org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:74) > at > org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:108) at > org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:69) at > org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:128) at > org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:488) > at > org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:410) > at > org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:342) > at > org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:277) > at > org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:262) > at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:327) at > org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:299) at > org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:257) > at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:281) at > org.apache.hadoop.fs.shell.Command.processArguments(Command.java:265) at > org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:228) > at > org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands.java:285) > at > org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:119) > at org.apache.hadoop.fs.shell.Command.run(Command.java:175) at > org.apache.hadoop.fs.FsShell.run(FsShell.java:317) at > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at > org.apache.hadoop.fs.FsShell.main(FsShell.java:380)20/11/05 11:49:13 ERROR > s3a.S3AFileSystem: bv/: "AccessDenied" - Access Denied > {code} > The problem is that Hadoop tries to create fake directories to map with S3 > prefix and it cleans them after the operation. The cleaning is done from the > parent folder until the root folder. > If we don't give the corresponding permission for some path, it will > encounter this problem: > [https://github.com/apache/hadoop/blob/rel/release-2.10.0/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3AFileSystem.java#L2296-L2301] > > During uploading, I don't see any "fake" directories are created. Why should > we clean them if it is not really created ? > It is the same for the other operations like rename or mkdir where the > "deleteUnnecessaryFakeDirectories" method is called. > Maybe the solution is to check the deleting permission before it calls the > deleteObjects method. > > To reproduce the problem: > # With a bucket n
[jira] [Commented] (HADOOP-17402) Add GCS FS impl reference to core-default.xml
[ https://issues.apache.org/jira/browse/HADOOP-17402?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17243991#comment-17243991 ] Steve Loughran commented on HADOOP-17402: - I think we've always hoped to do the elegant low-overhead service load mechanism, but it's not happened. A core-default entry is simpler. Does require the implementation class to not change though. what does the GCS team think? > Add GCS FS impl reference to core-default.xml > - > > Key: HADOOP-17402 > URL: https://issues.apache.org/jira/browse/HADOOP-17402 > Project: Hadoop Common > Issue Type: Improvement > Components: fs >Reporter: Rafal Wojdyla >Priority: Major > > Akin to current S3 default configuration add GCS configuration, specifically > to declare the GCS implementation. [GCS > connector|https://cloud.google.com/dataproc/docs/concepts/connectors/cloud-storage]. > Has this not been done since the GCS connector is not part of the hadoop/ASF > codebase, or is there any other blocker? -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-16775) DistCp reuses the same temp file within the task attempt for different files.
[ https://issues.apache.org/jira/browse/HADOOP-16775?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17243980#comment-17243980 ] Steve Loughran commented on HADOOP-16775: - Note: consistent S3 renders this fix moot. Older releases are safe to use. > DistCp reuses the same temp file within the task attempt for different files. > - > > Key: HADOOP-16775 > URL: https://issues.apache.org/jira/browse/HADOOP-16775 > Project: Hadoop Common > Issue Type: Improvement > Components: tools/distcp >Affects Versions: 3.0.0 >Reporter: Amir Shenavandeh >Assignee: Amir Shenavandeh >Priority: Major > Labels: DistCp, S3, hadoop-tools > Fix For: 3.2.2 > > Attachments: HADOOP-16775-v1.patch, HADOOP-16775.patch > > > Hadoop DistCp reuses the same temp file name for all the files copied within > each task attempt and then moves them to the target name, which is also a > server side copy. For copies to S3, this will cause inconsistency as S3 is > only consistent for reads after writes, for brand new objects. There is also > inconsistency for contents of overwritten objects on S3. > To avoid this, we should randomize the temp file name and for each temp file > use a different name. > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-16776) backport HADOOP-16775: distcp copies to s3 are randomly corrupted
[ https://issues.apache.org/jira/browse/HADOOP-16776?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17243979#comment-17243979 ] Steve Loughran commented on HADOOP-16776: - Note: consistent S3 renders this fix moot. Older releases are safe to use. > backport HADOOP-16775: distcp copies to s3 are randomly corrupted > - > > Key: HADOOP-16776 > URL: https://issues.apache.org/jira/browse/HADOOP-16776 > Project: Hadoop Common > Issue Type: Improvement > Components: tools/distcp >Affects Versions: 2.8.0, 3.0.0, 2.10.0 >Reporter: Amir Shenavandeh >Assignee: Amir Shenavandeh >Priority: Blocker > Labels: DistCp > Fix For: 3.1.4, 2.10.1 > > Attachments: HADOOP-16776-branch-2.8-001.patch, > HADOOP-16776-branch-2.8-002.patch > > > This is to back port HADOOP-16775 to hadoop 2.8 branch. > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17290) ABFS: Add Identifiers to Client Request Header
[ https://issues.apache.org/jira/browse/HADOOP-17290?focusedWorklogId=520178&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-520178 ] ASF GitHub Bot logged work on HADOOP-17290: --- Author: ASF GitHub Bot Created on: 04/Dec/20 12:44 Start Date: 04/Dec/20 12:44 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #2520: URL: https://github.com/apache/hadoop/pull/2520#issuecomment-738764426 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 33s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | | 0m 0s | [test4tests](test4tests) | The patch appears to include 29 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 43s | | trunk passed | | +1 :green_heart: | compile | 0m 39s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | compile | 0m 34s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | checkstyle | 0m 27s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 39s | | trunk passed | | +1 :green_heart: | shadedclient | 16m 20s | | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 33s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 0m 29s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +0 :ok: | spotbugs | 0m 58s | | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 0m 57s | | trunk passed | | -0 :warning: | patch | 1m 17s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 30s | | the patch passed | | +1 :green_heart: | compile | 0m 29s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javac | 0m 29s | | the patch passed | | +1 :green_heart: | compile | 0m 25s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | javac | 0m 25s | | the patch passed | | -0 :warning: | checkstyle | 0m 17s | [/diff-checkstyle-hadoop-tools_hadoop-azure.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2520/1/artifact/out/diff-checkstyle-hadoop-tools_hadoop-azure.txt) | hadoop-tools/hadoop-azure: The patch generated 34 new + 17 unchanged - 0 fixed = 51 total (was 17) | | +1 :green_heart: | mvnsite | 0m 28s | | the patch passed | | -1 :x: | whitespace | 0m 0s | [/whitespace-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2520/1/artifact/out/whitespace-eol.txt) | The patch has 2 line(s) that end in whitespace. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | shadedclient | 14m 39s | | patch has no errors when building and testing our client artifacts. | | -1 :x: | javadoc | 0m 26s | [/diff-javadoc-javadoc-hadoop-tools_hadoop-azure-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2520/1/artifact/out/diff-javadoc-javadoc-hadoop-tools_hadoop-azure-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04.txt) | hadoop-tools_hadoop-azure-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 generated 5 new + 17 unchanged - 0 fixed = 22 total (was 17) | | -1 :x: | javadoc | 0m 24s | [/diff-javadoc-javadoc-hadoop-tools_hadoop-azure-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2520/1/artifact/out/diff-javadoc-javadoc-hadoop-tools_hadoop-azure-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01.txt) | hadoop-tools_hadoop-azure-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 generated 5 new + 17 unchanged - 0 fixed = 22 total (was 17) | | -1 :x: | findbugs | 1m 0s | [/new-findbugs-hadoop-tools_hadoop-azure.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2520/1/artifact/out/new-findbugs-hadoop-
[GitHub] [hadoop] hadoop-yetus commented on pull request #2520: HADOOP-17290. ABFS: Add Identifiers to Client Request Header
hadoop-yetus commented on pull request #2520: URL: https://github.com/apache/hadoop/pull/2520#issuecomment-738764426 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 33s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | | 0m 0s | [test4tests](test4tests) | The patch appears to include 29 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 43s | | trunk passed | | +1 :green_heart: | compile | 0m 39s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | compile | 0m 34s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | checkstyle | 0m 27s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 39s | | trunk passed | | +1 :green_heart: | shadedclient | 16m 20s | | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 33s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 0m 29s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +0 :ok: | spotbugs | 0m 58s | | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 0m 57s | | trunk passed | | -0 :warning: | patch | 1m 17s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 30s | | the patch passed | | +1 :green_heart: | compile | 0m 29s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javac | 0m 29s | | the patch passed | | +1 :green_heart: | compile | 0m 25s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | javac | 0m 25s | | the patch passed | | -0 :warning: | checkstyle | 0m 17s | [/diff-checkstyle-hadoop-tools_hadoop-azure.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2520/1/artifact/out/diff-checkstyle-hadoop-tools_hadoop-azure.txt) | hadoop-tools/hadoop-azure: The patch generated 34 new + 17 unchanged - 0 fixed = 51 total (was 17) | | +1 :green_heart: | mvnsite | 0m 28s | | the patch passed | | -1 :x: | whitespace | 0m 0s | [/whitespace-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2520/1/artifact/out/whitespace-eol.txt) | The patch has 2 line(s) that end in whitespace. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | shadedclient | 14m 39s | | patch has no errors when building and testing our client artifacts. | | -1 :x: | javadoc | 0m 26s | [/diff-javadoc-javadoc-hadoop-tools_hadoop-azure-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2520/1/artifact/out/diff-javadoc-javadoc-hadoop-tools_hadoop-azure-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04.txt) | hadoop-tools_hadoop-azure-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 generated 5 new + 17 unchanged - 0 fixed = 22 total (was 17) | | -1 :x: | javadoc | 0m 24s | [/diff-javadoc-javadoc-hadoop-tools_hadoop-azure-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2520/1/artifact/out/diff-javadoc-javadoc-hadoop-tools_hadoop-azure-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01.txt) | hadoop-tools_hadoop-azure-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 generated 5 new + 17 unchanged - 0 fixed = 22 total (was 17) | | -1 :x: | findbugs | 1m 0s | [/new-findbugs-hadoop-tools_hadoop-azure.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2520/1/artifact/out/new-findbugs-hadoop-tools_hadoop-azure.html) | hadoop-tools/hadoop-azure generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) | _ Other Tests _ | | +1 :green_heart: | unit | 1m 30s | | hadoop-azure in the patch passed. | | -1 :x: | asflicense | 0m 34s | [/patch-asflicense-problems.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2520/1/artifact/out/patch-asflicense-problems.txt) | The patch genera
[jira] [Updated] (HADOOP-17290) ABFS: Add Identifiers to Client Request Header
[ https://issues.apache.org/jira/browse/HADOOP-17290?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-17290: Labels: abfsactive pull-request-available (was: abfsactive) > ABFS: Add Identifiers to Client Request Header > -- > > Key: HADOOP-17290 > URL: https://issues.apache.org/jira/browse/HADOOP-17290 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.3.0 >Reporter: Sumangala Patki >Priority: Major > Labels: abfsactive, pull-request-available > Time Spent: 10m > Remaining Estimate: 0h > > Adding unique values to the client request header to assist in correlating > requests -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17290) ABFS: Add Identifiers to Client Request Header
[ https://issues.apache.org/jira/browse/HADOOP-17290?focusedWorklogId=520162&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-520162 ] ASF GitHub Bot logged work on HADOOP-17290: --- Author: ASF GitHub Bot Created on: 04/Dec/20 11:26 Start Date: 04/Dec/20 11:26 Worklog Time Spent: 10m Work Description: sumangala-patki opened a new pull request #2520: URL: https://github.com/apache/hadoop/pull/2520 Adding a set of identifiers to the `X_MS_CLIENT_REQUEST_ID` header to help correlate requests. This header contains IDs concatenated into a string, and appears in the storage diagnostic logs. The clientRequestId guid used to identify requests is uniquely generated for each HTTP request. The additional identifiers will help group and/or filter requests for analysis. Of these, the clientCorrelationId is a unique identifier that can be provided by the user. The rest are generated by the driver. Two configs introduced: 1. `fs.azure.client.correlation.id` - takes String for clientCorrelationId (alphanumeric characters/hyphens, max length = 72) 2. `fs.azure.tracingcontext.format` - option for selecting format of IDs to be included in header [select enum from `SINGLE_ID_FORMAT` (existing format), `ALL_ID_FORMAT` (new default), `TWO_ID_FORMAT`]. Tests to check format of header and validate the identifiers have been added. The tests also ensure that an invalid config input does not result in request failure. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 520162) Remaining Estimate: 0h Time Spent: 10m > ABFS: Add Identifiers to Client Request Header > -- > > Key: HADOOP-17290 > URL: https://issues.apache.org/jira/browse/HADOOP-17290 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.3.0 >Reporter: Sumangala Patki >Priority: Major > Labels: abfsactive > Time Spent: 10m > Remaining Estimate: 0h > > Adding unique values to the client request header to assist in correlating > requests -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] sumangala-patki opened a new pull request #2520: HADOOP-17290. ABFS: Add Identifiers to Client Request Header
sumangala-patki opened a new pull request #2520: URL: https://github.com/apache/hadoop/pull/2520 Adding a set of identifiers to the `X_MS_CLIENT_REQUEST_ID` header to help correlate requests. This header contains IDs concatenated into a string, and appears in the storage diagnostic logs. The clientRequestId guid used to identify requests is uniquely generated for each HTTP request. The additional identifiers will help group and/or filter requests for analysis. Of these, the clientCorrelationId is a unique identifier that can be provided by the user. The rest are generated by the driver. Two configs introduced: 1. `fs.azure.client.correlation.id` - takes String for clientCorrelationId (alphanumeric characters/hyphens, max length = 72) 2. `fs.azure.tracingcontext.format` - option for selecting format of IDs to be included in header [select enum from `SINGLE_ID_FORMAT` (existing format), `ALL_ID_FORMAT` (new default), `TWO_ID_FORMAT`]. Tests to check format of header and validate the identifiers have been added. The tests also ensure that an invalid config input does not result in request failure. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-15775) [JDK9] Add missing javax.activation-api dependency
[ https://issues.apache.org/jira/browse/HADOOP-15775?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17243928#comment-17243928 ] Akira Ajisaka commented on HADOOP-15775: Go for it! > [JDK9] Add missing javax.activation-api dependency > -- > > Key: HADOOP-15775 > URL: https://issues.apache.org/jira/browse/HADOOP-15775 > Project: Hadoop Common > Issue Type: Sub-task > Components: test >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Critical > Fix For: 3.3.0 > > Attachments: HADOOP-15775.01.patch, HADOOP-15775.02.patch, > HADOOP-15775.03.patch, HADOOP-15775.04.patch, HADOOP-15775.05.patch, > HADOOP-15775.06.patch > > > Many unit tests fail due to missing java.activation module. This failure can > be fixed by adding javax.activation-api as third-party dependency. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on a change in pull request #2519: YARN-10491: Fix deprecation warnings in SLSWebApp.java
aajisaka commented on a change in pull request #2519: URL: https://github.com/apache/hadoop/pull/2519#discussion_r535999377 ## File path: hadoop-tools/hadoop-sls/src/main/java/org/apache/hadoop/yarn/sls/web/SLSWebApp.java ## @@ -54,6 +54,7 @@ @Unstable public class SLSWebApp extends HttpServlet { private static final long serialVersionUID = 1905162041950251407L; + private static final Charset UTF8 = Charset.forName("UTF-8"); Review comment: If you use `Charset.forName(String)`, you need to catch some exceptions. We can avoid catching exceptions by using `StandardCharsets.UTF-8`. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2519: YARN-10491: Fix deprecation warnings in SLSWebApp.java
hadoop-yetus commented on pull request #2519: URL: https://github.com/apache/hadoop/pull/2519#issuecomment-738701112 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 2m 32s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | | 0m 0s | [test4tests](test4tests) | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 35m 30s | | trunk passed | | +1 :green_heart: | compile | 0m 25s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | compile | 0m 24s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | checkstyle | 0m 20s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 27s | | trunk passed | | +1 :green_heart: | shadedclient | 18m 2s | | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 24s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 0m 22s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +0 :ok: | spotbugs | 0m 47s | | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 0m 44s | | trunk passed | _ Patch Compile Tests _ | | -1 :x: | mvninstall | 0m 15s | [/patch-mvninstall-hadoop-tools_hadoop-sls.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2519/1/artifact/out/patch-mvninstall-hadoop-tools_hadoop-sls.txt) | hadoop-sls in the patch failed. | | -1 :x: | compile | 0m 16s | [/patch-compile-hadoop-tools_hadoop-sls-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2519/1/artifact/out/patch-compile-hadoop-tools_hadoop-sls-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04.txt) | hadoop-sls in the patch failed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04. | | -1 :x: | javac | 0m 16s | [/patch-compile-hadoop-tools_hadoop-sls-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2519/1/artifact/out/patch-compile-hadoop-tools_hadoop-sls-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04.txt) | hadoop-sls in the patch failed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04. | | -1 :x: | compile | 0m 15s | [/patch-compile-hadoop-tools_hadoop-sls-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2519/1/artifact/out/patch-compile-hadoop-tools_hadoop-sls-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01.txt) | hadoop-sls in the patch failed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01. | | -1 :x: | javac | 0m 15s | [/patch-compile-hadoop-tools_hadoop-sls-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2519/1/artifact/out/patch-compile-hadoop-tools_hadoop-sls-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01.txt) | hadoop-sls in the patch failed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01. | | +1 :green_heart: | checkstyle | 0m 12s | | the patch passed | | -1 :x: | mvnsite | 0m 16s | [/patch-mvnsite-hadoop-tools_hadoop-sls.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2519/1/artifact/out/patch-mvnsite-hadoop-tools_hadoop-sls.txt) | hadoop-sls in the patch failed. | | +1 :green_heart: | whitespace | 0m 0s | | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 16m 57s | | patch has no errors when building and testing our client artifacts. | | -1 :x: | javadoc | 0m 18s | [/patch-javadoc-hadoop-tools_hadoop-sls-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2519/1/artifact/out/patch-javadoc-hadoop-tools_hadoop-sls-jdkUbuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04.txt) | hadoop-sls in the patch failed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04. | | -1 :x: | javadoc | 0m 19s | [/diff-javadoc-javadoc-hadoop-tools_hadoop-sls-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2519/1/artifact/out/diff-javadoc-javadoc-hadoop-tools_hadoop-sls-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01.txt) | hadoop-tools_hadoop-sls-jdkPrivateBuild-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 with JDK Private Build-1.8.0_275-8u275-b01-0
[GitHub] [hadoop] hadoop-yetus commented on pull request #2518: HDFS-15709. Socket file descriptor leak in StripedBlockChecksumRecons…
hadoop-yetus commented on pull request #2518: URL: https://github.com/apache/hadoop/pull/2518#issuecomment-738694419 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 35m 24s | | trunk passed | | +1 :green_heart: | compile | 1m 21s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | compile | 1m 15s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | checkstyle | 0m 48s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 22s | | trunk passed | | +1 :green_heart: | shadedclient | 17m 52s | | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 55s | | trunk passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 1m 23s | | trunk passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +0 :ok: | spotbugs | 3m 2s | | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 3m 0s | | trunk passed | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 9s | | the patch passed | | +1 :green_heart: | compile | 1m 12s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javac | 1m 12s | | the patch passed | | +1 :green_heart: | compile | 1m 6s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | javac | 1m 6s | | the patch passed | | +1 :green_heart: | checkstyle | 0m 39s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 10s | | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 15m 1s | | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 47s | | the patch passed with JDK Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 | | +1 :green_heart: | javadoc | 1m 24s | | the patch passed with JDK Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | +1 :green_heart: | findbugs | 3m 9s | | the patch passed | _ Other Tests _ | | -1 :x: | unit | 101m 10s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2518/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | | The patch does not generate ASF License warnings. | | | | 193m 26s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.balancer.TestBalancerWithMultipleNameNodes | | | hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks | | | hadoop.hdfs.server.datanode.TestBPOfferService | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2518/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2518 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux bbc0cbb62bb1 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / e2c1268ebd5 | | Default Java | Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.18.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_275-8u275-b01-0ubuntu1~18.04-b01 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2518/1/testReport/ | | Max. process+thread count | 4162 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2518/1/consol
[jira] [Commented] (HADOOP-15775) [JDK9] Add missing javax.activation-api dependency
[ https://issues.apache.org/jira/browse/HADOOP-15775?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17243890#comment-17243890 ] Xiaoqiao He commented on HADOOP-15775: -- Try to cherry-pick to branch-3.2 and verify at local, it passed when start ResouceManager successfully. [~aajisaka], Will backport branch-3.2 if no more other comments shortly. Thanks. > [JDK9] Add missing javax.activation-api dependency > -- > > Key: HADOOP-15775 > URL: https://issues.apache.org/jira/browse/HADOOP-15775 > Project: Hadoop Common > Issue Type: Sub-task > Components: test >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Critical > Fix For: 3.3.0 > > Attachments: HADOOP-15775.01.patch, HADOOP-15775.02.patch, > HADOOP-15775.03.patch, HADOOP-15775.04.patch, HADOOP-15775.05.patch, > HADOOP-15775.06.patch > > > Many unit tests fail due to missing java.activation module. This failure can > be fixed by adding javax.activation-api as third-party dependency. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ankit-kumar-25 opened a new pull request #2519: YARN-10491: Fix deprecation warnings in SLSWebApp.java
ankit-kumar-25 opened a new pull request #2519: URL: https://github.com/apache/hadoop/pull/2519 What? :: Fix deprecation warnings in SLSWebApp.java & TestSLSWebApp.java -- toString(URI uri) Deprecated. > Use toString(URI, Charset) instead. -- readFileToString(File file) Deprecated. > Use readFileToString(File, Charset) instead. -- @aajisaka Can you please review this? Thanks! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org