[GitHub] [hadoop] hadoop-yetus commented on issue #1515: HDDS-2171. Dangling links in test report due to incompatible realpath

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1515: HDDS-2171. Dangling links in test report 
due to incompatible realpath
URL: https://github.com/apache/hadoop/pull/1515#issuecomment-534863601
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 147 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | 0 | shelldocs | 0 | Shelldocs was not available. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 37 | hadoop-hdds in trunk failed. |
   | -1 | mvninstall | 31 | hadoop-ozone in trunk failed. |
   | +1 | mvnsite | 0 | trunk passed |
   | +1 | shadedclient | 892 | branch has no errors when building and testing 
our client artifacts. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 39 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 34 | hadoop-ozone in the patch failed. |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | shellcheck | 0 | There were no new shellcheck issues. |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 727 | patch has no errors when building and testing 
our client artifacts. |
   ||| _ Other Tests _ |
   | -1 | unit | 30 | hadoop-hdds in the patch failed. |
   | -1 | unit | 28 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 36 | The patch does not generate ASF License warnings. |
   | | | 2152 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.2 Server=19.03.2 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1515 |
   | Optional Tests | dupname asflicense mvnsite unit shellcheck shelldocs |
   | uname | Linux 7a5ecd5be026 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 3f89084 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/2/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/2/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/2/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/2/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/2/artifact/out/patch-unit-hadoop-hdds.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/2/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/2/testReport/ |
   | Max. process+thread count | 412 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone U: hadoop-ozone |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/2/console |
   | versions | git=2.7.4 maven=3.3.9 shellcheck=0.4.6 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] adoroszlai commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
adoroszlai commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#issuecomment-534851943
 
 
   > I have spotbugs 3.1.12 installed using homebrew. Yet I get the same issue.
   
   `export PATH=${PATH}:/usr/local/Cellar/spotbugs/3.1.12/libexec/bin`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16598) Backport "HADOOP-16558 [COMMON+HDFS] use protobuf-maven-plugin to generate protobuf classes" to all active branches

2019-09-24 Thread Hadoop QA (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16598?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16937400#comment-16937400
 ] 

Hadoop QA commented on HADOOP-16598:


| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 10m 
15s{color} | {color:blue} Docker mode activated. {color} |
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
| {color:green}+1{color} | {color:green} test4tests {color} | {color:green}  0m 
 0s{color} | {color:green} The patch appears to include 2 new or modified test 
files. {color} |
|| || || || {color:brown} branch-3.2 Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  2m 
18s{color} | {color:blue} Maven dependency ordering for branch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 21m 
45s{color} | {color:green} branch-3.2 passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 14m 
33s{color} | {color:green} branch-3.2 passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green}  4m  
6s{color} | {color:green} branch-3.2 passed {color} |
| {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 
54m 46s{color} | {color:green} branch has no errors when building and testing 
our client artifacts. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  3m 
29s{color} | {color:green} branch-3.2 passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m 
20s{color} | {color:blue} Maven dependency ordering for patch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  3m 
 8s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 15m 
33s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} cc {color} | {color:green} 15m 
33s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 15m 
33s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green}  4m  
0s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green}  0m 
 0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} xml {color} | {color:green}  0m  
6s{color} | {color:green} The patch has no ill-formed XML file. {color} |
| {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 
12m 15s{color} | {color:green} patch has no errors when building and testing 
our client artifacts. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  3m 
38s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} unit {color} | {color:green}  0m 
21s{color} | {color:green} hadoop-project in the patch passed. {color} |
| {color:green}+1{color} | {color:green} unit {color} | {color:green}  9m 
14s{color} | {color:green} hadoop-common in the patch passed. {color} |
| {color:green}+1{color} | {color:green} unit {color} | {color:green}  1m 
55s{color} | {color:green} hadoop-hdfs-client in the patch passed. {color} |
| {color:red}-1{color} | {color:red} unit {color} | {color:red}104m 55s{color} 
| {color:red} hadoop-hdfs in the patch failed. {color} |
| {color:green}+1{color} | {color:green} unit {color} | {color:green} 17m  
8s{color} | {color:green} hadoop-hdfs-rbf in the patch passed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green}  0m 
48s{color} | {color:green} The patch does not generate ASF License warnings. 
{color} |
| {color:black}{color} | {color:black} {color} | {color:black}243m 34s{color} | 
{color:black} {color} |
\\
\\
|| Reason || Tests ||
| Failed junit tests | hadoop.hdfs.TestLeaseRecovery2 |
|   | hadoop.hdfs.server.diskbalancer.TestDiskBalancer |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | Client=19.03.0 Server=19.03.0 Image:yetus/hadoop:63396beab41 |
| JIRA Issue | HADOOP-16598 |
| JIRA Patch URL | 
https://issues.apache.org/jira/secure/attachment/12981256/HADOOP-16598-branch-3.2.patch
 |
| Optional Tests |  dupname  asflicense  compile  javac  javadoc  mvninstall  
mvnsite  unit  shadedclient  xml  cc  |
| uname | Linux 64a3c38f1ec6 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | 

[GitHub] [hadoop] hadoop-yetus commented on issue #1519: HDDS-2174. Delete GDPR Encryption Key from metadata when a Key is deleted

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1519: HDDS-2174. Delete GDPR Encryption Key 
from metadata when a Key is deleted
URL: https://github.com/apache/hadoop/pull/1519#issuecomment-534846392
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 37 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 24 | Maven dependency ordering for branch |
   | -1 | mvninstall | 30 | hadoop-hdds in trunk failed. |
   | -1 | mvninstall | 25 | hadoop-ozone in trunk failed. |
   | -1 | compile | 20 | hadoop-hdds in trunk failed. |
   | -1 | compile | 15 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 56 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | +1 | shadedclient | 851 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 22 | hadoop-hdds in trunk failed. |
   | -1 | javadoc | 20 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 948 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 31 | hadoop-hdds in trunk failed. |
   | -1 | findbugs | 20 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 19 | Maven dependency ordering for patch |
   | -1 | mvninstall | 33 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 29 | hadoop-ozone in the patch failed. |
   | -1 | compile | 25 | hadoop-hdds in the patch failed. |
   | -1 | compile | 19 | hadoop-ozone in the patch failed. |
   | -1 | javac | 25 | hadoop-hdds in the patch failed. |
   | -1 | javac | 19 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 56 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 726 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 22 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 20 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 31 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 20 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 28 | hadoop-hdds in the patch failed. |
   | -1 | unit | 24 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 33 | The patch does not generate ASF License warnings. |
   | | | 2365 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1519 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 4fc39ac60747 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 3f89084 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1519/1/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | javac 

[GitHub] [hadoop] hadoop-yetus commented on issue #1431: HDDS-1569 Support creating multiple pipelines with same datanode

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1431: HDDS-1569 Support creating multiple 
pipelines with same datanode
URL: https://github.com/apache/hadoop/pull/1431#issuecomment-534846451
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 32 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ HDDS-1564 Compile Tests _ |
   | 0 | mvndep | 11 | Maven dependency ordering for branch |
   | -1 | mvninstall | 28 | hadoop-hdds in HDDS-1564 failed. |
   | -1 | mvninstall | 22 | hadoop-ozone in HDDS-1564 failed. |
   | -1 | compile | 17 | hadoop-hdds in HDDS-1564 failed. |
   | -1 | compile | 12 | hadoop-ozone in HDDS-1564 failed. |
   | +1 | checkstyle | 45 | HDDS-1564 passed |
   | +1 | mvnsite | 0 | HDDS-1564 passed |
   | +1 | shadedclient | 912 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 17 | hadoop-hdds in HDDS-1564 failed. |
   | -1 | javadoc | 16 | hadoop-ozone in HDDS-1564 failed. |
   | 0 | spotbugs | 992 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 27 | hadoop-hdds in HDDS-1564 failed. |
   | -1 | findbugs | 16 | hadoop-ozone in HDDS-1564 failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 13 | Maven dependency ordering for patch |
   | -1 | mvninstall | 30 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 24 | hadoop-ozone in the patch failed. |
   | -1 | compile | 19 | hadoop-hdds in the patch failed. |
   | -1 | compile | 15 | hadoop-ozone in the patch failed. |
   | -1 | javac | 19 | hadoop-hdds in the patch failed. |
   | -1 | javac | 15 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 24 | hadoop-hdds: The patch generated 0 new + 1 
unchanged - 2 fixed = 1 total (was 3) |
   | +1 | checkstyle | 26 | The patch passed checkstyle in hadoop-ozone |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 786 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 18 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 16 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 28 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 16 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 24 | hadoop-hdds in the patch failed. |
   | -1 | unit | 19 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 29 | The patch does not generate ASF License warnings. |
   | | | 2367 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/9/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1431 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 0be4f1abe9c4 4.15.0-54-generic #58-Ubuntu SMP Mon Jun 24 
10:55:24 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | HDDS-1564 / 7b5a5fe |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/9/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/9/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/9/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/9/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/9/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/9/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/9/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/9/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/9/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/9/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 

[GitHub] [hadoop] hadoop-yetus commented on issue #1431: HDDS-1569 Support creating multiple pipelines with same datanode

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1431: HDDS-1569 Support creating multiple 
pipelines with same datanode
URL: https://github.com/apache/hadoop/pull/1431#issuecomment-534846317
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 1831 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ HDDS-1564 Compile Tests _ |
   | 0 | mvndep | 20 | Maven dependency ordering for branch |
   | -1 | mvninstall | 31 | hadoop-hdds in HDDS-1564 failed. |
   | -1 | mvninstall | 27 | hadoop-ozone in HDDS-1564 failed. |
   | -1 | compile | 20 | hadoop-hdds in HDDS-1564 failed. |
   | -1 | compile | 16 | hadoop-ozone in HDDS-1564 failed. |
   | +1 | checkstyle | 61 | HDDS-1564 passed |
   | +1 | mvnsite | 0 | HDDS-1564 passed |
   | +1 | shadedclient | 848 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 22 | hadoop-hdds in HDDS-1564 failed. |
   | -1 | javadoc | 19 | hadoop-ozone in HDDS-1564 failed. |
   | 0 | spotbugs | 945 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 31 | hadoop-hdds in HDDS-1564 failed. |
   | -1 | findbugs | 20 | hadoop-ozone in HDDS-1564 failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 17 | Maven dependency ordering for patch |
   | -1 | mvninstall | 34 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 29 | hadoop-ozone in the patch failed. |
   | -1 | compile | 24 | hadoop-hdds in the patch failed. |
   | -1 | compile | 19 | hadoop-ozone in the patch failed. |
   | -1 | javac | 24 | hadoop-hdds in the patch failed. |
   | -1 | javac | 19 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 27 | hadoop-hdds: The patch generated 0 new + 1 
unchanged - 2 fixed = 1 total (was 3) |
   | +1 | checkstyle | 28 | The patch passed checkstyle in hadoop-ozone |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 695 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 20 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 20 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 31 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 20 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 28 | hadoop-hdds in the patch failed. |
   | -1 | unit | 23 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 33 | The patch does not generate ASF License warnings. |
   | | | 4124 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/8/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1431 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 39651ec46877 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | HDDS-1564 / 7b5a5fe |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/8/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/8/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/8/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/8/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/8/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/8/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/8/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/8/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/8/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/8/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 

[GitHub] [hadoop] dineshchitlangia opened a new pull request #1519: HDDS-2174. Delete GDPR Encryption Key from metadata when a Key is deleted

2019-09-24 Thread GitBox
dineshchitlangia opened a new pull request #1519: HDDS-2174. Delete GDPR 
Encryption Key from metadata when a Key is deleted
URL: https://github.com/apache/hadoop/pull/1519
 
 
   As advised, deleted GDPR related metadata from KeyInfo before moving it to 
deletedTable.
   Added/updated test.
   
   P.S. The changes in KeyManagerImpl are not really needed but made them for 
sanity & it is no harm.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16138) hadoop fs mkdir / of nonexistent abfs container raises NPE

2019-09-24 Thread Ayush Saxena (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16138?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16937377#comment-16937377
 ] 

Ayush Saxena commented on HADOOP-16138:
---

Hi [~ste...@apache.org] [~gabor.bota] 
I guess this change breaks 
{{TestHDFSCLI}} and {{TestDFSShell}}
Since it changed the text in the exception :

{code:java}
-throw new PathNotFoundException(itemParentPath.toString());
+throw new PathNotFoundException(String.format(
+"mkdir failed for path: %s. Item parent path not found: %s.",
+itemPath.toString(), itemParentPath.toString()));
   }
{code}

For reference :  
https://builds.apache.org/job/PreCommit-HDFS-Build/27958/testReport/

> hadoop fs mkdir / of nonexistent abfs container raises NPE
> --
>
> Key: HADOOP-16138
> URL: https://issues.apache.org/jira/browse/HADOOP-16138
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/azure
>Affects Versions: 3.2.0
>Reporter: Steve Loughran
>Assignee: Gabor Bota
>Priority: Minor
>
> If you try to do a mkdir on the root of a nonexistent container, you get an 
> NPE
> {code}
> hadoop fs -mkdir  abfs://contain...@abfswales1.dfs.core.windows.net/  
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] dineshchitlangia commented on issue #1519: HDDS-2174. Delete GDPR Encryption Key from metadata when a Key is deleted

2019-09-24 Thread GitBox
dineshchitlangia commented on issue #1519: HDDS-2174. Delete GDPR Encryption 
Key from metadata when a Key is deleted
URL: https://github.com/apache/hadoop/pull/1519#issuecomment-534838008
 
 
   /label ozone


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1431: HDDS-1569 Support creating multiple pipelines with same datanode

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1431: HDDS-1569 Support creating multiple 
pipelines with same datanode
URL: https://github.com/apache/hadoop/pull/1431#issuecomment-534835906
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 35 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 25 new or modified test 
files. |
   ||| _ HDDS-1564 Compile Tests _ |
   | 0 | mvndep | 74 | Maven dependency ordering for branch |
   | -1 | mvninstall | 37 | hadoop-hdds in HDDS-1564 failed. |
   | -1 | mvninstall | 29 | hadoop-ozone in HDDS-1564 failed. |
   | -1 | compile | 17 | hadoop-hdds in HDDS-1564 failed. |
   | -1 | compile | 13 | hadoop-ozone in HDDS-1564 failed. |
   | +1 | checkstyle | 58 | HDDS-1564 passed |
   | +1 | mvnsite | 0 | HDDS-1564 passed |
   | +1 | shadedclient | 940 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 18 | hadoop-hdds in HDDS-1564 failed. |
   | -1 | javadoc | 15 | hadoop-ozone in HDDS-1564 failed. |
   | 0 | spotbugs | 1022 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 29 | hadoop-hdds in HDDS-1564 failed. |
   | -1 | findbugs | 15 | hadoop-ozone in HDDS-1564 failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 24 | Maven dependency ordering for patch |
   | -1 | mvninstall | 30 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 24 | hadoop-ozone in the patch failed. |
   | -1 | compile | 20 | hadoop-hdds in the patch failed. |
   | -1 | compile | 14 | hadoop-ozone in the patch failed. |
   | -1 | javac | 20 | hadoop-hdds in the patch failed. |
   | -1 | javac | 14 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 49 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 791 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 17 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 16 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 27 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 16 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 25 | hadoop-hdds in the patch failed. |
   | -1 | unit | 19 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 30 | The patch does not generate ASF License warnings. |
   | | | 2502 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/7/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1431 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux d7ea5cf119ec 4.15.0-54-generic #58-Ubuntu SMP Mon Jun 24 
10:55:24 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | HDDS-1564 / 7b5a5fe |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/7/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/7/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/7/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/7/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/7/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/7/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/7/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/7/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/7/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/7/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/7/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1431/7/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 

[GitHub] [hadoop] timmylicheng commented on a change in pull request #1431: HDDS-1569 Support creating multiple pipelines with same datanode

2019-09-24 Thread GitBox
timmylicheng commented on a change in pull request #1431: HDDS-1569 Support 
creating multiple pipelines with same datanode
URL: https://github.com/apache/hadoop/pull/1431#discussion_r327913168
 
 

 ##
 File path: 
hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/safemode/HealthyPipelineSafeModeRule.java
 ##
 @@ -116,46 +115,46 @@ protected void process(PipelineReportFromDatanode
 // processed report event, we should not consider this pipeline report
 // from datanode again during threshold calculation.
 Preconditions.checkNotNull(pipelineReportFromDatanode);
-DatanodeDetails dnDetails = 
pipelineReportFromDatanode.getDatanodeDetails();
-if (!processedDatanodeDetails.contains(
-pipelineReportFromDatanode.getDatanodeDetails())) {
-
-  Pipeline pipeline;
-  PipelineReportsProto pipelineReport =
-  pipelineReportFromDatanode.getReport();
-
-  for (PipelineReport report : pipelineReport.getPipelineReportList()) {
-PipelineID pipelineID = PipelineID
-.getFromProtobuf(report.getPipelineID());
-try {
-  pipeline = pipelineManager.getPipeline(pipelineID);
-} catch (PipelineNotFoundException e) {
-  continue;
-}
-
-if (pipeline.getFactor() == HddsProtos.ReplicationFactor.THREE &&
-pipeline.getPipelineState() == Pipeline.PipelineState.OPEN) {
-  // If the pipeline is open state mean, all 3 datanodes are reported
-  // for this pipeline.
-  currentHealthyPipelineCount++;
-  getSafeModeMetrics().incCurrentHealthyPipelinesCount();
-}
+
+Pipeline pipeline;
+PipelineReportsProto pipelineReport =
+pipelineReportFromDatanode.getReport();
+
+for (PipelineReport report : pipelineReport.getPipelineReportList()) {
+  PipelineID pipelineID = PipelineID
+  .getFromProtobuf(report.getPipelineID());
+  try {
+pipeline = pipelineManager.getPipeline(pipelineID);
+  } catch (PipelineNotFoundException e) {
+continue;
   }
-  if (scmInSafeMode()) {
-SCMSafeModeManager.getLogger().info(
-"SCM in safe mode. Healthy pipelines reported count is {}, " +
-"required healthy pipeline reported count is {}",
-currentHealthyPipelineCount, healthyPipelineThresholdCount);
+
+  if (processedPipelineIDs.contains(pipelineID)) {
 
 Review comment:
   Good catch. Updated.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] timmylicheng commented on a change in pull request #1431: HDDS-1569 Support creating multiple pipelines with same datanode

2019-09-24 Thread GitBox
timmylicheng commented on a change in pull request #1431: HDDS-1569 Support 
creating multiple pipelines with same datanode
URL: https://github.com/apache/hadoop/pull/1431#discussion_r327913091
 
 

 ##
 File path: 
hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/pipeline/PipelinePlacementPolicy.java
 ##
 @@ -80,7 +81,19 @@ public PipelinePlacementPolicy(
*/
   @VisibleForTesting
   boolean meetCriteria(DatanodeDetails datanodeDetails, long heavyNodeLimit) {
-return (nodeManager.getPipelinesCount(datanodeDetails) <= heavyNodeLimit);
+if (heavyNodeLimit == 0) {
+  // no limit applied.
+  return true;
+}
+boolean meet = (nodeManager.getPipelinesCount(datanodeDetails)
+<= heavyNodeLimit);
+if (!meet) {
+  LOG.info("Pipeline Placement: Doesn't meet the criteria to be viable " +
+  "node: " + datanodeDetails.getUuid().toString() + " Heaviness: " +
 
 Review comment:
   Sounds good. Updated.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] timmylicheng commented on a change in pull request #1431: HDDS-1569 Support creating multiple pipelines with same datanode

2019-09-24 Thread GitBox
timmylicheng commented on a change in pull request #1431: HDDS-1569 Support 
creating multiple pipelines with same datanode
URL: https://github.com/apache/hadoop/pull/1431#discussion_r327913120
 
 

 ##
 File path: 
hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/pipeline/SCMPipelineManager.java
 ##
 @@ -158,6 +158,8 @@ public synchronized Pipeline createPipeline(
   throw idEx;
 } catch (IOException ex) {
   metrics.incNumPipelineCreationFailed();
+  LOG.error("Pipeline creation failed due to exception: "
 
 Review comment:
   Updated.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#issuecomment-534805110
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 45 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | 0 | shelldocs | 0 | Shelldocs was not available. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 31 | Maven dependency ordering for branch |
   | -1 | mvninstall | 1320 | root in trunk failed. |
   | -1 | compile | 1040 | root in trunk failed. |
   | -1 | mvnsite | 249 | root in trunk failed. |
   | +1 | shadedclient | 753 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 439 | root in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 34 | Maven dependency ordering for patch |
   | -1 | mvninstall | 1075 | root in the patch failed. |
   | -1 | mvninstall | 44 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 29 | common in the patch failed. |
   | -1 | mvninstall | 297 | container-service in the patch failed. |
   | -1 | mvninstall | 32 | server-scm in the patch failed. |
   | -1 | mvninstall | 37 | hadoop-ozone in the patch failed. |
   | -1 | mvninstall | 28 | common in the patch failed. |
   | -1 | mvninstall | 33 | csi in the patch failed. |
   | -1 | mvninstall | 31 | insight in the patch failed. |
   | -1 | mvninstall | 31 | ozone-manager in the patch failed. |
   | -1 | mvninstall | 32 | ozonefs in the patch failed. |
   | -1 | mvninstall | 32 | recon in the patch failed. |
   | -1 | mvninstall | 32 | s3gateway in the patch failed. |
   | -1 | mvninstall | 33 | tools in the patch failed. |
   | -1 | mvninstall | 28 | upgrade in the patch failed. |
   | -1 | compile | 984 | root in the patch failed. |
   | -1 | javac | 984 | root in the patch failed. |
   | -1 | mvnsite | 244 | root in the patch failed. |
   | -1 | shellcheck | 0 | The patch generated 4 new + 0 unchanged - 0 fixed = 
4 total (was 0) |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | xml | 23 | The patch has no ill-formed XML file. |
   | +1 | shadedclient | 761 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 435 | root in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 8623 | root in the patch failed. |
   | +1 | asflicense | 65 | The patch does not generate ASF License warnings. |
   | | | 17124 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.hdfs.TestDFSShell |
   |   | hadoop.cli.TestHDFSCLI |
   |   | hadoop.hdfs.server.namenode.TestDecommissioningStatus |
   |   | hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1513 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient xml shellcheck shelldocs |
   | uname | Linux afc910f89f90 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 66400c1 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/3/artifact/out/branch-mvninstall-root.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/3/artifact/out/branch-compile-root.txt
 |
   | mvnsite | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/3/artifact/out/branch-mvnsite-root.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/3/artifact/out/branch-javadoc-root.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/3/artifact/out/patch-mvninstall-root.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/3/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/3/artifact/out/patch-mvninstall-hadoop-hdds_common.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/3/artifact/out/patch-mvninstall-hadoop-hdds_container-service.txt
 |
   | mvninstall | 

[GitHub] [hadoop] hadoop-yetus commented on a change in pull request #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
hadoop-yetus commented on a change in pull request #1513: HDDS-2149. Replace 
FindBugs with SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#discussion_r327893418
 
 

 ##
 File path: hadoop-ozone/dev-support/checks/findbugs.sh
 ##
 @@ -16,16 +16,17 @@
 DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
 cd "$DIR/../../.." || exit 1
 
-mvn -B compile -fn findbugs:check -Dfindbugs.failOnError=false  -f 
pom.ozone.xml
+mvn -B compile spotbugs:spotbugs -f pom.ozone.xml
 
 REPORT_DIR=${OUTPUT_DIR:-"$DIR/../../../target/findbugs"}
 mkdir -p "$REPORT_DIR"
 REPORT_FILE="$REPORT_DIR/summary.txt"
 
 touch "$REPORT_FILE"
 
-find hadoop-ozone -name findbugsXml.xml -print0 | xargs -0 -n1 
convertXmlToText | tee -a "${REPORT_FILE}"
-find hadoop-hdds -name findbugsXml.xml -print0  | xargs -0 -n1 
convertXmlToText | tee -a "${REPORT_FILE}"
+find hadoop-hdds hadoop-ozone -name spotbugsXml.xml -print0 | xargs -0 
unionBugs -output ${REPORT_DIR}/summary.xml
+convertXmlToText ${REPORT_DIR}/summary.xml | tee -a "${REPORT_FILE}"
 
 Review comment:
   shellcheck:18: note: Double quote to prevent globbing and word splitting. 
[SC2086]
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on a change in pull request #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
hadoop-yetus commented on a change in pull request #1513: HDDS-2149. Replace 
FindBugs with SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#discussion_r327893424
 
 

 ##
 File path: hadoop-ozone/dev-support/checks/findbugs.sh
 ##
 @@ -16,16 +16,17 @@
 DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
 cd "$DIR/../../.." || exit 1
 
-mvn -B compile -fn findbugs:check -Dfindbugs.failOnError=false  -f 
pom.ozone.xml
+mvn -B compile spotbugs:spotbugs -f pom.ozone.xml
 
 REPORT_DIR=${OUTPUT_DIR:-"$DIR/../../../target/findbugs"}
 mkdir -p "$REPORT_DIR"
 REPORT_FILE="$REPORT_DIR/summary.txt"
 
 touch "$REPORT_FILE"
 
-find hadoop-ozone -name findbugsXml.xml -print0 | xargs -0 -n1 
convertXmlToText | tee -a "${REPORT_FILE}"
-find hadoop-hdds -name findbugsXml.xml -print0  | xargs -0 -n1 
convertXmlToText | tee -a "${REPORT_FILE}"
+find hadoop-hdds hadoop-ozone -name spotbugsXml.xml -print0 | xargs -0 
unionBugs -output ${REPORT_DIR}/summary.xml
+convertXmlToText ${REPORT_DIR}/summary.xml | tee -a "${REPORT_FILE}"
+convertXmlToText -html:fancy-hist.xsl ${REPORT_DIR}/summary.xml 
${REPORT_DIR}/summary.html
 
 Review comment:
   shellcheck:39: note: Double quote to prevent globbing and word splitting. 
[SC2086]
   shellcheck:65: note: Double quote to prevent globbing and word splitting. 
[SC2086]
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on a change in pull request #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
hadoop-yetus commented on a change in pull request #1513: HDDS-2149. Replace 
FindBugs with SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#discussion_r327893410
 
 

 ##
 File path: hadoop-ozone/dev-support/checks/findbugs.sh
 ##
 @@ -16,16 +16,17 @@
 DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
 cd "$DIR/../../.." || exit 1
 
-mvn -B compile -fn findbugs:check -Dfindbugs.failOnError=false  -f 
pom.ozone.xml
+mvn -B compile spotbugs:spotbugs -f pom.ozone.xml
 
 REPORT_DIR=${OUTPUT_DIR:-"$DIR/../../../target/findbugs"}
 mkdir -p "$REPORT_DIR"
 REPORT_FILE="$REPORT_DIR/summary.txt"
 
 touch "$REPORT_FILE"
 
-find hadoop-ozone -name findbugsXml.xml -print0 | xargs -0 -n1 
convertXmlToText | tee -a "${REPORT_FILE}"
-find hadoop-hdds -name findbugsXml.xml -print0  | xargs -0 -n1 
convertXmlToText | tee -a "${REPORT_FILE}"
+find hadoop-hdds hadoop-ozone -name spotbugsXml.xml -print0 | xargs -0 
unionBugs -output ${REPORT_DIR}/summary.xml
 
 Review comment:
   shellcheck:90: note: Double quote to prevent globbing and word splitting. 
[SC2086]
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16598) Backport "HADOOP-16558 [COMMON+HDFS] use protobuf-maven-plugin to generate protobuf classes" to all active branches

2019-09-24 Thread Duo Zhang (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16598?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Duo Zhang updated HADOOP-16598:
---
Attachment: HADOOP-16598-branch-3.2.patch

> Backport "HADOOP-16558 [COMMON+HDFS] use protobuf-maven-plugin to generate 
> protobuf classes" to all active branches
> ---
>
> Key: HADOOP-16598
> URL: https://issues.apache.org/jira/browse/HADOOP-16598
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Reporter: Duo Zhang
>Assignee: Duo Zhang
>Priority: Major
> Attachments: HADOOP-16598-branch-3.2.patch
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16598) Backport "HADOOP-16558 [COMMON+HDFS] use protobuf-maven-plugin to generate protobuf classes" to all active branches

2019-09-24 Thread Duo Zhang (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16598?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Duo Zhang updated HADOOP-16598:
---
Status: Patch Available  (was: In Progress)

> Backport "HADOOP-16558 [COMMON+HDFS] use protobuf-maven-plugin to generate 
> protobuf classes" to all active branches
> ---
>
> Key: HADOOP-16598
> URL: https://issues.apache.org/jira/browse/HADOOP-16598
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Reporter: Duo Zhang
>Assignee: Duo Zhang
>Priority: Major
> Attachments: HADOOP-16598-branch-3.2.patch
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] Apache9 commented on issue #1512: HADOOP-16598. Backport "HADOOP-16558 [COMMON+HDFS] use protobuf-maven…

2019-09-24 Thread GitBox
Apache9 commented on issue #1512: HADOOP-16598. Backport "HADOOP-16558 
[COMMON+HDFS] use protobuf-maven…
URL: https://github.com/apache/hadoop/pull/1512#issuecomment-534801839
 
 
   Oh, seem we do not have a jenkins file on branch-3.2... Let me upload a 
patch to the jira...


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16600) StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1

2019-09-24 Thread Duo Zhang (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Duo Zhang updated HADOOP-16600:
---
Assignee: Lisheng Sun
  Status: Patch Available  (was: Reopened)

> StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1
> -
>
> Key: HADOOP-16600
> URL: https://issues.apache.org/jira/browse/HADOOP-16600
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.1.2, 3.1.1, 3.1.0
>Reporter: Lisheng Sun
>Assignee: Lisheng Sun
>Priority: Major
> Attachments: HADOOP-16600.branch-3.1.v1.patch
>
>
> details see HADOOP-15398
> Problem: hadoop trunk compilation is failing
> Root Cause:
> compilation error is coming from 
> org.apache.hadoop.fs.s3a.commit.staging.StagingTestBase. Compilation error is 
> "The method getArgumentAt(int, Class) is undefined for the 
> type InvocationOnMock".
> StagingTestBase is using getArgumentAt(int, Class) method 
> which is not available in mockito-all 1.8.5 version. getArgumentAt(int, 
> Class) method is available only from version 2.0.0-beta
> as follow code:
> {code:java}
> InitiateMultipartUploadRequest req = invocation.getArgumentAt(
> 0, InitiateMultipartUploadRequest.class);
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] LeonGao91 opened a new pull request #1518: HDFS-14678. Allow triggerBlockReport to a specific namenode (Backport from trunk)

2019-09-24 Thread GitBox
LeonGao91 opened a new pull request #1518: HDFS-14678. Allow triggerBlockReport 
to a specific namenode (Backport from trunk)
URL: https://github.com/apache/hadoop/pull/1518
 
 
   ## NOTICE
   
   Please create an issue in ASF JIRA before opening a pull request,
   and you need to set the title of the pull request which starts with
   the corresponding JIRA issue number. (e.g. HADOOP-X. Fix a typo in YYY.)
   For more details, please see 
https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1489: HDDS-2019. Handle Set DtService of token in S3Gateway for OM HA.

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1489: HDDS-2019. Handle Set DtService of token 
in S3Gateway for OM HA.
URL: https://github.com/apache/hadoop/pull/1489#issuecomment-534799294
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 38 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 41 | hadoop-hdds in trunk failed. |
   | -1 | mvninstall | 30 | hadoop-ozone in trunk failed. |
   | -1 | compile | 22 | hadoop-hdds in trunk failed. |
   | -1 | compile | 15 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 63 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | +1 | shadedclient | 856 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 23 | hadoop-hdds in trunk failed. |
   | -1 | javadoc | 25 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 963 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 33 | hadoop-hdds in trunk failed. |
   | -1 | findbugs | 21 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 36 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 29 | hadoop-ozone in the patch failed. |
   | -1 | compile | 24 | hadoop-hdds in the patch failed. |
   | -1 | compile | 20 | hadoop-ozone in the patch failed. |
   | -1 | javac | 24 | hadoop-hdds in the patch failed. |
   | -1 | javac | 20 | hadoop-ozone in the patch failed. |
   | -0 | checkstyle | 41 | hadoop-ozone: The patch generated 2 new + 0 
unchanged - 0 fixed = 2 total (was 0) |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 708 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 22 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 19 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 31 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 21 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 29 | hadoop-hdds in the patch failed. |
   | -1 | unit | 24 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 32 | The patch does not generate ASF License warnings. |
   | | | 2355 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1489 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux e8ede916717d 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / a346381 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | javac | 

[GitHub] [hadoop] bharatviswa504 commented on a change in pull request #1489: HDDS-2019. Handle Set DtService of token in S3Gateway for OM HA.

2019-09-24 Thread GitBox
bharatviswa504 commented on a change in pull request #1489: HDDS-2019. Handle 
Set DtService of token in S3Gateway for OM HA.
URL: https://github.com/apache/hadoop/pull/1489#discussion_r327880512
 
 

 ##
 File path: 
hadoop-ozone/s3gateway/src/main/java/org/apache/hadoop/ozone/s3/OzoneServiceProvider.java
 ##
 @@ -34,13 +41,45 @@
 
   private Text omServiceAdd;
 
+  private String omserviceID;
+
   @Inject
   private OzoneConfiguration conf;
 
   @PostConstruct
   public void init() {
-omServiceAdd = SecurityUtil.buildTokenService(OmUtils.
-getOmAddressForClients(conf));
+Collection serviceIdList =
+conf.getTrimmedStringCollection(OZONE_OM_SERVICE_IDS_KEY);
+if (serviceIdList.size() == 0) {
+  // Non-HA cluster
+  omServiceAdd = SecurityUtil.buildTokenService(OmUtils.
+  getOmAddressForClients(conf));
+} else {
+  // HA cluster
+  Collection serviceIds =
 
 Review comment:
   fixed it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1517: HDDS-2169

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1517: HDDS-2169
URL: https://github.com/apache/hadoop/pull/1517#issuecomment-534790987
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 98 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 11 | Maven dependency ordering for branch |
   | -1 | mvninstall | 29 | hadoop-hdds in trunk failed. |
   | -1 | mvninstall | 24 | hadoop-ozone in trunk failed. |
   | -1 | compile | 18 | hadoop-hdds in trunk failed. |
   | -1 | compile | 13 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 49 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | +1 | shadedclient | 919 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 18 | hadoop-hdds in trunk failed. |
   | -1 | javadoc | 16 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 1000 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 27 | hadoop-hdds in trunk failed. |
   | -1 | findbugs | 16 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 15 | Maven dependency ordering for patch |
   | -1 | mvninstall | 35 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 29 | hadoop-ozone in the patch failed. |
   | -1 | compile | 21 | hadoop-hdds in the patch failed. |
   | -1 | compile | 16 | hadoop-ozone in the patch failed. |
   | -1 | javac | 21 | hadoop-hdds in the patch failed. |
   | -1 | javac | 16 | hadoop-ozone in the patch failed. |
   | -0 | checkstyle | 25 | hadoop-hdds: The patch generated 8 new + 0 
unchanged - 0 fixed = 8 total (was 0) |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 767 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 18 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 17 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 28 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 16 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 24 | hadoop-hdds in the patch failed. |
   | -1 | unit | 20 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 28 | The patch does not generate ASF License warnings. |
   | | | 2450 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.0 Server=19.03.0 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1517/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1517 |
   | JIRA Issue | HDDS-2169 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 655817f52c57 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 6917754 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1517/1/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1517/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1517/1/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1517/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1517/1/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1517/1/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1517/1/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1517/1/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1517/1/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1517/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1517/1/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1517/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 

[GitHub] [hadoop] bharatviswa504 commented on issue #1489: HDDS-2019. Handle Set DtService of token in S3Gateway for OM HA.

2019-09-24 Thread GitBox
bharatviswa504 commented on issue #1489: HDDS-2019. Handle Set DtService of 
token in S3Gateway for OM HA.
URL: https://github.com/apache/hadoop/pull/1489#issuecomment-534790961
 
 
   Thank You @xiaoyuyao for the review.
   I have addressed review comments.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] bharatviswa504 commented on a change in pull request #1489: HDDS-2019. Handle Set DtService of token in S3Gateway for OM HA.

2019-09-24 Thread GitBox
bharatviswa504 commented on a change in pull request #1489: HDDS-2019. Handle 
Set DtService of token in S3Gateway for OM HA.
URL: https://github.com/apache/hadoop/pull/1489#discussion_r327880432
 
 

 ##
 File path: 
hadoop-ozone/s3gateway/src/main/java/org/apache/hadoop/ozone/s3/util/OzoneS3Util.java
 ##
 @@ -33,4 +42,29 @@ public static String getVolumeName(String userName) {
 Objects.requireNonNull(userName);
 return DigestUtils.md5Hex(userName);
   }
+
+  /**
+   * Generate service Name for token.
+   * @param configuration
+   * @param serviceId - ozone manager service ID
+   * @param omNodeIds - list of node ids for the given OM service.
+   * @return service Name.
+   */
+  public static String buildServiceNameForToken(
 
 Review comment:
   done


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] bharatviswa504 commented on a change in pull request #1489: HDDS-2019. Handle Set DtService of token in S3Gateway for OM HA.

2019-09-24 Thread GitBox
bharatviswa504 commented on a change in pull request #1489: HDDS-2019. Handle 
Set DtService of token in S3Gateway for OM HA.
URL: https://github.com/apache/hadoop/pull/1489#discussion_r327880447
 
 

 ##
 File path: 
hadoop-ozone/s3gateway/src/main/java/org/apache/hadoop/ozone/s3/OzoneServiceProvider.java
 ##
 @@ -34,13 +41,45 @@
 
   private Text omServiceAdd;
 
+  private String omserviceID;
+
   @Inject
   private OzoneConfiguration conf;
 
   @PostConstruct
   public void init() {
-omServiceAdd = SecurityUtil.buildTokenService(OmUtils.
-getOmAddressForClients(conf));
+Collection serviceIdList =
+conf.getTrimmedStringCollection(OZONE_OM_SERVICE_IDS_KEY);
+if (serviceIdList.size() == 0) {
+  // Non-HA cluster
+  omServiceAdd = SecurityUtil.buildTokenService(OmUtils.
 
 Review comment:
   Existing one. Fixed in this PR.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] bharatviswa504 commented on a change in pull request #1489: HDDS-2019. Handle Set DtService of token in S3Gateway for OM HA.

2019-09-24 Thread GitBox
bharatviswa504 commented on a change in pull request #1489: HDDS-2019. Handle 
Set DtService of token in S3Gateway for OM HA.
URL: https://github.com/apache/hadoop/pull/1489#discussion_r327880447
 
 

 ##
 File path: 
hadoop-ozone/s3gateway/src/main/java/org/apache/hadoop/ozone/s3/OzoneServiceProvider.java
 ##
 @@ -34,13 +41,45 @@
 
   private Text omServiceAdd;
 
+  private String omserviceID;
+
   @Inject
   private OzoneConfiguration conf;
 
   @PostConstruct
   public void init() {
-omServiceAdd = SecurityUtil.buildTokenService(OmUtils.
-getOmAddressForClients(conf));
+Collection serviceIdList =
+conf.getTrimmedStringCollection(OZONE_OM_SERVICE_IDS_KEY);
+if (serviceIdList.size() == 0) {
+  // Non-HA cluster
+  omServiceAdd = SecurityUtil.buildTokenService(OmUtils.
 
 Review comment:
   done


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer edited a comment on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
anuengineer edited a comment on issue #1513: HDDS-2149. Replace FindBugs with 
SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#issuecomment-534774435
 
 
   yes, it would be nice to get it working locally.
   
   I have spotbugs 3.1.12 installed using homebrew. Yet I get the same issue.
   `
   xargs: unionBugs: No such file or directory
   
   ./hadoop-ozone/dev-support/checks/findbugs.sh: line 28: convertXmlToText: 
command not found
   
   ./hadoop-ozone/dev-support/checks/findbugs.sh: line 29: convertXmlToText: 
command not found`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] bharatviswa504 merged pull request #1509: HDDS-2168. TestOzoneManagerDoubleBufferWithOMResponse sometimes fails…

2019-09-24 Thread GitBox
bharatviswa504 merged pull request #1509: HDDS-2168. 
TestOzoneManagerDoubleBufferWithOMResponse sometimes fails…
URL: https://github.com/apache/hadoop/pull/1509
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] bharatviswa504 commented on issue #1509: HDDS-2168. TestOzoneManagerDoubleBufferWithOMResponse sometimes fails…

2019-09-24 Thread GitBox
bharatviswa504 commented on issue #1509: HDDS-2168. 
TestOzoneManagerDoubleBufferWithOMResponse sometimes fails…
URL: https://github.com/apache/hadoop/pull/1509#issuecomment-534786763
 
 
   I have committed this to trunk.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] vivekratnavel commented on issue #1509: HDDS-2168. TestOzoneManagerDoubleBufferWithOMResponse sometimes fails…

2019-09-24 Thread GitBox
vivekratnavel commented on issue #1509: HDDS-2168. 
TestOzoneManagerDoubleBufferWithOMResponse sometimes fails…
URL: https://github.com/apache/hadoop/pull/1509#issuecomment-534786661
 
 
   The integration test failures are not related to the patch.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] szetszwo opened a new pull request #1517: HDDS-2169

2019-09-24 Thread GitBox
szetszwo opened a new pull request #1517: HDDS-2169
URL: https://github.com/apache/hadoop/pull/1517
 
 
   Add some tests.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
anuengineer commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#issuecomment-534774435
 
 
   yes, it would be nice to get it working locally.
   
   I have spotbugs 3.1.12 installed using homebrew. Yet I get the same issue.
   `xargs: unionBugs: No such file or directory
   ./hadoop-ozone/dev-support/checks/findbugs.sh: line 28: convertXmlToText: 
command not found
   ./hadoop-ozone/dev-support/checks/findbugs.sh: line 29: convertXmlToText: 
command not found`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#issuecomment-534769583
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 38 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | 0 | shelldocs | 0 | Shelldocs was not available. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 73 | Maven dependency ordering for branch |
   | -1 | mvninstall | 1099 | root in trunk failed. |
   | -1 | compile | 1046 | root in trunk failed. |
   | -1 | mvnsite | 274 | root in trunk failed. |
   | +1 | shadedclient | 804 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 450 | root in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 34 | Maven dependency ordering for patch |
   | -1 | mvninstall | 1073 | root in the patch failed. |
   | -1 | mvninstall | 44 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 29 | common in the patch failed. |
   | -1 | mvninstall | 31 | container-service in the patch failed. |
   | -1 | mvninstall | 34 | server-scm in the patch failed. |
   | -1 | mvninstall | 40 | hadoop-ozone in the patch failed. |
   | -1 | mvninstall | 27 | common in the patch failed. |
   | -1 | mvninstall | 34 | csi in the patch failed. |
   | -1 | mvninstall | 35 | insight in the patch failed. |
   | -1 | mvninstall | 31 | ozone-manager in the patch failed. |
   | -1 | mvninstall | 32 | ozonefs in the patch failed. |
   | -1 | mvninstall | 33 | recon in the patch failed. |
   | -1 | mvninstall | 32 | s3gateway in the patch failed. |
   | -1 | mvninstall | 33 | tools in the patch failed. |
   | -1 | mvninstall | 28 | upgrade in the patch failed. |
   | -1 | compile | 991 | root in the patch failed. |
   | -1 | javac | 991 | root in the patch failed. |
   | -1 | mvnsite | 250 | root in the patch failed. |
   | +1 | shellcheck | 1 | There were no new shellcheck issues. |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | xml | 23 | The patch has no ill-formed XML file. |
   | +1 | shadedclient | 766 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 443 | root in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 8720 | root in the patch failed. |
   | +1 | asflicense | 73 | The patch does not generate ASF License warnings. |
   | | | 16901 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.hdfs.TestDFSShell |
   |   | hadoop.hdfs.server.namenode.ha.TestBootstrapAliasmap |
   |   | hadoop.cli.TestHDFSCLI |
   |   | hadoop.hdfs.TestMultipleNNPortQOP |
   |   | hadoop.hdfs.tools.TestDFSZKFailoverController |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1513 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient xml shellcheck shelldocs |
   | uname | Linux e86a421dc237 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / afa1006 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/2/artifact/out/branch-mvninstall-root.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/2/artifact/out/branch-compile-root.txt
 |
   | mvnsite | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/2/artifact/out/branch-mvnsite-root.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/2/artifact/out/branch-javadoc-root.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/2/artifact/out/patch-mvninstall-root.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/2/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/2/artifact/out/patch-mvninstall-hadoop-hdds_common.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/2/artifact/out/patch-mvninstall-hadoop-hdds_container-service.txt
 |
   | mvninstall | 

[GitHub] [hadoop] bharatviswa504 commented on a change in pull request #1486: HDDS-2158. Fixing Json Injection Issue in JsonUtils.

2019-09-24 Thread GitBox
bharatviswa504 commented on a change in pull request #1486: HDDS-2158. Fixing 
Json Injection Issue in JsonUtils.
URL: https://github.com/apache/hadoop/pull/1486#discussion_r327852770
 
 

 ##
 File path: 
hadoop-ozone/ozone-manager/src/main/java/org/apache/hadoop/ozone/web/ozShell/bucket/AddAclBucketHandler.java
 ##
 @@ -92,8 +92,9 @@ public Void call() throws Exception {
 boolean result = client.getObjectStore().addAcl(obj,
 OzoneAcl.parseAcl(acl));
 
-System.out.printf("%s%n", JsonUtils.toJsonStringWithDefaultPrettyPrinter(
-JsonUtils.toJsonString("Acl set successfully: " + result)));
+System.out.printf("%s%n", "Acl set successfully: " +
+JsonUtils.toJsonStringWithDefaultPrettyPrinter(result));
 
 Review comment:
   Here the result is true/false, we can directly print. Do we need 
toJsonStringWithDefaultPrettyPrinter here? Previously this was called with Acl 
set successfully: + result. But now just result, so is it okay if we directly 
use result to print?
   
   Same comment for all AclHandler classes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1486: HDDS-2158. Fixing Json Injection Issue in JsonUtils.

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1486: HDDS-2158. Fixing Json Injection Issue 
in JsonUtils.
URL: https://github.com/apache/hadoop/pull/1486#issuecomment-534752541
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 73 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 42 | Maven dependency ordering for branch |
   | -1 | mvninstall | 35 | hadoop-hdds in trunk failed. |
   | -1 | mvninstall | 24 | hadoop-ozone in trunk failed. |
   | -1 | compile | 18 | hadoop-hdds in trunk failed. |
   | -1 | compile | 13 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 52 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | +1 | shadedclient | 993 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 22 | hadoop-hdds in trunk failed. |
   | -1 | javadoc | 20 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 1088 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 31 | hadoop-hdds in trunk failed. |
   | -1 | findbugs | 16 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 26 | Maven dependency ordering for patch |
   | -1 | mvninstall | 33 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 25 | hadoop-ozone in the patch failed. |
   | -1 | compile | 20 | hadoop-hdds in the patch failed. |
   | -1 | compile | 15 | hadoop-ozone in the patch failed. |
   | -1 | javac | 20 | hadoop-hdds in the patch failed. |
   | -1 | javac | 15 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 57 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 785 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 18 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 15 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 28 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 16 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 24 | hadoop-hdds in the patch failed. |
   | -1 | unit | 18 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 29 | The patch does not generate ASF License warnings. |
   | | | 2570 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.0 Server=19.03.0 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1486 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 885b88adb895 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 66400c1 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/2/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/2/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/2/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/2/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/2/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/2/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/2/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/2/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/2/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/2/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/2/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/2/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 

[GitHub] [hadoop] hadoop-yetus commented on issue #1511: HDDS-2162. Make Kerberos related configuration support HA style config.

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1511: HDDS-2162. Make Kerberos related 
configuration support HA style config.
URL: https://github.com/apache/hadoop/pull/1511#issuecomment-534750574
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 79 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 2 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 32 | Maven dependency ordering for branch |
   | -1 | mvninstall | 33 | hadoop-hdds in trunk failed. |
   | -1 | mvninstall | 26 | hadoop-ozone in trunk failed. |
   | -1 | compile | 19 | hadoop-hdds in trunk failed. |
   | -1 | compile | 13 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 61 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | +1 | shadedclient | 952 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 19 | hadoop-hdds in trunk failed. |
   | -1 | javadoc | 18 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 1044 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 30 | hadoop-hdds in trunk failed. |
   | -1 | findbugs | 19 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 15 | Maven dependency ordering for patch |
   | -1 | mvninstall | 31 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 28 | hadoop-ozone in the patch failed. |
   | -1 | compile | 20 | hadoop-hdds in the patch failed. |
   | -1 | compile | 15 | hadoop-ozone in the patch failed. |
   | -1 | javac | 20 | hadoop-hdds in the patch failed. |
   | -1 | javac | 15 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 54 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 786 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 18 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 17 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 27 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 17 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 25 | hadoop-hdds in the patch failed. |
   | -1 | unit | 19 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 29 | The patch does not generate ASF License warnings. |
   | | | 2514 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.2 Server=19.03.2 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1511 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux cff901010f89 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 66400c1 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1511/2/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | javac 

[jira] [Commented] (HADOOP-9747) Reduce unnecessary UGI synchronization

2019-09-24 Thread Maxim Kolesnikov (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-9747?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16937184#comment-16937184
 ] 

Maxim Kolesnikov commented on HADOOP-9747:
--

Hello [~daryn], [~kihwal],

I see there was a long discussion on backporting this fix to branch-2, but it 
did never happen.

May I ask why? Were there any major issues preventing backporting? Is there any 
work left behind, which may be useful to proceed with backporting? From the 
thread I see that all patches were applied to Hadoop 3 only.

Thank you in advance.

> Reduce unnecessary UGI synchronization
> --
>
> Key: HADOOP-9747
> URL: https://issues.apache.org/jira/browse/HADOOP-9747
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 0.23.0, 2.0.0-alpha, 3.0.0-alpha1
>Reporter: Daryn Sharp
>Assignee: Daryn Sharp
>Priority: Critical
> Fix For: 3.1.0, 3.0.3
>
> Attachments: HADOOP-9747-trunk-03.patch, HADOOP-9747-trunk-04.patch, 
> HADOOP-9747-trunk.01.patch, HADOOP-9747-trunk.02.patch, 
> HADOOP-9747.2.branch-2.patch, HADOOP-9747.2.trunk.patch, 
> HADOOP-9747.branch-2.patch, HADOOP-9747.trunk.patch
>
>
> Jstacks of heavily loaded NNs show up to dozens of threads blocking in the 
> UGI.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #568: HADOOP-15691 Add PathCapabilities to FS and FC to complement StreamCapabilities

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #568: HADOOP-15691 Add PathCapabilities to FS 
and FC to complement StreamCapabilities
URL: https://github.com/apache/hadoop/pull/568#issuecomment-534734470
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 70 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 9 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 69 | Maven dependency ordering for branch |
   | +1 | mvninstall | 1218 | trunk passed |
   | +1 | compile | 1192 | trunk passed |
   | +1 | checkstyle | 193 | trunk passed |
   | +1 | mvnsite | 298 | trunk passed |
   | +1 | shadedclient | 1356 | branch has no errors when building and testing 
our client artifacts. |
   | +1 | javadoc | 238 | trunk passed |
   | 0 | spotbugs | 45 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 493 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 23 | Maven dependency ordering for patch |
   | +1 | mvninstall | 207 | the patch passed |
   | +1 | compile | 1068 | the patch passed |
   | -1 | javac | 1068 | root generated 1 new + 1838 unchanged - 0 fixed = 1839 
total (was 1838) |
   | -0 | checkstyle | 180 | root: The patch generated 16 new + 600 unchanged - 
0 fixed = 616 total (was 600) |
   | +1 | mvnsite | 289 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 811 | patch has no errors when building and testing 
our client artifacts. |
   | +1 | javadoc | 251 | the patch passed |
   | +1 | findbugs | 540 | the patch passed |
   ||| _ Other Tests _ |
   | +1 | unit | 577 | hadoop-common in the patch passed. |
   | +1 | unit | 126 | hadoop-hdfs-client in the patch passed. |
   | +1 | unit | 290 | hadoop-hdfs-httpfs in the patch passed. |
   | +1 | unit | 88 | hadoop-aws in the patch passed. |
   | +1 | unit | 83 | hadoop-azure in the patch passed. |
   | -1 | unit | 59 | hadoop-azure-datalake in the patch failed. |
   | +1 | asflicense | 46 | The patch does not generate ASF License warnings. |
   | | | 9598 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.fs.adl.live.TestAdlSdkConfiguration |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.2 Server=19.03.2 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-568/9/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/568 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux c1b2c13bf9e5 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / afa1006 |
   | Default Java | 1.8.0_222 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-568/9/artifact/out/diff-compile-javac-root.txt
 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-568/9/artifact/out/diff-checkstyle-root.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-568/9/artifact/out/patch-unit-hadoop-tools_hadoop-azure-datalake.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-568/9/testReport/ |
   | Max. process+thread count | 1346 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common 
hadoop-hdfs-project/hadoop-hdfs-client hadoop-hdfs-project/hadoop-hdfs-httpfs 
hadoop-tools/hadoop-aws hadoop-tools/hadoop-azure 
hadoop-tools/hadoop-azure-datalake U: . |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-568/9/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #568: HADOOP-15691 Add PathCapabilities to FS and FC to complement StreamCapabilities

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #568: HADOOP-15691 Add PathCapabilities to FS 
and FC to complement StreamCapabilities
URL: https://github.com/apache/hadoop/pull/568#issuecomment-534732089
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 81 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 9 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 74 | Maven dependency ordering for branch |
   | +1 | mvninstall | 1214 | trunk passed |
   | +1 | compile | 1075 | trunk passed |
   | +1 | checkstyle | 179 | trunk passed |
   | +1 | mvnsite | 299 | trunk passed |
   | +1 | shadedclient | 1318 | branch has no errors when building and testing 
our client artifacts. |
   | +1 | javadoc | 241 | trunk passed |
   | 0 | spotbugs | 49 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 529 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 23 | Maven dependency ordering for patch |
   | +1 | mvninstall | 218 | the patch passed |
   | +1 | compile | 1037 | the patch passed |
   | -1 | javac | 1037 | root generated 1 new + 1838 unchanged - 0 fixed = 1839 
total (was 1838) |
   | -0 | checkstyle | 180 | root: The patch generated 16 new + 598 unchanged - 
0 fixed = 614 total (was 598) |
   | +1 | mvnsite | 290 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 803 | patch has no errors when building and testing 
our client artifacts. |
   | +1 | javadoc | 249 | the patch passed |
   | +1 | findbugs | 522 | the patch passed |
   ||| _ Other Tests _ |
   | +1 | unit | 592 | hadoop-common in the patch passed. |
   | +1 | unit | 129 | hadoop-hdfs-client in the patch passed. |
   | +1 | unit | 289 | hadoop-hdfs-httpfs in the patch passed. |
   | +1 | unit | 79 | hadoop-aws in the patch passed. |
   | +1 | unit | 89 | hadoop-azure in the patch passed. |
   | -1 | unit | 61 | hadoop-azure-datalake in the patch failed. |
   | +1 | asflicense | 48 | The patch does not generate ASF License warnings. |
   | | | 9460 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.fs.adl.live.TestAdlSdkConfiguration |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.2 Server=19.03.2 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-568/8/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/568 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 5160d3d3aa6c 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / afa1006 |
   | Default Java | 1.8.0_222 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-568/8/artifact/out/diff-compile-javac-root.txt
 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-568/8/artifact/out/diff-checkstyle-root.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-568/8/artifact/out/patch-unit-hadoop-tools_hadoop-azure-datalake.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-568/8/testReport/ |
   | Max. process+thread count | 1375 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common 
hadoop-hdfs-project/hadoop-hdfs-client hadoop-hdfs-project/hadoop-hdfs-httpfs 
hadoop-tools/hadoop-aws hadoop-tools/hadoop-azure 
hadoop-tools/hadoop-azure-datalake U: . |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-568/8/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] adoroszlai commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
adoroszlai commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#issuecomment-534726118
 
 
   > Also on my Mac, I get this error --
   > `xargs: convertXmlToText: No such file or directory`
   
   `convertXmlToText` is part of 
FindBugs/[SpotBugs](https://github.com/spotbugs/spotbugs/blob/master/spotbugs/src/scripts/standard/convertXmlToText).
  It is installed in the docker image used by CI builds (I used the same for 
testing).  Would be nice to get it working locally.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] adoroszlai commented on a change in pull request #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
adoroszlai commented on a change in pull request #1513: HDDS-2149. Replace 
FindBugs with SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#discussion_r327807129
 
 

 ##
 File path: hadoop-ozone/dev-support/checks/findbugs.sh
 ##
 @@ -16,16 +16,15 @@
 DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
 cd "$DIR/../../.." || exit 1
 
-mvn -B compile -fn findbugs:check -Dfindbugs.failOnError=false  -f 
pom.ozone.xml
+mvn -B compile -fn spotbugs:check -Dspotbugs.failOnError=false -f pom.ozone.xml
 
 Review comment:
   Based on [this 
doc](https://spotbugs.readthedocs.io/en/latest/maven.html#goals-of-spotbugs-maven-plugin)
 I think `spotbugs:spotbugs` is the same as `spotbugs:check 
-Dspotbugs.failOnError=false`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
anuengineer commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#issuecomment-534723820
 
 
   Also on my Mac, I get this error -- 
   `xargs: convertXmlToText: No such file or directory`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] sahilTakiar commented on issue #963: HDFS-14564: Add libhdfs APIs for readFully; add readFully to ByteBufferPositionedReadable

2019-09-24 Thread GitBox
sahilTakiar commented on issue #963: HDFS-14564: Add libhdfs APIs for 
readFully; add readFully to ByteBufferPositionedReadable
URL: https://github.com/apache/hadoop/pull/963#issuecomment-534719995
 
 
   @jojochuang @smengcl any other comments?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-24 Thread GitBox
anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r327800711
 
 

 ##
 File path: 
hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/pipeline/SCMPipelineMetrics.java
 ##
 @@ -94,6 +96,7 @@ public void getMetrics(MetricsCollector collector, boolean 
all) {
   }
 
   void createPerPipelineMetrics(Pipeline pipeline) {
+System.out.println("add pipeline " +  pipeline.getId() + " to metrics 
map");
 
 Review comment:
   Debug statement?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-24 Thread GitBox
anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r327797492
 
 

 ##
 File path: 
hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/pipeline/RatisPipelineProvider.java
 ##
 @@ -155,12 +147,25 @@ public Pipeline create(ReplicationFactor factor) throws 
IOException {
 
 Pipeline pipeline = Pipeline.newBuilder()
 .setId(PipelineID.randomId())
-.setState(PipelineState.OPEN)
+.setState(PipelineState.ALLOCATED)
 .setType(ReplicationType.RATIS)
 .setFactor(factor)
 .setNodes(dns)
 .build();
-initializePipeline(pipeline);
+
+// Send command to datanode to create pipeline
+final CreatePipelineCommand createCommand =
+new CreatePipelineCommand(pipeline.getId(), pipeline.getType(),
+factor, dns);
+
+dns.stream().forEach(node -> {
 
 Review comment:
   I see that we have done exactly the same as the current code; so no issues.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-24 Thread GitBox
anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r327797006
 
 

 ##
 File path: 
hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/pipeline/RatisPipelineProvider.java
 ##
 @@ -155,12 +147,25 @@ public Pipeline create(ReplicationFactor factor) throws 
IOException {
 
 Pipeline pipeline = Pipeline.newBuilder()
 .setId(PipelineID.randomId())
-.setState(PipelineState.OPEN)
+.setState(PipelineState.ALLOCATED)
 .setType(ReplicationType.RATIS)
 .setFactor(factor)
 .setNodes(dns)
 .build();
-initializePipeline(pipeline);
+
+// Send command to datanode to create pipeline
+final CreatePipelineCommand createCommand =
+new CreatePipelineCommand(pipeline.getId(), pipeline.getType(),
+factor, dns);
+
+dns.stream().forEach(node -> {
 
 Review comment:
   Just a thought, probably need @nandakumar131  to weigh in too. 
   1. Since we are using the Ratis client in the DN's we might be able to get 
away with posting this command only to any one of the data nodes.
   2. If we are posting this command to all DNs, we must be prepared to handle 
the fact that 2 out of 3 creates in the DN will fail. 
   3. You current code is correct, since the createPipeline inside the DN -- 
handles the fact that createpipeline can fail. 
   
   I am just flagging that information here, so that next code reviewer can 
also reflect upon this. No work or change needed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-24 Thread GitBox
anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r327794704
 
 

 ##
 File path: 
hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/events/SCMEvents.java
 ##
 @@ -196,6 +198,14 @@
   public static final TypedEvent SAFE_MODE_STATUS =
   new TypedEvent<>(SafeModeStatus.class);
 
+  /**
+   * This event is triggered by CommandStatusReportHandler whenever a
+   * status for CreatePipeline SCMCommand is received.
+   */
+  public static final TypedEvent
+  CREATE_PIPELINE_STATUS =
+  new TypedEvent<>(CreatePipelineStatus.class, "Create_Pipeline_Status");
+
 
 Review comment:
   we don't need a destroyPipeline since it is not used any where ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-24 Thread GitBox
anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r327794178
 
 

 ##
 File path: 
hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/command/CommandStatusReportHandler.java
 ##
 @@ -55,13 +53,21 @@ public void onMessage(CommandStatusReportFromDatanode 
report,
 cmdStatusList.forEach(cmdStatus -> {
   LOGGER.trace("Emitting command status for id:{} type: {}", cmdStatus
   .getCmdId(), cmdStatus.getType());
-  if (cmdStatus.getType() == SCMCommandProto.Type.deleteBlocksCommand) {
+  switch (cmdStatus.getType()) {
+  case deleteBlocksCommand:
 if (cmdStatus.getStatus() == CommandStatus.Status.EXECUTED) {
   publisher.fireEvent(SCMEvents.DELETE_BLOCK_STATUS,
   new DeleteBlockStatus(cmdStatus));
 }
-  } else {
-LOGGER.debug("CommandStatus of type:{} not handled in " +
+break;
+  case createPipelineCommand:
+if (cmdStatus.getStatus() != CommandStatus.Status.PENDING) {
+  publisher.fireEvent(SCMEvents.CREATE_PIPELINE_STATUS,
+  new CreatePipelineStatus(cmdStatus));
+}
 
 Review comment:
   Do we need a handler for destoryPipeline?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-24 Thread GitBox
anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r327792833
 
 

 ##
 File path: 
hadoop-hdds/container-service/src/main/java/org/apache/hadoop/ozone/protocol/commands/CreatePipelineCommand.java
 ##
 @@ -0,0 +1,100 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with this
+ * work for additional information regarding copyright ownership.  The ASF
+ * licenses this file to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ * 
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+ * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+ * License for the specific language governing permissions and limitations 
under
+ * the License.
+ */
+package org.apache.hadoop.ozone.protocol.commands;
+
+import com.google.common.base.Preconditions;
+import org.apache.hadoop.hdds.protocol.DatanodeDetails;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.CreatePipelineCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.SCMCommandProto;
+import org.apache.hadoop.hdds.scm.pipeline.PipelineID;
+import org.apache.hadoop.hdds.protocol.proto.HddsProtos.ReplicationType;
+import org.apache.hadoop.hdds.protocol.proto.HddsProtos.ReplicationFactor;
+
+import java.util.List;
+import java.util.stream.Collectors;
+
+/**
+ * Asks datanode to create a pipeline.
+ */
+public class CreatePipelineCommand
+extends SCMCommand {
+
+  private final PipelineID pipelineID;
+  private final ReplicationFactor factor;
+  private final ReplicationType type;
+  private final List nodelist;
+
+  public CreatePipelineCommand(final PipelineID pipelineID,
+  final ReplicationType type, final ReplicationFactor factor,
+  final List datanodeList) {
+super();
+this.pipelineID = pipelineID;
+this.factor = factor;
+this.type = type;
+this.nodelist = datanodeList;
+  }
+
+  public CreatePipelineCommand(long cmdId, final PipelineID pipelineID,
+  final ReplicationType type, final ReplicationFactor factor,
+  final List datanodeList) {
+super(cmdId);
+this.pipelineID = pipelineID;
+this.factor = factor;
+this.type = type;
+this.nodelist = datanodeList;
+  }
+
+  /**
+   * Returns the type of this command.
+   *
+   * @return Type
+   */
+  @Override
+  public SCMCommandProto.Type getType() {
+return SCMCommandProto.Type.createPipelineCommand;
+  }
+
+  @Override
+  public CreatePipelineCommandProto getProto() {
+return CreatePipelineCommandProto.newBuilder()
+.setCmdId(getId())
+.setPipelineID(pipelineID.getProtobuf())
 
 Review comment:
   Since we set the pipeline ID, if the pipeline exists it might be easier to 
detect it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-24 Thread GitBox
anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r327792351
 
 

 ##
 File path: 
hadoop-hdds/container-service/src/main/java/org/apache/hadoop/ozone/container/common/statemachine/commandhandler/CreatePipelineCommandHandler.java
 ##
 @@ -0,0 +1,228 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with this
+ * work for additional information regarding copyright ownership.  The ASF
+ * licenses this file to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ * 
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+ * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+ * License for the specific language governing permissions and limitations 
under
+ * the License.
+ */
+package org.apache.hadoop.ozone.container.common.statemachine.commandhandler;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.hdds.protocol.DatanodeDetails;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.CreatePipelineCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.SCMCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.CreatePipelineACKProto;
+import org.apache.hadoop.hdds.ratis.RatisHelper;
+import org.apache.hadoop.hdds.scm.ScmConfigKeys;
+import org.apache.hadoop.hdds.scm.client.HddsClientUtils;
+import org.apache.hadoop.hdds.scm.pipeline.PipelineID;
+import org.apache.hadoop.hdds.security.x509.SecurityConfig;
+import org.apache.hadoop.hdds.security.x509.certificate.client
+.CertificateClient;
+import org.apache.hadoop.io.MultipleIOException;
+import org.apache.hadoop.ozone.container.common.statemachine
+.SCMConnectionManager;
+import org.apache.hadoop.ozone.container.common.statemachine.StateContext;
+import org.apache.hadoop.ozone.container.ozoneimpl.OzoneContainer;
+import org.apache.hadoop.ozone.protocol.commands.CommandStatus;
+import org.apache.hadoop.ozone.protocol.commands.CreatePipelineCommand;
+import org.apache.hadoop.ozone.protocol.commands.CreatePipelineCommandStatus;
+import org.apache.hadoop.ozone.protocol.commands.SCMCommand;
+import org.apache.hadoop.util.Time;
+import org.apache.ratis.client.RaftClient;
+import org.apache.ratis.grpc.GrpcTlsConfig;
+import org.apache.ratis.protocol.NotLeaderException;
+import org.apache.ratis.protocol.RaftClientReply;
+import org.apache.ratis.protocol.RaftGroup;
+import org.apache.ratis.protocol.RaftGroupId;
+import org.apache.ratis.protocol.RaftPeer;
+import org.apache.ratis.retry.RetryPolicy;
+import org.apache.ratis.rpc.SupportedRpcType;
+import org.apache.ratis.util.TimeDuration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+import java.util.concurrent.RejectedExecutionException;
+import java.util.concurrent.atomic.AtomicLong;
+import java.util.function.Consumer;
+import java.util.stream.Collectors;
+
+/**
+ * Handler for create pipeline command received from SCM.
+ */
+public class CreatePipelineCommandHandler implements CommandHandler {
+
+  private static final Logger LOG =
+  LoggerFactory.getLogger(CreatePipelineCommandHandler.class);
+
+  private AtomicLong invocationCount = new AtomicLong(0);
+  private long totalTime;
+
+  /**
+   * Constructs a createPipelineCommand handler.
+   */
+  public CreatePipelineCommandHandler() {
+  }
+
+  /**
+   * Handles a given SCM command.
+   *
+   * @param command   - SCM Command
+   * @param ozoneContainer- Ozone Container.
+   * @param context   - Current Context.
+   * @param connectionManager - The SCMs that we are talking to.
+   */
+  @Override
+  public void handle(SCMCommand command, OzoneContainer ozoneContainer,
+  StateContext context, SCMConnectionManager connectionManager) {
+invocationCount.incrementAndGet();
+final long startTime = Time.monotonicNow();
+final DatanodeDetails dn = context.getParent()
+.getDatanodeDetails();
+final CreatePipelineCommandProto createCommand =
+((CreatePipelineCommand)command).getProto();
+final PipelineID pipelineID = PipelineID.getFromProtobuf(
+createCommand.getPipelineID());
+Collection peers =
+createCommand.getDatanodeList().stream()
+.map(DatanodeDetails::getFromProtoBuf)
+.collect(Collectors.toList());
+
+ 

[GitHub] [hadoop] anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-24 Thread GitBox
anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r327791852
 
 

 ##
 File path: 
hadoop-hdds/container-service/src/main/java/org/apache/hadoop/ozone/container/common/statemachine/commandhandler/CreatePipelineCommandHandler.java
 ##
 @@ -0,0 +1,228 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with this
+ * work for additional information regarding copyright ownership.  The ASF
+ * licenses this file to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ * 
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+ * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+ * License for the specific language governing permissions and limitations 
under
+ * the License.
+ */
+package org.apache.hadoop.ozone.container.common.statemachine.commandhandler;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.hdds.protocol.DatanodeDetails;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.CreatePipelineCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.SCMCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.CreatePipelineACKProto;
+import org.apache.hadoop.hdds.ratis.RatisHelper;
+import org.apache.hadoop.hdds.scm.ScmConfigKeys;
+import org.apache.hadoop.hdds.scm.client.HddsClientUtils;
+import org.apache.hadoop.hdds.scm.pipeline.PipelineID;
+import org.apache.hadoop.hdds.security.x509.SecurityConfig;
+import org.apache.hadoop.hdds.security.x509.certificate.client
+.CertificateClient;
+import org.apache.hadoop.io.MultipleIOException;
+import org.apache.hadoop.ozone.container.common.statemachine
+.SCMConnectionManager;
+import org.apache.hadoop.ozone.container.common.statemachine.StateContext;
+import org.apache.hadoop.ozone.container.ozoneimpl.OzoneContainer;
+import org.apache.hadoop.ozone.protocol.commands.CommandStatus;
+import org.apache.hadoop.ozone.protocol.commands.CreatePipelineCommand;
+import org.apache.hadoop.ozone.protocol.commands.CreatePipelineCommandStatus;
+import org.apache.hadoop.ozone.protocol.commands.SCMCommand;
+import org.apache.hadoop.util.Time;
+import org.apache.ratis.client.RaftClient;
+import org.apache.ratis.grpc.GrpcTlsConfig;
+import org.apache.ratis.protocol.NotLeaderException;
+import org.apache.ratis.protocol.RaftClientReply;
+import org.apache.ratis.protocol.RaftGroup;
+import org.apache.ratis.protocol.RaftGroupId;
+import org.apache.ratis.protocol.RaftPeer;
+import org.apache.ratis.retry.RetryPolicy;
+import org.apache.ratis.rpc.SupportedRpcType;
+import org.apache.ratis.util.TimeDuration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+import java.util.concurrent.RejectedExecutionException;
+import java.util.concurrent.atomic.AtomicLong;
+import java.util.function.Consumer;
+import java.util.stream.Collectors;
+
+/**
+ * Handler for create pipeline command received from SCM.
+ */
+public class CreatePipelineCommandHandler implements CommandHandler {
+
+  private static final Logger LOG =
+  LoggerFactory.getLogger(CreatePipelineCommandHandler.class);
+
+  private AtomicLong invocationCount = new AtomicLong(0);
+  private long totalTime;
+
+  /**
+   * Constructs a createPipelineCommand handler.
+   */
+  public CreatePipelineCommandHandler() {
+  }
+
+  /**
+   * Handles a given SCM command.
+   *
+   * @param command   - SCM Command
+   * @param ozoneContainer- Ozone Container.
+   * @param context   - Current Context.
+   * @param connectionManager - The SCMs that we are talking to.
+   */
+  @Override
+  public void handle(SCMCommand command, OzoneContainer ozoneContainer,
+  StateContext context, SCMConnectionManager connectionManager) {
+invocationCount.incrementAndGet();
+final long startTime = Time.monotonicNow();
+final DatanodeDetails dn = context.getParent()
+.getDatanodeDetails();
+final CreatePipelineCommandProto createCommand =
+((CreatePipelineCommand)command).getProto();
+final PipelineID pipelineID = PipelineID.getFromProtobuf(
+createCommand.getPipelineID());
+Collection peers =
+createCommand.getDatanodeList().stream()
+.map(DatanodeDetails::getFromProtoBuf)
+.collect(Collectors.toList());
+
+ 

[GitHub] [hadoop] anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-24 Thread GitBox
anuengineer commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r327791638
 
 

 ##
 File path: 
hadoop-hdds/container-service/src/main/java/org/apache/hadoop/ozone/container/common/statemachine/commandhandler/CreatePipelineCommandHandler.java
 ##
 @@ -0,0 +1,228 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with this
+ * work for additional information regarding copyright ownership.  The ASF
+ * licenses this file to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ * 
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+ * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+ * License for the specific language governing permissions and limitations 
under
+ * the License.
+ */
+package org.apache.hadoop.ozone.container.common.statemachine.commandhandler;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.hdds.protocol.DatanodeDetails;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.CreatePipelineCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.SCMCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.CreatePipelineACKProto;
+import org.apache.hadoop.hdds.ratis.RatisHelper;
+import org.apache.hadoop.hdds.scm.ScmConfigKeys;
+import org.apache.hadoop.hdds.scm.client.HddsClientUtils;
+import org.apache.hadoop.hdds.scm.pipeline.PipelineID;
+import org.apache.hadoop.hdds.security.x509.SecurityConfig;
+import org.apache.hadoop.hdds.security.x509.certificate.client
+.CertificateClient;
+import org.apache.hadoop.io.MultipleIOException;
+import org.apache.hadoop.ozone.container.common.statemachine
+.SCMConnectionManager;
+import org.apache.hadoop.ozone.container.common.statemachine.StateContext;
+import org.apache.hadoop.ozone.container.ozoneimpl.OzoneContainer;
+import org.apache.hadoop.ozone.protocol.commands.CommandStatus;
+import org.apache.hadoop.ozone.protocol.commands.CreatePipelineCommand;
+import org.apache.hadoop.ozone.protocol.commands.CreatePipelineCommandStatus;
+import org.apache.hadoop.ozone.protocol.commands.SCMCommand;
+import org.apache.hadoop.util.Time;
+import org.apache.ratis.client.RaftClient;
+import org.apache.ratis.grpc.GrpcTlsConfig;
+import org.apache.ratis.protocol.NotLeaderException;
+import org.apache.ratis.protocol.RaftClientReply;
+import org.apache.ratis.protocol.RaftGroup;
+import org.apache.ratis.protocol.RaftGroupId;
+import org.apache.ratis.protocol.RaftPeer;
+import org.apache.ratis.retry.RetryPolicy;
+import org.apache.ratis.rpc.SupportedRpcType;
+import org.apache.ratis.util.TimeDuration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+import java.util.concurrent.RejectedExecutionException;
+import java.util.concurrent.atomic.AtomicLong;
+import java.util.function.Consumer;
+import java.util.stream.Collectors;
+
+/**
+ * Handler for create pipeline command received from SCM.
+ */
+public class CreatePipelineCommandHandler implements CommandHandler {
+
+  private static final Logger LOG =
+  LoggerFactory.getLogger(CreatePipelineCommandHandler.class);
+
+  private AtomicLong invocationCount = new AtomicLong(0);
+  private long totalTime;
+
+  /**
+   * Constructs a createPipelineCommand handler.
+   */
+  public CreatePipelineCommandHandler() {
+  }
+
+  /**
+   * Handles a given SCM command.
+   *
+   * @param command   - SCM Command
+   * @param ozoneContainer- Ozone Container.
+   * @param context   - Current Context.
+   * @param connectionManager - The SCMs that we are talking to.
+   */
+  @Override
+  public void handle(SCMCommand command, OzoneContainer ozoneContainer,
+  StateContext context, SCMConnectionManager connectionManager) {
+invocationCount.incrementAndGet();
+final long startTime = Time.monotonicNow();
+final DatanodeDetails dn = context.getParent()
+.getDatanodeDetails();
+final CreatePipelineCommandProto createCommand =
+((CreatePipelineCommand)command).getProto();
+final PipelineID pipelineID = PipelineID.getFromProtobuf(
+createCommand.getPipelineID());
+Collection peers =
+createCommand.getDatanodeList().stream()
+.map(DatanodeDetails::getFromProtoBuf)
+.collect(Collectors.toList());
+
+ 

[GitHub] [hadoop] hadoop-yetus commented on issue #1516: HADOOP-16599. Allow a SignerInitializer to be specified along with a

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1516: HADOOP-16599. Allow a SignerInitializer 
to be specified along with a
URL: https://github.com/apache/hadoop/pull/1516#issuecomment-534696847
 
 
   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 39 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | +1 | mvninstall | 1050 | trunk passed |
   | +1 | compile | 35 | trunk passed |
   | +1 | checkstyle | 27 | trunk passed |
   | +1 | mvnsite | 40 | trunk passed |
   | +1 | shadedclient | 788 | branch has no errors when building and testing 
our client artifacts. |
   | +1 | javadoc | 30 | trunk passed |
   | 0 | spotbugs | 59 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 56 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | +1 | mvninstall | 34 | the patch passed |
   | +1 | compile | 29 | the patch passed |
   | +1 | javac | 29 | the patch passed |
   | -0 | checkstyle | 20 | hadoop-tools/hadoop-aws: The patch generated 1 new 
+ 10 unchanged - 0 fixed = 11 total (was 10) |
   | +1 | mvnsite | 33 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 769 | patch has no errors when building and testing 
our client artifacts. |
   | +1 | javadoc | 26 | the patch passed |
   | +1 | findbugs | 62 | the patch passed |
   ||| _ Other Tests _ |
   | +1 | unit | 69 | hadoop-aws in the patch passed. |
   | +1 | asflicense | 33 | The patch does not generate ASF License warnings. |
   | | | 3241 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1516/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1516 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 18ec9c9f4269 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / afa1006 |
   | Default Java | 1.8.0_222 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1516/1/artifact/out/diff-checkstyle-hadoop-tools_hadoop-aws.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1516/1/testReport/ |
   | Max. process+thread count | 412 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1516/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Comment Edited] (HADOOP-16543) Cached DNS name resolution error

2019-09-24 Thread Fengnan Li (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16543?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16937082#comment-16937082
 ] 

Fengnan Li edited comment on HADOOP-16543 at 9/24/19 6:30 PM:
--

[~elgoiri] Sorry for coming back late. Right now we use DNS only in 
ResourceManager for router so we haven't run into this issue in production yet. 
But the general issue of DNS caching still exists. A sort of mitigation of our 
current strategy is to give each host a DNS alias instead of using its host 
directly and this at least get rid of trouble cased by host replacement.


was (Author: fengnanli):
[~elgoiri] Sorry for coming back late. Right now we use DNS only in 
ResourceManager for router so we haven't run into this issue in production yet. 
But the general issue of DNS caching still exists. A sort of mitigation of our 
current strategy is to give each host a DNS alias instead of using its host 
directly and this at least get rid of host replacement part.

> Cached DNS name resolution error
> 
>
> Key: HADOOP-16543
> URL: https://issues.apache.org/jira/browse/HADOOP-16543
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.1.2
>Reporter: Roger Liu
>Priority: Major
>
> In Kubernetes, the a node may go down and then come back later with a 
> different IP address. Yarn clients which are already running will be unable 
> to rediscover the node after it comes back up due to caching the original IP 
> address. This is problematic for cases such as Spark HA on Kubernetes, as the 
> node containing the resource manager may go down and come back up, meaning 
> existing node managers must then also be restarted.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16543) Cached DNS name resolution error

2019-09-24 Thread Fengnan Li (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16543?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16937082#comment-16937082
 ] 

Fengnan Li commented on HADOOP-16543:
-

[~elgoiri] Sorry for coming back late. Right now we use DNS only in 
ResourceManager for router so we haven't run into this issue in production yet. 
But the general issue of DNS caching still exists. A sort of mitigation of our 
current strategy is to give each host a DNS alias instead of using its host 
directly and this at least get rid of host replacement part.

> Cached DNS name resolution error
> 
>
> Key: HADOOP-16543
> URL: https://issues.apache.org/jira/browse/HADOOP-16543
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.1.2
>Reporter: Roger Liu
>Priority: Major
>
> In Kubernetes, the a node may go down and then come back later with a 
> different IP address. Yarn clients which are already running will be unable 
> to rediscover the node after it comes back up due to caching the original IP 
> address. This is problematic for cases such as Spark HA on Kubernetes, as the 
> node containing the resource manager may go down and come back up, meaning 
> existing node managers must then also be restarted.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1442: HADOOP-16570. S3A committers encounter scale issues

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1442: HADOOP-16570. S3A committers encounter 
scale issues
URL: https://github.com/apache/hadoop/pull/1442#issuecomment-534683787
 
 
   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 72 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 9 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | +1 | mvninstall | 1217 | trunk passed |
   | +1 | compile | 31 | trunk passed |
   | +1 | checkstyle | 24 | trunk passed |
   | +1 | mvnsite | 35 | trunk passed |
   | +1 | shadedclient | 854 | branch has no errors when building and testing 
our client artifacts. |
   | +1 | javadoc | 26 | trunk passed |
   | 0 | spotbugs | 60 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 58 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | +1 | mvninstall | 34 | the patch passed |
   | +1 | compile | 27 | the patch passed |
   | +1 | javac | 27 | the patch passed |
   | -0 | checkstyle | 19 | hadoop-tools/hadoop-aws: The patch generated 13 new 
+ 44 unchanged - 0 fixed = 57 total (was 44) |
   | +1 | mvnsite | 31 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 920 | patch has no errors when building and testing 
our client artifacts. |
   | +1 | javadoc | 26 | the patch passed |
   | +1 | findbugs | 75 | the patch passed |
   ||| _ Other Tests _ |
   | +1 | unit | 82 | hadoop-aws in the patch passed. |
   | +1 | asflicense | 35 | The patch does not generate ASF License warnings. |
   | | | 3674 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.2 Server=19.03.2 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/5/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1442 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 4a572df0bc0e 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / afa1006 |
   | Default Java | 1.8.0_222 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/5/artifact/out/diff-checkstyle-hadoop-tools_hadoop-aws.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/5/testReport/ |
   | Max. process+thread count | 318 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/5/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer commented on a change in pull request #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
anuengineer commented on a change in pull request #1513: HDDS-2149. Replace 
FindBugs with SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#discussion_r327759427
 
 

 ##
 File path: hadoop-ozone/dev-support/checks/findbugs.sh
 ##
 @@ -16,16 +16,15 @@
 DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
 cd "$DIR/../../.." || exit 1
 
-mvn -B compile -fn findbugs:check -Dfindbugs.failOnError=false  -f 
pom.ozone.xml
+mvn -B compile -fn spotbugs:check -Dspotbugs.failOnError=false -f pom.ozone.xml
 
 Review comment:
   should this be spotbugs:check or spotbugs:spotbugs ?? The spotbugs:check 
does not produce the report 
   
   Please look at the table in this link.
   https://spotbugs.github.io/spotbugs-maven-plugin/plugin-info.html
   
   Just wondering if it makes sense to save the report at all .. ? this is more 
of a question, than informed opinion.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on a change in pull request #1515: HDDS-2171. Dangling links in test report due to incompatible realpath

2019-09-24 Thread GitBox
hadoop-yetus commented on a change in pull request #1515: HDDS-2171. Dangling 
links in test report due to incompatible realpath
URL: https://github.com/apache/hadoop/pull/1515#discussion_r327758756
 
 

 ##
 File path: hadoop-ozone/dev-support/checks/_mvn_unit_report.sh
 ##
 @@ -16,6 +16,15 @@
 
 REPORT_DIR=${REPORT_DIR:-$PWD}
 
+_realpath() {
+  if realpath "$@" > /dev/null; then
+realpath "$@"
+  else
+local relative_to=$(realpath "${1/--relative-to=/}")
 
 Review comment:
   shellcheck:11: warning: Declare and assign separately to avoid masking 
return values. [SC2155]
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1515: HDDS-2171. Dangling links in test report due to incompatible realpath

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1515: HDDS-2171. Dangling links in test report 
due to incompatible realpath
URL: https://github.com/apache/hadoop/pull/1515#issuecomment-534679061
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 131 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | 0 | shelldocs | 0 | Shelldocs was not available. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 69 | hadoop-hdds in trunk failed. |
   | -1 | mvninstall | 34 | hadoop-ozone in trunk failed. |
   | +1 | mvnsite | 0 | trunk passed |
   | +1 | shadedclient | 932 | branch has no errors when building and testing 
our client artifacts. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 38 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 35 | hadoop-ozone in the patch failed. |
   | +1 | mvnsite | 0 | the patch passed |
   | -1 | shellcheck | 1 | The patch generated 1 new + 1 unchanged - 0 fixed = 
2 total (was 1) |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 725 | patch has no errors when building and testing 
our client artifacts. |
   ||| _ Other Tests _ |
   | -1 | unit | 30 | hadoop-hdds in the patch failed. |
   | -1 | unit | 27 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 36 | The patch does not generate ASF License warnings. |
   | | | 2208 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.2 Server=19.03.2 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1515 |
   | Optional Tests | dupname asflicense mvnsite unit shellcheck shelldocs |
   | uname | Linux d302ca002388 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / afa1006 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/1/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/1/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | shellcheck | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/1/artifact/out/diff-patch-shellcheck.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/1/artifact/out/patch-unit-hadoop-hdds.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/1/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/1/testReport/ |
   | Max. process+thread count | 402 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone U: hadoop-ozone |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1515/1/console |
   | versions | git=2.7.4 maven=3.3.9 shellcheck=0.4.6 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer commented on issue #1510: HDDS-2170. Add Object IDs and Update ID to Volume Object

2019-09-24 Thread GitBox
anuengineer commented on issue #1510: HDDS-2170. Add Object IDs and Update ID 
to Volume Object
URL: https://github.com/apache/hadoop/pull/1510#issuecomment-534677425
 
 
   Thanks for the reviews, @xiaoyuyao  and @bharatviswa504. Appreciate your 
time and careful thought.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer merged pull request #1510: HDDS-2170. Add Object IDs and Update ID to Volume Object

2019-09-24 Thread GitBox
anuengineer merged pull request #1510: HDDS-2170. Add Object IDs and Update ID 
to Volume Object
URL: https://github.com/apache/hadoop/pull/1510
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] sidseth opened a new pull request #1516: HADOOP-16599. Allow a SignerInitializer to be specified along with a

2019-09-24 Thread GitBox
sidseth opened a new pull request #1516: HADOOP-16599. Allow a 
SignerInitializer to be specified along with a
URL: https://github.com/apache/hadoop/pull/1516
 
 
   Patch is missing some unit tests, which will get added soon. Posting early 
to solicit feedback on the interface additions - @steveloughran


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] steveloughran commented on issue #568: HADOOP-15691 Add PathCapabilities to FS and FC to complement StreamCapabilities

2019-09-24 Thread GitBox
steveloughran commented on issue #568: HADOOP-15691 Add PathCapabilities to FS 
and FC to complement StreamCapabilities
URL: https://github.com/apache/hadoop/pull/568#issuecomment-534671330
 
 
   thanks for the vote; just rebasing and retesting after s/schema/r/scheme in 
the markdown


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] steveloughran commented on a change in pull request #568: HADOOP-15691 Add PathCapabilities to FS and FC to complement StreamCapabilities

2019-09-24 Thread GitBox
steveloughran commented on a change in pull request #568: HADOOP-15691 Add 
PathCapabilities to FS and FC to complement StreamCapabilities
URL: https://github.com/apache/hadoop/pull/568#discussion_r327747667
 
 

 ##
 File path: 
hadoop-common-project/hadoop-common/src/site/markdown/filesystem/pathcapabilities.md
 ##
 @@ -0,0 +1,158 @@
+
+
+#  interface `PathCapabilities`
+
+The `PathCapabilities` interface provides a way to programmatically query the
+operations offered under a given path by an instance of `FileSystem`, 
`FileContext`
+or other implementing class.
+
+```java
+public interface PathCapabilities {
+  boolean hasPathCapability(Path path, String capability)
+  throws IOException;
+}
+```
+
+There are a number of goals here:
+
+1. Allow callers to probe for optional filesystem operations without actually
+having to invoke them.
+1. Allow filesystems with their own optional per-instance features to declare
+whether or not they are active for the specific instance.
+1. Allow for fileystem connectors which work with object stores to expose the
+fundamental difference in semantics of these stores (e.g: files not visible
+until closed, file rename being `O(data)`), directory rename being non-atomic,
+etc.
+
+### Available Capabilities
+
+Capabilities are defined as strings and split into "Common Capabilites"
+and non-standard ones for a specific store.
+
+The common capabilities are all defined under the prefix `fs.capability.`
+
+Consult the javadocs for `org.apache.hadoop.fs.CommonPathCapabilities` for 
these.
+
+
+Individual filesystems MAY offer their own set of capabilities which
+can be probed for. These MUST begin with `fs.` + the filesystem schema +
+ `.capability`. For example `fs.s3a.capability.select.sql`;
+
+### `boolean hasPathCapability(path, capability)`
+
+Probe for the instance offering a specific capability under the
+given path.
+
+ Postconditions
+
+```python
+if fs_supports_the_feature(path, capability):
+  return True
+else:
+  return False
+```
+
+Return: `True`, iff the specific capability is available.
+
+A filesystem instance *MUST NOT* return `True` for any capability unless it is
+known to be supported by that specific instance. As a result, if a caller
+probes for a capability then it can assume that the specific feature/semantics
+are available.
+
+If the probe returns `False` then it can mean one of:
+
+1. The capability is unknown.
+1. The capability is known, and known to be unavailable on this instance.
+1. The capability is known but this local class does not know if it is 
supported
+   under the supplied path.
+
+This predicate is intended to be low cost. If it requires remote calls other
+than path/link resolution, it SHOULD conclude that the availability
+of the feature is unknown and return `False`.
+
+The predicate MUST also be side-effect free.
+
+*Validity of paths*
+There is no requirement that the existence of the path must be checked;
+the parameter exists so that any filesystem which relays operations to other
+filesystems (e.g `viewfs`) can resolve and relay it to the nested filesystem.
+Consider the call to be *relatively* lightweight.
+
+Because of this, it may be that while the filesystem declares that
+it supports a capability under a path, the actual invocation of the operation
+may fail for other reasons.
+
+As an example, while a filesystem may support `append()` under a path,
+if invoked on a directory, the call may fail.
+
+That is for a path `root = new Path("/")`: the capabilities call may succeed
+
+```java
+fs.hasCapabilities(root, "fs.capability.append") == true
+```
+
+But a subsequent call to the operation on that specific path may fail,
+because the root path is a directory:
+
+```java
+fs.append(root)
+```
+
+
+Similarly, there is no checking that the caller has the permission to
+perform a specific operation: just because a feature is available on that
+path does not mean that the caller can execute the operation.
 
 Review comment:
   not just that -it means that we can often avoid any round trip at all


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] steveloughran commented on a change in pull request #568: HADOOP-15691 Add PathCapabilities to FS and FC to complement StreamCapabilities

2019-09-24 Thread GitBox
steveloughran commented on a change in pull request #568: HADOOP-15691 Add 
PathCapabilities to FS and FC to complement StreamCapabilities
URL: https://github.com/apache/hadoop/pull/568#discussion_r327744896
 
 

 ##
 File path: 
hadoop-common-project/hadoop-common/src/site/markdown/filesystem/pathcapabilities.md
 ##
 @@ -0,0 +1,158 @@
+
+
+#  interface `PathCapabilities`
+
+The `PathCapabilities` interface provides a way to programmatically query the
+operations offered under a given path by an instance of `FileSystem`, 
`FileContext`
+or other implementing class.
+
+```java
+public interface PathCapabilities {
+  boolean hasPathCapability(Path path, String capability)
+  throws IOException;
+}
+```
+
+There are a number of goals here:
+
+1. Allow callers to probe for optional filesystem operations without actually
+having to invoke them.
+1. Allow filesystems with their own optional per-instance features to declare
+whether or not they are active for the specific instance.
+1. Allow for fileystem connectors which work with object stores to expose the
+fundamental difference in semantics of these stores (e.g: files not visible
+until closed, file rename being `O(data)`), directory rename being non-atomic,
+etc.
+
+### Available Capabilities
+
+Capabilities are defined as strings and split into "Common Capabilites"
+and non-standard ones for a specific store.
+
+The common capabilities are all defined under the prefix `fs.capability.`
+
+Consult the javadocs for `org.apache.hadoop.fs.CommonPathCapabilities` for 
these.
+
+
+Individual filesystems MAY offer their own set of capabilities which
+can be probed for. These MUST begin with `fs.` + the filesystem schema +
 
 Review comment:
   scheme -will fix


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] adoroszlai commented on issue #1515: HDDS-2171. Dangling links in test report due to incompatible realpath

2019-09-24 Thread GitBox
adoroszlai commented on issue #1515: HDDS-2171. Dangling links in test report 
due to incompatible realpath
URL: https://github.com/apache/hadoop/pull/1515#issuecomment-534663314
 
 
   /label ozone


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] adoroszlai opened a new pull request #1515: HDDS-2171. Dangling links in test report due to incompatible realpath

2019-09-24 Thread GitBox
adoroszlai opened a new pull request #1515: HDDS-2171. Dangling links in test 
report due to incompatible realpath
URL: https://github.com/apache/hadoop/pull/1515
 
 
   ## What changes were proposed in this pull request?
   
   Workaround for BusyBox's simplistic `realpath` command, which doesn't 
support `--relative-to` option.
   
   https://issues.apache.org/jira/browse/HDDS-2171
   
   ## How was this patch tested?
   
   Created local `realpath` implementation with behavior similar to BusyBox's, 
and ran `_mvn_unit_report.sh`.  Also tried it with regular `realpath`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] virajith commented on a change in pull request #1478: HDFS-14856 Fetch file ACLs while mounting external store

2019-09-24 Thread GitBox
virajith commented on a change in pull request #1478: HDFS-14856 Fetch file 
ACLs while mounting external store
URL: https://github.com/apache/hadoop/pull/1478#discussion_r327735706
 
 

 ##
 File path: 
hadoop-tools/hadoop-fs2img/src/main/java/org/apache/hadoop/hdfs/server/namenode/TreePath.java
 ##
 @@ -159,13 +180,17 @@ INode toFile(UGIResolver ugi, BlockResolver blk,
 
   INode toDirectory(UGIResolver ugi) {
 final FileStatus s = getFileStatus();
-ugi.addUser(s.getOwner());
-ugi.addGroup(s.getGroup());
+final AclStatus aclStatus = getAclStatus();
+long permissions = ugi.getPermissionsProto(s, aclStatus);
 INodeDirectory.Builder b = INodeDirectory.newBuilder()
 .setModificationTime(s.getModificationTime())
 .setNsQuota(DEFAULT_NAMESPACE_QUOTA)
 .setDsQuota(DEFAULT_STORAGE_SPACE_QUOTA)
-.setPermission(ugi.resolve(s));
+.setPermission(permissions);
+if (aclStatus != null) {
+  throw new UnsupportedOperationException(
 
 Review comment:
   Can you make this explicit (e.g,, add javadoc for this method)?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] steveloughran commented on issue #1442: HADOOP-16570. S3A committers leak threads on job/task commit.

2019-09-24 Thread GitBox
steveloughran commented on issue #1442: HADOOP-16570. S3A committers leak 
threads on job/task commit.
URL: https://github.com/apache/hadoop/pull/1442#issuecomment-534658407
 
 
   tested -s3a ireland w/ddb. not yet tested: all the way through spark


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus removed a comment on issue #1442: HADOOP-16570. S3A committers leak threads on job/task commit.

2019-09-24 Thread GitBox
hadoop-yetus removed a comment on issue #1442: HADOOP-16570. S3A committers 
leak threads on job/task commit.
URL: https://github.com/apache/hadoop/pull/1442#issuecomment-533697723
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 83 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 4 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | +1 | mvninstall | 1310 | trunk passed |
   | +1 | compile | 35 | trunk passed |
   | +1 | checkstyle | 28 | trunk passed |
   | +1 | mvnsite | 41 | trunk passed |
   | -1 | shadedclient | 109 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 29 | trunk passed |
   | 0 | spotbugs | 71 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 68 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | +1 | mvninstall | 38 | the patch passed |
   | +1 | compile | 32 | the patch passed |
   | +1 | javac | 32 | the patch passed |
   | -0 | checkstyle | 22 | hadoop-tools/hadoop-aws: The patch generated 10 new 
+ 40 unchanged - 0 fixed = 50 total (was 40) |
   | +1 | mvnsite | 34 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 32 | patch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 25 | the patch passed |
   | +1 | findbugs | 77 | the patch passed |
   ||| _ Other Tests _ |
   | -1 | unit | 72 | hadoop-aws in the patch failed. |
   | +1 | asflicense | 24 | The patch does not generate ASF License warnings. |
   | | | 2125 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.fs.s3a.commit.staging.TestStagingCommitter |
   |   | hadoop.fs.s3a.commit.staging.TestStagingPartitionedJobCommit |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.2 Server=19.03.2 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/4/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1442 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux b164b6d10a2f 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 3f223be |
   | Default Java | 1.8.0_222 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/4/artifact/out/diff-checkstyle-hadoop-tools_hadoop-aws.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/4/artifact/out/patch-unit-hadoop-tools_hadoop-aws.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/4/testReport/ |
   | Max. process+thread count | 327 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/4/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] elek closed pull request #1507: HDDS-2167. Hadoop31-mr acceptance test is failing due to the shading

2019-09-24 Thread GitBox
elek closed pull request #1507: HDDS-2167. Hadoop31-mr acceptance test is 
failing due to the shading
URL: https://github.com/apache/hadoop/pull/1507
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] elek commented on issue #1507: HDDS-2167. Hadoop31-mr acceptance test is failing due to the shading

2019-09-24 Thread GitBox
elek commented on issue #1507: HDDS-2167. Hadoop31-mr acceptance test is 
failing due to the shading
URL: https://github.com/apache/hadoop/pull/1507#issuecomment-534623116
 
 
   Merged to the trunk. Thanks the review @arp7 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1499: HDDS-1738. Add nullable annotation for OMResponse classes.

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1499: HDDS-1738. Add nullable annotation for 
OMResponse classes.
URL: https://github.com/apache/hadoop/pull/1499#issuecomment-534609623
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 47 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 35 | hadoop-hdds in trunk failed. |
   | -1 | mvninstall | 29 | hadoop-ozone in trunk failed. |
   | -1 | compile | 22 | hadoop-hdds in trunk failed. |
   | -1 | compile | 15 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 62 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | +1 | shadedclient | 850 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 22 | hadoop-hdds in trunk failed. |
   | -1 | javadoc | 17 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 944 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 30 | hadoop-hdds in trunk failed. |
   | -1 | findbugs | 19 | hadoop-ozone in trunk failed. |
   | -0 | patch | 979 | Used diff version of patch file. Binary files and 
potentially other changes not applied. Please rebase and squash commits if 
necessary. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 34 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 29 | hadoop-ozone in the patch failed. |
   | -1 | compile | 23 | hadoop-hdds in the patch failed. |
   | -1 | compile | 19 | hadoop-ozone in the patch failed. |
   | -1 | javac | 23 | hadoop-hdds in the patch failed. |
   | -1 | javac | 19 | hadoop-ozone in the patch failed. |
   | -0 | checkstyle | 29 | hadoop-ozone: The patch generated 6 new + 0 
unchanged - 0 fixed = 6 total (was 0) |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 789 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 22 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 20 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 30 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 21 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 29 | hadoop-hdds in the patch failed. |
   | -1 | unit | 23 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 33 | The patch does not generate ASF License warnings. |
   | | | 2410 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1499/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1499 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux e46c74a77441 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 51c64b3 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1499/2/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1499/2/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1499/2/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1499/2/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1499/2/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1499/2/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1499/2/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1499/2/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1499/2/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1499/2/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1499/2/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | compile | 

[GitHub] [hadoop] hadoop-yetus commented on issue #1514: HDDS-2072. Make StorageContainerLocationProtocolService message based

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1514: HDDS-2072. Make 
StorageContainerLocationProtocolService message based
URL: https://github.com/apache/hadoop/pull/1514#issuecomment-534607525
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 38 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ HDDS-2067 Compile Tests _ |
   | 0 | mvndep | 26 | Maven dependency ordering for branch |
   | -1 | mvninstall | 30 | hadoop-hdds in HDDS-2067 failed. |
   | -1 | mvninstall | 24 | hadoop-ozone in HDDS-2067 failed. |
   | -1 | compile | 21 | hadoop-hdds in HDDS-2067 failed. |
   | -1 | compile | 15 | hadoop-ozone in HDDS-2067 failed. |
   | +1 | checkstyle | 48 | HDDS-2067 passed |
   | +1 | mvnsite | 0 | HDDS-2067 passed |
   | +1 | shadedclient | 849 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 19 | hadoop-hdds in HDDS-2067 failed. |
   | -1 | javadoc | 14 | hadoop-ozone in HDDS-2067 failed. |
   | 0 | spotbugs | 932 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 26 | hadoop-hdds in HDDS-2067 failed. |
   | -1 | findbugs | 20 | hadoop-ozone in HDDS-2067 failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 32 | Maven dependency ordering for patch |
   | -1 | mvninstall | 35 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 32 | hadoop-ozone in the patch failed. |
   | -1 | compile | 30 | hadoop-hdds in the patch failed. |
   | -1 | compile | 19 | hadoop-ozone in the patch failed. |
   | -1 | cc | 30 | hadoop-hdds in the patch failed. |
   | -1 | cc | 19 | hadoop-ozone in the patch failed. |
   | -1 | javac | 30 | hadoop-hdds in the patch failed. |
   | -1 | javac | 19 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 29 | hadoop-hdds: The patch generated 0 new + 0 
unchanged - 1 fixed = 0 total (was 1) |
   | +1 | checkstyle | 31 | The patch passed checkstyle in hadoop-ozone |
   | +1 | mvnsite | 1 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 690 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 21 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 20 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 31 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 20 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 29 | hadoop-hdds in the patch failed. |
   | -1 | unit | 22 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 32 | The patch does not generate ASF License warnings. |
   | | | 2345 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1514/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1514 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle cc |
   | uname | Linux 6b010f356e16 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | HDDS-2067 / e4d4fca |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1514/1/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1514/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1514/1/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1514/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1514/1/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1514/1/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1514/1/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1514/1/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1514/1/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 

[GitHub] [hadoop] elek opened a new pull request #1514: HDDS-2072. Make StorageContainerLocationProtocolService message based

2019-09-24 Thread GitBox
elek opened a new pull request #1514: HDDS-2072. Make 
StorageContainerLocationProtocolService message based
URL: https://github.com/apache/hadoop/pull/1514
 
 
   It depends on HDDS-2067


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer commented on a change in pull request #1510: HDDS-2170. Add Object IDs and Update ID to Volume Object

2019-09-24 Thread GitBox
anuengineer commented on a change in pull request #1510: HDDS-2170. Add Object 
IDs and Update ID to Volume Object
URL: https://github.com/apache/hadoop/pull/1510#discussion_r327595783
 
 

 ##
 File path: 
hadoop-ozone/common/src/main/java/org/apache/hadoop/ozone/om/helpers/OmVolumeArgs.java
 ##
 @@ -188,6 +237,29 @@ public int hashCode() {
 private long quotaInBytes;
 private Map metadata;
 private OmOzoneAclMap aclMap;
+private long objectID;
+private long updateID;
+
+/**
+ * Sets the Object ID for this Object.
+ * Object ID are unique and immutable identifier for each object in the
+ * System.
+ * @param objectID - long
+ */
+public void setObjectID(long objectID) {
+  this.objectID = objectID;
 
 Review comment:
   Object ID represents a unique Identity for an object.  Update ID represents 
a token that is useful to detect if an object has been changed after a read has 
been performed. For example, with update IDs I can cache an object and make 
changes, and when I apply I can detect if an object has been updated by any 
other thread.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16601) Add support for hardware crc32 of nativetask checksums on aarch64 arch

2019-09-24 Thread MacChen01 (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

MacChen01 updated HADOOP-16601:
---
Attachment: HADOOP-16601.patch

> Add support for hardware crc32 of nativetask checksums on aarch64 arch
> --
>
> Key: HADOOP-16601
> URL: https://issues.apache.org/jira/browse/HADOOP-16601
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: common
>Affects Versions: 3.2.1
>Reporter: MacChen01
>Priority: Major
>  Labels: performance
> Fix For: site
>
> Attachments: HADOOP-16601.patch
>
>
> Add support for aarch64 CRC instructions in nativetask module, optimize the 
> CRC32 and CRC32C.
> Use the benchmark tools : nttest , the improvement is quite substantial:  
> *CRC32 Zlib polynomial 0x04C11DB7*
> |KeyValueType-IO|Before(MB/s)|After(MB/s)|Improvement|
> |TextType-Write|425.98|602.92|+42%|
> |TextType-Read|796.06|1716.59|+116%|
> |BytesType-Write|474.25|686.84|+45%|
> |BytesType-Read|844.96|1955.03|+131%|
> |UnknownType-Write|434.84|608.81|+40%|
> |UnknownType-Read|805.76|1733.82|+115%|
>  
>   
>  *CRC32C  Castagnoli polynomial 0x1EDC6F41*
>  
> |KeyValueType-IO|Before(MB/s)|After(MB/s)|Improvement|
> |TextType-Write|423.39|606.55|+43%|
> |TextType-Read|799.20|1783.28|+123%|
> |BytesType-Write|473.95|696.47|+47%|
> |BytesType-Read|846.30|2018.06|+138%|
> |UnknownType-Write|434.07|612.31|+41%|
> |UnknownType-Read|807.16|1783.95|+121%|



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16601) Add support for hardware crc32 of nativetask checksums on aarch64 arch

2019-09-24 Thread MacChen01 (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

MacChen01 updated HADOOP-16601:
---
Description: 
Add support for aarch64 CRC instructions in nativetask module, optimize the 
CRC32 and CRC32C.

Use the benchmark tools : nttest , the improvement is quite substantial:  

*CRC32 Zlib polynomial 0x04C11DB7*
|KeyValueType-IO|Before(MB/s)|After(MB/s)|Improvement|
|TextType-Write|425.98|602.92|+42%|
|TextType-Read|796.06|1716.59|+116%|
|BytesType-Write|474.25|686.84|+45%|
|BytesType-Read|844.96|1955.03|+131%|
|UnknownType-Write|434.84|608.81|+40%|
|UnknownType-Read|805.76|1733.82|+115%|

 
  
 *CRC32C  Castagnoli polynomial 0x1EDC6F41*
 
|KeyValueType-IO|Before(MB/s)|After(MB/s)|Improvement|
|TextType-Write|423.39|606.55|+43%|
|TextType-Read|799.20|1783.28|+123%|
|BytesType-Write|473.95|696.47|+47%|
|BytesType-Read|846.30|2018.06|+138%|
|UnknownType-Write|434.07|612.31|+41%|
|UnknownType-Read|807.16|1783.95|+121%|

  was:
Add support for aarch64 CRC instructions in nativetask module, optimize the 
CRC32 and CRC32C.

Use the benchmark tools : nttest , the improvement is quite substantial:  

*CRC32 Zlib polynomial 0x04C11DB7*
|KeyValueType|IO|before(M/s)|after(M/s)|improvement|
|TextType|Write|425.98|602.92|+42%|
|Read|796.06|1716.59|+116%|
|BytesType|Write|474.25|686.84|+45%|
|Read|844.96|1955.03|+131%|
|UnknownType|Write|434.84|608.81|+40%|
|Read|805.76|1733.82|+115%|
 
 
*CRC32C  Castagnoli polynomial 0x1EDC6F41*
|KeyValueType|IO|before(M/s)|after(M/s)|improvement|
|TextType|Write|423.39|606.55|+43%|
|Read|799.20|1783.28|+123%|
|BytesType|Write|473.95|696.47|+47%|
|Read|846.30|2018.06|+138%|
|UnknownType|Write|434.07|612.31|+41%|
|Read|807.16|1783.95|+121%|


> Add support for hardware crc32 of nativetask checksums on aarch64 arch
> --
>
> Key: HADOOP-16601
> URL: https://issues.apache.org/jira/browse/HADOOP-16601
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: common
>Affects Versions: 3.2.1
>Reporter: MacChen01
>Priority: Major
>  Labels: performance
> Fix For: site
>
>
> Add support for aarch64 CRC instructions in nativetask module, optimize the 
> CRC32 and CRC32C.
> Use the benchmark tools : nttest , the improvement is quite substantial:  
> *CRC32 Zlib polynomial 0x04C11DB7*
> |KeyValueType-IO|Before(MB/s)|After(MB/s)|Improvement|
> |TextType-Write|425.98|602.92|+42%|
> |TextType-Read|796.06|1716.59|+116%|
> |BytesType-Write|474.25|686.84|+45%|
> |BytesType-Read|844.96|1955.03|+131%|
> |UnknownType-Write|434.84|608.81|+40%|
> |UnknownType-Read|805.76|1733.82|+115%|
>  
>   
>  *CRC32C  Castagnoli polynomial 0x1EDC6F41*
>  
> |KeyValueType-IO|Before(MB/s)|After(MB/s)|Improvement|
> |TextType-Write|423.39|606.55|+43%|
> |TextType-Read|799.20|1783.28|+123%|
> |BytesType-Write|473.95|696.47|+47%|
> |BytesType-Read|846.30|2018.06|+138%|
> |UnknownType-Write|434.07|612.31|+41%|
> |UnknownType-Read|807.16|1783.95|+121%|



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16601) Add support for hardware crc32 of nativetask checksums on aarch64 arch

2019-09-24 Thread MacChen01 (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

MacChen01 updated HADOOP-16601:
---
Description: 
Add support for aarch64 CRC instructions in nativetask module, optimize the 
CRC32 and CRC32C.

Use the benchmark tools : nttest , the improvement is quite substantial:  

*CRC32 Zlib polynomial 0x04C11DB7*
|KeyValueType|IO|before(M/s)|after(M/s)|improvement|
|TextType|Write|425.98|602.92|+42%|
|Read|796.06|1716.59|+116%|
|BytesType|Write|474.25|686.84|+45%|
|Read|844.96|1955.03|+131%|
|UnknownType|Write|434.84|608.81|+40%|
|Read|805.76|1733.82|+115%|
 
 
*CRC32C  Castagnoli polynomial 0x1EDC6F41*
|KeyValueType|IO|before(M/s)|after(M/s)|improvement|
|TextType|Write|423.39|606.55|+43%|
|Read|799.20|1783.28|+123%|
|BytesType|Write|473.95|696.47|+47%|
|Read|846.30|2018.06|+138%|
|UnknownType|Write|434.07|612.31|+41%|
|Read|807.16|1783.95|+121%|

  was:
Add support for aarch64 CRC instructions in nativetask module, optimize the 
CRC32 and CRC32C.

Use the benchmark tools : nttest , the improvement is quite substantial:
 
|ChecksumType|KeyValueType|IO|before(M/s)|after(M/s)|improvement|
|CRC32|TextType|Write|425.98|602.92|42%|
|Read|796.06|1716.59|116%|
|BytesType|Write|474.25|686.84|45%|
|Read|844.96|1955.03|131%|
|UnknownType|Write|434.84|608.81|40%|
|Read|805.76|1733.82|115%|
|CRC32C|TextType|Write|423.39|606.55|43%|
|Read|799.20|1783.28|123%|
|BytesType|Write|473.95|696.47|47%|
|Read|846.30|2018.06|138%|
|UnknownType|Write|434.07|612.31|41%|
|Read|807.16|1783.95|121%|


> Add support for hardware crc32 of nativetask checksums on aarch64 arch
> --
>
> Key: HADOOP-16601
> URL: https://issues.apache.org/jira/browse/HADOOP-16601
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: common
>Affects Versions: 3.2.1
>Reporter: MacChen01
>Priority: Major
>  Labels: performance
> Fix For: site
>
>
> Add support for aarch64 CRC instructions in nativetask module, optimize the 
> CRC32 and CRC32C.
> Use the benchmark tools : nttest , the improvement is quite substantial:  
> *CRC32 Zlib polynomial 0x04C11DB7*
> |KeyValueType|IO|before(M/s)|after(M/s)|improvement|
> |TextType|Write|425.98|602.92|+42%|
> |Read|796.06|1716.59|+116%|
> |BytesType|Write|474.25|686.84|+45%|
> |Read|844.96|1955.03|+131%|
> |UnknownType|Write|434.84|608.81|+40%|
> |Read|805.76|1733.82|+115%|
>  
>  
> *CRC32C  Castagnoli polynomial 0x1EDC6F41*
> |KeyValueType|IO|before(M/s)|after(M/s)|improvement|
> |TextType|Write|423.39|606.55|+43%|
> |Read|799.20|1783.28|+123%|
> |BytesType|Write|473.95|696.47|+47%|
> |Read|846.30|2018.06|+138%|
> |UnknownType|Write|434.07|612.31|+41%|
> |Read|807.16|1783.95|+121%|



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16601) Add support for hardware crc32 of nativetask checksums on aarch64 arch

2019-09-24 Thread MacChen01 (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

MacChen01 updated HADOOP-16601:
---
Description: 
Add support for aarch64 CRC instructions in nativetask module, optimize the 
CRC32 and CRC32C.

Use the benchmark tools : nttest , the improvement is quite substantial:
 
|ChecksumType|KeyValueType|IO|before(M/s)|after(M/s)|improvement|
|CRC32|TextType|Write|425.98|602.92|42%|
|Read|796.06|1716.59|116%|
|BytesType|Write|474.25|686.84|45%|
|Read|844.96|1955.03|131%|
|UnknownType|Write|434.84|608.81|40%|
|Read|805.76|1733.82|115%|
|CRC32C|TextType|Write|423.39|606.55|43%|
|Read|799.20|1783.28|123%|
|BytesType|Write|473.95|696.47|47%|
|Read|846.30|2018.06|138%|
|UnknownType|Write|434.07|612.31|41%|
|Read|807.16|1783.95|121%|

  was:
Add support for aarch64 CRC instructions in nativetask module, optimize the 
CRC32 and CRC32C.

Use the benchmark tools : nttest , the improvement is quite substantial:
|ChecksumType|KeyValueType|IO|Before(M/s)|After(M/s)|Improvement|
|CRC32|TextType|Write|425.98|602.92|+42%|
|Read|796.06|1716.59|+116%|
|BytesType|Write|474.25|686.84|+45%|
|Read|844.96|1955.03|+131%|
|UnknownType|Write|434.84|608.81|+40%|
|Read|805.76|1733.82|+115%|
|CRC32C|TextType|Write|423.39|606.55|+43%|
|Read|799.20|1783.28|+123%|
|BytesType|Write|473.95|696.47|+47%|
|Read|846.30|2018.06|+138%|
|UnknownType|Write|434.07|612.31|+41%|
|Read|807.16|1783.95|+121%|


> Add support for hardware crc32 of nativetask checksums on aarch64 arch
> --
>
> Key: HADOOP-16601
> URL: https://issues.apache.org/jira/browse/HADOOP-16601
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: common
>Affects Versions: 3.2.1
>Reporter: MacChen01
>Priority: Major
>  Labels: performance
> Fix For: site
>
>
> Add support for aarch64 CRC instructions in nativetask module, optimize the 
> CRC32 and CRC32C.
> Use the benchmark tools : nttest , the improvement is quite substantial:
>  
> |ChecksumType|KeyValueType|IO|before(M/s)|after(M/s)|improvement|
> |CRC32|TextType|Write|425.98|602.92|42%|
> |Read|796.06|1716.59|116%|
> |BytesType|Write|474.25|686.84|45%|
> |Read|844.96|1955.03|131%|
> |UnknownType|Write|434.84|608.81|40%|
> |Read|805.76|1733.82|115%|
> |CRC32C|TextType|Write|423.39|606.55|43%|
> |Read|799.20|1783.28|123%|
> |BytesType|Write|473.95|696.47|47%|
> |Read|846.30|2018.06|138%|
> |UnknownType|Write|434.07|612.31|41%|
> |Read|807.16|1783.95|121%|



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16600) StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1

2019-09-24 Thread Lisheng Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lisheng Sun updated HADOOP-16600:
-
Attachment: HADOOP-16600.branch-3.1.v1.patch

> StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1
> -
>
> Key: HADOOP-16600
> URL: https://issues.apache.org/jira/browse/HADOOP-16600
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.1.0, 3.1.1, 3.1.2
>Reporter: Lisheng Sun
>Priority: Major
> Attachments: HADOOP-16600.branch-3.1.v1.patch
>
>
> details see HADOOP-15398
> Problem: hadoop trunk compilation is failing
> Root Cause:
> compilation error is coming from 
> org.apache.hadoop.fs.s3a.commit.staging.StagingTestBase. Compilation error is 
> "The method getArgumentAt(int, Class) is undefined for the 
> type InvocationOnMock".
> StagingTestBase is using getArgumentAt(int, Class) method 
> which is not available in mockito-all 1.8.5 version. getArgumentAt(int, 
> Class) method is available only from version 2.0.0-beta
> as follow code:
> {code:java}
> InitiateMultipartUploadRequest req = invocation.getArgumentAt(
> 0, InitiateMultipartUploadRequest.class);
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16601) Add support for hardware crc32 of nativetask checksums on aarch64 arch

2019-09-24 Thread MacChen01 (Jira)
MacChen01 created HADOOP-16601:
--

 Summary: Add support for hardware crc32 of nativetask checksums on 
aarch64 arch
 Key: HADOOP-16601
 URL: https://issues.apache.org/jira/browse/HADOOP-16601
 Project: Hadoop Common
  Issue Type: Improvement
  Components: common
Affects Versions: 3.2.1
Reporter: MacChen01
 Fix For: site


Add support for aarch64 CRC instructions in nativetask module, optimize the 
CRC32 and CRC32C.

Use the benchmark tools : nttest , the improvement is quite substantial:
|ChecksumType|KeyValueType|IO|Before(M/s)|After(M/s)|Improvement|
|CRC32|TextType|Write|425.98|602.92|+42%|
|Read|796.06|1716.59|+116%|
|BytesType|Write|474.25|686.84|+45%|
|Read|844.96|1955.03|+131%|
|UnknownType|Write|434.84|608.81|+40%|
|Read|805.76|1733.82|+115%|
|CRC32C|TextType|Write|423.39|606.55|+43%|
|Read|799.20|1783.28|+123%|
|BytesType|Write|473.95|696.47|+47%|
|Read|846.30|2018.06|+138%|
|UnknownType|Write|434.07|612.31|+41%|
|Read|807.16|1783.95|+121%|



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] steveloughran commented on a change in pull request #568: HADOOP-15691 Add PathCapabilities to FS and FC to complement StreamCapabilities

2019-09-24 Thread GitBox
steveloughran commented on a change in pull request #568: HADOOP-15691 Add 
PathCapabilities to FS and FC to complement StreamCapabilities
URL: https://github.com/apache/hadoop/pull/568#discussion_r327565927
 
 

 ##
 File path: 
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/AbstractFileSystem.java
 ##
 @@ -1371,4 +1373,16 @@ public boolean equals(Object other) {
 new CompletableFuture<>(), () -> open(path, bufferSize));
   }
 
+  public boolean hasPathCapability(final Path path,
+  final String capability)
 
 Review comment:
   > Was curious about examples where different paths in the same FS would have 
different capabilities.
   
   Files HDFS encryption zones behave differently; viewfs relays things, and 
any DFS whose mount points may have different semantics can do it. Oh, and WASB 
has an option for special path where leases need to be acquired before renames 
-HBase needs that


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16600) StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1

2019-09-24 Thread Lisheng Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lisheng Sun updated HADOOP-16600:
-
Description: 
details see HADOOP-15398
Problem: hadoop trunk compilation is failing
Root Cause:
compilation error is coming from 
org.apache.hadoop.fs.s3a.commit.staging.StagingTestBase. Compilation error is 
"The method getArgumentAt(int, Class) is undefined for the 
type InvocationOnMock".

StagingTestBase is using getArgumentAt(int, Class) method 
which is not available in mockito-all 1.8.5 version. getArgumentAt(int, 
Class) method is available only from version 2.0.0-beta


{code:java}
InitiateMultipartUploadRequest req = invocation.getArgumentAt(
0, InitiateMultipartUploadRequest.class);
{code}


  was:
details see HADOOP-15398
Problem: hadoop trunk compilation is failing
Root Cause:
compilation error is coming from 
org.apache.hadoop.fs.s3a.commit.staging.StagingTestBase. Compilation error is 
"The method getArgumentAt(int, Class) is undefined for the 
type InvocationOnMock".

StagingTestBase is using getArgumentAt(int, Class) method 
which is not available in mockito-all 1.8.5 version. getArgumentAt(int, 
Class) method is available only from version 2.0.0-beta


> StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1
> -
>
> Key: HADOOP-16600
> URL: https://issues.apache.org/jira/browse/HADOOP-16600
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.1.0, 3.1.1, 3.1.2
>Reporter: Lisheng Sun
>Priority: Major
>
> details see HADOOP-15398
> Problem: hadoop trunk compilation is failing
> Root Cause:
> compilation error is coming from 
> org.apache.hadoop.fs.s3a.commit.staging.StagingTestBase. Compilation error is 
> "The method getArgumentAt(int, Class) is undefined for the 
> type InvocationOnMock".
> StagingTestBase is using getArgumentAt(int, Class) method 
> which is not available in mockito-all 1.8.5 version. getArgumentAt(int, 
> Class) method is available only from version 2.0.0-beta
> {code:java}
> InitiateMultipartUploadRequest req = invocation.getArgumentAt(
> 0, InitiateMultipartUploadRequest.class);
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16600) StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1

2019-09-24 Thread Lisheng Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lisheng Sun updated HADOOP-16600:
-
Description: 
details see HADOOP-15398
Problem: hadoop trunk compilation is failing
Root Cause:
compilation error is coming from 
org.apache.hadoop.fs.s3a.commit.staging.StagingTestBase. Compilation error is 
"The method getArgumentAt(int, Class) is undefined for the 
type InvocationOnMock".

StagingTestBase is using getArgumentAt(int, Class) method 
which is not available in mockito-all 1.8.5 version. getArgumentAt(int, 
Class) method is available only from version 2.0.0-beta

as follow code:
{code:java}
InitiateMultipartUploadRequest req = invocation.getArgumentAt(
0, InitiateMultipartUploadRequest.class);
{code}


  was:
details see HADOOP-15398
Problem: hadoop trunk compilation is failing
Root Cause:
compilation error is coming from 
org.apache.hadoop.fs.s3a.commit.staging.StagingTestBase. Compilation error is 
"The method getArgumentAt(int, Class) is undefined for the 
type InvocationOnMock".

StagingTestBase is using getArgumentAt(int, Class) method 
which is not available in mockito-all 1.8.5 version. getArgumentAt(int, 
Class) method is available only from version 2.0.0-beta


{code:java}
InitiateMultipartUploadRequest req = invocation.getArgumentAt(
0, InitiateMultipartUploadRequest.class);
{code}



> StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1
> -
>
> Key: HADOOP-16600
> URL: https://issues.apache.org/jira/browse/HADOOP-16600
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.1.0, 3.1.1, 3.1.2
>Reporter: Lisheng Sun
>Priority: Major
>
> details see HADOOP-15398
> Problem: hadoop trunk compilation is failing
> Root Cause:
> compilation error is coming from 
> org.apache.hadoop.fs.s3a.commit.staging.StagingTestBase. Compilation error is 
> "The method getArgumentAt(int, Class) is undefined for the 
> type InvocationOnMock".
> StagingTestBase is using getArgumentAt(int, Class) method 
> which is not available in mockito-all 1.8.5 version. getArgumentAt(int, 
> Class) method is available only from version 2.0.0-beta
> as follow code:
> {code:java}
> InitiateMultipartUploadRequest req = invocation.getArgumentAt(
> 0, InitiateMultipartUploadRequest.class);
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16600) StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1

2019-09-24 Thread Lisheng Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lisheng Sun resolved HADOOP-16600.
--
Resolution: Duplicate

> StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1
> -
>
> Key: HADOOP-16600
> URL: https://issues.apache.org/jira/browse/HADOOP-16600
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.1.0, 3.1.1, 3.1.2
>Reporter: Lisheng Sun
>Priority: Major
>
> details see HADOOP-15398
> Problem: hadoop trunk compilation is failing
> Root Cause:
> compilation error is coming from 
> org.apache.hadoop.fs.s3a.commit.staging.StagingTestBase. Compilation error is 
> "The method getArgumentAt(int, Class) is undefined for the 
> type InvocationOnMock".
> StagingTestBase is using getArgumentAt(int, Class) method 
> which is not available in mockito-all 1.8.5 version. getArgumentAt(int, 
> Class) method is available only from version 2.0.0-beta



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Reopened] (HADOOP-16600) StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1

2019-09-24 Thread Lisheng Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lisheng Sun reopened HADOOP-16600:
--

> StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1
> -
>
> Key: HADOOP-16600
> URL: https://issues.apache.org/jira/browse/HADOOP-16600
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.1.0, 3.1.1, 3.1.2
>Reporter: Lisheng Sun
>Priority: Major
>
> details see HADOOP-15398
> Problem: hadoop trunk compilation is failing
> Root Cause:
> compilation error is coming from 
> org.apache.hadoop.fs.s3a.commit.staging.StagingTestBase. Compilation error is 
> "The method getArgumentAt(int, Class) is undefined for the 
> type InvocationOnMock".
> StagingTestBase is using getArgumentAt(int, Class) method 
> which is not available in mockito-all 1.8.5 version. getArgumentAt(int, 
> Class) method is available only from version 2.0.0-beta



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16600) StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1

2019-09-24 Thread Lisheng Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lisheng Sun updated HADOOP-16600:
-
Description: 
details see HADOOP-15398
Problem: hadoop trunk compilation is failing
Root Cause:
compilation error is coming from 
org.apache.hadoop.fs.s3a.commit.staging.StagingTestBase. Compilation error is 
"The method getArgumentAt(int, Class) is undefined for the 
type InvocationOnMock".

StagingTestBase is using getArgumentAt(int, Class) method 
which is not available in mockito-all 1.8.5 version. getArgumentAt(int, 
Class) method is available only from version 2.0.0-beta

> StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1
> -
>
> Key: HADOOP-16600
> URL: https://issues.apache.org/jira/browse/HADOOP-16600
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.1.0, 3.1.1, 3.1.2
>Reporter: Lisheng Sun
>Priority: Major
>
> details see HADOOP-15398
> Problem: hadoop trunk compilation is failing
> Root Cause:
> compilation error is coming from 
> org.apache.hadoop.fs.s3a.commit.staging.StagingTestBase. Compilation error is 
> "The method getArgumentAt(int, Class) is undefined for the 
> type InvocationOnMock".
> StagingTestBase is using getArgumentAt(int, Class) method 
> which is not available in mockito-all 1.8.5 version. getArgumentAt(int, 
> Class) method is available only from version 2.0.0-beta



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16600) StagingTestBase uses methods not available in Mockito 1.8.5 in branch-3.1

2019-09-24 Thread Lisheng Sun (Jira)
Lisheng Sun created HADOOP-16600:


 Summary: StagingTestBase uses methods not available in Mockito 
1.8.5 in branch-3.1
 Key: HADOOP-16600
 URL: https://issues.apache.org/jira/browse/HADOOP-16600
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 3.1.2, 3.1.1, 3.1.0
Reporter: Lisheng Sun






--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1501: HDDS-2067. Create generic service facade with tracing/metrics/logging support

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1501: HDDS-2067. Create generic service facade 
with tracing/metrics/logging support
URL: https://github.com/apache/hadoop/pull/1501#issuecomment-534516212
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 45 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 79 | Maven dependency ordering for branch |
   | -1 | mvninstall | 59 | hadoop-hdds in trunk failed. |
   | -1 | mvninstall | 35 | hadoop-ozone in trunk failed. |
   | -1 | compile | 20 | hadoop-hdds in trunk failed. |
   | -1 | compile | 15 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 69 | trunk passed |
   | +1 | mvnsite | 1 | trunk passed |
   | +1 | shadedclient | 1115 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 24 | hadoop-hdds in trunk failed. |
   | -1 | javadoc | 20 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 1219 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 37 | hadoop-hdds in trunk failed. |
   | -1 | findbugs | 19 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 29 | Maven dependency ordering for patch |
   | -1 | mvninstall | 35 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 28 | hadoop-ozone in the patch failed. |
   | -1 | compile | 22 | hadoop-hdds in the patch failed. |
   | -1 | compile | 16 | hadoop-ozone in the patch failed. |
   | -1 | javac | 22 | hadoop-hdds in the patch failed. |
   | -1 | javac | 16 | hadoop-ozone in the patch failed. |
   | -0 | checkstyle | 29 | hadoop-hdds: The patch generated 1 new + 0 
unchanged - 0 fixed = 1 total (was 0) |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 925 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 23 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 22 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 33 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 21 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 27 | hadoop-hdds in the patch failed. |
   | -1 | unit | 22 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 30 | The patch does not generate ASF License warnings. |
   | | | 2950 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1501 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux ffd4595ba58a 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 51c64b3 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/3/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/3/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/3/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/3/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/3/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/3/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/3/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/3/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/3/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/3/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/3/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/3/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 

[GitHub] [hadoop] hadoop-yetus commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#issuecomment-534514299
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 0 | Docker mode activated. |
   | -1 | patch | 11 | https://github.com/apache/hadoop/pull/1513 does not 
apply to trunk. Rebase required? Wrong Branch? See 
https://wiki.apache.org/hadoop/HowToContribute for help. |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1513/1/console |
   | versions | git=2.17.1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1505: HDDS-2166. Some RPC metrics are missing from SCM prometheus endpoint

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1505: HDDS-2166. Some RPC metrics are missing 
from SCM prometheus endpoint
URL: https://github.com/apache/hadoop/pull/1505#issuecomment-534512941
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 38 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 29 | hadoop-hdds in trunk failed. |
   | -1 | mvninstall | 30 | hadoop-ozone in trunk failed. |
   | -1 | compile | 21 | hadoop-hdds in trunk failed. |
   | -1 | compile | 14 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 60 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | +1 | shadedclient | 859 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 23 | hadoop-hdds in trunk failed. |
   | -1 | javadoc | 20 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 964 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 34 | hadoop-hdds in trunk failed. |
   | -1 | findbugs | 21 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 35 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 32 | hadoop-ozone in the patch failed. |
   | -1 | compile | 25 | hadoop-hdds in the patch failed. |
   | -1 | compile | 19 | hadoop-ozone in the patch failed. |
   | -1 | javac | 25 | hadoop-hdds in the patch failed. |
   | -1 | javac | 19 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 57 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 697 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 23 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 18 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 31 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 16 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 27 | hadoop-hdds in the patch failed. |
   | -1 | unit | 20 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 34 | The patch does not generate ASF License warnings. |
   | | | 2301 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1505 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux e88a1b57f239 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 51c64b3 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/2/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javadoc | 

[GitHub] [hadoop] hadoop-yetus commented on issue #1504: HDDS-2068. Make StorageContainerDatanodeProtocolService message based

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1504: HDDS-2068. Make 
StorageContainerDatanodeProtocolService message based
URL: https://github.com/apache/hadoop/pull/1504#issuecomment-534513053
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 41 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ HDDS-2067 Compile Tests _ |
   | 0 | mvndep | 48 | Maven dependency ordering for branch |
   | -1 | mvninstall | 31 | hadoop-hdds in HDDS-2067 failed. |
   | -1 | mvninstall | 24 | hadoop-ozone in HDDS-2067 failed. |
   | -1 | compile | 21 | hadoop-hdds in HDDS-2067 failed. |
   | -1 | compile | 14 | hadoop-ozone in HDDS-2067 failed. |
   | +1 | checkstyle | 50 | HDDS-2067 passed |
   | +1 | mvnsite | 0 | HDDS-2067 passed |
   | +1 | shadedclient | 833 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 23 | hadoop-hdds in HDDS-2067 failed. |
   | -1 | javadoc | 20 | hadoop-ozone in HDDS-2067 failed. |
   | 0 | spotbugs | 936 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 34 | hadoop-hdds in HDDS-2067 failed. |
   | -1 | findbugs | 21 | hadoop-ozone in HDDS-2067 failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 31 | Maven dependency ordering for patch |
   | -1 | mvninstall | 36 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 31 | hadoop-ozone in the patch failed. |
   | -1 | compile | 23 | hadoop-hdds in the patch failed. |
   | -1 | compile | 18 | hadoop-ozone in the patch failed. |
   | -1 | cc | 23 | hadoop-hdds in the patch failed. |
   | -1 | cc | 18 | hadoop-ozone in the patch failed. |
   | -1 | javac | 23 | hadoop-hdds in the patch failed. |
   | -1 | javac | 18 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 58 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 682 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 20 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 20 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 29 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 20 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 27 | hadoop-hdds in the patch failed. |
   | -1 | unit | 24 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 32 | The patch does not generate ASF License warnings. |
   | | | 2355 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1504/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1504 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle cc |
   | uname | Linux 038d92767dd8 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | HDDS-2067 / e4d4fca |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1504/2/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1504/2/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1504/2/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1504/2/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1504/2/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1504/2/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1504/2/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1504/2/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1504/2/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1504/2/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1504/2/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | compile | 

[GitHub] [hadoop] adoroszlai commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
adoroszlai commented on issue #1513: HDDS-2149. Replace FindBugs with SpotBugs
URL: https://github.com/apache/hadoop/pull/1513#issuecomment-534512731
 
 
   /label ozone


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] adoroszlai opened a new pull request #1513: HDDS-2149. Replace FindBugs with SpotBugs

2019-09-24 Thread GitBox
adoroszlai opened a new pull request #1513: HDDS-2149. Replace FindBugs with 
SpotBugs
URL: https://github.com/apache/hadoop/pull/1513
 
 
   ## What changes were proposed in this pull request?
   
   Replace FindBugs with [SpotBugs](https://spotbugs.github.io), as FindBugs is 
no longer maintained.
   
   https://issues.apache.org/jira/browse/HDDS-2149
   
   ## How was this patch tested?
   
   ```
   $ mvn -f pom.ozone.xml clean
   $ hadoop-ozone/dev-support/checks/findbugs.sh
   ...
   [INFO] BUILD SUCCESS
   ```
   
   Also verified that it catches violations introduced temporarily:
   
   ```
   ...
   [INFO] Build failures were ignored.
   H C Eq: org.apache.hadoop.hdds.HddsUtils.equals(Object) always returns false 
 At HddsUtils.java:[line 98]
   M D RV: Return value of Object.toString() ignored, but method has no side 
effect  At HddsUtils.java:[line 94]
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1501: HDDS-2067. Create generic service facade with tracing/metrics/logging support

2019-09-24 Thread GitBox
hadoop-yetus commented on issue #1501: HDDS-2067. Create generic service facade 
with tracing/metrics/logging support
URL: https://github.com/apache/hadoop/pull/1501#issuecomment-534504404
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 37 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 63 | Maven dependency ordering for branch |
   | -1 | mvninstall | 31 | hadoop-hdds in trunk failed. |
   | -1 | mvninstall | 25 | hadoop-ozone in trunk failed. |
   | -1 | compile | 21 | hadoop-hdds in trunk failed. |
   | -1 | compile | 15 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 49 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | +1 | shadedclient | 845 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 21 | hadoop-hdds in trunk failed. |
   | -1 | javadoc | 20 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 942 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 31 | hadoop-hdds in trunk failed. |
   | -1 | findbugs | 20 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 29 | Maven dependency ordering for patch |
   | -1 | mvninstall | 34 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 30 | hadoop-ozone in the patch failed. |
   | -1 | compile | 24 | hadoop-hdds in the patch failed. |
   | -1 | compile | 19 | hadoop-ozone in the patch failed. |
   | -1 | javac | 24 | hadoop-hdds in the patch failed. |
   | -1 | javac | 19 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 56 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 721 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 21 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 20 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 31 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 19 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 29 | hadoop-hdds in the patch failed. |
   | -1 | unit | 23 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 33 | The patch does not generate ASF License warnings. |
   | | | 2409 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1501 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 160ecd00f83e 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 8f1a135 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1501/2/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | 

[GitHub] [hadoop] n-marion commented on issue #1484: [HADOOP-16590] - LoginModule Classes and Principal Classes deprecated in IBM Java.

2019-09-24 Thread GitBox
n-marion commented on issue #1484: [HADOOP-16590] - LoginModule Classes and 
Principal Classes deprecated in IBM Java.
URL: https://github.com/apache/hadoop/pull/1484#issuecomment-534495801
 
 
   Is there any insight onto what errors there were during build? I didn't have 
any. And for tests, I tried to find existing tests for this code path, but 
didn't find any. I however have tested this fix inside of our own build of 
hadoop for Apache Spark and it resolved the `unable to login` error.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



  1   2   >