[GitHub] [hadoop] tasanuma merged pull request #4156: HDFS-16457.Make fs.getspaceused.classname reconfigurable (apache#4069)

2022-04-10 Thread GitBox


tasanuma merged PR #4156:
URL: https://github.com/apache/hadoop/pull/4156


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-18178) Upgrade jackson to 2.13.2 and jackson-databind to 2.13.2.2

2022-04-10 Thread Akira Ajisaka (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18178?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Akira Ajisaka resolved HADOOP-18178.

Fix Version/s: 3.3.3
   Resolution: Fixed

Backported to branch-3.3.

> Upgrade jackson to 2.13.2 and jackson-databind to 2.13.2.2
> --
>
> Key: HADOOP-18178
> URL: https://issues.apache.org/jira/browse/HADOOP-18178
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: PJ Fanning
>Assignee: PJ Fanning
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.0, 3.3.3
>
>  Time Spent: 3h
>  Remaining Estimate: 0h
>
> https://github.com/FasterXML/jackson-databind/issues/2816



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-18178) Upgrade jackson to 2.13.2 and jackson-databind to 2.13.2.2

2022-04-10 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18178?focusedWorklogId=755096=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-755096
 ]

ASF GitHub Bot logged work on HADOOP-18178:
---

Author: ASF GitHub Bot
Created on: 11/Apr/22 05:58
Start Date: 11/Apr/22 05:58
Worklog Time Spent: 10m 
  Work Description: aajisaka merged PR #4147:
URL: https://github.com/apache/hadoop/pull/4147




Issue Time Tracking
---

Worklog Id: (was: 755096)
Time Spent: 3h  (was: 2h 50m)

> Upgrade jackson to 2.13.2 and jackson-databind to 2.13.2.2
> --
>
> Key: HADOOP-18178
> URL: https://issues.apache.org/jira/browse/HADOOP-18178
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: PJ Fanning
>Assignee: PJ Fanning
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>  Time Spent: 3h
>  Remaining Estimate: 0h
>
> https://github.com/FasterXML/jackson-databind/issues/2816



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] aajisaka merged pull request #4147: HADOOP-18178. Upgrade jackson to 2.13.2 and jackson-databind to 2.13.…

2022-04-10 Thread GitBox


aajisaka merged PR #4147:
URL: https://github.com/apache/hadoop/pull/4147


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-18178) Upgrade jackson to 2.13.2 and jackson-databind to 2.13.2.2

2022-04-10 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18178?focusedWorklogId=755095=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-755095
 ]

ASF GitHub Bot logged work on HADOOP-18178:
---

Author: ASF GitHub Bot
Created on: 11/Apr/22 05:56
Start Date: 11/Apr/22 05:56
Worklog Time Spent: 10m 
  Work Description: aajisaka commented on PR #4147:
URL: https://github.com/apache/hadoop/pull/4147#issuecomment-1094580681

   Failed tests
   
   ### Ran successfully in my local
   hadoop.yarn.server.nodemanager.amrmproxy.TestFederationInterceptor
   hadoop.hdfs.server.balancer.TestBalancerWithHANameNodes
   hadoop.hdfs.TestDecommissionWithStriped
   
   ### Reproduced even without the patch
   hadoop.yarn.applications.distributedshell.TestDistributedShell
   hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewer -> Filed 
HDFS-16536
   hadoop.hdfs.server.federation.router.TestRouterRpcMultiDestination
   hadoop.hdfs.server.federation.router.TestRouterRpc
   
   




Issue Time Tracking
---

Worklog Id: (was: 755095)
Time Spent: 2h 50m  (was: 2h 40m)

> Upgrade jackson to 2.13.2 and jackson-databind to 2.13.2.2
> --
>
> Key: HADOOP-18178
> URL: https://issues.apache.org/jira/browse/HADOOP-18178
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: PJ Fanning
>Assignee: PJ Fanning
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>  Time Spent: 2h 50m
>  Remaining Estimate: 0h
>
> https://github.com/FasterXML/jackson-databind/issues/2816



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] aajisaka commented on pull request #4147: HADOOP-18178. Upgrade jackson to 2.13.2 and jackson-databind to 2.13.…

2022-04-10 Thread GitBox


aajisaka commented on PR #4147:
URL: https://github.com/apache/hadoop/pull/4147#issuecomment-1094580681

   Failed tests
   
   ### Ran successfully in my local
   hadoop.yarn.server.nodemanager.amrmproxy.TestFederationInterceptor
   hadoop.hdfs.server.balancer.TestBalancerWithHANameNodes
   hadoop.hdfs.TestDecommissionWithStriped
   
   ### Reproduced even without the patch
   hadoop.yarn.applications.distributedshell.TestDistributedShell
   hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewer -> Filed 
HDFS-16536
   hadoop.hdfs.server.federation.router.TestRouterRpcMultiDestination
   hadoop.hdfs.server.federation.router.TestRouterRpc
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4158: HDFS-16535. SlotReleaser should reuse the domain socket based on socket paths

2022-04-10 Thread GitBox


hadoop-yetus commented on PR #4158:
URL: https://github.com/apache/hadoop/pull/4158#issuecomment-1094567208

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 39s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  39m  2s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   1m  2s |  |  trunk passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  compile  |   0m 54s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 36s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   1m  1s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 50s |  |  trunk passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 39s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   2m 46s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  22m 15s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 50s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 53s |  |  the patch passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javac  |   0m 53s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 46s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 46s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   0m 19s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   0m 50s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 34s |  |  the patch passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 31s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   2m 30s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  21m 47s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 24s |  |  hadoop-hdfs-client in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   0m 37s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 100m 52s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4158/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4158 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell |
   | uname | Linux 1d56d1eec633 4.15.0-169-generic #177-Ubuntu SMP Thu Feb 3 
10:50:38 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 45f926da74533ed348d007df2ee2a042357ccf0a |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4158/1/testReport/ |
   | Max. process+thread count | 546 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-client U: 
hadoop-hdfs-project/hadoop-hdfs-client |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4158/1/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

[GitHub] [hadoop] hadoop-yetus commented on pull request #4157: HDFS-16474. Make HDFS tail tool cross platform

2022-04-10 Thread GitBox


hadoop-yetus commented on PR #4157:
URL: https://github.com/apache/hadoop/pull/4157#issuecomment-1094556606

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 58s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 5 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  24m 42s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   4m  9s |  |  trunk passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  compile  |   4m  8s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  mvnsite  |   0m 25s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  56m 38s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 14s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   4m  6s |  |  the patch passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  cc  |   4m  6s |  |  the patch passed  |
   | +1 :green_heart: |  golang  |   4m  6s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |   4m  6s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   4m  1s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  cc  |   4m  1s |  |  the patch passed  |
   | +1 :green_heart: |  golang  |   4m  1s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |   4m  1s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   0m 16s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  22m 42s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  88m 12s |  |  hadoop-hdfs-native-client in 
the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 29s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 179m 58s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4157/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4157 |
   | Optional Tests | dupname asflicense compile cc mvnsite javac unit 
codespell golang |
   | uname | Linux 04de66e00b85 4.15.0-162-generic #170-Ubuntu SMP Mon Oct 18 
11:38:05 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 0cc0e067973923efc901d76f24a739d06b0ed4b7 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4157/1/testReport/ |
   | Max. process+thread count | 522 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: 
hadoop-hdfs-project/hadoop-hdfs-native-client |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4157/1/console |
   | versions | git=2.25.1 maven=3.6.3 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-18088) Replace log4j 1.x with reload4j

2022-04-10 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18088?focusedWorklogId=755079=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-755079
 ]

ASF GitHub Bot logged work on HADOOP-18088:
---

Author: ASF GitHub Bot
Created on: 11/Apr/22 03:59
Start Date: 11/Apr/22 03:59
Worklog Time Spent: 10m 
  Work Description: iwasakims commented on PR #4151:
URL: https://github.com/apache/hadoop/pull/4151#issuecomment-1094517723

   I got no issue on manual testing on my local 
security-enabled-pseudo-distributed cluster including kms and httpfs.




Issue Time Tracking
---

Worklog Id: (was: 755079)
Time Spent: 7h  (was: 6h 50m)

> Replace log4j 1.x with reload4j
> ---
>
> Key: HADOOP-18088
> URL: https://issues.apache.org/jira/browse/HADOOP-18088
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: Wei-Chiu Chuang
>Assignee: Wei-Chiu Chuang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.2.4, 3.3.3
>
>  Time Spent: 7h
>  Remaining Estimate: 0h
>
> As proposed in the dev mailing list 
> (https://lists.apache.org/thread/fdzkv80mzkf3w74z9120l0k0rc3v7kqk) let's 
> replace log4j 1 with reload4j in the maintenance releases (i.e. 3.3.x, 3.2.x 
> and 2.10.x)



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] iwasakims commented on pull request #4151: HADOOP-18088. Replace log4j 1.x with reload4j.

2022-04-10 Thread GitBox


iwasakims commented on PR #4151:
URL: https://github.com/apache/hadoop/pull/4151#issuecomment-1094517723

   I got no issue on manual testing on my local 
security-enabled-pseudo-distributed cluster including kms and httpfs.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] cndaimin commented on pull request #4078: HDFS-16510. Fix EC decommission when rack is not enough

2022-04-10 Thread GitBox


cndaimin commented on PR #4078:
URL: https://github.com/apache/hadoop/pull/4078#issuecomment-1094513479

   @Jing9 @jojochuang @sodonnel @ayushtkn Do you have time to take a look on 
this? I think it's a problem we need to fix in EC, thanks a lot.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] stiga-huang opened a new pull request, #4158: HDFS-16535. SlotReleaser should reuse the domain socket based on socket paths

2022-04-10 Thread GitBox


stiga-huang opened a new pull request, #4158:
URL: https://github.com/apache/hadoop/pull/4158

   
   
   ### Description of PR
   HDFS-13639 improves the performance of short-circuit shm slot releasing by 
reusing the domain socket that the client previously used to send release 
request to the DataNode.
   
   This is good when there are only one DataNode locates with the client (truth 
in most of the production environment). However, if we launch multiple 
DataNodes on a machine (usually for testing, e.g. Impala's end-to-end tests), 
the request could be sent to the wrong DataNode. See an example in IMPALA-11234.
   
   We should only reuse the domain socket when it corresponds to the same 
socket path.
   
   ### How was this patch tested?
   
   I deploy the fix in my Impala minicluster with multiple DataNodes launched 
on my machine. Verified that the error logs of slot release failures 
disappeared.
   
   ### For code changes:
   
   - [x] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4157: HDFS-16474. Make HDFS tail tool cross platform

2022-04-10 Thread GitBox


hadoop-yetus commented on PR #4157:
URL: https://github.com/apache/hadoop/pull/4157#issuecomment-1094469578

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  15m 19s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 5 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  32m 51s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   3m 31s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 25s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  68m 14s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 14s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   3m 24s |  |  the patch passed  |
   | +1 :green_heart: |  cc  |   3m 24s |  |  the patch passed  |
   | +1 :green_heart: |  golang  |   3m 24s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |   3m 24s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   0m 20s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  31m 29s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  92m  7s |  |  hadoop-hdfs-native-client in 
the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 29s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 214m  1s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4157/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4157 |
   | Optional Tests | dupname asflicense compile cc mvnsite javac unit 
codespell golang |
   | uname | Linux 2eca3456a886 4.15.0-162-generic #170-Ubuntu SMP Mon Oct 18 
11:38:05 UTC 2021 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 0cc0e067973923efc901d76f24a739d06b0ed4b7 |
   | Default Java | Debian-11.0.14+9-post-Debian-1deb10u1 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4157/1/testReport/ |
   | Max. process+thread count | 614 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: 
hadoop-hdfs-project/hadoop-hdfs-native-client |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4157/1/console |
   | versions | git=2.20.1 maven=3.6.0 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] singer-bin commented on pull request #4156: HDFS-16457.Make fs.getspaceused.classname reconfigurable (apache#4069)

2022-04-10 Thread GitBox


singer-bin commented on PR #4156:
URL: https://github.com/apache/hadoop/pull/4156#issuecomment-1094459124

   apache#4069 has been submitted to branch-3.3, please review, the error 
doesn't seem to be related to my code. @tasanuma 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] liubingxing commented on a diff in pull request #4032: HDFS-16484. [SPS]: Fix an infinite loop bug in SPSPathIdProcessor thread

2022-04-10 Thread GitBox


liubingxing commented on code in PR #4032:
URL: https://github.com/apache/hadoop/pull/4032#discussion_r846883348


##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/namenode/sps/BlockStorageMovementNeeded.java:
##
@@ -232,6 +233,7 @@ public synchronized void clearQueuesWithNotification() {
 public void run() {
   LOG.info("Starting SPSPathIdProcessor!.");
   Long startINode = null;
+  int retryCount = 0;

Review Comment:
   Thanks @tasanuma 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-18088) Replace log4j 1.x with reload4j

2022-04-10 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18088?focusedWorklogId=755065=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-755065
 ]

ASF GitHub Bot logged work on HADOOP-18088:
---

Author: ASF GitHub Bot
Created on: 11/Apr/22 00:35
Start Date: 11/Apr/22 00:35
Worklog Time Spent: 10m 
  Work Description: iwasakims commented on PR #4151:
URL: https://github.com/apache/hadoop/pull/4151#issuecomment-1094426560

   Test failures related to wasb are reproducible even without the patch. I 
could not reproduce other failures on my local.




Issue Time Tracking
---

Worklog Id: (was: 755065)
Time Spent: 6h 50m  (was: 6h 40m)

> Replace log4j 1.x with reload4j
> ---
>
> Key: HADOOP-18088
> URL: https://issues.apache.org/jira/browse/HADOOP-18088
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: Wei-Chiu Chuang
>Assignee: Wei-Chiu Chuang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.2.4, 3.3.3
>
>  Time Spent: 6h 50m
>  Remaining Estimate: 0h
>
> As proposed in the dev mailing list 
> (https://lists.apache.org/thread/fdzkv80mzkf3w74z9120l0k0rc3v7kqk) let's 
> replace log4j 1 with reload4j in the maintenance releases (i.e. 3.3.x, 3.2.x 
> and 2.10.x)



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] iwasakims commented on pull request #4151: HADOOP-18088. Replace log4j 1.x with reload4j.

2022-04-10 Thread GitBox


iwasakims commented on PR #4151:
URL: https://github.com/apache/hadoop/pull/4151#issuecomment-1094426560

   Test failures related to wasb are reproducible even without the patch. I 
could not reproduce other failures on my local.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-18088) Replace log4j 1.x with reload4j

2022-04-10 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18088?focusedWorklogId=755062=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-755062
 ]

ASF GitHub Bot logged work on HADOOP-18088:
---

Author: ASF GitHub Bot
Created on: 11/Apr/22 00:07
Start Date: 11/Apr/22 00:07
Worklog Time Spent: 10m 
  Work Description: iwasakims commented on PR #4151:
URL: https://github.com/apache/hadoop/pull/4151#issuecomment-1094418162

   The failure of TestClassUtil is relevant. I missed to include the change on 
backporting.




Issue Time Tracking
---

Worklog Id: (was: 755062)
Time Spent: 6h 40m  (was: 6.5h)

> Replace log4j 1.x with reload4j
> ---
>
> Key: HADOOP-18088
> URL: https://issues.apache.org/jira/browse/HADOOP-18088
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: Wei-Chiu Chuang
>Assignee: Wei-Chiu Chuang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.2.4, 3.3.3
>
>  Time Spent: 6h 40m
>  Remaining Estimate: 0h
>
> As proposed in the dev mailing list 
> (https://lists.apache.org/thread/fdzkv80mzkf3w74z9120l0k0rc3v7kqk) let's 
> replace log4j 1 with reload4j in the maintenance releases (i.e. 3.3.x, 3.2.x 
> and 2.10.x)



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] iwasakims commented on pull request #4151: HADOOP-18088. Replace log4j 1.x with reload4j.

2022-04-10 Thread GitBox


iwasakims commented on PR #4151:
URL: https://github.com/apache/hadoop/pull/4151#issuecomment-1094418162

   The failure of TestClassUtil is relevant. I missed to include the change on 
backporting.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4157: HDFS-16474. Make HDFS tail tool cross platform

2022-04-10 Thread GitBox


hadoop-yetus commented on PR #4157:
URL: https://github.com/apache/hadoop/pull/4157#issuecomment-1094388605

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  25m 49s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 5 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  25m 12s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   3m 50s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 37s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  52m 44s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 18s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   3m 41s |  |  the patch passed  |
   | +1 :green_heart: |  cc  |   3m 41s |  |  the patch passed  |
   | +1 :green_heart: |  golang  |   3m 41s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |   3m 41s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   0m 22s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  22m 37s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  94m 23s |  |  hadoop-hdfs-native-client in 
the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 38s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 203m  6s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4157/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4157 |
   | Optional Tests | dupname asflicense compile cc mvnsite javac unit 
codespell golang |
   | uname | Linux b196e434c4e4 4.15.0-162-generic #170-Ubuntu SMP Mon Oct 18 
11:38:05 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 0cc0e067973923efc901d76f24a739d06b0ed4b7 |
   | Default Java | Red Hat, Inc.-1.8.0_312-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4157/1/testReport/ |
   | Max. process+thread count | 523 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: 
hadoop-hdfs-project/hadoop-hdfs-native-client |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4157/1/console |
   | versions | git=2.27.0 maven=3.6.3 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-17843) Support IPV6 with IP for internal and external communication

2022-04-10 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17843?focusedWorklogId=755052=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-755052
 ]

ASF GitHub Bot logged work on HADOOP-17843:
---

Author: ASF GitHub Bot
Created on: 10/Apr/22 21:45
Start Date: 10/Apr/22 21:45
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on PR #3290:
URL: https://github.com/apache/hadoop/pull/3290#issuecomment-1094376403

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   1m  1s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 2 new or modified test files.  |
    _ HADOOP-17800 Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 43s |  |  Maven dependency ordering for branch  |
   | -1 :x: |  mvninstall  |  23m 43s | 
[/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/branch-mvninstall-root.txt)
 |  root in HADOOP-17800 failed.  |
   | -1 :x: |  compile  |  14m 25s | 
[/branch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/branch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in HADOOP-17800 failed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.  |
   | -1 :x: |  compile  |  12m 11s | 
[/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in HADOOP-17800 failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | +1 :green_heart: |  checkstyle  |   3m 52s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  mvnsite  |   2m 35s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  javadoc  |   2m  5s |  |  HADOOP-17800 passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   2m 27s |  |  HADOOP-17800 passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   5m 50s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  shadedclient  |  16m 24s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 21s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   2m 11s |  |  the patch passed  |
   | -1 :x: |  compile  |  14m 14s | 
[/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in the patch failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. 
 |
   | -1 :x: |  javac  |  14m 14s | 
[/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in the patch failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. 
 |
   | -1 :x: |  compile  |  12m  7s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | -1 :x: |  javac  |  12m  7s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   3m 42s | 
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/results-checkstyle-root.txt)
 |  root: The patch generated 1 new + 141 unchanged - 0 fixed = 142 total (was 
141)  |
   | +1 :green_heart: |  mvnsite  |   2m 24s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   1m 55s |  |  the patch passed with JDK 

[GitHub] [hadoop] hadoop-yetus commented on pull request #3290: HADOOP-17843. Support IPV6 with IP for internal and external communication

2022-04-10 Thread GitBox


hadoop-yetus commented on PR #3290:
URL: https://github.com/apache/hadoop/pull/3290#issuecomment-1094376403

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   1m  1s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 2 new or modified test files.  |
    _ HADOOP-17800 Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 43s |  |  Maven dependency ordering for branch  |
   | -1 :x: |  mvninstall  |  23m 43s | 
[/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/branch-mvninstall-root.txt)
 |  root in HADOOP-17800 failed.  |
   | -1 :x: |  compile  |  14m 25s | 
[/branch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/branch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in HADOOP-17800 failed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.  |
   | -1 :x: |  compile  |  12m 11s | 
[/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in HADOOP-17800 failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | +1 :green_heart: |  checkstyle  |   3m 52s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  mvnsite  |   2m 35s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  javadoc  |   2m  5s |  |  HADOOP-17800 passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   2m 27s |  |  HADOOP-17800 passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   5m 50s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  shadedclient  |  16m 24s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 21s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   2m 11s |  |  the patch passed  |
   | -1 :x: |  compile  |  14m 14s | 
[/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in the patch failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. 
 |
   | -1 :x: |  javac  |  14m 14s | 
[/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in the patch failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. 
 |
   | -1 :x: |  compile  |  12m  7s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | -1 :x: |  javac  |  12m  7s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   3m 42s | 
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/5/artifact/out/results-checkstyle-root.txt)
 |  root: The patch generated 1 new + 141 unchanged - 0 fixed = 142 total (was 
141)  |
   | +1 :green_heart: |  mvnsite  |   2m 24s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   1m 55s |  |  the patch passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   2m 26s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   6m  2s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  16m 14s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | -1 :x: |  unit  | 366m 54s | 

[jira] [Work logged] (HADOOP-17843) Support IPV6 with IP for internal and external communication

2022-04-10 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17843?focusedWorklogId=755048=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-755048
 ]

ASF GitHub Bot logged work on HADOOP-17843:
---

Author: ASF GitHub Bot
Created on: 10/Apr/22 20:02
Start Date: 10/Apr/22 20:02
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on PR #3290:
URL: https://github.com/apache/hadoop/pull/3290#issuecomment-1094358940

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  12m 21s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 2 new or modified test files.  |
    _ HADOOP-17800 Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 50s |  |  Maven dependency ordering for branch  |
   | -1 :x: |  mvninstall  |  21m  0s | 
[/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/branch-mvninstall-root.txt)
 |  root in HADOOP-17800 failed.  |
   | -1 :x: |  compile  |  13m 15s | 
[/branch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/branch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in HADOOP-17800 failed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.  |
   | -1 :x: |  compile  |  11m 30s | 
[/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in HADOOP-17800 failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | +1 :green_heart: |  checkstyle  |   3m 30s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  mvnsite  |   4m  3s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  javadoc  |   3m  8s |  |  HADOOP-17800 passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   4m  9s |  |  HADOOP-17800 passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   7m 48s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  shadedclient  |  13m 54s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 27s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   3m 11s |  |  the patch passed  |
   | -1 :x: |  compile  |  13m  4s | 
[/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in the patch failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. 
 |
   | -1 :x: |  javac  |  13m  4s | 
[/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in the patch failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. 
 |
   | -1 :x: |  compile  |  11m 27s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | -1 :x: |  javac  |  11m 27s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   3m 18s |  |  root: The patch generated 
0 new + 165 unchanged - 1 fixed = 165 total (was 166)  |
   | +1 :green_heart: |  mvnsite  |   3m 52s |  |  the patch passed  |
   | -1 :x: |  javadoc  |   0m 52s | 

[GitHub] [hadoop] hadoop-yetus commented on pull request #3290: HADOOP-17843. Support IPV6 with IP for internal and external communication

2022-04-10 Thread GitBox


hadoop-yetus commented on PR #3290:
URL: https://github.com/apache/hadoop/pull/3290#issuecomment-1094358940

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  12m 21s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 2 new or modified test files.  |
    _ HADOOP-17800 Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 50s |  |  Maven dependency ordering for branch  |
   | -1 :x: |  mvninstall  |  21m  0s | 
[/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/branch-mvninstall-root.txt)
 |  root in HADOOP-17800 failed.  |
   | -1 :x: |  compile  |  13m 15s | 
[/branch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/branch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in HADOOP-17800 failed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.  |
   | -1 :x: |  compile  |  11m 30s | 
[/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in HADOOP-17800 failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | +1 :green_heart: |  checkstyle  |   3m 30s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  mvnsite  |   4m  3s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  javadoc  |   3m  8s |  |  HADOOP-17800 passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   4m  9s |  |  HADOOP-17800 passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   7m 48s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  shadedclient  |  13m 54s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 27s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   3m 11s |  |  the patch passed  |
   | -1 :x: |  compile  |  13m  4s | 
[/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in the patch failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. 
 |
   | -1 :x: |  javac  |  13m  4s | 
[/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in the patch failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. 
 |
   | -1 :x: |  compile  |  11m 27s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | -1 :x: |  javac  |  11m 27s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   3m 18s |  |  root: The patch generated 
0 new + 165 unchanged - 1 fixed = 165 total (was 166)  |
   | +1 :green_heart: |  mvnsite  |   3m 52s |  |  the patch passed  |
   | -1 :x: |  javadoc  |   0m 52s | 
[/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/7/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  hadoop-common in the patch failed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.  |
   | +1 :green_heart: |  javadoc  |   3m 57s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   8m 13s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  14m 11s |  | 

[jira] [Work logged] (HADOOP-17843) Support IPV6 with IP for internal and external communication

2022-04-10 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17843?focusedWorklogId=755046=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-755046
 ]

ASF GitHub Bot logged work on HADOOP-17843:
---

Author: ASF GitHub Bot
Created on: 10/Apr/22 19:32
Start Date: 10/Apr/22 19:32
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on PR #3290:
URL: https://github.com/apache/hadoop/pull/3290#issuecomment-1094354228

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 45s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 2 new or modified test files.  |
    _ HADOOP-17800 Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 47s |  |  Maven dependency ordering for branch  |
   | -1 :x: |  mvninstall  |  21m 37s | 
[/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/branch-mvninstall-root.txt)
 |  root in HADOOP-17800 failed.  |
   | -1 :x: |  compile  |  14m 23s | 
[/branch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/branch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in HADOOP-17800 failed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.  |
   | -1 :x: |  compile  |  11m 48s | 
[/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in HADOOP-17800 failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | +1 :green_heart: |  checkstyle  |   3m 33s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  mvnsite  |   2m 50s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  javadoc  |   2m 12s |  |  HADOOP-17800 passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   2m 36s |  |  HADOOP-17800 passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   6m  3s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  shadedclient  |  13m 55s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 28s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   2m 14s |  |  the patch passed  |
   | -1 :x: |  compile  |  14m 12s | 
[/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in the patch failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. 
 |
   | -1 :x: |  javac  |  14m 12s | 
[/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in the patch failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. 
 |
   | -1 :x: |  compile  |  12m 20s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | -1 :x: |  javac  |  12m 20s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | +1 :green_heart: |  blanks  |   0m  1s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   3m 25s | 
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/results-checkstyle-root.txt)
 |  root: The patch generated 1 new + 140 unchanged - 0 fixed = 141 total (was 
140)  |
   | +1 :green_heart: |  mvnsite  |   2m 33s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m  3s |  |  the patch passed with JDK 

[GitHub] [hadoop] hadoop-yetus commented on pull request #3290: HADOOP-17843. Support IPV6 with IP for internal and external communication

2022-04-10 Thread GitBox


hadoop-yetus commented on PR #3290:
URL: https://github.com/apache/hadoop/pull/3290#issuecomment-1094354228

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 45s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 2 new or modified test files.  |
    _ HADOOP-17800 Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 47s |  |  Maven dependency ordering for branch  |
   | -1 :x: |  mvninstall  |  21m 37s | 
[/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/branch-mvninstall-root.txt)
 |  root in HADOOP-17800 failed.  |
   | -1 :x: |  compile  |  14m 23s | 
[/branch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/branch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in HADOOP-17800 failed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.  |
   | -1 :x: |  compile  |  11m 48s | 
[/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in HADOOP-17800 failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | +1 :green_heart: |  checkstyle  |   3m 33s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  mvnsite  |   2m 50s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  javadoc  |   2m 12s |  |  HADOOP-17800 passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   2m 36s |  |  HADOOP-17800 passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   6m  3s |  |  HADOOP-17800 passed  |
   | +1 :green_heart: |  shadedclient  |  13m 55s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 28s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   2m 14s |  |  the patch passed  |
   | -1 :x: |  compile  |  14m 12s | 
[/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in the patch failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. 
 |
   | -1 :x: |  javac  |  14m 12s | 
[/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/patch-compile-root-jdkUbuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04.txt)
 |  root in the patch failed with JDK Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04. 
 |
   | -1 :x: |  compile  |  12m 20s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | -1 :x: |  javac  |  12m 20s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | +1 :green_heart: |  blanks  |   0m  1s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   3m 25s | 
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3290/6/artifact/out/results-checkstyle-root.txt)
 |  root: The patch generated 1 new + 140 unchanged - 0 fixed = 141 total (was 
140)  |
   | +1 :green_heart: |  mvnsite  |   2m 33s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m  3s |  |  the patch passed with JDK 
Ubuntu-11.0.14.1+1-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   2m 30s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   6m  6s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  14m 37s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  | 236m 33s |  |  

[GitHub] [hadoop] hadoop-yetus commented on pull request #4156: HDFS-16457.Make fs.getspaceused.classname reconfigurable (apache#4069)

2022-04-10 Thread GitBox


hadoop-yetus commented on PR #4156:
URL: https://github.com/apache/hadoop/pull/4156#issuecomment-1094353628

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   6m 55s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 2 new or modified test files.  |
    _ branch-3.3 Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  35m 57s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  compile  |   1m 16s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  checkstyle  |   0m 54s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  mvnsite  |   1m 23s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  javadoc  |   1m 36s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  spotbugs  |   3m 22s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  shadedclient  |  26m  1s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   1m 13s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   1m  3s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |   1m  3s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   0m 41s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   1m 10s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   1m 24s |  |  the patch passed  |
   | +1 :green_heart: |  spotbugs  |   3m 10s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  25m 12s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | -1 :x: |  unit  | 187m 21s | 
[/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4156/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt)
 |  hadoop-hdfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 51s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 296m 54s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.hdfs.server.namenode.ha.TestHAAppend |
   |   | hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks |
   |   | hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewer |
   |   | hadoop.hdfs.server.namenode.ha.TestStandbyCheckpoints |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4156/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4156 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell |
   | uname | Linux 69ab364732ba 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 
23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | branch-3.3 / 3fe04d35a361b838f4e39906e32b0d3de4683531 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4156/1/testReport/ |
   | Max. process+thread count | 3703 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs U: 
hadoop-hdfs-project/hadoop-hdfs |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4156/1/console |
   | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4157: HDFS-16474. Make HDFS tail tool cross platform

2022-04-10 Thread GitBox


hadoop-yetus commented on PR #4157:
URL: https://github.com/apache/hadoop/pull/4157#issuecomment-1094350860

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  47m 58s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 5 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  42m 30s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   3m 40s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 25s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  69m 13s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 14s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   3m 38s |  |  the patch passed  |
   | +1 :green_heart: |  cc  |   3m 38s |  |  the patch passed  |
   | +1 :green_heart: |  golang  |   3m 38s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |   3m 38s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   0m 15s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  22m 20s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  93m 39s |  |  hadoop-hdfs-native-client in 
the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 35s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 240m  0s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4157/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4157 |
   | Optional Tests | dupname asflicense compile cc mvnsite javac unit 
codespell golang |
   | uname | Linux 1719e126d85a 4.15.0-162-generic #170-Ubuntu SMP Mon Oct 18 
11:38:05 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 0cc0e067973923efc901d76f24a739d06b0ed4b7 |
   | Default Java | Red Hat, Inc.-1.8.0_322-b06 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4157/1/testReport/ |
   | Max. process+thread count | 559 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: 
hadoop-hdfs-project/hadoop-hdfs-native-client |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4157/1/console |
   | versions | git=2.9.5 maven=3.6.3 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] Hexiaoqiao commented on pull request #4141: HDFS-16534. Split FsDatasetImpl from block pool locks to volume grain locks.

2022-04-10 Thread GitBox


Hexiaoqiao commented on PR #4141:
URL: https://github.com/apache/hadoop/pull/4141#issuecomment-1094299342

   Please fix checkstyle as Yetus suggest 
[/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4141/2/artifact/out/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] Hexiaoqiao commented on a diff in pull request #4141: HDFS-16534. Split FsDatasetImpl from block pool locks to volume grain locks.

2022-04-10 Thread GitBox


Hexiaoqiao commented on code in PR #4141:
URL: https://github.com/apache/hadoop/pull/4141#discussion_r846794147


##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/fsdataset/impl/FsDatasetImpl.java:
##
@@ -2888,9 +2920,6 @@ static ReplicaRecoveryInfo initReplicaRecoveryImpl(String 
bpid, ReplicaMap map,
   Block block, long recoveryId)
   throws IOException, MustStopExistingWriter {
 final ReplicaInfo replica = map.get(bpid, block.getBlockId());
-LOG.info("initReplicaRecovery: " + block + ", recoveryId=" + recoveryId

Review Comment:
   Not sure why delete this log info here, if no anymore consideration, just 
suggest to keep it. 



##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/fsdataset/impl/FsDatasetImpl.java:
##
@@ -2875,7 +2900,14 @@ static ReplicaRecoveryInfo initReplicaRecovery(String 
bpid, ReplicaMap map,
lockManager) throws IOException {
 while (true) {
   try {
-try (AutoCloseDataSetLock l = 
lockManager.writeLock(LockLevel.BLOCK_POOl, bpid)) {
+ReplicaInfo replica = map.get(bpid, block.getBlockId());
+LOG.info("initReplicaRecovery: " + block + ", recoveryId=" + recoveryId
++ ", replica=" + replica);
+if (replica == null) {

Review Comment:
   I think it is good choice to move this statement before LOG.info.



##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/fsdataset/impl/FsDatasetImpl.java:
##
@@ -629,6 +634,9 @@ public void removeVolumes(
 synchronized (this) {
   for (String storageUuid : storageToRemove) {
 storageMap.remove(storageUuid);
+for (String bp : volumeMap.getBlockPoolList()) {

Review Comment:
   do we need to synchronized this segment? IMO, lock remove is thread safe 
here, right?



##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/fsdataset/impl/FsDatasetImpl.java:
##
@@ -524,6 +527,8 @@ public void addVolume(final StorageLocation location,
 
 for (final NamespaceInfo nsInfo : nsInfos) {
   String bpid = nsInfo.getBlockPoolID();
+  String vol = fsVolume.getStorageID();
+  lockManager.addLock(LockLevel.VOLUME, bpid, vol);

Review Comment:
   Is this duplicate action here since `activateVolume` already invoke the same 
logic to add volume/blockpool level lock? 



##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/fsdataset/impl/FsDatasetImpl.java:
##
@@ -2888,9 +2920,6 @@ static ReplicaRecoveryInfo initReplicaRecoveryImpl(String 
bpid, ReplicaMap map,
   Block block, long recoveryId)
   throws IOException, MustStopExistingWriter {
 final ReplicaInfo replica = map.get(bpid, block.getBlockId());
-LOG.info("initReplicaRecovery: " + block + ", recoveryId=" + recoveryId

Review Comment:
   Ah, just notice that add the same log info at `initReplicaRecovery`, it make 
sense to me if you want to move this log out of lock logic.



##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/fsdataset/impl/FsDatasetImpl.java:
##
@@ -2860,7 +2878,14 @@ ReplicaRecoveryInfo initReplicaRecovery(String bpid, 
ReplicaMap map,
   Block block, long recoveryId, long xceiverStopTimeout) throws 
IOException {
 while (true) {
   try {
-try (AutoCloseDataSetLock l = 
lockManager.writeLock(LockLevel.BLOCK_POOl, bpid)) {
+ReplicaInfo replica = map.get(bpid, block.getBlockId());
+LOG.info("initReplicaRecovery: " + block + ", recoveryId=" + recoveryId
++ ", replica=" + replica);
+if (replica == null) {

Review Comment:
   I think it is good choice to move this statement before LOG.info.



##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/fsdataset/impl/FsDatasetImpl.java:
##
@@ -1887,12 +1896,12 @@ public ReplicaHandler createTemporary(StorageType 
storageType,
   false);
 }
 long startHoldLockTimeMs = Time.monotonicNow();
-try (AutoCloseableLock lock = lockManager.writeLock(LockLevel.BLOCK_POOl,
-b.getBlockPoolId())) {
-  FsVolumeReference ref = volumes.getNextVolume(storageType, storageId, b
-  .getNumBytes());
-  FsVolumeImpl v = (FsVolumeImpl) ref.getVolume();
-  ReplicaInPipeline newReplicaInfo;
+FsVolumeReference ref = volumes.getNextVolume(storageType, storageId, b

Review Comment:
   Not sure if this is thread safe without any `ReadWriteLock`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [hadoop] GauthamBanasandra opened a new pull request, #4157: HDFS-16474. Make HDFS tail tool cross platform

2022-04-10 Thread GitBox


GauthamBanasandra opened a new pull request, #4157:
URL: https://github.com/apache/hadoop/pull/4157

   
   
   ### Description of PR
   The source files for `hdfs_tail` uses `getopt` for parsing the command line 
arguments. getopt is available only on Linux and thus, isn't cross platform. We 
need to replace getopt with `boost::program_options` to make these tools cross 
platform.
   
   
   ### How was this patch tested?
   In progress.
   
   ### For code changes:
   
   - [x] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] singer-bin opened a new pull request, #4156: HDFS-16457.Make fs.getspaceused.classname reconfigurable (apache#4069)

2022-04-10 Thread GitBox


singer-bin opened a new pull request, #4156:
URL: https://github.com/apache/hadoop/pull/4156

   Cherry-pick https://github.com/apache/hadoop/pull/4069 to branch-3.3.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] virajjasani commented on pull request #4107: HDFS-16521. DFS API to retrieve slow datanodes

2022-04-10 Thread GitBox


virajjasani commented on PR #4107:
URL: https://github.com/apache/hadoop/pull/4107#issuecomment-1094233257

   To provide more insights, 
[FanOutOneBlockAsyncDFSOutput](https://github.com/apache/hbase/blob/master/hbase-asyncfs/src/main/java/org/apache/hadoop/hbase/io/asyncfs/FanOutOneBlockAsyncDFSOutput.java)
 in HBase currently has to rely on it's own way of marking and excluding slow 
nodes while 1) creating pipelines and 2) handling ack, based on factors like 
the data length of the packet, processing time with last ack timestamp, whether 
flush to replicas is finished etc. If it can utilize slownode API from HDFS to 
exclude nodes appropriately while writing block, a lot of it's own post-ack 
computation of slow nodes can be _saved_ or _improved_ or based on further 
experiment, we could find _better solution_ to manage slow node detection logic 
both in HDFS and HBase. However, in order to collect more data points and run 
more POC around this area, at least we should expect HDFS to provide API for 
downstreamers to efficiently utilize slownode info for such critical 
low-latency use-case (like w
 riting WALs).
   
   cc @jojochuang @saintstack 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ayushtkn commented on a diff in pull request #4138: HDFS-16479. EC: NameNode should not send a reconstruction work when the source datanodes are insufficient

2022-04-10 Thread GitBox


ayushtkn commented on code in PR #4138:
URL: https://github.com/apache/hadoop/pull/4138#discussion_r846742614


##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/blockmanagement/BlockManager.java:
##
@@ -2163,6 +2163,17 @@ BlockReconstructionWork scheduleReconstruction(BlockInfo 
block,
   return null;
 }
 
+// skip if source datanodes for reconstructing ec block are not enough
+if (block.isStriped()) {
+  BlockInfoStriped stripedBlock = (BlockInfoStriped) block;
+  int cellsNum = (int) ((stripedBlock.getNumBytes() - 1) / 
stripedBlock.getCellSize() + 1);
+  int minRequiredSources = Math.min(cellsNum, 
stripedBlock.getDataBlockNum());
+  if (minRequiredSources > srcNodes.length) {
+LOG.debug("Block {} cannot be reconstructed due to shortage of source 
datanodes ", block);
+return null;

Review Comment:
   Should we increment the metrics before returning ``null``
   ```
   NameNode.getNameNodeMetrics().incNumTimesReReplicationNotScheduled();
   ```



##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/blockmanagement/BlockManager.java:
##
@@ -2163,6 +2163,17 @@ BlockReconstructionWork scheduleReconstruction(BlockInfo 
block,
   return null;
 }
 
+// skip if source datanodes for reconstructing ec block are not enough
+if (block.isStriped()) {
+  BlockInfoStriped stripedBlock = (BlockInfoStriped) block;
+  int cellsNum = (int) ((stripedBlock.getNumBytes() - 1) / 
stripedBlock.getCellSize() + 1);
+  int minRequiredSources = Math.min(cellsNum, 
stripedBlock.getDataBlockNum());

Review Comment:
   Is this logic same as BlockInfoStriped.getRealDataBlockNum() can we use or 
extract the logic from there? or do some refactoring there, just trying if we 
can keep the logic at one place, in case there is some issue in the logic 
changing at one places fixes all the places..



##
hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/server/blockmanagement/TestBlockManager.java:
##
@@ -852,6 +852,101 @@ public void 
testChooseSrcDNWithDupECInDecommissioningNode() throws Exception {
 0, numReplicas.redundantInternalBlocks());
   }
 
+  @Test
+  public void testSkipReconstructionWithManyBusyNodes() {
+long blockId = -9223372036854775776L; // real ec block id
+// RS-3-2 EC policy
+ErasureCodingPolicy ecPolicy =
+SystemErasureCodingPolicies.getPolicies().get(1);
+
+// striped blockInfo: 3 data blocks + 2 parity blocks
+Block aBlock = new Block(blockId, ecPolicy.getCellSize() * 
ecPolicy.getNumDataUnits(), 0);
+BlockInfoStriped aBlockInfoStriped = new BlockInfoStriped(aBlock, 
ecPolicy);
+
+// create 4 storageInfo, which means 1 block is missing
+DatanodeStorageInfo ds1 = DFSTestUtil.createDatanodeStorageInfo(
+"storage1", "1.1.1.1", "rack1", "host1");
+DatanodeStorageInfo ds2 = DFSTestUtil.createDatanodeStorageInfo(
+"storage2", "2.2.2.2", "rack2", "host2");
+DatanodeStorageInfo ds3 = DFSTestUtil.createDatanodeStorageInfo(
+"storage3", "3.3.3.3", "rack3", "host3");
+DatanodeStorageInfo ds4 = DFSTestUtil.createDatanodeStorageInfo(
+"storage4", "4.4.4.4", "rack4", "host4");
+
+// link block with storage
+aBlockInfoStriped.addStorage(ds1, aBlock);
+aBlockInfoStriped.addStorage(ds2, new Block(blockId + 1, 0, 0));
+aBlockInfoStriped.addStorage(ds3, new Block(blockId + 2, 0, 0));
+aBlockInfoStriped.addStorage(ds4, new Block(blockId + 3, 0, 0));
+
+addEcBlockToBM(blockId, ecPolicy);
+aBlockInfoStriped.setBlockCollectionId(mockINodeId);
+
+// reconstruction should be scheduled
+BlockReconstructionWork work = 
bm.scheduleReconstruction(aBlockInfoStriped, 3);
+assertNotNull(work);
+
+// simulate the 2 nodes reach maxReplicationStreams
+for(int i = 0; i < bm.maxReplicationStreams; i++){
+  ds3.getDatanodeDescriptor().incrementPendingReplicationWithoutTargets();
+  ds4.getDatanodeDescriptor().incrementPendingReplicationWithoutTargets();
+}
+
+// reconstruction should be skipped since the number of non-busy nodes are 
not enough
+work = bm.scheduleReconstruction(aBlockInfoStriped, 3);
+assertNull(work);
+  }
+
+  @Test
+  public void testSkipReconstructionWithManyBusyNodes2() {
+long blockId = -9223372036854775776L; // real ec block id
+// RS-3-2 EC policy
+ErasureCodingPolicy ecPolicy =
+SystemErasureCodingPolicies.getPolicies().get(1);
+
+// striped blockInfo: 2 data blocks + 2 paritys

Review Comment:
   typo `paritys`



##
hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/server/blockmanagement/TestBlockManager.java:
##
@@ -852,6 +852,101 @@ public void 
testChooseSrcDNWithDupECInDecommissioningNode() throws Exception {
 0, numReplicas.redundantInternalBlocks());
   }
 
+  @Test
+  public void