[GitHub] [hadoop] hadoop-yetus commented on pull request #5674: HDFS-17020. RBF: mount table addAll should print failed records in std error
hadoop-yetus commented on PR #5674: URL: https://github.com/apache/hadoop/pull/5674#issuecomment-1556567594 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 17m 30s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | buf | 0m 0s | | buf was not available. | | +0 :ok: | buf | 0m 0s | | buf was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 36m 49s | | trunk passed | | +1 :green_heart: | compile | 0m 37s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 0m 34s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 0m 30s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 38s | | trunk passed | | +1 :green_heart: | javadoc | 0m 41s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 27s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 1m 28s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 49s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 30s | | the patch passed | | +1 :green_heart: | compile | 0m 31s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | cc | 0m 32s | | the patch passed | | +1 :green_heart: | javac | 0m 31s | | the patch passed | | +1 :green_heart: | compile | 0m 27s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | cc | 0m 27s | | the patch passed | | +1 :green_heart: | javac | 0m 27s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 17s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 30s | | the patch passed | | +1 :green_heart: | javadoc | 0m 27s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 21s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 1m 19s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 48s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 21m 44s | | hadoop-hdfs-rbf in the patch passed. | | +1 :green_heart: | asflicense | 0m 34s | | The patch does not generate ASF License warnings. | | | | 135m 49s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5674/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5674 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets cc buflint bufcompat | | uname | Linux 21fe891e972e 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 9753f1c37aca937f03c002c598c9f9a75c75bbdc | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5674/3/testReport/ | | Max. process+thread count | 2438 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-rbf U: hadoop-hdfs-project/hadoop-hdfs-rbf | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5674/3/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message
[GitHub] [hadoop] mudit-97 commented on pull request #5680: MAPREDUCE-7438 : Support removal of only selective node states in untracked removal flow
mudit-97 commented on PR #5680: URL: https://github.com/apache/hadoop/pull/5680#issuecomment-1556562975 Created new PR in Yarn Project: https://github.com/apache/hadoop/pull/5681 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mudit-97 opened a new pull request, #5681: YARN-11497 : Support removal of only selective node states in untracked removal flow
mudit-97 opened a new pull request, #5681: URL: https://github.com/apache/hadoop/pull/5681 ASF JIRA: https://issues.apache.org/jira/browse/YARN-11497 Allowing config to remove untracked nodes having only selective nodeStates Config: yarn.resourcemanager.node-removal-untracked.node-selective-states-to-remove If Yarn untracked removal is enabled, then this config can control what all node states can be removed. If the untracked node is not having one of these states, then node will skipped for removal. If this config value is set to empty, all node states, will be eligible for removal NodeState is an ENUM: org.apache.hadoop.yarn.api.records.NodeState How was this patch tested? Adding 1 new UT in TestResourceTrackerService.java -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #5423: HDFS-16882. RBF: Add cache hit rate metric in MountTableResolver#getDestinationForPath
hadoop-yetus commented on PR #5423: URL: https://github.com/apache/hadoop/pull/5423#issuecomment-1556559339 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 11m 17s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ branch-3.3 Compile Tests _ | | +1 :green_heart: | mvninstall | 40m 5s | | branch-3.3 passed | | +1 :green_heart: | compile | 0m 38s | | branch-3.3 passed | | +1 :green_heart: | checkstyle | 0m 35s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 0m 44s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 1m 7s | | branch-3.3 passed | | +1 :green_heart: | spotbugs | 1m 33s | | branch-3.3 passed | | +1 :green_heart: | shadedclient | 24m 32s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 33s | | the patch passed | | +1 :green_heart: | compile | 0m 30s | | the patch passed | | +1 :green_heart: | javac | 0m 30s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 19s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 34s | | the patch passed | | +1 :green_heart: | javadoc | 0m 51s | | the patch passed | | +1 :green_heart: | spotbugs | 1m 18s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 59s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 8s | | hadoop-hdfs-rbf in the patch passed. | | +1 :green_heart: | asflicense | 0m 37s | | The patch does not generate ASF License warnings. | | | | 128m 25s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5423/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5423 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 356cbf6f6e57 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / b721f8de078a47f847f228ad15b92972739fc451 | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~18.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5423/2/testReport/ | | Max. process+thread count | 2718 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-rbf U: hadoop-hdfs-project/hadoop-hdfs-rbf | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5423/2/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #5674: HDFS-17020. RBF: mount table addAll should print failed records in std error
hadoop-yetus commented on PR #5674: URL: https://github.com/apache/hadoop/pull/5674#issuecomment-1556547632 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 33s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | buf | 0m 0s | | buf was not available. | | +0 :ok: | buf | 0m 0s | | buf was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 33m 38s | | trunk passed | | +1 :green_heart: | compile | 0m 40s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 0m 38s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 0m 34s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 43s | | trunk passed | | +1 :green_heart: | javadoc | 0m 47s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 33s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 1m 30s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 55s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 30s | | the patch passed | | +1 :green_heart: | compile | 0m 30s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | cc | 0m 30s | | the patch passed | | +1 :green_heart: | javac | 0m 30s | | the patch passed | | +1 :green_heart: | compile | 0m 29s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | cc | 0m 29s | | the patch passed | | +1 :green_heart: | javac | 0m 29s | | the patch passed | | +1 :green_heart: | blanks | 0m 1s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 18s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 31s | | the patch passed | | +1 :green_heart: | javadoc | 0m 27s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 23s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 1m 18s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 24s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 20m 35s | | hadoop-hdfs-rbf in the patch passed. | | +1 :green_heart: | asflicense | 0m 40s | | The patch does not generate ASF License warnings. | | | | 109m 11s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5674/4/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5674 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets cc buflint bufcompat | | uname | Linux 6d115bf3b8ce 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 9753f1c37aca937f03c002c598c9f9a75c75bbdc | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5674/4/testReport/ | | Max. process+thread count | 2404 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-rbf U: hadoop-hdfs-project/hadoop-hdfs-rbf | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5674/4/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message
[jira] [Commented] (HADOOP-18207) Introduce hadoop-logging module
[ https://issues.apache.org/jira/browse/HADOOP-18207?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17724751#comment-17724751 ] ASF GitHub Bot commented on HADOOP-18207: - virajjasani commented on code in PR #5503: URL: https://github.com/apache/hadoop/pull/5503#discussion_r1199923215 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/log/LogLevel.java: ## @@ -349,7 +348,7 @@ public void doGet(HttpServletRequest request, HttpServletResponse response } if (GenericsUtil.isLog4jLogger(logName)) { Review Comment: Thanks @Apache9 for raising this point btw. let me paste my [comment](https://issues.apache.org/jira/browse/HADOOP-16206?focusedCommentId=17702846=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-17702846) from parent jira, the jira has too many comments so can be easily ignored :) _FYI, we use some of the core log4j APIs specifically meant to be used for properties based config and log4j2 has this support removed as they were meant to be private usage. Moreover, we also programmatically set monitor interval for dynamic log4j file changes in httpfs server. This is also no longer supported as that is also meant to be kept in config only (properties, xml, json, yaml etc)._ _I have started this thread with log4j dev/user group for the same https://lists.apache.org/thread/4l7oyk84jpj6br0sn4ymofdcbgfxmtqp_ _So far, the only recommended solution provided by them is to configure monitor interval in log4j properties only (i.e. user provided custom log4j properties, in case of httpfs Server), and use Configurator API for the properties file specific usage we have._ > Introduce hadoop-logging module > --- > > Key: HADOOP-18207 > URL: https://issues.apache.org/jira/browse/HADOOP-18207 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Duo Zhang >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > > There are several goals here: > # Provide the ability to change log level, get log level, etc. > # Place all the appender implementation(?) > # Hide the real logging implementation. > # Later we could remove all the log4j references in other hadoop module. > # Move as much log4j usage to the module as possible. > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] virajjasani commented on a diff in pull request #5503: HADOOP-18207. Introduce hadoop-logging module
virajjasani commented on code in PR #5503: URL: https://github.com/apache/hadoop/pull/5503#discussion_r1199923215 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/log/LogLevel.java: ## @@ -349,7 +348,7 @@ public void doGet(HttpServletRequest request, HttpServletResponse response } if (GenericsUtil.isLog4jLogger(logName)) { Review Comment: Thanks @Apache9 for raising this point btw. let me paste my [comment](https://issues.apache.org/jira/browse/HADOOP-16206?focusedCommentId=17702846=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-17702846) from parent jira, the jira has too many comments so can be easily ignored :) _FYI, we use some of the core log4j APIs specifically meant to be used for properties based config and log4j2 has this support removed as they were meant to be private usage. Moreover, we also programmatically set monitor interval for dynamic log4j file changes in httpfs server. This is also no longer supported as that is also meant to be kept in config only (properties, xml, json, yaml etc)._ _I have started this thread with log4j dev/user group for the same https://lists.apache.org/thread/4l7oyk84jpj6br0sn4ymofdcbgfxmtqp_ _So far, the only recommended solution provided by them is to configure monitor interval in log4j properties only (i.e. user provided custom log4j properties, in case of httpfs Server), and use Configurator API for the properties file specific usage we have._ -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18207) Introduce hadoop-logging module
[ https://issues.apache.org/jira/browse/HADOOP-18207?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17724750#comment-17724750 ] ASF GitHub Bot commented on HADOOP-18207: - virajjasani commented on code in PR #5503: URL: https://github.com/apache/hadoop/pull/5503#discussion_r1199921484 ## hadoop-common-project/hadoop-common/src/main/conf/log4j.properties: ## @@ -299,7 +299,7 @@ log4j.appender.NMAUDIT.MaxBackupIndex=${nm.audit.log.maxbackupindex} yarn.ewma.cleanupInterval=300 yarn.ewma.messageAgeLimitSeconds=86400 yarn.ewma.maxUniqueMessages=250 -log4j.appender.EWMA=org.apache.hadoop.yarn.util.Log4jWarningErrorMetricsAppender +log4j.appender.EWMA=org.apache.hadoop.logging.appenders.Log4jWarningErrorMetricsAppender Review Comment: that's because with this PR, all appenders (including `Log4jWarningErrorMetricsAppender`) are moved to hadoop-logging ## hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/conf/TestConfiguration.java: ## @@ -220,27 +219,21 @@ public void testFinalWarnings() throws Exception { InputStream in2 = new ByteArrayInputStream(bytes2); // Attach our own log appender so we can verify output -TestAppender appender = new TestAppender(); -final Logger logger = Logger.getRootLogger(); -logger.addAppender(appender); +LogCapturer logCapturer = LogCapturer.captureLogs(LoggerFactory.getLogger("root")); Review Comment: correct, this is the right way ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/log/LogLevel.java: ## @@ -349,7 +348,7 @@ public void doGet(HttpServletRequest request, HttpServletResponse response } if (GenericsUtil.isLog4jLogger(logName)) { Review Comment: yes unfortunately, we will still have some log4j references outside of hadoop-logging, moving all references is extremely difficult. i am planning to get upgraded to log4j2 with this change and then eventually we can try moving all usages to hadoop-logging. With this PR, we have almost 95% of log4j references moved to hadoop-logging. > Introduce hadoop-logging module > --- > > Key: HADOOP-18207 > URL: https://issues.apache.org/jira/browse/HADOOP-18207 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Duo Zhang >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > > There are several goals here: > # Provide the ability to change log level, get log level, etc. > # Place all the appender implementation(?) > # Hide the real logging implementation. > # Later we could remove all the log4j references in other hadoop module. > # Move as much log4j usage to the module as possible. > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] virajjasani commented on a diff in pull request #5503: HADOOP-18207. Introduce hadoop-logging module
virajjasani commented on code in PR #5503: URL: https://github.com/apache/hadoop/pull/5503#discussion_r1199921484 ## hadoop-common-project/hadoop-common/src/main/conf/log4j.properties: ## @@ -299,7 +299,7 @@ log4j.appender.NMAUDIT.MaxBackupIndex=${nm.audit.log.maxbackupindex} yarn.ewma.cleanupInterval=300 yarn.ewma.messageAgeLimitSeconds=86400 yarn.ewma.maxUniqueMessages=250 -log4j.appender.EWMA=org.apache.hadoop.yarn.util.Log4jWarningErrorMetricsAppender +log4j.appender.EWMA=org.apache.hadoop.logging.appenders.Log4jWarningErrorMetricsAppender Review Comment: that's because with this PR, all appenders (including `Log4jWarningErrorMetricsAppender`) are moved to hadoop-logging ## hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/conf/TestConfiguration.java: ## @@ -220,27 +219,21 @@ public void testFinalWarnings() throws Exception { InputStream in2 = new ByteArrayInputStream(bytes2); // Attach our own log appender so we can verify output -TestAppender appender = new TestAppender(); -final Logger logger = Logger.getRootLogger(); -logger.addAppender(appender); +LogCapturer logCapturer = LogCapturer.captureLogs(LoggerFactory.getLogger("root")); Review Comment: correct, this is the right way ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/log/LogLevel.java: ## @@ -349,7 +348,7 @@ public void doGet(HttpServletRequest request, HttpServletResponse response } if (GenericsUtil.isLog4jLogger(logName)) { Review Comment: yes unfortunately, we will still have some log4j references outside of hadoop-logging, moving all references is extremely difficult. i am planning to get upgraded to log4j2 with this change and then eventually we can try moving all usages to hadoop-logging. With this PR, we have almost 95% of log4j references moved to hadoop-logging. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hfutatzhanghb commented on pull request #5597: HDFS-16993. Datanode supports configure TopN DatanodeNetworkCounts
hfutatzhanghb commented on PR #5597: URL: https://github.com/apache/hadoop/pull/5597#issuecomment-1556463308 > have triggered the build again, if the build is green will merge this today Sir, thanks for your replying. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hfutatzhanghb closed pull request #5600: HDFS-16994. NumTimedOutPendingReconstructions metrics should not be accumulated
hfutatzhanghb closed pull request #5600: HDFS-16994. NumTimedOutPendingReconstructions metrics should not be accumulated URL: https://github.com/apache/hadoop/pull/5600 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hfutatzhanghb commented on pull request #5600: HDFS-16994. NumTimedOutPendingReconstructions metrics should not be accumulated
hfutatzhanghb commented on PR #5600: URL: https://github.com/apache/hadoop/pull/5600#issuecomment-1556462000 > First it is a behaviour change, hence incompatible. > > Second, I don't find any reason we shouldn't accumulate, it is just the metrics telling since last failover/restart how many blocks timedout in pending reconstruction stage. No relation of getNumTimedOut, there can be multiple consumers as well @ayushtkn , Thanks sir, agree with you. I will close this PR. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ayushtkn commented on pull request #5597: HDFS-16993. Datanode supports configure TopN DatanodeNetworkCounts
ayushtkn commented on PR #5597: URL: https://github.com/apache/hadoop/pull/5597#issuecomment-1556461742 have triggered the build again, if the build is green will merge this today -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hfutatzhanghb commented on pull request #5597: HDFS-16993. Datanode supports configure TopN DatanodeNetworkCounts
hfutatzhanghb commented on PR #5597: URL: https://github.com/apache/hadoop/pull/5597#issuecomment-1556455626 force push to fix checkstyle problems. @ayushtkn Hi, sir. Could you please help me review this code and push this PR forward? Thanks a lot. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] virajjasani commented on a diff in pull request #5674: HDFS-17020. RBF: mount table addAll should print failed records in std error
virajjasani commented on code in PR #5674: URL: https://github.com/apache/hadoop/pull/5674#discussion_r1199906322 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/test/java/org/apache/hadoop/hdfs/server/federation/router/TestRouterAdminCLI.java: ## @@ -1918,6 +1920,9 @@ public void testAddMultipleMountPointsFailure() throws Exception { "-faulttolerant"}; // mount points were already added assertNotEquals(0, ToolRunner.run(admin, argv)); + +assertTrue("The error message should return failed entries", +err.toString().contains("Cannot add mount points: [0SLASH0testAddMultiMountPoints-01")); Review Comment: Sounds good, that will be more readable to operators. Let me make this change. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hfutatzhanghb commented on pull request #5643: HDFS-17003. Erasure coding: invalidate wrong block after reporting bad blocks from datanode
hfutatzhanghb commented on PR #5643: URL: https://github.com/apache/hadoop/pull/5643#issuecomment-1556424290 > @Hexiaoqiao Sir, thanks for your suggestions. 1、The failed UT was passed in my local environment. 2、I have fixed checkstyle problems. Thanks again. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #5680: MAPREDUCE-7438 : Support removal of only selective node states in untracked removal flow
hadoop-yetus commented on PR #5680: URL: https://github.com/apache/hadoop/pull/5680#issuecomment-1556297518 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 16m 9s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 19m 41s | | trunk passed | | +1 :green_heart: | compile | 6m 54s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 6m 24s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 1m 50s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 3s | | trunk passed | | +1 :green_heart: | javadoc | 3m 1s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 2m 47s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 6m 1s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 29s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 54s | | the patch passed | | +1 :green_heart: | compile | 6m 21s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 6m 21s | | the patch passed | | +1 :green_heart: | compile | 6m 18s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 6m 18s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 39s | [/results-checkstyle-hadoop-yarn-project_hadoop-yarn.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5680/2/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn.txt) | hadoop-yarn-project/hadoop-yarn: The patch generated 6 new + 197 unchanged - 0 fixed = 203 total (was 197) | | +1 :green_heart: | mvnsite | 2m 44s | | the patch passed | | +1 :green_heart: | javadoc | 2m 39s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 2m 34s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 6m 9s | | the patch passed | | +1 :green_heart: | shadedclient | 21m 49s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 10s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 5m 41s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 102m 15s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | asflicense | 0m 58s | | The patch does not generate ASF License warnings. | | | | 253m 54s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5680/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5680 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 31a180c82301 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 19804ca9ef9ff95a0d5a8e49ab9dfc5aab6369bf | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results |
[GitHub] [hadoop] hadoop-yetus commented on pull request #5561: HDFS-16983. Whether checking path access permissions should be decided by dfs.permissions.enabled in concat operation
hadoop-yetus commented on PR #5561: URL: https://github.com/apache/hadoop/pull/5561#issuecomment-1556286723 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 17m 38s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | -1 :x: | mvninstall | 34m 56s | [/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5561/2/artifact/out/branch-mvninstall-root.txt) | root in trunk failed. | | +1 :green_heart: | compile | 1m 18s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 1m 8s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 1m 4s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 19s | | trunk passed | | +1 :green_heart: | javadoc | 1m 9s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 1m 28s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 3m 25s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 33s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 9s | | the patch passed | | +1 :green_heart: | compile | 1m 13s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 1m 13s | | the patch passed | | +1 :green_heart: | compile | 1m 4s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 1m 4s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 55s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 12s | | the patch passed | | +1 :green_heart: | javadoc | 0m 53s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 1m 23s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 3m 19s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 15s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 224m 21s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5561/2/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | | The patch does not generate ASF License warnings. | | | | 348m 43s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.datanode.TestDirectoryScanner | | | hadoop.hdfs.TestRollingUpgrade | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5561/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5561 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux f869479f92f9 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / d0130a25759590bcc249948fa3a7bb8f00c415a0 | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5561/2/testReport/ | | Max. process+thread count | 2277 (vs. ulimit of 5500) | | modules | C:
[GitHub] [hadoop] hadoop-yetus commented on pull request #5676: YARN-6648. BackPort [GPG] Add SubClusterCleaner in Global Policy Generator.
hadoop-yetus commented on PR #5676: URL: https://github.com/apache/hadoop/pull/5676#issuecomment-1556249581 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 41s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 24m 25s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 22m 46s | | trunk passed | | +1 :green_heart: | compile | 8m 18s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 7m 37s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 1m 47s | | trunk passed | | +1 :green_heart: | mvnsite | 5m 41s | | trunk passed | | +1 :green_heart: | javadoc | 6m 15s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 5m 16s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 17m 43s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 15s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 4m 48s | | the patch passed | | +1 :green_heart: | compile | 7m 36s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 7m 36s | | the patch passed | | +1 :green_heart: | compile | 6m 56s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 6m 55s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 43s | | the patch passed | | +1 :green_heart: | mvnsite | 5m 25s | | the patch passed | | +1 :green_heart: | javadoc | 5m 42s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 4m 51s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 18m 20s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 30s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 234m 30s | | hadoop-yarn in the patch passed. | | +1 :green_heart: | unit | 1m 18s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 5m 46s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 3m 26s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 0m 37s | | hadoop-yarn-server-globalpolicygenerator in the patch passed. | | +1 :green_heart: | asflicense | 1m 4s | | The patch does not generate ASF License warnings. | | | | 454m 15s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5676/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5676 | | Optional Tests | dupname asflicense codespell detsecrets xmllint compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle | | uname | Linux 8d6c41859c83 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 9742423dd4aaad62d8c0d24eac33206ef2ad2367 | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5676/3/testReport/ | | Max. process+thread count | 2775 (vs. ulimit of
[GitHub] [hadoop] ashutoshcipher commented on pull request #5028: MAPREDUCE-7419. Upgrade Junit 4 to 5 in hadoop-mapreduce-client-common
ashutoshcipher commented on PR #5028: URL: https://github.com/apache/hadoop/pull/5028#issuecomment-1556234576 > Good to go from my side. I don't think the test failures are related, double check once. If all safe can revert the changes done in the pom and licence to trigger the tests. We can merge post that. Thanks @ayushtkn for reviewing it. I checked as well and test failures doesnt look related. Reverted the pom and licence changes as well that were for to trigger the tests -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #5680: MAPREDUCE-7438 : Support removal of only selective node states in untracked removal flow
hadoop-yetus commented on PR #5680: URL: https://github.com/apache/hadoop/pull/5680#issuecomment-1556228795 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 53s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 54s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 19m 48s | | trunk passed | | +1 :green_heart: | compile | 6m 53s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 6m 23s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 1m 50s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 59s | | trunk passed | | +1 :green_heart: | javadoc | 3m 0s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 2m 47s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 5m 59s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 48s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 28s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 52s | | the patch passed | | +1 :green_heart: | compile | 6m 19s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 6m 19s | | the patch passed | | +1 :green_heart: | compile | 6m 20s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 6m 20s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 38s | [/results-checkstyle-hadoop-yarn-project_hadoop-yarn.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5680/1/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn.txt) | hadoop-yarn-project/hadoop-yarn: The patch generated 6 new + 197 unchanged - 0 fixed = 203 total (was 197) | | +1 :green_heart: | mvnsite | 2m 45s | | the patch passed | | +1 :green_heart: | javadoc | 2m 39s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 2m 31s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 6m 6s | | the patch passed | | +1 :green_heart: | shadedclient | 21m 47s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 11s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 5m 38s | | hadoop-yarn-common in the patch passed. | | -1 :x: | unit | 102m 3s | [/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5680/1/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt) | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | asflicense | 0m 57s | | The patch does not generate ASF License warnings. | | | | 253m 16s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.yarn.server.resourcemanager.reservation.TestCapacityOverTimePolicy | | | hadoop.yarn.server.resourcemanager.TestResourceTrackerService | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5680/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5680 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 1194573d80af 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64
[jira] [Resolved] (HADOOP-18746) Install Python 3 for Windows 10 docker image
[ https://issues.apache.org/jira/browse/HADOOP-18746?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Gautham Banasandra resolved HADOOP-18746. - Fix Version/s: 3.4.0 Target Version/s: 3.4.0 Resolution: Fixed Merged PR https://github.com/apache/hadoop/pull/5679 to trunk. > Install Python 3 for Windows 10 docker image > > > Key: HADOOP-18746 > URL: https://issues.apache.org/jira/browse/HADOOP-18746 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 > Environment: Windows 10 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0 > > > Currently, mvnsite build phase fails due to the following error - > {code} > [INFO] --< org.apache.hadoop:hadoop-common > >--- > [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT > [11/114] > [INFO] [ jar > ]- > [INFO] > [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common --- > [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = > [UnixShellAPI.md], excludes = []) > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = > [configuration.xsl, core-default.xml], excludes = []) > [INFO] > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to > 'test-patch.sh': No such file or directory > tar: Exiting with failure status due to previous errors > /usr/bin/env: 'python3': No such file or directory > {code} > Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker > image to fix this. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18746) Install Python 3 for Windows 10 docker image
[ https://issues.apache.org/jira/browse/HADOOP-18746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17724682#comment-17724682 ] ASF GitHub Bot commented on HADOOP-18746: - GauthamBanasandra merged PR #5679: URL: https://github.com/apache/hadoop/pull/5679 > Install Python 3 for Windows 10 docker image > > > Key: HADOOP-18746 > URL: https://issues.apache.org/jira/browse/HADOOP-18746 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 > Environment: Windows 10 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Major > Labels: pull-request-available > > Currently, mvnsite build phase fails due to the following error - > {code} > [INFO] --< org.apache.hadoop:hadoop-common > >--- > [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT > [11/114] > [INFO] [ jar > ]- > [INFO] > [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common --- > [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = > [UnixShellAPI.md], excludes = []) > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = > [configuration.xsl, core-default.xml], excludes = []) > [INFO] > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to > 'test-patch.sh': No such file or directory > tar: Exiting with failure status due to previous errors > /usr/bin/env: 'python3': No such file or directory > {code} > Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker > image to fix this. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18746) Install Python 3 for Windows 10 docker image
[ https://issues.apache.org/jira/browse/HADOOP-18746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17724683#comment-17724683 ] ASF subversion and git services commented on HADOOP-18746: -- Commit afe850ca2c283791890ba3450f2c2ae2559186d9 in hadoop's branch refs/heads/trunk from Gautham B A [ https://gitbox.apache.org/repos/asf?p=hadoop.git;h=afe850ca2c2 ] HADOOP-18746. Install Python 3 for Windows 10 docker image (#5679) * This PR installs Python 3.10.11 for Windows 10 Docker image to fix the issue with building mvnsite. * After installing Python 3.10.11, it creates the hardlink python -> python3 as required by the script. > Install Python 3 for Windows 10 docker image > > > Key: HADOOP-18746 > URL: https://issues.apache.org/jira/browse/HADOOP-18746 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 > Environment: Windows 10 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Major > Labels: pull-request-available > > Currently, mvnsite build phase fails due to the following error - > {code} > [INFO] --< org.apache.hadoop:hadoop-common > >--- > [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT > [11/114] > [INFO] [ jar > ]- > [INFO] > [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common --- > [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = > [UnixShellAPI.md], excludes = []) > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = > [configuration.xsl, core-default.xml], excludes = []) > [INFO] > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to > 'test-patch.sh': No such file or directory > tar: Exiting with failure status due to previous errors > /usr/bin/env: 'python3': No such file or directory > {code} > Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker > image to fix this. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] GauthamBanasandra merged pull request #5679: HADOOP-18746. Install Python 3 for Windows 10 docker image
GauthamBanasandra merged PR #5679: URL: https://github.com/apache/hadoop/pull/5679 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] lfxy commented on pull request #5561: HDFS-16983. Whether checking path access permissions should be decided by dfs.permissions.enabled in concat operation
lfxy commented on PR #5561: URL: https://github.com/apache/hadoop/pull/5561#issuecomment-1556199483 @Hexiaoqiao Ok, I have add the NPE check. Thanks for your review. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hfutatzhanghb commented on pull request #5423: HDFS-16882. RBF: Add cache hit rate metric in MountTableResolver#getDestinationForPath
hfutatzhanghb commented on PR #5423: URL: https://github.com/apache/hadoop/pull/5423#issuecomment-1556177990 @ayushtkn , Sir, could you please help me review this code~ thanks. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18746) Install Python 3 for Windows 10 docker image
[ https://issues.apache.org/jira/browse/HADOOP-18746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17724657#comment-17724657 ] ASF GitHub Bot commented on HADOOP-18746: - ayushtkn commented on code in PR #5679: URL: https://github.com/apache/hadoop/pull/5679#discussion_r1199764494 ## dev-support/docker/Dockerfile_windows_10: ## @@ -108,6 +108,12 @@ RUN powershell Copy-Item -Path "C:\LibXXHash\usr\bin\*.dll" -Destination "C:\Pro RUN powershell Copy-Item -Path "C:\LibZStd\usr\bin\*.dll" -Destination "C:\Program` Files\Git\usr\bin" RUN powershell Copy-Item -Path "C:\RSync\usr\bin\*" -Destination "C:\Program` Files\Git\usr\bin" +# Install Python 3.10.11. Review Comment: nopes, just wanted to know, if we have any challenges with the higher versions. Good with me > Install Python 3 for Windows 10 docker image > > > Key: HADOOP-18746 > URL: https://issues.apache.org/jira/browse/HADOOP-18746 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 > Environment: Windows 10 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Major > Labels: pull-request-available > > Currently, mvnsite build phase fails due to the following error - > {code} > [INFO] --< org.apache.hadoop:hadoop-common > >--- > [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT > [11/114] > [INFO] [ jar > ]- > [INFO] > [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common --- > [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = > [UnixShellAPI.md], excludes = []) > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = > [configuration.xsl, core-default.xml], excludes = []) > [INFO] > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to > 'test-patch.sh': No such file or directory > tar: Exiting with failure status due to previous errors > /usr/bin/env: 'python3': No such file or directory > {code} > Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker > image to fix this. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ayushtkn commented on a diff in pull request #5679: HADOOP-18746. Install Python 3 for Windows 10 docker image
ayushtkn commented on code in PR #5679: URL: https://github.com/apache/hadoop/pull/5679#discussion_r1199764494 ## dev-support/docker/Dockerfile_windows_10: ## @@ -108,6 +108,12 @@ RUN powershell Copy-Item -Path "C:\LibXXHash\usr\bin\*.dll" -Destination "C:\Pro RUN powershell Copy-Item -Path "C:\LibZStd\usr\bin\*.dll" -Destination "C:\Program` Files\Git\usr\bin" RUN powershell Copy-Item -Path "C:\RSync\usr\bin\*" -Destination "C:\Program` Files\Git\usr\bin" +# Install Python 3.10.11. Review Comment: nopes, just wanted to know, if we have any challenges with the higher versions. Good with me -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mudit1289 opened a new pull request, #5680: MAPREDUCE-7438 : Support removal of only selective node states in untracked removal flow
mudit1289 opened a new pull request, #5680: URL: https://github.com/apache/hadoop/pull/5680 ASF JIRA: https://issues.apache.org/jira/browse/MAPREDUCE-7438 Allowing config to remove untracked nodes having only selective nodeStates Config: yarn.resourcemanager.node-removal-untracked.node-selective-states-to-remove If Yarn untracked removal is enabled, then this config can control what all node states can be removed. If the untracked node is not having one of these states, then node will skipped for removal. If this config value is set to empty, all node states, will be eligible for removal NodeState is an ENUM: org.apache.hadoop.yarn.api.records.NodeState ### How was this patch tested? Adding 1 new UT in TestResourceTrackerService.java -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] GauthamBanasandra commented on a diff in pull request #5679: HADOOP-18746. Install Python 3 for Windows 10 docker image
GauthamBanasandra commented on code in PR #5679: URL: https://github.com/apache/hadoop/pull/5679#discussion_r1199758698 ## dev-support/docker/Dockerfile_windows_10: ## @@ -108,6 +108,12 @@ RUN powershell Copy-Item -Path "C:\LibXXHash\usr\bin\*.dll" -Destination "C:\Pro RUN powershell Copy-Item -Path "C:\LibZStd\usr\bin\*.dll" -Destination "C:\Program` Files\Git\usr\bin" RUN powershell Copy-Item -Path "C:\RSync\usr\bin\*" -Destination "C:\Program` Files\Git\usr\bin" +# Install Python 3.10.11. Review Comment: I normally choose a version that's less recent so that it's mature and has got the appropriate fixes (if any that were discovered upon release). 1. `3.10.11` -> Would've gone through 11 iterations after release, hopefully containing the bug fixes from the time when `3.10.0` was released. 2. `3.12.0` -> Is just the first release and could potentially have some bugs. This is just a practice that I follow. I don't mind bumping up the version to `3.12.0` if you insist. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18746) Install Python 3 for Windows 10 docker image
[ https://issues.apache.org/jira/browse/HADOOP-18746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17724645#comment-17724645 ] ASF GitHub Bot commented on HADOOP-18746: - GauthamBanasandra commented on code in PR #5679: URL: https://github.com/apache/hadoop/pull/5679#discussion_r1199758698 ## dev-support/docker/Dockerfile_windows_10: ## @@ -108,6 +108,12 @@ RUN powershell Copy-Item -Path "C:\LibXXHash\usr\bin\*.dll" -Destination "C:\Pro RUN powershell Copy-Item -Path "C:\LibZStd\usr\bin\*.dll" -Destination "C:\Program` Files\Git\usr\bin" RUN powershell Copy-Item -Path "C:\RSync\usr\bin\*" -Destination "C:\Program` Files\Git\usr\bin" +# Install Python 3.10.11. Review Comment: I normally choose a version that's less recent so that it's mature and has got the appropriate fixes (if any that were discovered upon release). 1. `3.10.11` -> Would've gone through 11 iterations after release, hopefully containing the bug fixes from the time when `3.10.0` was released. 2. `3.12.0` -> Is just the first release and could potentially have some bugs. This is just a practice that I follow. I don't mind bumping up the version to `3.12.0` if you insist. > Install Python 3 for Windows 10 docker image > > > Key: HADOOP-18746 > URL: https://issues.apache.org/jira/browse/HADOOP-18746 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 > Environment: Windows 10 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Major > Labels: pull-request-available > > Currently, mvnsite build phase fails due to the following error - > {code} > [INFO] --< org.apache.hadoop:hadoop-common > >--- > [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT > [11/114] > [INFO] [ jar > ]- > [INFO] > [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common --- > [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = > [UnixShellAPI.md], excludes = []) > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = > [configuration.xsl, core-default.xml], excludes = []) > [INFO] > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to > 'test-patch.sh': No such file or directory > tar: Exiting with failure status due to previous errors > /usr/bin/env: 'python3': No such file or directory > {code} > Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker > image to fix this. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #5636: YARN-11492. Improve createJerseyClient#setConnectTimeout Code.
hadoop-yetus commented on PR #5636: URL: https://github.com/apache/hadoop/pull/5636#issuecomment-1556169512 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 37s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | -1 :x: | mvninstall | 43m 46s | [/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5636/6/artifact/out/branch-mvninstall-root.txt) | root in trunk failed. | | +1 :green_heart: | compile | 0m 32s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 0m 28s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 0m 33s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 34s | | trunk passed | | +1 :green_heart: | javadoc | 0m 45s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 29s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 1m 6s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 47s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 22s | | the patch passed | | +1 :green_heart: | compile | 0m 24s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 0m 24s | | the patch passed | | +1 :green_heart: | compile | 0m 21s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 0m 21s | | the patch passed | | +1 :green_heart: | blanks | 0m 1s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 17s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 22s | | the patch passed | | +1 :green_heart: | javadoc | 0m 23s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 20s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 0m 54s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 29s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 29s | | hadoop-yarn-server-router in the patch passed. | | +1 :green_heart: | asflicense | 0m 37s | | The patch does not generate ASF License warnings. | | | | 97m 21s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5636/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5636 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux e4c3d2204b76 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / faf8f3dbba3fefe49446a98f92845bf905e3d619 | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5636/6/testReport/ | | Max. process+thread count | 562 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5636/6/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an
[GitHub] [hadoop] ayushtkn commented on pull request #5028: MAPREDUCE-7419. Upgrade Junit 4 to 5 in hadoop-mapreduce-client-common
ayushtkn commented on PR #5028: URL: https://github.com/apache/hadoop/pull/5028#issuecomment-1556159707 Good to go from my side. I don't think the test failures are related, double check once. If all safe can revert the changes done in the pom and licence to trigger the tests. We can merge post that. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hfutatzhanghb commented on pull request #5353: HDFS-16909. Improve ReplicaMap#mergeAll method.
hfutatzhanghb commented on PR #5353: URL: https://github.com/apache/hadoop/pull/5353#issuecomment-1556158848 thanks sir, thanks shuyan. ---Original--- From: ***@***.*** Date: Sun, May 21, 2023 19:09 PM To: ***@***.***; Cc: ***@***.**@***.***; Subject: Re: [apache/hadoop] HDFS-16909. Improve ReplicaMap#mergeAll method.(PR #5353) Committed to trunk. Thanks @hfutatzhanghb and @zhangshuyan0 . — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: ***@***.*** -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18746) Install Python 3 for Windows 10 docker image
[ https://issues.apache.org/jira/browse/HADOOP-18746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17724629#comment-17724629 ] ASF GitHub Bot commented on HADOOP-18746: - ayushtkn commented on code in PR #5679: URL: https://github.com/apache/hadoop/pull/5679#discussion_r1199750794 ## dev-support/docker/Dockerfile_windows_10: ## @@ -108,6 +108,12 @@ RUN powershell Copy-Item -Path "C:\LibXXHash\usr\bin\*.dll" -Destination "C:\Pro RUN powershell Copy-Item -Path "C:\LibZStd\usr\bin\*.dll" -Destination "C:\Program` Files\Git\usr\bin" RUN powershell Copy-Item -Path "C:\RSync\usr\bin\*" -Destination "C:\Program` Files\Git\usr\bin" +# Install Python 3.10.11. Review Comment: curious any specific reason for using 3.10.11? If I see here: https://www.python.org/ftp/python/ we have 3.12.0 as well > Install Python 3 for Windows 10 docker image > > > Key: HADOOP-18746 > URL: https://issues.apache.org/jira/browse/HADOOP-18746 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 > Environment: Windows 10 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Major > Labels: pull-request-available > > Currently, mvnsite build phase fails due to the following error - > {code} > [INFO] --< org.apache.hadoop:hadoop-common > >--- > [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT > [11/114] > [INFO] [ jar > ]- > [INFO] > [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common --- > [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = > [UnixShellAPI.md], excludes = []) > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = > [configuration.xsl, core-default.xml], excludes = []) > [INFO] > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to > 'test-patch.sh': No such file or directory > tar: Exiting with failure status due to previous errors > /usr/bin/env: 'python3': No such file or directory > {code} > Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker > image to fix this. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ayushtkn commented on a diff in pull request #5679: HADOOP-18746. Install Python 3 for Windows 10 docker image
ayushtkn commented on code in PR #5679: URL: https://github.com/apache/hadoop/pull/5679#discussion_r1199750794 ## dev-support/docker/Dockerfile_windows_10: ## @@ -108,6 +108,12 @@ RUN powershell Copy-Item -Path "C:\LibXXHash\usr\bin\*.dll" -Destination "C:\Pro RUN powershell Copy-Item -Path "C:\LibZStd\usr\bin\*.dll" -Destination "C:\Program` Files\Git\usr\bin" RUN powershell Copy-Item -Path "C:\RSync\usr\bin\*" -Destination "C:\Program` Files\Git\usr\bin" +# Install Python 3.10.11. Review Comment: curious any specific reason for using 3.10.11? If I see here: https://www.python.org/ftp/python/ we have 3.12.0 as well -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] Hexiaoqiao commented on a diff in pull request #5561: HDFS-16983. Whether checking path access permissions should be decided by dfs.permissions.enabled in concat operation
Hexiaoqiao commented on code in PR #5561: URL: https://github.com/apache/hadoop/pull/5561#discussion_r1199749665 ## hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/namenode/FSDirConcatOp.java: ## @@ -121,7 +121,7 @@ private static INodeFile[] verifySrcFiles(FSDirectory fsd, String[] srcs, for(String src : srcs) { final INodesInPath iip = fsd.resolvePath(pc, src, DirOp.WRITE); // permission check for srcs - if (pc != null) { + if (fsd.isPermissionEnabled()) { Review Comment: `if (fsd.isPermissionEnabled())` should be `if (pc != null && fsd.isPermissionEnabled())` here? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] Hexiaoqiao commented on pull request #5353: HDFS-16909. Improve ReplicaMap#mergeAll method.
Hexiaoqiao commented on PR #5353: URL: https://github.com/apache/hadoop/pull/5353#issuecomment-1556150811 Committed to trunk. Thanks @hfutatzhanghb and @zhangshuyan0 . -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] Hexiaoqiao merged pull request #5353: HDFS-16909. Improve ReplicaMap#mergeAll method.
Hexiaoqiao merged PR #5353: URL: https://github.com/apache/hadoop/pull/5353 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] Hexiaoqiao commented on pull request #5643: HDFS-17003. Erasure coding: invalidate wrong block after reporting bad blocks from datanode
Hexiaoqiao commented on PR #5643: URL: https://github.com/apache/hadoop/pull/5643#issuecomment-1556147416 > Please use a capital letter at the beginning of the sentences and period at the end of it for all annotation. Just suggest to improve all added comments following this comments if necessary.(include some other PRs you have submitted.). Please also check the checkstyle and failed unit test if it is related to this changes. Thanks. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on a diff in pull request #5676: YARN-6648. BackPort [GPG] Add SubClusterCleaner in Global Policy Generator.
slfan1989 commented on code in PR #5676: URL: https://github.com/apache/hadoop/pull/5676#discussion_r1199742041 ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api/src/main/java/org/apache/hadoop/yarn/conf/YarnConfiguration.java: ## @@ -4326,6 +4326,24 @@ public static boolean isAclEnabled(Configuration conf) { public static final boolean DEFAULT_ROUTER_WEBAPP_PARTIAL_RESULTS_ENABLED = false; + private static final String FEDERATION_GPG_PREFIX = + FEDERATION_PREFIX + "gpg."; + + // The number of threads to use for the GPG scheduled executor service + public static final String GPG_SCHEDULED_EXECUTOR_THREADS = + FEDERATION_GPG_PREFIX + "scheduled.executor.threads"; + public static final int DEFAULT_GPG_SCHEDULED_EXECUTOR_THREADS = 10; + + // The interval at which the subcluster cleaner runs, -1 means disabled + public static final String GPG_SUBCLUSTER_CLEANER_INTERVAL_MS = + FEDERATION_GPG_PREFIX + "subcluster.cleaner.interval-ms"; + public static final long DEFAULT_GPG_SUBCLUSTER_CLEANER_INTERVAL_MS = -1; + + // The expiration time for a subcluster heartbeat, default is 30 minutes + public static final String GPG_SUBCLUSTER_EXPIRATION_MS = + FEDERATION_GPG_PREFIX + "subcluster.heartbeat.expiration-ms"; + public static final long DEFAULT_GPG_SUBCLUSTER_EXPIRATION_MS = 180; Review Comment: Thank you for helping to review the code, I will improve the code. ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-globalpolicygenerator/src/main/java/org/apache/hadoop/yarn/server/globalpolicygenerator/subclustercleaner/SubClusterCleaner.java: ## @@ -0,0 +1,112 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hadoop.yarn.server.globalpolicygenerator.subclustercleaner; + +import java.util.Date; +import java.util.Map; + +import org.apache.commons.lang.time.DurationFormatUtils; +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.yarn.conf.YarnConfiguration; +import org.apache.hadoop.yarn.server.federation.store.records.SubClusterId; +import org.apache.hadoop.yarn.server.federation.store.records.SubClusterInfo; +import org.apache.hadoop.yarn.server.federation.store.records.SubClusterState; +import org.apache.hadoop.yarn.server.globalpolicygenerator.GPGContext; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +/** + * The sub-cluster cleaner is one of the GPG's services that periodically checks + * the membership table in FederationStateStore and mark sub-clusters that have + * not sent a heartbeat in certain amount of time as LOST. + */ +public class SubClusterCleaner implements Runnable { + + private static final Logger LOG = + LoggerFactory.getLogger(SubClusterCleaner.class); + + private GPGContext gpgContext; + private long heartbeatExpirationMillis; + + /** + * The sub-cluster cleaner runnable is invoked by the sub cluster cleaner + * service to check the membership table and remove sub clusters that have not + * sent a heart beat in some amount of time. + * + * @param conf configuration. + * @param gpgContext GPGContext. + */ + public SubClusterCleaner(Configuration conf, GPGContext gpgContext) { +this.heartbeatExpirationMillis = +conf.getLong(YarnConfiguration.GPG_SUBCLUSTER_EXPIRATION_MS, Review Comment: I will improve the code. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18746) Install Python 3 for Windows 10 docker image
[ https://issues.apache.org/jira/browse/HADOOP-18746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17724603#comment-17724603 ] ASF GitHub Bot commented on HADOOP-18746: - hadoop-yetus commented on PR #5679: URL: https://github.com/apache/hadoop/pull/5679#issuecomment-1556128063 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 16m 56s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 16m 6s | | Maven dependency ordering for branch | | +1 :green_heart: | shadedclient | 38m 23s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | shadedclient | 19m 20s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 0m 37s | | The patch does not generate ASF License warnings. | | | | 78m 27s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5679/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5679 | | Optional Tests | dupname asflicense | | uname | Linux f698310145f3 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 631c73959fc311d01665de491c461b48fdbb619a | | Max. process+thread count | 557 (vs. ulimit of 5500) | | modules | C: U: | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5679/1/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Install Python 3 for Windows 10 docker image > > > Key: HADOOP-18746 > URL: https://issues.apache.org/jira/browse/HADOOP-18746 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 > Environment: Windows 10 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Major > Labels: pull-request-available > > Currently, mvnsite build phase fails due to the following error - > {code} > [INFO] --< org.apache.hadoop:hadoop-common > >--- > [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT > [11/114] > [INFO] [ jar > ]- > [INFO] > [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common --- > [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = > [UnixShellAPI.md], excludes = []) > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = > [configuration.xsl, core-default.xml], excludes = []) > [INFO] > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to > 'test-patch.sh': No such file or directory > tar: Exiting with failure status due to previous errors > /usr/bin/env: 'python3': No such file or directory > {code} > Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker > image to fix this. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #5679: HADOOP-18746. Install Python 3 for Windows 10 docker image
hadoop-yetus commented on PR #5679: URL: https://github.com/apache/hadoop/pull/5679#issuecomment-1556128063 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 16m 56s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 16m 6s | | Maven dependency ordering for branch | | +1 :green_heart: | shadedclient | 38m 23s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | shadedclient | 19m 20s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 0m 37s | | The patch does not generate ASF License warnings. | | | | 78m 27s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5679/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5679 | | Optional Tests | dupname asflicense | | uname | Linux f698310145f3 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 631c73959fc311d01665de491c461b48fdbb619a | | Max. process+thread count | 557 (vs. ulimit of 5500) | | modules | C: U: | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5679/1/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18746) Install Python 3 for Windows 10 docker image
[ https://issues.apache.org/jira/browse/HADOOP-18746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17724596#comment-17724596 ] ASF GitHub Bot commented on HADOOP-18746: - hadoop-yetus commented on PR #5679: URL: https://github.com/apache/hadoop/pull/5679#issuecomment-1556112446 (!) A patch to the testing environment has been detected. Re-executing against the patched versions to perform further tests. The console is at https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5679/1/console in case of problems. > Install Python 3 for Windows 10 docker image > > > Key: HADOOP-18746 > URL: https://issues.apache.org/jira/browse/HADOOP-18746 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 > Environment: Windows 10 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Major > Labels: pull-request-available > > Currently, mvnsite build phase fails due to the following error - > {code} > [INFO] --< org.apache.hadoop:hadoop-common > >--- > [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT > [11/114] > [INFO] [ jar > ]- > [INFO] > [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common --- > [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = > [UnixShellAPI.md], excludes = []) > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = > [configuration.xsl, core-default.xml], excludes = []) > [INFO] > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to > 'test-patch.sh': No such file or directory > tar: Exiting with failure status due to previous errors > /usr/bin/env: 'python3': No such file or directory > {code} > Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker > image to fix this. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #5679: HADOOP-18746. Install Python 3 for Windows 10 docker image
hadoop-yetus commented on PR #5679: URL: https://github.com/apache/hadoop/pull/5679#issuecomment-1556112446 (!) A patch to the testing environment has been detected. Re-executing against the patched versions to perform further tests. The console is at https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5679/1/console in case of problems. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18746) Install Python 3 for Windows 10 docker image
[ https://issues.apache.org/jira/browse/HADOOP-18746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17724595#comment-17724595 ] ASF GitHub Bot commented on HADOOP-18746: - GauthamBanasandra opened a new pull request, #5679: URL: https://github.com/apache/hadoop/pull/5679 ### Description of PR Currently, mvnsite build phase fails due to the following error - ``` [INFO] --< org.apache.hadoop:hadoop-common >--- [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT [11/114] [INFO] [ jar ]- [INFO] [INFO] - > Install Python 3 for Windows 10 docker image > > > Key: HADOOP-18746 > URL: https://issues.apache.org/jira/browse/HADOOP-18746 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 > Environment: Windows 10 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Major > > Currently, mvnsite build phase fails due to the following error - > {code} > [INFO] --< org.apache.hadoop:hadoop-common > >--- > [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT > [11/114] > [INFO] [ jar > ]- > [INFO] > [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common --- > [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = > [UnixShellAPI.md], excludes = []) > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = > [configuration.xsl, core-default.xml], excludes = []) > [INFO] > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to > 'test-patch.sh': No such file or directory > tar: Exiting with failure status due to previous errors > /usr/bin/env: 'python3': No such file or directory > {code} > Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker > image to fix this. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18746) Install Python 3 for Windows 10 docker image
[ https://issues.apache.org/jira/browse/HADOOP-18746?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-18746: Labels: pull-request-available (was: ) > Install Python 3 for Windows 10 docker image > > > Key: HADOOP-18746 > URL: https://issues.apache.org/jira/browse/HADOOP-18746 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 > Environment: Windows 10 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Major > Labels: pull-request-available > > Currently, mvnsite build phase fails due to the following error - > {code} > [INFO] --< org.apache.hadoop:hadoop-common > >--- > [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT > [11/114] > [INFO] [ jar > ]- > [INFO] > [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common --- > [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = > [UnixShellAPI.md], excludes = []) > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = > [configuration.xsl, core-default.xml], excludes = []) > [INFO] > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to > 'test-patch.sh': No such file or directory > tar: Exiting with failure status due to previous errors > /usr/bin/env: 'python3': No such file or directory > {code} > Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker > image to fix this. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] GauthamBanasandra opened a new pull request, #5679: HADOOP-18746. Install Python 3 for Windows 10 docker image
GauthamBanasandra opened a new pull request, #5679: URL: https://github.com/apache/hadoop/pull/5679 ### Description of PR Currently, mvnsite build phase fails due to the following error - ``` [INFO] --< org.apache.hadoop:hadoop-common >--- [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT [11/114] [INFO] [ jar ]- [INFO] [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common --- [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = [UnixShellAPI.md], excludes = []) [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = [configuration.xsl, core-default.xml], excludes = []) [INFO] [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to 'test-patch.sh': No such file or directory tar: Exiting with failure status due to previous errors /usr/bin/env: 'python3': No such file or directory ``` This PR installs Python 3 in the Windows 10 Hadoop builder docker image to fix this. ### How was this patch tested? Jenkins CI. Build results - 1. Without this PR - [patch-mvnsite-root-fail.txt](https://github.com/apache/hadoop/files/11523971/patch-mvnsite-root-fail.txt) 2. With this PR - [patch-mvnsite-root-pass.txt](https://github.com/apache/hadoop/files/11523970/patch-mvnsite-root-pass.txt) ### For code changes: - [x] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a w [patch-mvnsite-root-fail.txt](https://github.com/apache/hadoop/files/11523969/patch-mvnsite-root-fail.txt) ay that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18746) Install Python 3 for Windows 10 docker image
Gautham Banasandra created HADOOP-18746: --- Summary: Install Python 3 for Windows 10 docker image Key: HADOOP-18746 URL: https://issues.apache.org/jira/browse/HADOOP-18746 Project: Hadoop Common Issue Type: Bug Components: build Affects Versions: 3.4.0 Environment: Windows 10 Reporter: Gautham Banasandra Assignee: Gautham Banasandra Currently, mvnsite build phase fails due to the following error - {code} [INFO] --< org.apache.hadoop:hadoop-common >--- [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT[11/114] [INFO] [ jar ]- [INFO] [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common --- [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = [UnixShellAPI.md], excludes = []) [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = [configuration.xsl, core-default.xml], excludes = []) [INFO] [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to 'test-patch.sh': No such file or directory tar: Exiting with failure status due to previous errors /usr/bin/env: 'python3': No such file or directory {code} Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker image to fix this. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18746) Install Python 3 for Windows 10 docker image
[ https://issues.apache.org/jira/browse/HADOOP-18746?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Gautham Banasandra updated HADOOP-18746: Language: powershell Docker (was: powershell) > Install Python 3 for Windows 10 docker image > > > Key: HADOOP-18746 > URL: https://issues.apache.org/jira/browse/HADOOP-18746 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 > Environment: Windows 10 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Major > > Currently, mvnsite build phase fails due to the following error - > {code} > [INFO] --< org.apache.hadoop:hadoop-common > >--- > [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT > [11/114] > [INFO] [ jar > ]- > [INFO] > [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common --- > [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = > [UnixShellAPI.md], excludes = []) > [INFO] Deleting > C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = > [configuration.xsl, core-default.xml], excludes = []) > [INFO] > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to > 'test-patch.sh': No such file or directory > tar: Exiting with failure status due to previous errors > /usr/bin/env: 'python3': No such file or directory > {code} > Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker > image to fix this. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #5028: MAPREDUCE-7419. Upgrade Junit 4 to 5 in hadoop-mapreduce-client-common
hadoop-yetus commented on PR #5028: URL: https://github.com/apache/hadoop/pull/5028#issuecomment-1556097662 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 17m 18s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | shelldocs | 0m 0s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 12 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 40s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 22m 27s | | trunk passed | | +1 :green_heart: | compile | 17m 24s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 15m 50s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 4m 3s | | trunk passed | | +1 :green_heart: | mvnsite | 20m 47s | | trunk passed | | +1 :green_heart: | javadoc | 8m 23s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 6m 55s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +0 :ok: | spotbugs | 0m 16s | | branch/hadoop-project no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 52m 35s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 24s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 20m 1s | | the patch passed | | +1 :green_heart: | compile | 16m 49s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 16m 49s | | the patch passed | | +1 :green_heart: | compile | 15m 45s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 15m 45s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 3m 55s | | root: The patch generated 0 new + 47 unchanged - 35 fixed = 47 total (was 82) | | +1 :green_heart: | mvnsite | 13m 23s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | javadoc | 8m 11s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 7m 1s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +0 :ok: | spotbugs | 0m 16s | | hadoop-project has no data from spotbugs | | +1 :green_heart: | shadedclient | 53m 4s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 755m 4s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5028/7/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 1m 17s | | The patch does not generate ASF License warnings. | | | | 1084m 54s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.mapreduce.v2.TestUberAM | | | hadoop.mapreduce.v2.TestMRJobsWithProfiler | | | hadoop.mapreduce.v2.TestMRJobs | | | hadoop.yarn.server.resourcemanager.reservation.TestCapacityOverTimePolicy | | | hadoop.yarn.server.applicationhistoryservice.webapp.TestAHSWebServices | | | hadoop.hdfs.server.datanode.TestDirectoryScanner | | | hadoop.fs.http.server.TestHttpFSServerNoXAttrs | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5028/7/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5028 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle shellcheck shelldocs | | uname | Linux 4970ce9b1e2a 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git