[jira] [Commented] (HADOOP-18761) Revert HADOOP-18535 because mysql-connector-java is GPL
[ https://issues.apache.org/jira/browse/HADOOP-18761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730814#comment-17730814 ] ASF GitHub Bot commented on HADOOP-18761: - omalley commented on PR #5724: URL: https://github.com/apache/hadoop/pull/5724#issuecomment-1584012490 -1 we don't need this patch and it wasn't done correctly. > Revert HADOOP-18535 because mysql-connector-java is GPL > --- > > Key: HADOOP-18761 > URL: https://issues.apache.org/jira/browse/HADOOP-18761 > Project: Hadoop Common > Issue Type: Task >Reporter: Wei-Chiu Chuang >Priority: Blocker > Labels: pull-request-available > > While preparing for 3.3.6 RC, I realized the mysql-connector-java dependency > added by HADOOP-18535 is GPL licensed. > Source: https://github.com/mysql/mysql-connector-j/blob/release/8.0/LICENSE > See legal discussion at LEGAL-423. > I looked at the original jira and github PR and I don't think the license > issue was noticed. > Is it possible to get rid of the mysql connector dependency? As far as I can > tell the dependency is very limited. > If not, I guess I'll have to revert the commits for now. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] omalley closed pull request #5724: HADOOP-18761. Revert HADOOP-18535 because mysql-connector-java is GPL
omalley closed pull request #5724: HADOOP-18761. Revert HADOOP-18535 because mysql-connector-java is GPL URL: https://github.com/apache/hadoop/pull/5724 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18761) Revert HADOOP-18535 because mysql-connector-java is GPL
[ https://issues.apache.org/jira/browse/HADOOP-18761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730815#comment-17730815 ] ASF GitHub Bot commented on HADOOP-18761: - omalley closed pull request #5724: HADOOP-18761. Revert HADOOP-18535 because mysql-connector-java is GPL URL: https://github.com/apache/hadoop/pull/5724 > Revert HADOOP-18535 because mysql-connector-java is GPL > --- > > Key: HADOOP-18761 > URL: https://issues.apache.org/jira/browse/HADOOP-18761 > Project: Hadoop Common > Issue Type: Task >Reporter: Wei-Chiu Chuang >Priority: Blocker > Labels: pull-request-available > > While preparing for 3.3.6 RC, I realized the mysql-connector-java dependency > added by HADOOP-18535 is GPL licensed. > Source: https://github.com/mysql/mysql-connector-j/blob/release/8.0/LICENSE > See legal discussion at LEGAL-423. > I looked at the original jira and github PR and I don't think the license > issue was noticed. > Is it possible to get rid of the mysql connector dependency? As far as I can > tell the dependency is very limited. > If not, I guess I'll have to revert the commits for now. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] omalley commented on pull request #5724: HADOOP-18761. Revert HADOOP-18535 because mysql-connector-java is GPL
omalley commented on PR #5724: URL: https://github.com/apache/hadoop/pull/5724#issuecomment-1584012490 -1 we don't need this patch and it wasn't done correctly. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18762) Add Qiniu Cloud Kodo File System Implementation
[ https://issues.apache.org/jira/browse/HADOOP-18762?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Zhiqiang Zhang updated HADOOP-18762: Attachment: Qiniu-Kodo-Integrated.pdf > Add Qiniu Cloud Kodo File System Implementation > --- > > Key: HADOOP-18762 > URL: https://issues.apache.org/jira/browse/HADOOP-18762 > Project: Hadoop Common > Issue Type: New Feature >Reporter: Zhiqiang Zhang >Priority: Major > Labels: pull-request-available > Attachments: Qiniu-Kodo-Integrated.pdf > > > Qiniu Kodo is a self-developed unstructured data storage management platform > by Qiniu Cloud Storage that supports center and edge storage. The platform > has been verified by a large number of users for many years and has been > widely used in various scenarios of massive data management. It is widely > used in many cloud service users in China, but currently in the Apache Hadoop > project, there is a lack of a solution that supports Kodo through > Hadoop/Spark directly. > The purpose of this project is to integrate Kodo into Hadoop/Spark projects, > so that users can operate Kodo through the API of Hadoop/Spark without > additional learning costs. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18762) Add Qiniu Cloud Kodo File System Implementation
[ https://issues.apache.org/jira/browse/HADOOP-18762?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Zhiqiang Zhang updated HADOOP-18762: Language: (was: Java) > Add Qiniu Cloud Kodo File System Implementation > --- > > Key: HADOOP-18762 > URL: https://issues.apache.org/jira/browse/HADOOP-18762 > Project: Hadoop Common > Issue Type: New Feature >Reporter: Zhiqiang Zhang >Priority: Major > Labels: pull-request-available > > Qiniu Kodo is a self-developed unstructured data storage management platform > by Qiniu Cloud Storage that supports center and edge storage. The platform > has been verified by a large number of users for many years and has been > widely used in various scenarios of massive data management. It is widely > used in many cloud service users in China, but currently in the Apache Hadoop > project, there is a lack of a solution that supports Kodo through > Hadoop/Spark directly. > The purpose of this project is to integrate Kodo into Hadoop/Spark projects, > so that users can operate Kodo through the API of Hadoop/Spark without > additional learning costs. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18762) Add Qiniu Cloud Kodo File System Implementation
[ https://issues.apache.org/jira/browse/HADOOP-18762?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-18762: Labels: pull-request-available (was: ) > Add Qiniu Cloud Kodo File System Implementation > --- > > Key: HADOOP-18762 > URL: https://issues.apache.org/jira/browse/HADOOP-18762 > Project: Hadoop Common > Issue Type: New Feature >Reporter: Zhiqiang Zhang >Priority: Major > Labels: pull-request-available > > Qiniu Kodo is a self-developed unstructured data storage management platform > by Qiniu Cloud Storage that supports center and edge storage. The platform > has been verified by a large number of users for many years and has been > widely used in various scenarios of massive data management. It is widely > used in many cloud service users in China, but currently in the Apache Hadoop > project, there is a lack of a solution that supports Kodo through > Hadoop/Spark directly. > The purpose of this project is to integrate Kodo into Hadoop/Spark projects, > so that users can operate Kodo through the API of Hadoop/Spark without > additional learning costs. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] zhangzqs opened a new pull request, #5725: HADOOP-18762. Add Qiniu Cloud Kodo File System Implementation
zhangzqs opened a new pull request, #5725: URL: https://github.com/apache/hadoop/pull/5725 ### Description of PR See (HADOOP-18762)[https://issues.apache.org/jira/browse/HADOOP-18762] ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18762) Add Qiniu Cloud Kodo File System Implementation
[ https://issues.apache.org/jira/browse/HADOOP-18762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730807#comment-17730807 ] ASF GitHub Bot commented on HADOOP-18762: - zhangzqs opened a new pull request, #5725: URL: https://github.com/apache/hadoop/pull/5725 ### Description of PR See (HADOOP-18762)[https://issues.apache.org/jira/browse/HADOOP-18762] ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? > Add Qiniu Cloud Kodo File System Implementation > --- > > Key: HADOOP-18762 > URL: https://issues.apache.org/jira/browse/HADOOP-18762 > Project: Hadoop Common > Issue Type: New Feature >Reporter: Zhiqiang Zhang >Priority: Major > > Qiniu Kodo is a self-developed unstructured data storage management platform > by Qiniu Cloud Storage that supports center and edge storage. The platform > has been verified by a large number of users for many years and has been > widely used in various scenarios of massive data management. It is widely > used in many cloud service users in China, but currently in the Apache Hadoop > project, there is a lack of a solution that supports Kodo through > Hadoop/Spark directly. > The purpose of this project is to integrate Kodo into Hadoop/Spark projects, > so that users can operate Kodo through the API of Hadoop/Spark without > additional learning costs. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18762) Add Qiniu Cloud Kodo File System Implementation
[ https://issues.apache.org/jira/browse/HADOOP-18762?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Zhiqiang Zhang updated HADOOP-18762: Summary: Add Qiniu Cloud Kodo File System Implementation (was: Qiniu Cloud Kodo File System Implementation) > Add Qiniu Cloud Kodo File System Implementation > --- > > Key: HADOOP-18762 > URL: https://issues.apache.org/jira/browse/HADOOP-18762 > Project: Hadoop Common > Issue Type: New Feature >Reporter: Zhiqiang Zhang >Priority: Major > > Qiniu Kodo is a self-developed unstructured data storage management platform > by Qiniu Cloud Storage that supports center and edge storage. The platform > has been verified by a large number of users for many years and has been > widely used in various scenarios of massive data management. It is widely > used in many cloud service users in China, but currently in the Apache Hadoop > project, there is a lack of a solution that supports Kodo through > Hadoop/Spark directly. > The purpose of this project is to integrate Kodo into Hadoop/Spark projects, > so that users can operate Kodo through the API of Hadoop/Spark without > additional learning costs. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18762) Qiniu Cloud Kodo File System Implementation
Zhiqiang Zhang created HADOOP-18762: --- Summary: Qiniu Cloud Kodo File System Implementation Key: HADOOP-18762 URL: https://issues.apache.org/jira/browse/HADOOP-18762 Project: Hadoop Common Issue Type: New Feature Reporter: Zhiqiang Zhang Qiniu Kodo is a self-developed unstructured data storage management platform by Qiniu Cloud Storage that supports center and edge storage. The platform has been verified by a large number of users for many years and has been widely used in various scenarios of massive data management. It is widely used in many cloud service users in China, but currently in the Apache Hadoop project, there is a lack of a solution that supports Kodo through Hadoop/Spark directly. The purpose of this project is to integrate Kodo into Hadoop/Spark projects, so that users can operate Kodo through the API of Hadoop/Spark without additional learning costs. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] yl09099 commented on a diff in pull request #5716: YARN-11506.The formatted yarn queue list is displayed on CLI
yl09099 commented on code in PR #5716: URL: https://github.com/apache/hadoop/pull/5716#discussion_r1223807168 ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/pom.xml: ## @@ -175,6 +175,12 @@ org.jline jline + + + com.blinkfox + mini-table + 1.0.0 Review Comment: Thanks for the advice. I'll check it out -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ayushtkn commented on a diff in pull request #5696: HDFS-16946. Fix getTopTokenRealOwners to return String
ayushtkn commented on code in PR #5696: URL: https://github.com/apache/hadoop/pull/5696#discussion_r1223781601 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/test/java/org/apache/hadoop/hdfs/server/federation/security/TestRouterSecurityManager.java: ## @@ -259,6 +272,39 @@ private static String[] getUserGroupForTesting() { return groupsForTesting; } + @Test + public void testGetTopTokenRealOwners() throws Exception { +// Create conf and start routers with only an RPC service +Configuration conf = initSecurity(); + +Configuration routerConf = new RouterConfigBuilder() +.metrics() +.rpc() +.build(); +conf.addResource(routerConf); + +Router router = initializeAndStartRouter(conf); + +// Create credentials +UserGroupInformation ugi = UserGroupInformation.createUserForTesting("router", getUserGroupForTesting()); +Credentials creds = RouterSecurityManager.createCredentials(router, ugi, "some_renewer"); Review Comment: assigning to ``creds`` is not required maybe -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ayushtkn commented on a diff in pull request #5592: HADOOP-18718. Fix several maven build warnings
ayushtkn commented on code in PR #5592: URL: https://github.com/apache/hadoop/pull/5592#discussion_r1223777426 ## hadoop-hdfs-project/hadoop-hdfs-rbf/pom.xml: ## @@ -251,10 +248,10 @@ https://maven.apache.org/xsd/maven-4.0.0.xsd;> run - + - + Review Comment: Was supposed to use target, right? From the description ``` Use target instead of tasks. ``` Other places you moved tasks -> target only. Any specific reason here? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18718) Fix several maven build warnings
[ https://issues.apache.org/jira/browse/HADOOP-18718?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730788#comment-17730788 ] ASF GitHub Bot commented on HADOOP-18718: - ayushtkn commented on code in PR #5592: URL: https://github.com/apache/hadoop/pull/5592#discussion_r1223777426 ## hadoop-hdfs-project/hadoop-hdfs-rbf/pom.xml: ## @@ -251,10 +248,10 @@ https://maven.apache.org/xsd/maven-4.0.0.xsd;> run - + - + Review Comment: Was supposed to use target, right? From the description ``` Use target instead of tasks. ``` Other places you moved tasks -> target only. Any specific reason here? > Fix several maven build warnings > > > Key: HADOOP-18718 > URL: https://issues.apache.org/jira/browse/HADOOP-18718 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > > {code} > [WARNING] 'build.plugins.plugin.version' for > org.cyclonedx:cyclonedx-maven-plugin is missing. > {code} > {code} > [WARNING] Unknown keyword additionalItems - you should define your own Meta > Schema. If the keyword is irrelevant for validation, just use a > NonValidationKeyword > {code} > {code} > [WARNING] 'build.plugins.plugin.version' for > org.codehaus.mojo:findbugs-maven-plugin is missing > {code} > {code} > [WARNING] Parameter 'requiresOnline' is unknown for plugin > 'exec-maven-plugin:1.3.1:exec (pre-dist)' > {code} > {code} > [WARNING] Parameter 'destDir' (user property 'destDir') is deprecated: No > reason given > {code} > {code} > [WARNING] Parameter 'tasks' is deprecated: Use target instead > [WARNING] Parameter tasks is deprecated, use target instead > {code} > {code} > [WARNING] Parameter 'systemProperties' is deprecated: Use > systemPropertyVariables instead. > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18761) Revert HADOOP-18535 because mysql-connector-java is GPL
[ https://issues.apache.org/jira/browse/HADOOP-18761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730787#comment-17730787 ] Shilun Fan commented on HADOOP-18761: - [~weichiu] Thank you for your reminder! With [~elgoiri] help, I'm working on improving the code for YARN Federation. We are using MySQL in this feature, and I will verify the impact of the related dependencies and respond as soon as possible. > Revert HADOOP-18535 because mysql-connector-java is GPL > --- > > Key: HADOOP-18761 > URL: https://issues.apache.org/jira/browse/HADOOP-18761 > Project: Hadoop Common > Issue Type: Task >Reporter: Wei-Chiu Chuang >Priority: Blocker > Labels: pull-request-available > > While preparing for 3.3.6 RC, I realized the mysql-connector-java dependency > added by HADOOP-18535 is GPL licensed. > Source: https://github.com/mysql/mysql-connector-j/blob/release/8.0/LICENSE > See legal discussion at LEGAL-423. > I looked at the original jira and github PR and I don't think the license > issue was noticed. > Is it possible to get rid of the mysql connector dependency? As far as I can > tell the dependency is very limited. > If not, I guess I'll have to revert the commits for now. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18761) Revert HADOOP-18535 because mysql-connector-java is GPL
[ https://issues.apache.org/jira/browse/HADOOP-18761?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-18761: Labels: pull-request-available (was: ) > Revert HADOOP-18535 because mysql-connector-java is GPL > --- > > Key: HADOOP-18761 > URL: https://issues.apache.org/jira/browse/HADOOP-18761 > Project: Hadoop Common > Issue Type: Task >Reporter: Wei-Chiu Chuang >Priority: Blocker > Labels: pull-request-available > > While preparing for 3.3.6 RC, I realized the mysql-connector-java dependency > added by HADOOP-18535 is GPL licensed. > Source: https://github.com/mysql/mysql-connector-j/blob/release/8.0/LICENSE > See legal discussion at LEGAL-423. > I looked at the original jira and github PR and I don't think the license > issue was noticed. > Is it possible to get rid of the mysql connector dependency? As far as I can > tell the dependency is very limited. > If not, I guess I'll have to revert the commits for now. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18761) Revert HADOOP-18535 because mysql-connector-java is GPL
[ https://issues.apache.org/jira/browse/HADOOP-18761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730777#comment-17730777 ] ASF GitHub Bot commented on HADOOP-18761: - jojochuang opened a new pull request, #5724: URL: https://github.com/apache/hadoop/pull/5724 ### Description of PR See HADOOP-18761. ### How was this patch tested? No test. Revert only. ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? > Revert HADOOP-18535 because mysql-connector-java is GPL > --- > > Key: HADOOP-18761 > URL: https://issues.apache.org/jira/browse/HADOOP-18761 > Project: Hadoop Common > Issue Type: Task >Reporter: Wei-Chiu Chuang >Priority: Blocker > > While preparing for 3.3.6 RC, I realized the mysql-connector-java dependency > added by HADOOP-18535 is GPL licensed. > Source: https://github.com/mysql/mysql-connector-j/blob/release/8.0/LICENSE > See legal discussion at LEGAL-423. > I looked at the original jira and github PR and I don't think the license > issue was noticed. > Is it possible to get rid of the mysql connector dependency? As far as I can > tell the dependency is very limited. > If not, I guess I'll have to revert the commits for now. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] jojochuang opened a new pull request, #5724: HADOOP-18761. Revert HADOOP-18535 because mysql-connector-java is GPL
jojochuang opened a new pull request, #5724: URL: https://github.com/apache/hadoop/pull/5724 ### Description of PR See HADOOP-18761. ### How was this patch tested? No test. Revert only. ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18761) Revert HADOOP-18535 because mysql-connector-java is GPL
Wei-Chiu Chuang created HADOOP-18761: Summary: Revert HADOOP-18535 because mysql-connector-java is GPL Key: HADOOP-18761 URL: https://issues.apache.org/jira/browse/HADOOP-18761 Project: Hadoop Common Issue Type: Task Reporter: Wei-Chiu Chuang While preparing for 3.3.6 RC, I realized the mysql-connector-java dependency added by HADOOP-18535 is GPL licensed. Source: https://github.com/mysql/mysql-connector-j/blob/release/8.0/LICENSE See legal discussion at LEGAL-423. I looked at the original jira and github PR and I don't think the license issue was noticed. Is it possible to get rid of the mysql connector dependency? As far as I can tell the dependency is very limited. If not, I guess I'll have to revert the commits for now. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18760) 3.3.6 Release NOTICE and LICENSE file update
[ https://issues.apache.org/jira/browse/HADOOP-18760?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730776#comment-17730776 ] Wei-Chiu Chuang commented on HADOOP-18760: -- We can't use mysql connector as it's GPL. Either get rid of it or revert HADOOP-18535. > 3.3.6 Release NOTICE and LICENSE file update > > > Key: HADOOP-18760 > URL: https://issues.apache.org/jira/browse/HADOOP-18760 > Project: Hadoop Common > Issue Type: Task >Affects Versions: 3.3.6 >Reporter: Wei-Chiu Chuang >Assignee: Wei-Chiu Chuang >Priority: Blocker > > As far as I can tell looking at hadoop-project/pom.xml the only difference > between 3.3.5 and 3.3.6 from a dependency point of view is mysql connector > (HADOOP-18535) derby (HADOOP-18535, HADOOP-18693). > Json-smart, snakeyaml and jetty, jettison are updated in LICENSE-binary > already. grizzly was used in test scope only so its removal doesn't matter. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Assigned] (HADOOP-18760) 3.3.6 Release NOTICE and LICENSE file update
[ https://issues.apache.org/jira/browse/HADOOP-18760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Wei-Chiu Chuang reassigned HADOOP-18760: Assignee: Wei-Chiu Chuang > 3.3.6 Release NOTICE and LICENSE file update > > > Key: HADOOP-18760 > URL: https://issues.apache.org/jira/browse/HADOOP-18760 > Project: Hadoop Common > Issue Type: Task >Affects Versions: 3.3.6 >Reporter: Wei-Chiu Chuang >Assignee: Wei-Chiu Chuang >Priority: Blocker > > As far as I can tell looking at hadoop-project/pom.xml the only difference > between 3.3.5 and 3.3.6 from a dependency point of view is mysql connector > (HADOOP-18535) derby (HADOOP-18535, HADOOP-18693). > Json-smart, snakeyaml and jetty, jettison are updated in LICENSE-binary > already. grizzly was used in test scope only so its removal doesn't matter. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18760) 3.3.6 Release NOTICE and LICENSE file update
Wei-Chiu Chuang created HADOOP-18760: Summary: 3.3.6 Release NOTICE and LICENSE file update Key: HADOOP-18760 URL: https://issues.apache.org/jira/browse/HADOOP-18760 Project: Hadoop Common Issue Type: Task Affects Versions: 3.3.6 Reporter: Wei-Chiu Chuang As far as I can tell looking at hadoop-project/pom.xml the only difference between 3.3.5 and 3.3.6 from a dependency point of view is mysql connector (HADOOP-18535) derby (HADOOP-18535, HADOOP-18693). Json-smart, snakeyaml and jetty, jettison are updated in LICENSE-binary already. grizzly was used in test scope only so its removal doesn't matter. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18207) Introduce hadoop-logging module
[ https://issues.apache.org/jira/browse/HADOOP-18207?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730766#comment-17730766 ] ASF GitHub Bot commented on HADOOP-18207: - hadoop-yetus commented on PR #5717: URL: https://github.com/apache/hadoop/pull/5717#issuecomment-1583604621 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 1s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 2s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 81 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 17m 59s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 23m 12s | | trunk passed | | +1 :green_heart: | compile | 17m 38s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 15m 52s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 4m 9s | | trunk passed | | +1 :green_heart: | mvnsite | 20m 3s | | trunk passed | | +1 :green_heart: | javadoc | 16m 14s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 14m 21s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +0 :ok: | spotbugs | 0m 33s | | branch/hadoop-project no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 23m 39s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 26s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 15m 24s | | the patch passed | | +1 :green_heart: | compile | 17m 1s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 17m 1s | | the patch passed | | +1 :green_heart: | compile | 15m 53s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 15m 53s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 3s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5717/3/artifact/out/results-checkstyle-root.txt) | root: The patch generated 30 new + 1170 unchanged - 43 fixed = 1200 total (was 1213) | | +1 :green_heart: | mvnsite | 20m 39s | | the patch passed | | +1 :green_heart: | javadoc | 16m 59s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 15m 40s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +0 :ok: | spotbugs | 0m 17s | | hadoop-project has no data from spotbugs | | +1 :green_heart: | shadedclient | 23m 58s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 20s | | hadoop-project in the patch passed. | | +1 :green_heart: | unit | 0m 22s | | hadoop-logging in the patch passed. | | +1 :green_heart: | unit | 0m 30s | | hadoop-minikdc in the patch passed. | | +1 :green_heart: | unit | 3m 10s | | hadoop-auth in the patch passed. | | +1 :green_heart: | unit | 0m 21s | | hadoop-auth-examples in the patch passed. | | +1 :green_heart: | unit | 18m 25s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 32s | | hadoop-kms in the patch passed. | | +1 :green_heart: | unit | 25m 32s | | hadoop-common-project in the patch passed. | | +1 :green_heart: | unit | 2m 23s | | hadoop-hdfs-client in the patch passed. | | -1 :x: | unit | 238m 48s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5717/3/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | unit | 240m 4s | | hadoop-yarn in the patch passed. | | +1 :green_heart: | unit | 5m 39s | | hadoop-yarn-common in the patch passed. | | +1
[GitHub] [hadoop] hadoop-yetus commented on pull request #5717: HADOOP-18207. Introduce hadoop-logging module
hadoop-yetus commented on PR #5717: URL: https://github.com/apache/hadoop/pull/5717#issuecomment-1583604621 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 1s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 2s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 81 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 17m 59s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 23m 12s | | trunk passed | | +1 :green_heart: | compile | 17m 38s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 15m 52s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 4m 9s | | trunk passed | | +1 :green_heart: | mvnsite | 20m 3s | | trunk passed | | +1 :green_heart: | javadoc | 16m 14s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 14m 21s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +0 :ok: | spotbugs | 0m 33s | | branch/hadoop-project no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 23m 39s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 26s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 15m 24s | | the patch passed | | +1 :green_heart: | compile | 17m 1s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 17m 1s | | the patch passed | | +1 :green_heart: | compile | 15m 53s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 15m 53s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 3s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5717/3/artifact/out/results-checkstyle-root.txt) | root: The patch generated 30 new + 1170 unchanged - 43 fixed = 1200 total (was 1213) | | +1 :green_heart: | mvnsite | 20m 39s | | the patch passed | | +1 :green_heart: | javadoc | 16m 59s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 15m 40s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +0 :ok: | spotbugs | 0m 17s | | hadoop-project has no data from spotbugs | | +1 :green_heart: | shadedclient | 23m 58s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 20s | | hadoop-project in the patch passed. | | +1 :green_heart: | unit | 0m 22s | | hadoop-logging in the patch passed. | | +1 :green_heart: | unit | 0m 30s | | hadoop-minikdc in the patch passed. | | +1 :green_heart: | unit | 3m 10s | | hadoop-auth in the patch passed. | | +1 :green_heart: | unit | 0m 21s | | hadoop-auth-examples in the patch passed. | | +1 :green_heart: | unit | 18m 25s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 32s | | hadoop-kms in the patch passed. | | +1 :green_heart: | unit | 25m 32s | | hadoop-common-project in the patch passed. | | +1 :green_heart: | unit | 2m 23s | | hadoop-hdfs-client in the patch passed. | | -1 :x: | unit | 238m 48s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5717/3/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | unit | 240m 4s | | hadoop-yarn in the patch passed. | | +1 :green_heart: | unit | 5m 39s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 3m 36s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 4m 58s | | hadoop-yarn-server-applicationhistoryservice in the patch passed. | | +1 :green_heart: | unit | 101m 38s |
[GitHub] [hadoop] hadoop-yetus commented on pull request #5700: HDFS-17030. Limit wait time for getHAServiceState in ObserverReaderProxy
hadoop-yetus commented on PR #5700: URL: https://github.com/apache/hadoop/pull/5700#issuecomment-1583560536 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 49s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 19m 32s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 22m 50s | | trunk passed | | +1 :green_heart: | compile | 5m 48s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 5m 36s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 1m 19s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 12s | | trunk passed | | +1 :green_heart: | javadoc | 1m 45s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 2m 6s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 6m 1s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 48s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 23s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 56s | | the patch passed | | +1 :green_heart: | compile | 5m 47s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 5m 47s | | the patch passed | | +1 :green_heart: | compile | 5m 30s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 5m 30s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 8s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 59s | | the patch passed | | +1 :green_heart: | javadoc | 1m 28s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 1m 59s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 5m 59s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 48s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 16s | | hadoop-hdfs-client in the patch passed. | | -1 :x: | unit | 223m 24s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5700/19/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 49s | | The patch does not generate ASF License warnings. | | | | 371m 22s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.datanode.TestDirectoryScanner | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5700/19/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5700 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 16a5ef6f944a 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 54867487cb36504fec1f2885e6ea6233d9556610 | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5700/19/testReport/ | | Max. process+thread count | 2102 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-client
[GitHub] [hadoop] hadoop-yetus commented on pull request #5700: HDFS-17030. Limit wait time for getHAServiceState in ObserverReaderProxy
hadoop-yetus commented on PR #5700: URL: https://github.com/apache/hadoop/pull/5700#issuecomment-1583542874 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 32s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 19m 34s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 20m 10s | | trunk passed | | +1 :green_heart: | compile | 5m 16s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 5m 6s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 1m 16s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 14s | | trunk passed | | +1 :green_heart: | javadoc | 1m 51s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 2m 15s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 5m 37s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 19s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 28s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 50s | | the patch passed | | +1 :green_heart: | compile | 5m 0s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 5m 0s | | the patch passed | | +1 :green_heart: | compile | 4m 52s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 4m 52s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 5s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 56s | | the patch passed | | +1 :green_heart: | javadoc | 1m 26s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 2m 1s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 5m 35s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 14s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 22s | | hadoop-hdfs-client in the patch passed. | | +1 :green_heart: | unit | 203m 35s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 1m 1s | | The patch does not generate ASF License warnings. | | | | 339m 44s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5700/20/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5700 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 1878df179392 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / cf48c697a219614460e7fbca240ff63e9bd7e111 | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5700/20/testReport/ | | Max. process+thread count | 3219 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-client hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5700/20/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was
[jira] [Commented] (HADOOP-18750) Spark History Server 3.3.1 fails to starts with Hadoop 3.3.x
[ https://issues.apache.org/jira/browse/HADOOP-18750?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730756#comment-17730756 ] ASF GitHub Bot commented on HADOOP-18750: - jojochuang commented on PR #5695: URL: https://github.com/apache/hadoop/pull/5695#issuecomment-1583472070 I checked out the branch and ran the exact same command in precommit test. `mvn verify -fae --batch-mode -am -pl hadoop-client-modules/hadoop-client-check-invariants -pl hadoop-client-modules/hadoop-client-check-test-invariants -pl hadoop-client-modules/hadoop-client-integration-tests -Dtest=NoUnitTests -Dmaven.javadoc.skip=true -Dcheckstyle.skip=true -Dspotbugs.skip=true` The error is it cannot find the classpath: > [INFO] Running org.apache.hadoop.example.ITUseMiniCluster > [ERROR] Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 19.938 s <<< FAILURE! - in org.apache.hadoop.example.ITUseMiniCluster > [ERROR] useWebHDFS(org.apache.hadoop.example.ITUseMiniCluster) Time elapsed: 19.632 s <<< ERROR! > java.lang.NoClassDefFoundError: javax/servlet/Servlet >at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:493) >at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:150) >at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:1064) >at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:875) >at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:1130) >at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:1105) >at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1879) >at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1397) >at org.apache.hadoop.hdfs.MiniDFSCluster.configureNameService(MiniDFSCluster.java:1166) >at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:1039) >at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:971) >at org.apache.hadoop.hdfs.MiniDFSCluster.(MiniDFSCluster.java:591) >at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:530) >at org.apache.hadoop.example.ITUseMiniCluster.clusterUp(ITUseMiniCluster.java:77) >at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) >at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >at java.lang.reflect.Method.invoke(Method.java:498) >at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) >at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) >at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) >at org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33) >at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24) >at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) >at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) >at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100) >at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) >at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103) >at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63) >at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) >at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) >at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) >at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) >at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) >at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) >at org.junit.runners.ParentRunner.run(ParentRunner.java:413) >at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365) >at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273) >at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238) >at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159) >at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384) >at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345) >at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126) >at
[GitHub] [hadoop] jojochuang commented on pull request #5695: HADOOP-18750. Remove javax/servlet shading in hadoop-client-api
jojochuang commented on PR #5695: URL: https://github.com/apache/hadoop/pull/5695#issuecomment-1583472070 I checked out the branch and ran the exact same command in precommit test. `mvn verify -fae --batch-mode -am -pl hadoop-client-modules/hadoop-client-check-invariants -pl hadoop-client-modules/hadoop-client-check-test-invariants -pl hadoop-client-modules/hadoop-client-integration-tests -Dtest=NoUnitTests -Dmaven.javadoc.skip=true -Dcheckstyle.skip=true -Dspotbugs.skip=true` The error is it cannot find the classpath: > [INFO] Running org.apache.hadoop.example.ITUseMiniCluster > [ERROR] Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 19.938 s <<< FAILURE! - in org.apache.hadoop.example.ITUseMiniCluster > [ERROR] useWebHDFS(org.apache.hadoop.example.ITUseMiniCluster) Time elapsed: 19.632 s <<< ERROR! > java.lang.NoClassDefFoundError: javax/servlet/Servlet >at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:493) >at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:150) >at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:1064) >at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:875) >at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:1130) >at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:1105) >at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1879) >at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1397) >at org.apache.hadoop.hdfs.MiniDFSCluster.configureNameService(MiniDFSCluster.java:1166) >at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:1039) >at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:971) >at org.apache.hadoop.hdfs.MiniDFSCluster.(MiniDFSCluster.java:591) >at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:530) >at org.apache.hadoop.example.ITUseMiniCluster.clusterUp(ITUseMiniCluster.java:77) >at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) >at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >at java.lang.reflect.Method.invoke(Method.java:498) >at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) >at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) >at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) >at org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33) >at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24) >at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) >at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) >at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100) >at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) >at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103) >at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63) >at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) >at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) >at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) >at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) >at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) >at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) >at org.junit.runners.ParentRunner.run(ParentRunner.java:413) >at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365) >at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273) >at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238) >at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159) >at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384) >at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345) >at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126) >at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418) > Caused by: java.lang.ClassNotFoundException: javax.servlet.Servlet >at java.net.URLClassLoader.findClass(URLClassLoader.java:387) >at
[jira] [Commented] (HADOOP-18756) CachingBlockManager to use AtomicBoolean for closed flag
[ https://issues.apache.org/jira/browse/HADOOP-18756?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730711#comment-17730711 ] ASF GitHub Bot commented on HADOOP-18756: - hadoop-yetus commented on PR #5718: URL: https://github.com/apache/hadoop/pull/5718#issuecomment-1583302372 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 33s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 36m 19s | | trunk passed | | +1 :green_heart: | compile | 15m 30s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 14m 16s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 1m 15s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 37s | | trunk passed | | +1 :green_heart: | javadoc | 1m 17s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 52s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 2m 39s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 28s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 22m 50s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 52s | | the patch passed | | +1 :green_heart: | compile | 14m 58s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 14m 58s | | the patch passed | | +1 :green_heart: | compile | 14m 26s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 14m 26s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 7s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 34s | | the patch passed | | +1 :green_heart: | javadoc | 1m 7s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 51s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 2m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 20s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 34s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 4s | | The patch does not generate ASF License warnings. | | | | 177m 53s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5718/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5718 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 82134ed0d52e 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 7f48042efed52119a991381a1cfc196bb98ddd38 | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5718/3/testReport/ | | Max. process+thread count | 1925 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U:
[GitHub] [hadoop] hadoop-yetus commented on pull request #5718: HADOOP-18756. S3A prefetch - CachingBlockManager to use AtomicBoolean for closed flag
hadoop-yetus commented on PR #5718: URL: https://github.com/apache/hadoop/pull/5718#issuecomment-1583302372 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 33s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 36m 19s | | trunk passed | | +1 :green_heart: | compile | 15m 30s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 14m 16s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 1m 15s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 37s | | trunk passed | | +1 :green_heart: | javadoc | 1m 17s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 52s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 2m 39s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 28s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 22m 50s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 52s | | the patch passed | | +1 :green_heart: | compile | 14m 58s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 14m 58s | | the patch passed | | +1 :green_heart: | compile | 14m 26s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 14m 26s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 7s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 34s | | the patch passed | | +1 :green_heart: | javadoc | 1m 7s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 51s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 2m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 20s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 34s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 4s | | The patch does not generate ASF License warnings. | | | | 177m 53s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5718/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5718 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 82134ed0d52e 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 7f48042efed52119a991381a1cfc196bb98ddd38 | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5718/3/testReport/ | | Max. process+thread count | 1925 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5718/3/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org
[jira] [Updated] (HADOOP-18718) Fix several maven build warnings
[ https://issues.apache.org/jira/browse/HADOOP-18718?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ayush Saxena updated HADOOP-18718: -- Target Version/s: 3.4.0 (was: 3.4.0, 3.3.9, 3.3.6) > Fix several maven build warnings > > > Key: HADOOP-18718 > URL: https://issues.apache.org/jira/browse/HADOOP-18718 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > > {code} > [WARNING] 'build.plugins.plugin.version' for > org.cyclonedx:cyclonedx-maven-plugin is missing. > {code} > {code} > [WARNING] Unknown keyword additionalItems - you should define your own Meta > Schema. If the keyword is irrelevant for validation, just use a > NonValidationKeyword > {code} > {code} > [WARNING] 'build.plugins.plugin.version' for > org.codehaus.mojo:findbugs-maven-plugin is missing > {code} > {code} > [WARNING] Parameter 'requiresOnline' is unknown for plugin > 'exec-maven-plugin:1.3.1:exec (pre-dist)' > {code} > {code} > [WARNING] Parameter 'destDir' (user property 'destDir') is deprecated: No > reason given > {code} > {code} > [WARNING] Parameter 'tasks' is deprecated: Use target instead > [WARNING] Parameter tasks is deprecated, use target instead > {code} > {code} > [WARNING] Parameter 'systemProperties' is deprecated: Use > systemPropertyVariables instead. > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18487) protobuf-2.5.0 dependencies => provided
[ https://issues.apache.org/jira/browse/HADOOP-18487?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Wei-Chiu Chuang updated HADOOP-18487: - Target Version/s: 3.3.7 (was: 3.3.6) > protobuf-2.5.0 dependencies => provided > --- > > Key: HADOOP-18487 > URL: https://issues.apache.org/jira/browse/HADOOP-18487 > Project: Hadoop Common > Issue Type: Improvement > Components: build, ipc >Affects Versions: 3.3.4 >Reporter: Steve Loughran >Priority: Major > Labels: pull-request-available > > uses of protobuf 2.5 and RpcEnginej have been deprecated since 3.3.0 in > HADOOP-17046 > while still keeping those files around (for a long time...), how about we > make the protobuf 2.5.0 export off hadoop common and hadoop-hdfs *provided*, > rather than *compile* > that way, if apps want it for their own apis, they have to explicitly ask for > it, but at least our own scans don't break. > i have no idea what will happen to the rest of the stack at this point, it > will be "interesting" to see -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Resolved] (HADOOP-11219) [Umbrella] Upgrade to netty 4
[ https://issues.apache.org/jira/browse/HADOOP-11219?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Wei-Chiu Chuang resolved HADOOP-11219. -- Fix Version/s: 3.4.0 Resolution: Fixed This is done. Would be nice to cherrypick the YARN shuffle handler change into branch-3.3. But at least this is finished in trunk so we can get rid of netty3 in Hadoop 3.4.0 > [Umbrella] Upgrade to netty 4 > - > > Key: HADOOP-11219 > URL: https://issues.apache.org/jira/browse/HADOOP-11219 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Haohui Mai >Assignee: Haohui Mai >Priority: Major > Fix For: 3.4.0 > > > This is an umbrella jira to track the effort of upgrading to Netty 4. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18750) Spark History Server 3.3.1 fails to starts with Hadoop 3.3.x
[ https://issues.apache.org/jira/browse/HADOOP-18750?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730675#comment-17730675 ] ASF GitHub Bot commented on HADOOP-18750: - hadoop-yetus commented on PR #5695: URL: https://github.com/apache/hadoop/pull/5695#issuecomment-1583191500 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 35s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | shelldocs | 0m 0s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 21m 25s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 21m 1s | | trunk passed | | +1 :green_heart: | compile | 0m 26s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 0m 23s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | mvnsite | 0m 50s | | trunk passed | | +1 :green_heart: | javadoc | 0m 49s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 42s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | shadedclient | 21m 19s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 3m 1s | | the patch passed | | +1 :green_heart: | compile | 0m 16s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 0m 16s | | the patch passed | | +1 :green_heart: | compile | 0m 15s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 0m 15s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 34s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | javadoc | 0m 28s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 30s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | -1 :x: | shadedclient | 20m 43s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 19s | | hadoop-client-api in the patch passed. | | +1 :green_heart: | unit | 0m 18s | | hadoop-client-check-invariants in the patch passed. | | +1 :green_heart: | asflicense | 0m 38s | | The patch does not generate ASF License warnings. | | | | 96m 10s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5695/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5695 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint shellcheck shelldocs | | uname | Linux 53cf23effe2f 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / ef9defadf710531709f2a7a060d7373b3f792b5f | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5695/3/testReport/ | | Max. process+thread count | 728 (vs. ulimit of 5500) | | modules | C: hadoop-client-modules/hadoop-client-api hadoop-client-modules/hadoop-client-check-invariants U: hadoop-client-modules | | Console output |
[GitHub] [hadoop] hadoop-yetus commented on pull request #5695: HADOOP-18750. Remove javax/servlet shading in hadoop-client-api
hadoop-yetus commented on PR #5695: URL: https://github.com/apache/hadoop/pull/5695#issuecomment-1583191500 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 35s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | shelldocs | 0m 0s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 21m 25s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 21m 1s | | trunk passed | | +1 :green_heart: | compile | 0m 26s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 0m 23s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | mvnsite | 0m 50s | | trunk passed | | +1 :green_heart: | javadoc | 0m 49s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 42s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | shadedclient | 21m 19s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 3m 1s | | the patch passed | | +1 :green_heart: | compile | 0m 16s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 0m 16s | | the patch passed | | +1 :green_heart: | compile | 0m 15s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 0m 15s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 34s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | javadoc | 0m 28s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 30s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | -1 :x: | shadedclient | 20m 43s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 19s | | hadoop-client-api in the patch passed. | | +1 :green_heart: | unit | 0m 18s | | hadoop-client-check-invariants in the patch passed. | | +1 :green_heart: | asflicense | 0m 38s | | The patch does not generate ASF License warnings. | | | | 96m 10s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5695/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5695 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint shellcheck shelldocs | | uname | Linux 53cf23effe2f 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / ef9defadf710531709f2a7a060d7373b3f792b5f | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5695/3/testReport/ | | Max. process+thread count | 728 (vs. ulimit of 5500) | | modules | C: hadoop-client-modules/hadoop-client-api hadoop-client-modules/hadoop-client-check-invariants U: hadoop-client-modules | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5695/3/console | | versions | git=2.25.1 maven=3.6.3 shellcheck=0.7.0 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. --
[GitHub] [hadoop] hadoop-yetus commented on pull request #5722: YARN-11504. [Federation] YARN Federation Supports Non-HA mode.
hadoop-yetus commented on PR #5722: URL: https://github.com/apache/hadoop/pull/5722#issuecomment-1583178177 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 43s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 18m 14s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 20m 34s | | trunk passed | | +1 :green_heart: | compile | 7m 12s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 6m 37s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 1m 49s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 55s | | trunk passed | | +1 :green_heart: | javadoc | 1m 56s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 1m 49s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 4m 10s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 2s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 16s | | the patch passed | | +1 :green_heart: | compile | 6m 43s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 6m 43s | | the patch passed | | +1 :green_heart: | compile | 6m 46s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 6m 46s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 40s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 52s | | the patch passed | | +1 :green_heart: | javadoc | 1m 38s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 1m 35s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 4m 17s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 21s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 10s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 5m 43s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 56s | | The patch does not generate ASF License warnings. | | | | 147m 27s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5722/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5722 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux dcf6767e76a5 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / e7eb7fc186fc7369d799801747881b8f015a9fb0 | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5722/3/testReport/ | | Max. process+thread count | 555 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common U: hadoop-yarn-project/hadoop-yarn | |
[jira] [Updated] (HADOOP-18718) Fix several maven build warnings
[ https://issues.apache.org/jira/browse/HADOOP-18718?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Wei-Chiu Chuang updated HADOOP-18718: - Target Version/s: 3.4.0, 3.3.9, 3.3.6 (was: 3.4.0, 3.3.9) > Fix several maven build warnings > > > Key: HADOOP-18718 > URL: https://issues.apache.org/jira/browse/HADOOP-18718 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > > {code} > [WARNING] 'build.plugins.plugin.version' for > org.cyclonedx:cyclonedx-maven-plugin is missing. > {code} > {code} > [WARNING] Unknown keyword additionalItems - you should define your own Meta > Schema. If the keyword is irrelevant for validation, just use a > NonValidationKeyword > {code} > {code} > [WARNING] 'build.plugins.plugin.version' for > org.codehaus.mojo:findbugs-maven-plugin is missing > {code} > {code} > [WARNING] Parameter 'requiresOnline' is unknown for plugin > 'exec-maven-plugin:1.3.1:exec (pre-dist)' > {code} > {code} > [WARNING] Parameter 'destDir' (user property 'destDir') is deprecated: No > reason given > {code} > {code} > [WARNING] Parameter 'tasks' is deprecated: Use target instead > [WARNING] Parameter tasks is deprecated, use target instead > {code} > {code} > [WARNING] Parameter 'systemProperties' is deprecated: Use > systemPropertyVariables instead. > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18655) Upgrade Kerby to 2.0.3 due to CVE-2023-25613
[ https://issues.apache.org/jira/browse/HADOOP-18655?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730668#comment-17730668 ] Wei-Chiu Chuang commented on HADOOP-18655: -- I'm trying to understand if it's a real problem for 3.3.x release. [~ste...@apache.org] [~rohit.kumar] AFAIK this is an issue using Kerby as Ldap server. But we don't use Kerby as a Ldap server in Hadoop (we do have Kerby as KDC server in test though) so IMO this shouldn't block 3.3.6 release. > Upgrade Kerby to 2.0.3 due to CVE-2023-25613 > > > Key: HADOOP-18655 > URL: https://issues.apache.org/jira/browse/HADOOP-18655 > Project: Hadoop Common > Issue Type: Task > Components: build >Affects Versions: 3.3.4 >Reporter: Rohit Kumar >Assignee: Rohit Kumar >Priority: Blocker > Labels: pull-request-available > Fix For: 3.4.0 > > > An LDAP Injection vulnerability exists in the LdapIdentityBackend of Apache > Kerby before 2.0.3. > CVSSv3 Score:- 9.8(Critical) > [https://nvd.nist.gov/vuln/detail/CVE-2023-25613] -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18655) Upgrade Kerby to 2.0.3 due to CVE-2023-25613
[ https://issues.apache.org/jira/browse/HADOOP-18655?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Wei-Chiu Chuang updated HADOOP-18655: - Target Version/s: 3.3.9 (was: 3.3.6) > Upgrade Kerby to 2.0.3 due to CVE-2023-25613 > > > Key: HADOOP-18655 > URL: https://issues.apache.org/jira/browse/HADOOP-18655 > Project: Hadoop Common > Issue Type: Task > Components: build >Affects Versions: 3.3.4 >Reporter: Rohit Kumar >Assignee: Rohit Kumar >Priority: Blocker > Labels: pull-request-available > Fix For: 3.4.0 > > > An LDAP Injection vulnerability exists in the LdapIdentityBackend of Apache > Kerby before 2.0.3. > CVSSv3 Score:- 9.8(Critical) > [https://nvd.nist.gov/vuln/detail/CVE-2023-25613] -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18718) Fix several maven build warnings
[ https://issues.apache.org/jira/browse/HADOOP-18718?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730667#comment-17730667 ] ASF GitHub Bot commented on HADOOP-18718: - dongjoon-hyun commented on PR #5592: URL: https://github.com/apache/hadoop/pull/5592#issuecomment-1583141301 Thank you, @GauthamBanasandra . > Fix several maven build warnings > > > Key: HADOOP-18718 > URL: https://issues.apache.org/jira/browse/HADOOP-18718 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > > {code} > [WARNING] 'build.plugins.plugin.version' for > org.cyclonedx:cyclonedx-maven-plugin is missing. > {code} > {code} > [WARNING] Unknown keyword additionalItems - you should define your own Meta > Schema. If the keyword is irrelevant for validation, just use a > NonValidationKeyword > {code} > {code} > [WARNING] 'build.plugins.plugin.version' for > org.codehaus.mojo:findbugs-maven-plugin is missing > {code} > {code} > [WARNING] Parameter 'requiresOnline' is unknown for plugin > 'exec-maven-plugin:1.3.1:exec (pre-dist)' > {code} > {code} > [WARNING] Parameter 'destDir' (user property 'destDir') is deprecated: No > reason given > {code} > {code} > [WARNING] Parameter 'tasks' is deprecated: Use target instead > [WARNING] Parameter tasks is deprecated, use target instead > {code} > {code} > [WARNING] Parameter 'systemProperties' is deprecated: Use > systemPropertyVariables instead. > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] dongjoon-hyun commented on pull request #5592: HADOOP-18718. Fix several maven build warnings
dongjoon-hyun commented on PR #5592: URL: https://github.com/apache/hadoop/pull/5592#issuecomment-1583141301 Thank you, @GauthamBanasandra . -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18718) Fix several maven build warnings
[ https://issues.apache.org/jira/browse/HADOOP-18718?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730661#comment-17730661 ] ASF GitHub Bot commented on HADOOP-18718: - GauthamBanasandra commented on code in PR #5592: URL: https://github.com/apache/hadoop/pull/5592#discussion_r1223396260 ## hadoop-project-dist/pom.xml: ## @@ -106,7 +106,7 @@ ${maven.compile.source} ${maven.compile.encoding} ${project.build.directory}/site - ${project.build.directory}/api + ${project.build.directory}/api Review Comment: Makes sense. Changing to `outputDirectory` would be redundant. > Fix several maven build warnings > > > Key: HADOOP-18718 > URL: https://issues.apache.org/jira/browse/HADOOP-18718 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > > {code} > [WARNING] 'build.plugins.plugin.version' for > org.cyclonedx:cyclonedx-maven-plugin is missing. > {code} > {code} > [WARNING] Unknown keyword additionalItems - you should define your own Meta > Schema. If the keyword is irrelevant for validation, just use a > NonValidationKeyword > {code} > {code} > [WARNING] 'build.plugins.plugin.version' for > org.codehaus.mojo:findbugs-maven-plugin is missing > {code} > {code} > [WARNING] Parameter 'requiresOnline' is unknown for plugin > 'exec-maven-plugin:1.3.1:exec (pre-dist)' > {code} > {code} > [WARNING] Parameter 'destDir' (user property 'destDir') is deprecated: No > reason given > {code} > {code} > [WARNING] Parameter 'tasks' is deprecated: Use target instead > [WARNING] Parameter tasks is deprecated, use target instead > {code} > {code} > [WARNING] Parameter 'systemProperties' is deprecated: Use > systemPropertyVariables instead. > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] GauthamBanasandra commented on a diff in pull request #5592: HADOOP-18718. Fix several maven build warnings
GauthamBanasandra commented on code in PR #5592: URL: https://github.com/apache/hadoop/pull/5592#discussion_r1223396260 ## hadoop-project-dist/pom.xml: ## @@ -106,7 +106,7 @@ ${maven.compile.source} ${maven.compile.encoding} ${project.build.directory}/site - ${project.build.directory}/api + ${project.build.directory}/api Review Comment: Makes sense. Changing to `outputDirectory` would be redundant. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18655) Upgrade Kerby to 2.0.3 due to CVE-2023-25613
[ https://issues.apache.org/jira/browse/HADOOP-18655?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Wei-Chiu Chuang updated HADOOP-18655: - Target Version/s: 3.3.6 Priority: Blocker (was: Minor) > Upgrade Kerby to 2.0.3 due to CVE-2023-25613 > > > Key: HADOOP-18655 > URL: https://issues.apache.org/jira/browse/HADOOP-18655 > Project: Hadoop Common > Issue Type: Task > Components: build >Affects Versions: 3.3.4 >Reporter: Rohit Kumar >Assignee: Rohit Kumar >Priority: Blocker > Labels: pull-request-available > Fix For: 3.4.0 > > > An LDAP Injection vulnerability exists in the LdapIdentityBackend of Apache > Kerby before 2.0.3. > CVSSv3 Score:- 9.8(Critical) > [https://nvd.nist.gov/vuln/detail/CVE-2023-25613] -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18718) Fix several maven build warnings
[ https://issues.apache.org/jira/browse/HADOOP-18718?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Wei-Chiu Chuang updated HADOOP-18718: - Target Version/s: 3.4.0, 3.3.9 > Fix several maven build warnings > > > Key: HADOOP-18718 > URL: https://issues.apache.org/jira/browse/HADOOP-18718 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.4.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > > {code} > [WARNING] 'build.plugins.plugin.version' for > org.cyclonedx:cyclonedx-maven-plugin is missing. > {code} > {code} > [WARNING] Unknown keyword additionalItems - you should define your own Meta > Schema. If the keyword is irrelevant for validation, just use a > NonValidationKeyword > {code} > {code} > [WARNING] 'build.plugins.plugin.version' for > org.codehaus.mojo:findbugs-maven-plugin is missing > {code} > {code} > [WARNING] Parameter 'requiresOnline' is unknown for plugin > 'exec-maven-plugin:1.3.1:exec (pre-dist)' > {code} > {code} > [WARNING] Parameter 'destDir' (user property 'destDir') is deprecated: No > reason given > {code} > {code} > [WARNING] Parameter 'tasks' is deprecated: Use target instead > [WARNING] Parameter tasks is deprecated, use target instead > {code} > {code} > [WARNING] Parameter 'systemProperties' is deprecated: Use > systemPropertyVariables instead. > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18756) CachingBlockManager to use AtomicBoolean for closed flag
[ https://issues.apache.org/jira/browse/HADOOP-18756?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730655#comment-17730655 ] ASF GitHub Bot commented on HADOOP-18756: - virajjasani commented on code in PR #5718: URL: https://github.com/apache/hadoop/pull/5718#discussion_r1223347046 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/impl/prefetch/SingleFilePerBlockCache.java: ## @@ -333,37 +335,33 @@ protected Path getCacheFilePath(final Configuration conf, @Override public void close() throws IOException { -if (closed) { - return; -} - -closed = true; +if (closed.compareAndSet(false, true)) { + LOG.info(getStats()); Review Comment: agree, it is noisy, let me downgrade it to debug > CachingBlockManager to use AtomicBoolean for closed flag > > > Key: HADOOP-18756 > URL: https://issues.apache.org/jira/browse/HADOOP-18756 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.9 >Reporter: Steve Loughran >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > > the {{CachingBlockManager}} uses the boolean field {{closed)) in various > operations, including a do/while loop. to ensure the flag is correctly > updated across threads, it needs to move to an atomic boolean. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] virajjasani commented on a diff in pull request #5718: HADOOP-18756. S3A prefetch - CachingBlockManager to use AtomicBoolean for closed flag
virajjasani commented on code in PR #5718: URL: https://github.com/apache/hadoop/pull/5718#discussion_r1223347046 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/impl/prefetch/SingleFilePerBlockCache.java: ## @@ -333,37 +335,33 @@ protected Path getCacheFilePath(final Configuration conf, @Override public void close() throws IOException { -if (closed) { - return; -} - -closed = true; +if (closed.compareAndSet(false, true)) { + LOG.info(getStats()); Review Comment: agree, it is noisy, let me downgrade it to debug -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18073) Upgrade AWS SDK to v2
[ https://issues.apache.org/jira/browse/HADOOP-18073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730653#comment-17730653 ] ASF GitHub Bot commented on HADOOP-18073: - steveloughran merged PR #5707: URL: https://github.com/apache/hadoop/pull/5707 > Upgrade AWS SDK to v2 > - > > Key: HADOOP-18073 > URL: https://issues.apache.org/jira/browse/HADOOP-18073 > Project: Hadoop Common > Issue Type: Task > Components: auth, fs/s3 >Affects Versions: 3.3.1 >Reporter: xiaowei sun >Assignee: Ahmar Suhail >Priority: Major > Labels: pull-request-available > Attachments: Upgrading S3A to SDKV2.pdf > > > This task tracks upgrading Hadoop's AWS connector S3A from AWS SDK for Java > V1 to AWS SDK for Java V2. > Original use case: > {quote}We would like to access s3 with AWS SSO, which is supported in > software.amazon.awssdk:sdk-core:2.*. > In particular, from > [https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html], > when to set 'fs.s3a.aws.credentials.provider', it must be > "com.amazonaws.auth.AWSCredentialsProvider". We would like to support > "software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider" which > supports AWS SSO, so users only need to authenticate once. > {quote} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran merged pull request #5707: HADOOP-18073. Upgrade S3A in 3.3 branch to AWS SDK V2.
steveloughran merged PR #5707: URL: https://github.com/apache/hadoop/pull/5707 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18756) CachingBlockManager to use AtomicBoolean for closed flag
[ https://issues.apache.org/jira/browse/HADOOP-18756?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730651#comment-17730651 ] ASF GitHub Bot commented on HADOOP-18756: - steveloughran commented on code in PR #5718: URL: https://github.com/apache/hadoop/pull/5718#discussion_r1223339252 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/impl/prefetch/SingleFilePerBlockCache.java: ## @@ -333,37 +335,33 @@ protected Path getCacheFilePath(final Configuration conf, @Override public void close() throws IOException { -if (closed) { - return; -} - -closed = true; +if (closed.compareAndSet(false, true)) { + LOG.info(getStats()); Review Comment: could we log this at debug; prefetching is fairly noisy right now ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/impl/prefetch/SingleFilePerBlockCache.java: ## @@ -258,27 +260,23 @@ protected Path getCacheFilePath(final Configuration conf, @Override public void close() throws IOException { -if (closed) { - return; -} - -closed = true; - -LOG.info(getStats()); -int numFilesDeleted = 0; - -for (Entry entry : blocks.values()) { - try { -Files.deleteIfExists(entry.path); -prefetchingStatistics.blockRemovedFromFileCache(); -numFilesDeleted++; - } catch (IOException e) { -LOG.debug("Failed to delete cache file {}", entry.path, e); +if (closed.compareAndSet(false, true)) { + LOG.info(getStats()); + int numFilesDeleted = 0; + + for (Entry entry : blocks.values()) { +try { + Files.deleteIfExists(entry.path); + prefetchingStatistics.blockRemovedFromFileCache(); + numFilesDeleted++; +} catch (IOException e) { + LOG.debug("Failed to delete cache file {}", entry.path, e); +} } -} -if (numFilesDeleted > 0) { - LOG.info("Deleted {} cache files", numFilesDeleted); + if (numFilesDeleted > 0) { +LOG.info("Deleted {} cache files", numFilesDeleted); Review Comment: downgrade to debug and remove the if() test; we can log the zero count too > CachingBlockManager to use AtomicBoolean for closed flag > > > Key: HADOOP-18756 > URL: https://issues.apache.org/jira/browse/HADOOP-18756 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.9 >Reporter: Steve Loughran >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > > the {{CachingBlockManager}} uses the boolean field {{closed)) in various > operations, including a do/while loop. to ensure the flag is correctly > updated across threads, it needs to move to an atomic boolean. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on a diff in pull request #5718: HADOOP-18756. S3A prefetch - CachingBlockManager to use AtomicBoolean for closed flag
steveloughran commented on code in PR #5718: URL: https://github.com/apache/hadoop/pull/5718#discussion_r1223339252 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/impl/prefetch/SingleFilePerBlockCache.java: ## @@ -333,37 +335,33 @@ protected Path getCacheFilePath(final Configuration conf, @Override public void close() throws IOException { -if (closed) { - return; -} - -closed = true; +if (closed.compareAndSet(false, true)) { + LOG.info(getStats()); Review Comment: could we log this at debug; prefetching is fairly noisy right now ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/impl/prefetch/SingleFilePerBlockCache.java: ## @@ -258,27 +260,23 @@ protected Path getCacheFilePath(final Configuration conf, @Override public void close() throws IOException { -if (closed) { - return; -} - -closed = true; - -LOG.info(getStats()); -int numFilesDeleted = 0; - -for (Entry entry : blocks.values()) { - try { -Files.deleteIfExists(entry.path); -prefetchingStatistics.blockRemovedFromFileCache(); -numFilesDeleted++; - } catch (IOException e) { -LOG.debug("Failed to delete cache file {}", entry.path, e); +if (closed.compareAndSet(false, true)) { + LOG.info(getStats()); + int numFilesDeleted = 0; + + for (Entry entry : blocks.values()) { +try { + Files.deleteIfExists(entry.path); + prefetchingStatistics.blockRemovedFromFileCache(); + numFilesDeleted++; +} catch (IOException e) { + LOG.debug("Failed to delete cache file {}", entry.path, e); +} } -} -if (numFilesDeleted > 0) { - LOG.info("Deleted {} cache files", numFilesDeleted); + if (numFilesDeleted > 0) { +LOG.info("Deleted {} cache files", numFilesDeleted); Review Comment: downgrade to debug and remove the if() test; we can log the zero count too -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18750) Spark History Server 3.3.1 fails to starts with Hadoop 3.3.x
[ https://issues.apache.org/jira/browse/HADOOP-18750?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730648#comment-17730648 ] ASF GitHub Bot commented on HADOOP-18750: - hadoop-yetus commented on PR #5695: URL: https://github.com/apache/hadoop/pull/5695#issuecomment-1583029528 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 36s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | shelldocs | 0m 0s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 20m 56s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 21m 56s | | trunk passed | | +1 :green_heart: | compile | 0m 24s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 0m 22s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | mvnsite | 0m 50s | | trunk passed | | +1 :green_heart: | javadoc | 0m 48s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 43s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | shadedclient | 23m 57s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 3m 43s | | the patch passed | | +1 :green_heart: | compile | 0m 16s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 0m 16s | | the patch passed | | +1 :green_heart: | compile | 0m 16s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 0m 16s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 32s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | javadoc | 0m 30s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 28s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | -1 :x: | shadedclient | 22m 16s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 17s | | hadoop-client-api in the patch passed. | | +1 :green_heart: | unit | 0m 16s | | hadoop-client-check-invariants in the patch passed. | | +1 :green_heart: | asflicense | 0m 36s | | The patch does not generate ASF License warnings. | | | | 101m 19s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5695/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5695 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint shellcheck shelldocs | | uname | Linux 1c67920e37cf 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / ec5e60d5426c2953dd48138af8879b46326f47db | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5695/2/testReport/ | | Max. process+thread count | 560 (vs. ulimit of 5500) | | modules | C: hadoop-client-modules/hadoop-client-api hadoop-client-modules/hadoop-client-check-invariants U: hadoop-client-modules | | Console output |
[GitHub] [hadoop] hadoop-yetus commented on pull request #5695: HADOOP-18750. Remove javax/servlet shading in hadoop-client-api
hadoop-yetus commented on PR #5695: URL: https://github.com/apache/hadoop/pull/5695#issuecomment-1583029528 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 36s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | shelldocs | 0m 0s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 20m 56s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 21m 56s | | trunk passed | | +1 :green_heart: | compile | 0m 24s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 0m 22s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | mvnsite | 0m 50s | | trunk passed | | +1 :green_heart: | javadoc | 0m 48s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 43s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | shadedclient | 23m 57s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 3m 43s | | the patch passed | | +1 :green_heart: | compile | 0m 16s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 0m 16s | | the patch passed | | +1 :green_heart: | compile | 0m 16s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 0m 16s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 32s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | javadoc | 0m 30s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 28s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | -1 :x: | shadedclient | 22m 16s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 17s | | hadoop-client-api in the patch passed. | | +1 :green_heart: | unit | 0m 16s | | hadoop-client-check-invariants in the patch passed. | | +1 :green_heart: | asflicense | 0m 36s | | The patch does not generate ASF License warnings. | | | | 101m 19s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5695/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5695 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint shellcheck shelldocs | | uname | Linux 1c67920e37cf 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / ec5e60d5426c2953dd48138af8879b46326f47db | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5695/2/testReport/ | | Max. process+thread count | 560 (vs. ulimit of 5500) | | modules | C: hadoop-client-modules/hadoop-client-api hadoop-client-modules/hadoop-client-check-invariants U: hadoop-client-modules | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5695/2/console | | versions | git=2.25.1 maven=3.6.3 shellcheck=0.7.0 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. --
[GitHub] [hadoop] slfan1989 commented on a diff in pull request #5722: YARN-11504. [Federation] YARN Federation Supports Non-HA mode.
slfan1989 commented on code in PR #5722: URL: https://github.com/apache/hadoop/pull/5722#discussion_r1223299403 ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/client/RMProxy.java: ## @@ -116,6 +116,28 @@ protected static T createRMProxy(final Configuration configuration, return newProxyInstance(conf, protocol, instance, retryPolicy); } + /** + * Currently, used by NodeManagers only. + * + * @param configuration configuration. + * @param protocol protocol. + * @param instance RMProxy instance. + * @return RMProxy. + * @param Generic T. + * @throws IOException io error occur. + */ + protected static T createRMProxyFederation(final Configuration configuration, + final Class protocol, RMProxy instance) throws IOException { +YarnConfiguration yarnConf = +(configuration instanceof YarnConfiguration) ? (YarnConfiguration) configuration : +new YarnConfiguration(configuration); +if(isFederationNonHAEnabled(yarnConf)){ + RetryPolicy retryPolicy = createRetryPolicy(yarnConf, true); + return newProxyInstance(yarnConf, protocol, instance, retryPolicy); +} +return createRMProxy(configuration, protocol, instance); Review Comment: I will add unit tests as soon as possible. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18718) Fix several maven build warnings
[ https://issues.apache.org/jira/browse/HADOOP-18718?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730612#comment-17730612 ] ASF GitHub Bot commented on HADOOP-18718: - hadoop-yetus commented on PR #5592: URL: https://github.com/apache/hadoop/pull/5592#issuecomment-1582818110 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 48s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 18m 27s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 19m 53s | | trunk passed | | +1 :green_heart: | compile | 15m 35s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 14m 19s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | mvnsite | 18m 47s | | trunk passed | | +1 :green_heart: | javadoc | 8m 28s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 6m 58s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | shadedclient | 122m 28s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 43s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 22m 55s | | the patch passed | | +1 :green_heart: | compile | 15m 6s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 15m 6s | | the patch passed | | +1 :green_heart: | compile | 14m 24s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 14m 24s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 12m 15s | | the patch passed | | +1 :green_heart: | javadoc | 7m 49s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 6m 49s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | shadedclient | 48m 17s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 792m 17s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5592/6/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 1m 34s | | The patch does not generate ASF License warnings. | | | | 1021m 58s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.yarn.server.timeline.webapp.TestTimelineWebServices | | | hadoop.mapreduce.v2.TestMRJobsWithProfiler | | | hadoop.mapreduce.v2.TestMRJobs | | | hadoop.mapreduce.v2.TestUberAM | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5592/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5592 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint | | uname | Linux d3874830e04c 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / c5ef56970bec44c6299c973cc7b313bc6561229c | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5592/6/testReport/
[GitHub] [hadoop] hadoop-yetus commented on pull request #5592: HADOOP-18718. Fix several maven build warnings
hadoop-yetus commented on PR #5592: URL: https://github.com/apache/hadoop/pull/5592#issuecomment-1582818110 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 48s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 18m 27s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 19m 53s | | trunk passed | | +1 :green_heart: | compile | 15m 35s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 14m 19s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | mvnsite | 18m 47s | | trunk passed | | +1 :green_heart: | javadoc | 8m 28s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 6m 58s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | shadedclient | 122m 28s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 43s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 22m 55s | | the patch passed | | +1 :green_heart: | compile | 15m 6s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 15m 6s | | the patch passed | | +1 :green_heart: | compile | 14m 24s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 14m 24s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 12m 15s | | the patch passed | | +1 :green_heart: | javadoc | 7m 49s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 6m 49s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | shadedclient | 48m 17s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 792m 17s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5592/6/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 1m 34s | | The patch does not generate ASF License warnings. | | | | 1021m 58s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.yarn.server.timeline.webapp.TestTimelineWebServices | | | hadoop.mapreduce.v2.TestMRJobsWithProfiler | | | hadoop.mapreduce.v2.TestMRJobs | | | hadoop.mapreduce.v2.TestUberAM | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5592/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5592 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint | | uname | Linux d3874830e04c 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / c5ef56970bec44c6299c973cc7b313bc6561229c | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5592/6/testReport/ | | Max. process+thread count | 3152 (vs. ulimit of 5500) | | modules | C: hadoop-project-dist hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common
[GitHub] [hadoop] hadoop-yetus commented on pull request #5696: HDFS-16946. Fix getTopTokenRealOwners to return String
hadoop-yetus commented on PR #5696: URL: https://github.com/apache/hadoop/pull/5696#issuecomment-1582766074 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 45s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 47m 17s | | trunk passed | | +1 :green_heart: | compile | 0m 36s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 0m 34s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 0m 30s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 38s | | trunk passed | | +1 :green_heart: | javadoc | 0m 42s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 27s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 1m 31s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 14s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 29s | | the patch passed | | +1 :green_heart: | compile | 0m 31s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 0m 31s | | the patch passed | | +1 :green_heart: | compile | 0m 27s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 0m 27s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 16s | [/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs-rbf.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5696/11/artifact/out/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs-rbf.txt) | hadoop-hdfs-project/hadoop-hdfs-rbf: The patch generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | +1 :green_heart: | mvnsite | 0m 30s | | the patch passed | | +1 :green_heart: | javadoc | 0m 26s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 21s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 1m 20s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 50s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 21m 43s | | hadoop-hdfs-rbf in the patch passed. | | +1 :green_heart: | asflicense | 0m 35s | | The patch does not generate ASF License warnings. | | | | 130m 9s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5696/11/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5696 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 699598558ce7 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / a21b79bb95d94b0090fa763f442676f5d5ffdd6b | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5696/11/testReport/ | | Max. process+thread count | 2426 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-rbf U: hadoop-hdfs-project/hadoop-hdfs-rbf | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5696/11/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This
[GitHub] [hadoop] hadoop-yetus commented on pull request #5700: HDFS-17030. Limit wait time for getHAServiceState in ObserverReaderProxy
hadoop-yetus commented on PR #5700: URL: https://github.com/apache/hadoop/pull/5700#issuecomment-1582602740 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 0s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 17m 37s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 22m 53s | | trunk passed | | +1 :green_heart: | compile | 5m 57s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 5m 40s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 1m 20s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 14s | | trunk passed | | +1 :green_heart: | javadoc | 1m 48s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 2m 8s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 6m 1s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 35s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 24s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 52s | | the patch passed | | +1 :green_heart: | compile | 5m 43s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 5m 43s | | the patch passed | | +1 :green_heart: | compile | 5m 29s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 5m 29s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 10s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 59s | | the patch passed | | +1 :green_heart: | javadoc | 1m 32s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 1m 56s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 5m 56s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 41s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 18s | | hadoop-hdfs-client in the patch passed. | | -1 :x: | unit | 226m 31s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5700/18/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | | The patch does not generate ASF License warnings. | | | | 372m 45s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.TestRollingUpgrade | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5700/18/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5700 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 82cae10efc75 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / d7364efcb81dd4283b596640a393016fa290ccc4 | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5700/18/testReport/ | | Max. process+thread count | 2448 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-client hadoop-hdfs-project/hadoop-hdfs U:
[GitHub] [hadoop] NishthaShah commented on a diff in pull request #5696: HDFS-16946. Fix getTopTokenRealOwners to return String
NishthaShah commented on code in PR #5696: URL: https://github.com/apache/hadoop/pull/5696#discussion_r1222939776 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/test/java/org/apache/hadoop/hdfs/server/federation/security/TestRouterSecurityManager.java: ## @@ -81,6 +87,17 @@ public static void createMockSecretManager() throws IOException { @Rule public ExpectedException exceptionRule = ExpectedException.none(); + private Router initializeAndStartRouter(Configuration configuration) { +Router router = new Router(); +try { + router.init(configuration); + router.start(); +} catch (MetricsException e) { + //do nothing Review Comment: @ayushtkn Are you running via command line and still see failure? I see consistent successful behaviour when I run full test from cmdline and via IDE debug/run its passing for me intermittently (3/5 passed for me) Edit: With the latest commit, eliminated the flaky behaviour -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] NishthaShah commented on a diff in pull request #5696: HDFS-16946. Fix getTopTokenRealOwners to return String
NishthaShah commented on code in PR #5696: URL: https://github.com/apache/hadoop/pull/5696#discussion_r1222939776 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/test/java/org/apache/hadoop/hdfs/server/federation/security/TestRouterSecurityManager.java: ## @@ -81,6 +87,17 @@ public static void createMockSecretManager() throws IOException { @Rule public ExpectedException exceptionRule = ExpectedException.none(); + private Router initializeAndStartRouter(Configuration configuration) { +Router router = new Router(); +try { + router.init(configuration); + router.start(); +} catch (MetricsException e) { + //do nothing Review Comment: @ayushtkn Are you running via command line and still see failure? I see consistent successful behaviour when I run full test from cmdline and via IDE debug/run its passing for me intermittently (3/5 passed for me) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] Hexiaoqiao commented on pull request #5408: HDFS-16898. Remove write lock for processCommandFromActor of DataNode to reduce impact on heartbeat.
Hexiaoqiao commented on PR #5408: URL: https://github.com/apache/hadoop/pull/5408#issuecomment-1582450852 @hfutatzhanghb Please check failed unit tests if relate to this changes. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] NishthaShah commented on a diff in pull request #5696: HDFS-16946. Fix getTopTokenRealOwners to return String
NishthaShah commented on code in PR #5696: URL: https://github.com/apache/hadoop/pull/5696#discussion_r1222881620 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/test/java/org/apache/hadoop/hdfs/server/federation/security/TestRouterSecurityManager.java: ## @@ -81,6 +87,17 @@ public static void createMockSecretManager() throws IOException { @Rule public ExpectedException exceptionRule = ExpectedException.none(); + private Router initializeAndStartRouter(Configuration configuration) { +Router router = new Router(); +try { + router.init(configuration); + router.start(); +} catch (MetricsException e) { + //do nothing Review Comment: Sure I can explore DefaultMetricsSystem.setMiniClusterMode(true); For me, when I was running via command line, testNotRunningSecretManager never fail. But yes now when I run/debug via the IDE configurations, it fails Edit: Via IDE it fails sometimes and not all the time -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Resolved] (HADOOP-18752) Change fs.s3a.directory.marker.retention to "keep"
[ https://issues.apache.org/jira/browse/HADOOP-18752?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran resolved HADOOP-18752. - Fix Version/s: 3.4.0 Resolution: Fixed > Change fs.s3a.directory.marker.retention to "keep" > -- > > Key: HADOOP-18752 > URL: https://issues.apache.org/jira/browse/HADOOP-18752 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.5 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0 > > > Change the default value of "fs.s3a.directory.marker.retention" to keep; > update docs to match. > maybe include with HADOOP-17802 so we don't blow up with fewer markers being > created. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18748) Configuration.get is slow
[ https://issues.apache.org/jira/browse/HADOOP-18748?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730509#comment-17730509 ] ASF GitHub Bot commented on HADOOP-18748: - hadoop-yetus commented on PR #5685: URL: https://github.com/apache/hadoop/pull/5685#issuecomment-1582390168 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 34s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 39m 8s | | trunk passed | | +1 :green_heart: | compile | 15m 40s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 14m 24s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 1m 14s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 37s | | trunk passed | | +1 :green_heart: | javadoc | 1m 14s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 51s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 2m 36s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 20s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 51s | | the patch passed | | +1 :green_heart: | compile | 14m 55s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 14m 55s | | the patch passed | | +1 :green_heart: | compile | 14m 33s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 14m 33s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 7s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 36s | | the patch passed | | +1 :green_heart: | javadoc | 1m 4s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 51s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 2m 33s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 27s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 33s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 3s | | The patch does not generate ASF License warnings. | | | | 180m 50s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5685/14/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5685 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 0048ebbdc570 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / c44a6a5e793ab665c80da3b9c1f8fb82379dd486 | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5685/14/testReport/ | | Max. process+thread count | 1263 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5685/14/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
[jira] [Commented] (HADOOP-18752) Change fs.s3a.directory.marker.retention to "keep"
[ https://issues.apache.org/jira/browse/HADOOP-18752?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730508#comment-17730508 ] ASF GitHub Bot commented on HADOOP-18752: - steveloughran merged PR #5689: URL: https://github.com/apache/hadoop/pull/5689 > Change fs.s3a.directory.marker.retention to "keep" > -- > > Key: HADOOP-18752 > URL: https://issues.apache.org/jira/browse/HADOOP-18752 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.5 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > > Change the default value of "fs.s3a.directory.marker.retention" to keep; > update docs to match. > maybe include with HADOOP-17802 so we don't blow up with fewer markers being > created. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #5685: HADOOP-18748 Optimize Configuration.handleDeprecation
hadoop-yetus commented on PR #5685: URL: https://github.com/apache/hadoop/pull/5685#issuecomment-1582390168 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 34s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 39m 8s | | trunk passed | | +1 :green_heart: | compile | 15m 40s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 14m 24s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 1m 14s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 37s | | trunk passed | | +1 :green_heart: | javadoc | 1m 14s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 51s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 2m 36s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 20s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 51s | | the patch passed | | +1 :green_heart: | compile | 14m 55s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 14m 55s | | the patch passed | | +1 :green_heart: | compile | 14m 33s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 14m 33s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 7s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 36s | | the patch passed | | +1 :green_heart: | javadoc | 1m 4s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 0m 51s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 2m 33s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 27s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 33s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 3s | | The patch does not generate ASF License warnings. | | | | 180m 50s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5685/14/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5685 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 0048ebbdc570 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / c44a6a5e793ab665c80da3b9c1f8fb82379dd486 | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5685/14/testReport/ | | Max. process+thread count | 1263 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5685/14/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and
[GitHub] [hadoop] steveloughran merged pull request #5689: HADOOP-18752. Change fs.s3a.directory.marker.retention to "keep"
steveloughran merged PR #5689: URL: https://github.com/apache/hadoop/pull/5689 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] NishthaShah commented on a diff in pull request #5696: HDFS-16946. Fix getTopTokenRealOwners to return String
NishthaShah commented on code in PR #5696: URL: https://github.com/apache/hadoop/pull/5696#discussion_r1222881620 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/test/java/org/apache/hadoop/hdfs/server/federation/security/TestRouterSecurityManager.java: ## @@ -81,6 +87,17 @@ public static void createMockSecretManager() throws IOException { @Rule public ExpectedException exceptionRule = ExpectedException.none(); + private Router initializeAndStartRouter(Configuration configuration) { +Router router = new Router(); +try { + router.init(configuration); + router.start(); +} catch (MetricsException e) { + //do nothing Review Comment: Sure I can explore DefaultMetricsSystem.setMiniClusterMode(true); For me, when I was running via command line, testNotRunningSecretManager never fail. But yes now when I run/debug via the IDE configurations, it fails -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18073) Upgrade AWS SDK to v2
[ https://issues.apache.org/jira/browse/HADOOP-18073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730497#comment-17730497 ] ASF GitHub Bot commented on HADOOP-18073: - steveloughran commented on PR #5707: URL: https://github.com/apache/hadoop/pull/5707#issuecomment-1582343411 +1, merging > Upgrade AWS SDK to v2 > - > > Key: HADOOP-18073 > URL: https://issues.apache.org/jira/browse/HADOOP-18073 > Project: Hadoop Common > Issue Type: Task > Components: auth, fs/s3 >Affects Versions: 3.3.1 >Reporter: xiaowei sun >Assignee: Ahmar Suhail >Priority: Major > Labels: pull-request-available > Attachments: Upgrading S3A to SDKV2.pdf > > > This task tracks upgrading Hadoop's AWS connector S3A from AWS SDK for Java > V1 to AWS SDK for Java V2. > Original use case: > {quote}We would like to access s3 with AWS SSO, which is supported in > software.amazon.awssdk:sdk-core:2.*. > In particular, from > [https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html], > when to set 'fs.s3a.aws.credentials.provider', it must be > "com.amazonaws.auth.AWSCredentialsProvider". We would like to support > "software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider" which > supports AWS SSO, so users only need to authenticate once. > {quote} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #5707: HADOOP-18073. Upgrade S3A in 3.3 branch to AWS SDK V2.
steveloughran commented on PR #5707: URL: https://github.com/apache/hadoop/pull/5707#issuecomment-1582343411 +1, merging -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] Hexiaoqiao commented on pull request #5643: HDFS-17003. Erasure Coding: invalidate wrong block after reporting bad blocks from datanode
Hexiaoqiao commented on PR #5643: URL: https://github.com/apache/hadoop/pull/5643#issuecomment-1582282121 Committed to trunk. Thanks @hfutatzhanghb for your contirbutions, And @zhangshuyan0, @sodonnel for the reviews! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] Hexiaoqiao merged pull request #5643: HDFS-17003. Erasure Coding: invalidate wrong block after reporting bad blocks from datanode
Hexiaoqiao merged PR #5643: URL: https://github.com/apache/hadoop/pull/5643 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ayushtkn commented on a diff in pull request #5696: HDFS-16946. Fix getTopTokenRealOwners to return String
ayushtkn commented on code in PR #5696: URL: https://github.com/apache/hadoop/pull/5696#discussion_r1222765730 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/test/java/org/apache/hadoop/hdfs/server/federation/security/TestRouterSecurityManager.java: ## @@ -81,6 +87,17 @@ public static void createMockSecretManager() throws IOException { @Rule public ExpectedException exceptionRule = ExpectedException.none(); + private Router initializeAndStartRouter(Configuration configuration) { +Router router = new Router(); +try { + router.init(configuration); + router.start(); +} catch (MetricsException e) { + //do nothing Review Comment: I don't think we need this try-catch logic itself, put ``` DefaultMetricsSystem.setMiniClusterMode(true); ``` in the ``BeforeClass`` And can you run the test locally as well, for me if ``testNotRunningSecretManager`` runs after your test it fails ``` java.lang.AssertionError: Expecting org.apache.hadoop.service.ServiceStateException with text Failed to create SecretManager but got : Expected to find 'Failed to create SecretManager' but got unexpected exception: org.apache.hadoop.service.ServiceStateException: org.apache.hadoop.security.KerberosAuthException: failure to login: for principal: router/localh...@example.com from keytab /Users/ayushsaxena/code/hadoop-code/hadoop/hadoop-hdfs-project/hadoop-hdfs-rbf/target/test/data/SecurityConfUtil/test.keytab javax.security.auth.login.LoginException: Integrity check on decrypted field failed (31) - PREAUTH_FAILED at org.apache.hadoop.service.ServiceStateException.convert(ServiceStateException.java:105) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:174) at org.apache.hadoop.hdfs.server.federation.security.TestRouterSecurityManager.lambda$testNotRunningSecretManager$1(TestRouterSecurityManager.java:327) at org.apache.hadoop.test.LambdaTestUtils.lambda$intercept$0(LambdaTestUtils.java:534) at org.apache.hadoop.test.LambdaTestUtils.intercept(LambdaTestUtils.java:498) at org.apache.hadoop.test.LambdaTestUtils.intercept(LambdaTestUtils.java:529) at org.apache.hadoop.hdfs.server.federation.security.TestRouterSecurityManager.testNotRunningSecretManager(TestRouterSecurityManager.java:326) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:258) at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63) at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) at org.junit.runners.ParentRunner.run(ParentRunner.java:413) at org.junit.runner.JUnitCore.run(JUnitCore.java:137) at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:69) at com.intellij.rt.junit.IdeaTestRunner$Repeater$1.execute(IdeaTestRunner.java:38) at com.intellij.rt.execution.junit.TestsRepeater.repeat(TestsRepeater.java:11) at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:35) at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:235) at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:54) Caused by: org.apache.hadoop.security.KerberosAuthException: failure to login: for principal: router/localh...@example.com from keytab
[GitHub] [hadoop] hadoop-yetus commented on pull request #5715: HDFS-17037. Consider nonDfsUsed when running balancer.
hadoop-yetus commented on PR #5715: URL: https://github.com/apache/hadoop/pull/5715#issuecomment-1582218750 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 18m 38s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 20m 4s | | trunk passed | | +1 :green_heart: | compile | 5m 18s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 5m 8s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 1m 19s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 13s | | trunk passed | | +1 :green_heart: | javadoc | 1m 48s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 2m 16s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 5m 37s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 25s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 49s | | the patch passed | | +1 :green_heart: | compile | 5m 14s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 5m 14s | | the patch passed | | +1 :green_heart: | compile | 4m 55s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 4m 55s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 6s | | hadoop-hdfs-project: The patch generated 0 new + 143 unchanged - 1 fixed = 143 total (was 144) | | +1 :green_heart: | mvnsite | 1m 56s | | the patch passed | | +1 :green_heart: | javadoc | 1m 28s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 1m 57s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 5m 29s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 41s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 24s | | hadoop-hdfs-client in the patch passed. | | +1 :green_heart: | unit | 255m 54s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 51s | | The patch does not generate ASF License warnings. | | | | 391m 50s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5715/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5715 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 7d1894527cad 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 86c4919792e86c08b90626583ad880fbe89edbbc | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5715/5/testReport/ | | Max. process+thread count | 2898 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-client hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5715/5/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache
[GitHub] [hadoop] ayushtkn merged pull request #5708: HDFS-17035. FsVolumeImpl#getActualNonDfsUsed may return negative value.
ayushtkn merged PR #5708: URL: https://github.com/apache/hadoop/pull/5708 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mehakmeet commented on a diff in pull request #5519: MAPREDUCE-7435. Manifest Committer OOM on abfs
mehakmeet commented on code in PR #5519: URL: https://github.com/apache/hadoop/pull/5519#discussion_r1222648403 ## hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/lib/output/committer/manifest/impl/EntryFileIO.java: ## @@ -63,7 +63,18 @@ public class EntryFileIO { private static final Logger LOG = LoggerFactory.getLogger( EntryFileIO.class); - public static final int WRITER_SHUTDOWN_TIMEOUT = 60; + /** + * How long should the writer shutdown take? + */ + public static final int WRITER_SHUTDOWN_TIMEOUT_SECONDS = 60; + + /** + * How long should trying to queue a write block before giving up + * with an error? + * This is a safety feature to ensure that if something has gone wrong + * in the queue code the job fails with an error rather than just hangs + */ + public static final int WRITER_QUEUE_PUT_TIMEOUT_MINUTES = 10; Review Comment: Sorry, I think I missed these constants being added, don't you think these should be configurable, just for some kind of fallback sakes, so that these values never cause any issues and are easily changeable? I guess if it waits for this long then, we can assume it's just hanging as well. Your call on it being configurable or not. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #5722: YARN-11504. [Federation] YARN Federation Supports Non-HA mode.
hadoop-yetus commented on PR #5722: URL: https://github.com/apache/hadoop/pull/5722#issuecomment-1582135821 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 36s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 17m 55s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 21m 24s | | trunk passed | | +1 :green_heart: | compile | 8m 11s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | compile | 7m 29s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | checkstyle | 2m 2s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 50s | | trunk passed | | +1 :green_heart: | javadoc | 1m 55s | | trunk passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 1m 45s | | trunk passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 4m 21s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 59s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 28s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 12s | | the patch passed | | +1 :green_heart: | compile | 7m 8s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javac | 7m 8s | | the patch passed | | +1 :green_heart: | compile | 7m 29s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | javac | 7m 29s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 44s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 47s | | the patch passed | | +1 :green_heart: | javadoc | 1m 42s | | the patch passed with JDK Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 | | +1 :green_heart: | javadoc | 1m 38s | | the patch passed with JDK Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | +1 :green_heart: | spotbugs | 4m 31s | | the patch passed | | +1 :green_heart: | shadedclient | 24m 59s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 9s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 5m 48s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 55s | | The patch does not generate ASF License warnings. | | | | 155m 47s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.43 ServerAPI=1.43 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5722/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5722 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux a50cedb6cd1d 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 160a4210ca3cf831cac177ecf815a6038aa6213c | | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5722/2/testReport/ | | Max. process+thread count | 648 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common U: hadoop-yarn-project/hadoop-yarn | |
[jira] [Commented] (HADOOP-18748) Configuration.get is slow
[ https://issues.apache.org/jira/browse/HADOOP-18748?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17730442#comment-17730442 ] ASF GitHub Bot commented on HADOOP-18748: - alkis commented on PR #5685: URL: https://github.com/apache/hadoop/pull/5685#issuecomment-1582106658 > ok, style failures > > ``` > ./hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java:754: Properties overlay = getOverlay();:16: 'overlay' hides a field. [HiddenField] > ./hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/conf/ConfigurationBenchmark.java:1:package org.apache.hadoop.conf;public class ConfigurationBenchmark {:31: ';' is not followed by whitespace. [WhitespaceAfter] > ``` > > the overlay one needs new variable name. > > the ConfigurationBenchmark one looks like one of the PRs added a file, which somehow is still around for the style checking. > > Probably the strategy there is actually do a squash commit and forced write, so we are down to a single patch. I know, it's not "elegant" but it ensures that there's no memory of a transient file Fixed the style failures and rewrote the history to sidestep the accidental addition of the file. > Configuration.get is slow > - > > Key: HADOOP-18748 > URL: https://issues.apache.org/jira/browse/HADOOP-18748 > Project: Hadoop Common > Issue Type: Improvement > Components: conf >Affects Versions: 3.3.5 >Reporter: Alkis Evlogimenos >Priority: Major > Labels: pull-request-available > > `Configuration.get` is slow mainly because of the overhead of > `handleDeprecation` and eager creation of `overlay` even when null. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] alkis commented on pull request #5685: HADOOP-18748 Optimize Configuration.handleDeprecation
alkis commented on PR #5685: URL: https://github.com/apache/hadoop/pull/5685#issuecomment-1582106658 > ok, style failures > > ``` > ./hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java:754: Properties overlay = getOverlay();:16: 'overlay' hides a field. [HiddenField] > ./hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/conf/ConfigurationBenchmark.java:1:package org.apache.hadoop.conf;public class ConfigurationBenchmark {:31: ';' is not followed by whitespace. [WhitespaceAfter] > ``` > > the overlay one needs new variable name. > > the ConfigurationBenchmark one looks like one of the PRs added a file, which somehow is still around for the style checking. > > Probably the strategy there is actually do a squash commit and forced write, so we are down to a single patch. I know, it's not "elegant" but it ensures that there's no memory of a transient file Fixed the style failures and rewrote the history to sidestep the accidental addition of the file. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org