[GitHub] [hadoop] hfutatzhanghb commented on pull request #3976: HDFS-16452. msync RPC should send to acitve namenode directly
hfutatzhanghb commented on pull request #3976: URL: https://github.com/apache/hadoop/pull/3976#issuecomment-1050618080 hi, @xkrogen . thanks a lot. I agree with you. I will figure out the better solution as i can. we can discuss it later. very thanks -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] virajjasani commented on pull request #4028: HDFS-16481. Provide support to set Http and Rpc ports in MiniJournalCluster
virajjasani commented on pull request #4028: URL: https://github.com/apache/hadoop/pull/4028#issuecomment-1050610723 @ferhui @goiri could you please review this PR? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] liubingxing opened a new pull request #4032: HDFS-16484. [SPS]: Fix an infinite loop bug in SPSPathIdProcessor thread
liubingxing opened a new pull request #4032: URL: https://github.com/apache/hadoop/pull/4032 JIRA: [HDFS-16484](https://issues.apache.org/jira/browse/HDFS-16484) In SPSPathIdProcessor thread, if it get a inodeId which path does not exist, then the SPSPathIdProcessor thread entry infinite loop and can't work normally. ![image](https://user-images.githubusercontent.com/2844826/155667810-b2d0e158-72d3-4a66-8d20-fce268ce113f.png) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tomscut opened a new pull request #4031: HDFS-16371. Exclude slow disks when choosing volume (#3753)
tomscut opened a new pull request #4031: URL: https://github.com/apache/hadoop/pull/4031 Cherry-pick [#3753](https://github.com/apache/hadoop/pull/3753) to branch-3.3 . -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-15566) Support OpenTelemetry
[ https://issues.apache.org/jira/browse/HADOOP-15566?focusedWorklogId=732858&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-732858 ] ASF GitHub Bot logged work on HADOOP-15566: --- Author: ASF GitHub Bot Created on: 25/Feb/22 04:56 Start Date: 25/Feb/22 04:56 Worklog Time Spent: 10m Work Description: kiran-maturi commented on pull request #3445: URL: https://github.com/apache/hadoop/pull/3445#issuecomment-1050521577 @ndimiduk @cijothomas @iwasakims can you please help with the review -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 732858) Time Spent: 9.5h (was: 9h 20m) > Support OpenTelemetry > - > > Key: HADOOP-15566 > URL: https://issues.apache.org/jira/browse/HADOOP-15566 > Project: Hadoop Common > Issue Type: New Feature > Components: metrics, tracing >Affects Versions: 3.1.0 >Reporter: Todd Lipcon >Assignee: Siyao Meng >Priority: Major > Labels: pull-request-available, security > Attachments: HADOOP-15566-WIP.1.patch, HADOOP-15566.000.WIP.patch, > OpenTelemetry Support Scope Doc v2.pdf, OpenTracing Support Scope Doc.pdf, > Screen Shot 2018-06-29 at 11.59.16 AM.png, ss-trace-s3a.png > > Time Spent: 9.5h > Remaining Estimate: 0h > > The HTrace incubator project has voted to retire itself and won't be making > further releases. The Hadoop project currently has various hooks with HTrace. > It seems in some cases (eg HDFS-13702) these hooks have had measurable > performance overhead. Given these two factors, I think we should consider > removing the HTrace integration. If there is someone willing to do the work, > replacing it with OpenTracing might be a better choice since there is an > active community. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] kiran-maturi commented on pull request #3445: HADOOP-15566 Opentelemetry changes using java agent
kiran-maturi commented on pull request #3445: URL: https://github.com/apache/hadoop/pull/3445#issuecomment-1050521577 @ndimiduk @cijothomas @iwasakims can you please help with the review -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4028: HDFS-16481. Provide support to set Http and Rpc ports in MiniJournalCluster
hadoop-yetus commented on pull request #4028: URL: https://github.com/apache/hadoop/pull/4028#issuecomment-1050482214 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 2s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 41m 47s | | trunk passed | | +1 :green_heart: | compile | 2m 6s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 1m 55s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 16s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 1s | | trunk passed | | +1 :green_heart: | javadoc | 1m 27s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 54s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 4m 16s | | trunk passed | | +1 :green_heart: | shadedclient | 32m 39s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 52s | | the patch passed | | +1 :green_heart: | compile | 1m 59s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 1m 59s | | the patch passed | | +1 :green_heart: | compile | 1m 40s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 1m 40s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 3s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 35s | | the patch passed | | +1 :green_heart: | javadoc | 1m 2s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 30s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 35s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 38s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 356m 37s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 39s | | The patch does not generate ASF License warnings. | | | | 485m 9s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4028 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux a8a1c7074c87 4.15.0-163-generic #171-Ubuntu SMP Fri Nov 5 11:55:11 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 5833956b6f9d4c3116897d027efe6d4cabdbd1c9 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/3/testReport/ | | Max. process+thread count | 2083 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/3/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [hadoop] tomscut commented on pull request #3828: HDFS-16397. Reconfig slow disk parameters for datanode
tomscut commented on pull request #3828: URL: https://github.com/apache/hadoop/pull/3828#issuecomment-1050392857 Thanks @tasanuma for reviewing and merging it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-16254) Add proxy address in IPC connection
[ https://issues.apache.org/jira/browse/HADOOP-16254?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17497809#comment-17497809 ] Owen O'Malley commented on HADOOP-16254: [~ayushtkn] , yeah I hadn't found that jira, so thank you. Of course, using the caller context will work, with the only major downside is that if the user sets a caller context that is close to the limit, it could cause bytes to get dropped. We might want to pick a shorter lead string (4 bytes?) to minimize that chance. (Or bump up the default limit by 50 bytes?) > Add proxy address in IPC connection > --- > > Key: HADOOP-16254 > URL: https://issues.apache.org/jira/browse/HADOOP-16254 > Project: Hadoop Common > Issue Type: New Feature > Components: ipc >Reporter: Xiaoqiao He >Assignee: Xiaoqiao He >Priority: Blocker > Attachments: HADOOP-16254.001.patch, HADOOP-16254.002.patch, > HADOOP-16254.004.patch, HADOOP-16254.005.patch, HADOOP-16254.006.patch, > HADOOP-16254.007.patch > > > In order to support data locality of RBF, we need to add new field about > client hostname in the RPC headers of Router protocol calls. > clientHostname represents hostname of client and forward by Router to > Namenode to support data locality friendly. See more [RBF Data Locality > Design|https://issues.apache.org/jira/secure/attachment/12965092/RBF%20Data%20Locality%20Design.pdf] > in HDFS-13248 and [maillist > vote|http://mail-archives.apache.org/mod_mbox/hadoop-common-dev/201904.mbox/%3CCAF3Ajax7hGxvowg4K_HVTZeDqC5H=3bfb7mv5sz5mgvadhv...@mail.gmail.com%3E]. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Resolved] (HADOOP-18139) Allow configuration of zookeeper server principal
[ https://issues.apache.org/jira/browse/HADOOP-18139?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Owen O'Malley resolved HADOOP-18139. Fix Version/s: 3.4.0 3.3.3 Resolution: Fixed Thanks for the review, Íñigo! > Allow configuration of zookeeper server principal > - > > Key: HADOOP-18139 > URL: https://issues.apache.org/jira/browse/HADOOP-18139 > Project: Hadoop Common > Issue Type: Improvement > Components: auth >Reporter: Owen O'Malley >Assignee: Owen O'Malley >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.3 > > Time Spent: 20m > Remaining Estimate: 0h > > Allow configuration of zookeeper server principal. > This would allow the Router to specify the principal. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18139) Allow configuration of zookeeper server principal
[ https://issues.apache.org/jira/browse/HADOOP-18139?focusedWorklogId=732759&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-732759 ] ASF GitHub Bot logged work on HADOOP-18139: --- Author: ASF GitHub Bot Created on: 24/Feb/22 23:02 Start Date: 24/Feb/22 23:02 Worklog Time Spent: 10m Work Description: omalley closed pull request #4024: URL: https://github.com/apache/hadoop/pull/4024 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 732759) Time Spent: 20m (was: 10m) > Allow configuration of zookeeper server principal > - > > Key: HADOOP-18139 > URL: https://issues.apache.org/jira/browse/HADOOP-18139 > Project: Hadoop Common > Issue Type: Improvement > Components: auth >Reporter: Owen O'Malley >Assignee: Owen O'Malley >Priority: Major > Labels: pull-request-available > Time Spent: 20m > Remaining Estimate: 0h > > Allow configuration of zookeeper server principal. > This would allow the Router to specify the principal. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] omalley closed pull request #4024: HADOOP-18139. Allow configuration of zookeeper server principal.
omalley closed pull request #4024: URL: https://github.com/apache/hadoop/pull/4024 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-8521) Port StreamInputFormat to new Map Reduce API
[ https://issues.apache.org/jira/browse/HADOOP-8521?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17497774#comment-17497774 ] Giuseppe Valente commented on HADOOP-8521: -- I'm getting this stacktrace trying to use an input format based on the new API, is it supposed to work? Exception in thread "main" java.lang.RuntimeException: class org.apache.hadoop.hbase.mapreduce.HFileInputFormat not org.apache.hadoop.mapred.InputFormat at org.apache.hadoop.conf.Configuration.setClass(Configuration.java:2712) at org.apache.hadoop.mapred.JobConf.setInputFormat(JobConf.java:700) at org.apache.hadoop.streaming.StreamJob.setJobConf(StreamJob.java:798) at org.apache.hadoop.streaming.StreamJob.run(StreamJob.java:126) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at org.apache.hadoop.streaming.HadoopStreaming.main(HadoopStreaming.java:50) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:564) at org.apache.hadoop.util.RunJar.run(RunJar.java:318) at org.apache.hadoop.util.RunJar.main(RunJar.java:232) > Port StreamInputFormat to new Map Reduce API > > > Key: HADOOP-8521 > URL: https://issues.apache.org/jira/browse/HADOOP-8521 > Project: Hadoop Common > Issue Type: Improvement > Components: streaming >Affects Versions: 0.23.0 >Reporter: madhukara phatak >Assignee: madhukara phatak >Priority: Major > Fix For: 3.0.0-alpha1 > > Attachments: HADOOP-8521-1.patch, HADOOP-8521-2.patch, > HADOOP-8521-3.patch, HADOOP-8521.patch > > > As of now , hadoop streaming uses old Hadoop M/R API. This JIRA ports it to > the new M/R API. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18139) Allow configuration of zookeeper server principal
[ https://issues.apache.org/jira/browse/HADOOP-18139?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-18139: Labels: pull-request-available (was: ) > Allow configuration of zookeeper server principal > - > > Key: HADOOP-18139 > URL: https://issues.apache.org/jira/browse/HADOOP-18139 > Project: Hadoop Common > Issue Type: Improvement > Components: auth >Reporter: Owen O'Malley >Assignee: Owen O'Malley >Priority: Major > Labels: pull-request-available > Time Spent: 10m > Remaining Estimate: 0h > > Allow configuration of zookeeper server principal. > This would allow the Router to specify the principal. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18139) Allow configuration of zookeeper server principal
[ https://issues.apache.org/jira/browse/HADOOP-18139?focusedWorklogId=732734&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-732734 ] ASF GitHub Bot logged work on HADOOP-18139: --- Author: ASF GitHub Bot Created on: 24/Feb/22 22:14 Start Date: 24/Feb/22 22:14 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #4024: URL: https://github.com/apache/hadoop/pull/4024#issuecomment-1050315985 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 32s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ branch-3.3 Compile Tests _ | | +1 :green_heart: | mvninstall | 40m 37s | | branch-3.3 passed | | +1 :green_heart: | compile | 24m 1s | | branch-3.3 passed | | +1 :green_heart: | checkstyle | 1m 4s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 2m 9s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 2m 2s | | branch-3.3 passed | | +1 :green_heart: | spotbugs | 3m 7s | | branch-3.3 passed | | +1 :green_heart: | shadedclient | 31m 36s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 9s | | the patch passed | | +1 :green_heart: | compile | 23m 21s | | the patch passed | | +1 :green_heart: | javac | 23m 21s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 58s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 52s | | the patch passed | | +1 :green_heart: | javadoc | 1m 51s | | the patch passed | | +1 :green_heart: | spotbugs | 3m 4s | | the patch passed | | +1 :green_heart: | shadedclient | 30m 14s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 10s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 48s | | The patch does not generate ASF License warnings. | | | | 184m 55s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4024/4/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4024 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 79363f83193d 4.15.0-153-generic #160-Ubuntu SMP Thu Jul 29 06:54:29 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / fc4bf4a7d00b7b719cdc876a7932d824f5e28a7f | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4024/4/testReport/ | | Max. process+thread count | 2575 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4024/4/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 732734) Remaining Estimate: 0h Time Spent: 10m > Allow configuration of zookeeper server principal > - > > Key: HADOOP-18139 > URL: https://issues.apache.org/jira/browse/HADOOP-18139 > Project: Hadoop Common > Issue Type: Improvement >
[GitHub] [hadoop] hadoop-yetus commented on pull request #4024: HADOOP-18139. Allow configuration of zookeeper server principal.
hadoop-yetus commented on pull request #4024: URL: https://github.com/apache/hadoop/pull/4024#issuecomment-1050315985 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 32s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ branch-3.3 Compile Tests _ | | +1 :green_heart: | mvninstall | 40m 37s | | branch-3.3 passed | | +1 :green_heart: | compile | 24m 1s | | branch-3.3 passed | | +1 :green_heart: | checkstyle | 1m 4s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 2m 9s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 2m 2s | | branch-3.3 passed | | +1 :green_heart: | spotbugs | 3m 7s | | branch-3.3 passed | | +1 :green_heart: | shadedclient | 31m 36s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 9s | | the patch passed | | +1 :green_heart: | compile | 23m 21s | | the patch passed | | +1 :green_heart: | javac | 23m 21s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 58s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 52s | | the patch passed | | +1 :green_heart: | javadoc | 1m 51s | | the patch passed | | +1 :green_heart: | spotbugs | 3m 4s | | the patch passed | | +1 :green_heart: | shadedclient | 30m 14s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 10s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 48s | | The patch does not generate ASF License warnings. | | | | 184m 55s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4024/4/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4024 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 79363f83193d 4.15.0-153-generic #160-Ubuntu SMP Thu Jul 29 06:54:29 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / fc4bf4a7d00b7b719cdc876a7932d824f5e28a7f | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4024/4/testReport/ | | Max. process+thread count | 2575 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4024/4/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18143) toString method of RpcCall throws IllegalArgumentException
[ https://issues.apache.org/jira/browse/HADOOP-18143?focusedWorklogId=732720&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-732720 ] ASF GitHub Bot logged work on HADOOP-18143: --- Author: ASF GitHub Bot Created on: 24/Feb/22 21:59 Start Date: 24/Feb/22 21:59 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #4030: URL: https://github.com/apache/hadoop/pull/4030#issuecomment-1050305846 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 47s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 33m 43s | | trunk passed | | +1 :green_heart: | compile | 23m 53s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 20m 46s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 8s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 40s | | trunk passed | | +1 :green_heart: | javadoc | 1m 14s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 42s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 28s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 39s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 0s | | the patch passed | | +1 :green_heart: | compile | 23m 1s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 23m 1s | | the patch passed | | +1 :green_heart: | compile | 20m 37s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 37s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 4s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 39s | | the patch passed | | +1 :green_heart: | javadoc | 1m 9s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 40s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 43s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 44s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 6s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 55s | | The patch does not generate ASF License warnings. | | | | 204m 43s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4030/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4030 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux ecc3a7dae72f 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / efdfab807f4dffd48dfd1c1e4d0af5ffbdc64136 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4030/1/testReport/ | | Max. process+thread count | 1251 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4030/1/console | | v
[GitHub] [hadoop] hadoop-yetus commented on pull request #4030: HADOOP-18143. toString method of RpcCall throws IllegalArgumentException
hadoop-yetus commented on pull request #4030: URL: https://github.com/apache/hadoop/pull/4030#issuecomment-1050305846 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 47s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 33m 43s | | trunk passed | | +1 :green_heart: | compile | 23m 53s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 20m 46s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 8s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 40s | | trunk passed | | +1 :green_heart: | javadoc | 1m 14s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 42s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 28s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 39s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 0s | | the patch passed | | +1 :green_heart: | compile | 23m 1s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 23m 1s | | the patch passed | | +1 :green_heart: | compile | 20m 37s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 37s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 4s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 39s | | the patch passed | | +1 :green_heart: | javadoc | 1m 9s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 40s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 43s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 44s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 6s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 55s | | The patch does not generate ASF License warnings. | | | | 204m 43s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4030/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4030 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux ecc3a7dae72f 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / efdfab807f4dffd48dfd1c1e4d0af5ffbdc64136 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4030/1/testReport/ | | Max. process+thread count | 1251 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4030/1/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For querie
[GitHub] [hadoop] hadoop-yetus commented on pull request #4028: HDFS-16481. Provide support to set Http and Rpc ports in MiniJournalCluster
hadoop-yetus commented on pull request #4028: URL: https://github.com/apache/hadoop/pull/4028#issuecomment-1050170528 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 50s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | -1 :x: | mvninstall | 20m 36s | [/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/2/artifact/out/branch-mvninstall-root.txt) | root in trunk failed. | | -1 :x: | compile | 0m 27s | [/branch-compile-hadoop-hdfs-project_hadoop-hdfs-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/2/artifact/out/branch-compile-hadoop-hdfs-project_hadoop-hdfs-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt) | hadoop-hdfs in trunk failed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04. | | -1 :x: | compile | 0m 27s | [/branch-compile-hadoop-hdfs-project_hadoop-hdfs-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/2/artifact/out/branch-compile-hadoop-hdfs-project_hadoop-hdfs-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-hdfs in trunk failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -0 :warning: | checkstyle | 0m 26s | [/buildtool-branch-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/2/artifact/out/buildtool-branch-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt) | The patch fails to run checkstyle in hadoop-hdfs | | -1 :x: | mvnsite | 0m 28s | [/branch-mvnsite-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/2/artifact/out/branch-mvnsite-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in trunk failed. | | -1 :x: | javadoc | 0m 28s | [/branch-javadoc-hadoop-hdfs-project_hadoop-hdfs-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/2/artifact/out/branch-javadoc-hadoop-hdfs-project_hadoop-hdfs-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt) | hadoop-hdfs in trunk failed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04. | | -1 :x: | javadoc | 0m 27s | [/branch-javadoc-hadoop-hdfs-project_hadoop-hdfs-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/2/artifact/out/branch-javadoc-hadoop-hdfs-project_hadoop-hdfs-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-hdfs in trunk failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -1 :x: | spotbugs | 0m 28s | [/branch-spotbugs-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/2/artifact/out/branch-spotbugs-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in trunk failed. | | +1 :green_heart: | shadedclient | 3m 17s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | -1 :x: | mvninstall | 0m 22s | [/patch-mvninstall-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/2/artifact/out/patch-mvninstall-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch failed. | | -1 :x: | compile | 0m 22s | [/patch-compile-hadoop-hdfs-project_hadoop-hdfs-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/2/artifact/out/patch-compile-hadoop-hdfs-project_hadoop-hdfs-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt) | hadoop-hdfs in the patch failed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04. | | -1 :x: | javac | 0m 22s | [/patch-compile-hadoop-hdfs-project_hadoop-hdfs-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/2/artifact/out/patch-compile-hadoop-hdfs-project_hadoop-hdfs-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt) | hadoop-hdfs in the patch failed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04. | | -1 :x: | compile | 0m 23s | [/patch-compile-hadoop-hdfs-project_hadoop-hdfs-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/2/artifact/out/patch-compile-hadoop-hdfs-project_hadoop-hdfs-jdkPrivateBuild-1.8.0_31
[jira] [Updated] (HADOOP-18139) Allow configuration of zookeeper server principal
[ https://issues.apache.org/jira/browse/HADOOP-18139?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Íñigo Goiri updated HADOOP-18139: - Description: Allow configuration of zookeeper server principal. This would allow the Router to specify the principal. > Allow configuration of zookeeper server principal > - > > Key: HADOOP-18139 > URL: https://issues.apache.org/jira/browse/HADOOP-18139 > Project: Hadoop Common > Issue Type: Improvement > Components: auth >Reporter: Owen O'Malley >Assignee: Owen O'Malley >Priority: Major > > Allow configuration of zookeeper server principal. > This would allow the Router to specify the principal. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18139) Allow configuration of zookeeper server principal
[ https://issues.apache.org/jira/browse/HADOOP-18139?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Íñigo Goiri updated HADOOP-18139: - Summary: Allow configuration of zookeeper server principal (was: RBF: Allow configuration of zookeeper server principal in router) > Allow configuration of zookeeper server principal > - > > Key: HADOOP-18139 > URL: https://issues.apache.org/jira/browse/HADOOP-18139 > Project: Hadoop Common > Issue Type: Improvement > Components: auth >Reporter: Owen O'Malley >Assignee: Owen O'Malley >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18142) Increase precommit job timeout from 24 hr to 30 hr
[ https://issues.apache.org/jira/browse/HADOOP-18142?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17497613#comment-17497613 ] Viraj Jasani commented on HADOOP-18142: --- Thanks [~kgyrtkirk] for the nice suggestions! If you would like to take up this work, please feel free to go ahead with creating PR. > Increase precommit job timeout from 24 hr to 30 hr > -- > > Key: HADOOP-18142 > URL: https://issues.apache.org/jira/browse/HADOOP-18142 > Project: Hadoop Common > Issue Type: Task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > > As per some recent precommit build results, full build QA is not getting > completed in 24 hr (recent example > [here|https://github.com/apache/hadoop/pull/4000] where more than 5 builds > timed out after 24 hr). We should increase it to 30 hr. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18131) Upgrade maven enforcer plugin and relevant dependencies
[ https://issues.apache.org/jira/browse/HADOOP-18131?focusedWorklogId=732556&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-732556 ] ASF GitHub Bot logged work on HADOOP-18131: --- Author: ASF GitHub Bot Created on: 24/Feb/22 18:11 Start Date: 24/Feb/22 18:11 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #4000: URL: https://github.com/apache/hadoop/pull/4000#issuecomment-1050125099 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 53s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | shelldocs | 0m 1s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 56s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 26m 8s | | trunk passed | | +1 :green_heart: | compile | 27m 0s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 22m 23s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 3m 56s | | trunk passed | | +1 :green_heart: | mvnsite | 28m 13s | | trunk passed | | +1 :green_heart: | javadoc | 9m 12s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 9m 43s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 0m 25s | | branch/hadoop-minicluster no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 26s | | branch/hadoop-tools/hadoop-tools-dist no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 27s | | branch/hadoop-client-modules/hadoop-client-api no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 26s | | branch/hadoop-client-modules/hadoop-client-runtime no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 27s | | branch/hadoop-client-modules/hadoop-client-check-invariants no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 25s | | branch/hadoop-client-modules/hadoop-client-minicluster no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 28s | | branch/hadoop-client-modules/hadoop-client-check-test-invariants no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 27s | | branch/hadoop-client-modules/hadoop-client-integration-tests no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 26s | | branch/hadoop-cloud-storage-project/hadoop-cloud-storage no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 63m 58s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 40s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 79m 31s | | the patch passed | | +1 :green_heart: | compile | 28m 49s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 28m 49s | | the patch passed | | +1 :green_heart: | compile | 25m 42s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 25m 42s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 13s | | the patch passed | | +1 :green_heart: | mvnsite | 25m 16s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | xml | 2m 0s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 10m 7s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 9m 42s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 0m 20s | | hadoop-minicluster has no data from spotbugs | | +0 :ok: | spotbugs | 0m 28s | | hadoop-tools/hadoop-tools-dis
[GitHub] [hadoop] hadoop-yetus commented on pull request #4000: HADOOP-18131. Upgrade maven enforcer plugin and relevant dependencies
hadoop-yetus commented on pull request #4000: URL: https://github.com/apache/hadoop/pull/4000#issuecomment-1050125099 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 53s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | shelldocs | 0m 1s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 56s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 26m 8s | | trunk passed | | +1 :green_heart: | compile | 27m 0s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 22m 23s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 3m 56s | | trunk passed | | +1 :green_heart: | mvnsite | 28m 13s | | trunk passed | | +1 :green_heart: | javadoc | 9m 12s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 9m 43s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 0m 25s | | branch/hadoop-minicluster no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 26s | | branch/hadoop-tools/hadoop-tools-dist no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 27s | | branch/hadoop-client-modules/hadoop-client-api no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 26s | | branch/hadoop-client-modules/hadoop-client-runtime no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 27s | | branch/hadoop-client-modules/hadoop-client-check-invariants no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 25s | | branch/hadoop-client-modules/hadoop-client-minicluster no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 28s | | branch/hadoop-client-modules/hadoop-client-check-test-invariants no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 27s | | branch/hadoop-client-modules/hadoop-client-integration-tests no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 26s | | branch/hadoop-cloud-storage-project/hadoop-cloud-storage no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 63m 58s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 40s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 79m 31s | | the patch passed | | +1 :green_heart: | compile | 28m 49s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 28m 49s | | the patch passed | | +1 :green_heart: | compile | 25m 42s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 25m 42s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 13s | | the patch passed | | +1 :green_heart: | mvnsite | 25m 16s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | xml | 2m 0s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 10m 7s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 9m 42s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 0m 20s | | hadoop-minicluster has no data from spotbugs | | +0 :ok: | spotbugs | 0m 28s | | hadoop-tools/hadoop-tools-dist has no data from spotbugs | | +0 :ok: | spotbugs | 0m 28s | | hadoop-client-modules/hadoop-client-api has no data from spotbugs | | +0 :ok: | spotbugs | 0m 25s | | hadoop-client-modules/hadoop-client-runtime has no data from spotbugs | | +0 :ok: | spotbugs | 0m 24s | | hadoop-client-modules/hadoop-client-check-invariants has no data from spotbugs | | +0 :ok: | spotbugs | 0m 31s | |
[GitHub] [hadoop] xkrogen commented on pull request #3976: HDFS-16452. msync RPC should send to acitve namenode directly
xkrogen commented on pull request #3976: URL: https://github.com/apache/hadoop/pull/3976#issuecomment-1050121895 I agree with you that the current logic has RPCs that are theoretically unnecessary. I would be very open to some approach that allows `ObservedReadProxyProvider` to share information with its `failoverProxy` instance (either direction, from ORRP to `failoverProxy` or from `failoverProxy` to ORPP, could be useful) to eliminate unnecessary work. But we need to make sure it is layered cleanly, so that `failoverProxy` and ORPP both continue to respect their duties: `failoverProxy` is responsible for finding/contacting the Active NN (including for `msync`), and ORPP is responsible for finding/contacting Observer NNs. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] omalley commented on pull request #4024: Hadoop 18139
omalley commented on pull request #4024: URL: https://github.com/apache/hadoop/pull/4024#issuecomment-1050121255 I looked into adding unit tests, but the test zookeeper instance that the unit tests use doesn't support security, so it can't test this patch, which only matters if the zookeeper service is running with kerberos authentication. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3828: HDFS-16397. Reconfig slow disk parameters for datanode
hadoop-yetus commented on pull request #3828: URL: https://github.com/apache/hadoop/pull/3828#issuecomment-1050094467 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 49s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 2s | | trunk passed | | +1 :green_heart: | compile | 1m 30s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 1m 19s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 2s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 31s | | trunk passed | | +1 :green_heart: | javadoc | 1m 2s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 31s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 14s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 52s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 27s | | the patch passed | | +1 :green_heart: | compile | 1m 35s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 1m 35s | | the patch passed | | +1 :green_heart: | compile | 1m 22s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 1m 22s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 59s | [/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3828/8/artifact/out/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs-project/hadoop-hdfs: The patch generated 2 new + 136 unchanged - 2 fixed = 138 total (was 138) | | +1 :green_heart: | mvnsite | 1m 30s | | the patch passed | | +1 :green_heart: | javadoc | 1m 1s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 32s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 44s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 43s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 35m 43s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3828/8/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +0 :ok: | asflicense | 0m 28s | | ASF License check generated no output? | | | | 140m 22s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.TestDFSStorageStateRecovery | | | hadoop.hdfs.TestReadStripedFileWithMissingBlocks | | | hadoop.hdfs.TestGetBlocks | | | hadoop.hdfs.TestReadStripedFileWithDNFailure | | | hadoop.hdfs.client.impl.TestBlockReaderLocal | | | hadoop.hdfs.web.TestWebHDFSForHA | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3828/8/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3828 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux c937a0d6c41b 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 9acc63a94679eafb9ae8e57108890794d0293685 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3828/8/testReport/ | | Max. process+thr
[jira] [Commented] (HADOOP-18143) toString method of RpcCall throws IllegalArgumentException
[ https://issues.apache.org/jira/browse/HADOOP-18143?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17497588#comment-17497588 ] András Győri commented on HADOOP-18143: --- Created a PR for it, in which I wrap the affected functions with a mutex. As getRequestHeader is more like getOrCreateRequestHeader, I did not see the benefit of dividing the lock as separate read/write locks (as in both methods we would just end up using the write lock anyway). Not sure whether it is acceptable in terms of performance. cc [~vinayakumarb] > toString method of RpcCall throws IllegalArgumentException > -- > > Key: HADOOP-18143 > URL: https://issues.apache.org/jira/browse/HADOOP-18143 > Project: Hadoop Common > Issue Type: Bug >Reporter: András Győri >Assignee: András Győri >Priority: Critical > Labels: pull-request-available > Time Spent: 10m > Remaining Estimate: 0h > > We have observed breaking tests such as TestApplicationACLs. We have located > the root cause, which is HADOOP-18082. It seems that there is a concurrency > issue within ProtobufRpcEngine2. When using a debugger, the missing fields > are there, hence the suspicion of concurrency problem. The stack trace: > {noformat} > java.lang.IllegalArgumentException > at java.nio.Buffer.position(Buffer.java:244) > at > org.apache.hadoop.ipc.RpcWritable$ProtobufWrapper.readFrom(RpcWritable.java:131) > at org.apache.hadoop.ipc.RpcWritable$Buffer.getValue(RpcWritable.java:232) > at > org.apache.hadoop.ipc.ProtobufRpcEngine2$RpcProtobufRequest.getRequestHeader(ProtobufRpcEngine2.java:645) > at > org.apache.hadoop.ipc.ProtobufRpcEngine2$RpcProtobufRequest.toString(ProtobufRpcEngine2.java:663) > at java.lang.String.valueOf(String.java:3425) > at java.lang.StringBuilder.append(StringBuilder.java:516) > at org.apache.hadoop.ipc.Server$RpcCall.toString(Server.java:1328) > at java.lang.String.valueOf(String.java:3425) > at java.lang.StringBuilder.append(StringBuilder.java:516) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:3097){noformat} -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18143) toString method of RpcCall throws IllegalArgumentException
[ https://issues.apache.org/jira/browse/HADOOP-18143?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-18143: Labels: pull-request-available (was: ) > toString method of RpcCall throws IllegalArgumentException > -- > > Key: HADOOP-18143 > URL: https://issues.apache.org/jira/browse/HADOOP-18143 > Project: Hadoop Common > Issue Type: Bug >Reporter: András Győri >Assignee: András Győri >Priority: Critical > Labels: pull-request-available > Time Spent: 10m > Remaining Estimate: 0h > > We have observed breaking tests such as TestApplicationACLs. We have located > the root cause, which is HADOOP-18082. It seems that there is a concurrency > issue within ProtobufRpcEngine2. When using a debugger, the missing fields > are there, hence the suspicion of concurrency problem. The stack trace: > {noformat} > java.lang.IllegalArgumentException > at java.nio.Buffer.position(Buffer.java:244) > at > org.apache.hadoop.ipc.RpcWritable$ProtobufWrapper.readFrom(RpcWritable.java:131) > at org.apache.hadoop.ipc.RpcWritable$Buffer.getValue(RpcWritable.java:232) > at > org.apache.hadoop.ipc.ProtobufRpcEngine2$RpcProtobufRequest.getRequestHeader(ProtobufRpcEngine2.java:645) > at > org.apache.hadoop.ipc.ProtobufRpcEngine2$RpcProtobufRequest.toString(ProtobufRpcEngine2.java:663) > at java.lang.String.valueOf(String.java:3425) > at java.lang.StringBuilder.append(StringBuilder.java:516) > at org.apache.hadoop.ipc.Server$RpcCall.toString(Server.java:1328) > at java.lang.String.valueOf(String.java:3425) > at java.lang.StringBuilder.append(StringBuilder.java:516) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:3097){noformat} -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18143) toString method of RpcCall throws IllegalArgumentException
[ https://issues.apache.org/jira/browse/HADOOP-18143?focusedWorklogId=732519&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-732519 ] ASF GitHub Bot logged work on HADOOP-18143: --- Author: ASF GitHub Bot Created on: 24/Feb/22 17:29 Start Date: 24/Feb/22 17:29 Worklog Time Spent: 10m Work Description: 9uapaw opened a new pull request #4030: URL: https://github.com/apache/hadoop/pull/4030 ### Description of PR ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 732519) Remaining Estimate: 0h Time Spent: 10m > toString method of RpcCall throws IllegalArgumentException > -- > > Key: HADOOP-18143 > URL: https://issues.apache.org/jira/browse/HADOOP-18143 > Project: Hadoop Common > Issue Type: Bug >Reporter: András Győri >Assignee: András Győri >Priority: Critical > Time Spent: 10m > Remaining Estimate: 0h > > We have observed breaking tests such as TestApplicationACLs. We have located > the root cause, which is HADOOP-18082. It seems that there is a concurrency > issue within ProtobufRpcEngine2. When using a debugger, the missing fields > are there, hence the suspicion of concurrency problem. The stack trace: > {noformat} > java.lang.IllegalArgumentException > at java.nio.Buffer.position(Buffer.java:244) > at > org.apache.hadoop.ipc.RpcWritable$ProtobufWrapper.readFrom(RpcWritable.java:131) > at org.apache.hadoop.ipc.RpcWritable$Buffer.getValue(RpcWritable.java:232) > at > org.apache.hadoop.ipc.ProtobufRpcEngine2$RpcProtobufRequest.getRequestHeader(ProtobufRpcEngine2.java:645) > at > org.apache.hadoop.ipc.ProtobufRpcEngine2$RpcProtobufRequest.toString(ProtobufRpcEngine2.java:663) > at java.lang.String.valueOf(String.java:3425) > at java.lang.StringBuilder.append(StringBuilder.java:516) > at org.apache.hadoop.ipc.Server$RpcCall.toString(Server.java:1328) > at java.lang.String.valueOf(String.java:3425) > at java.lang.StringBuilder.append(StringBuilder.java:516) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:3097){noformat} -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] 9uapaw opened a new pull request #4030: HADOOP-18143. toString method of RpcCall throws IllegalArgumentException
9uapaw opened a new pull request #4030: URL: https://github.com/apache/hadoop/pull/4030 ### Description of PR ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] simbadzina commented on a change in pull request #4024: Hadoop 18139
simbadzina commented on a change in pull request #4024: URL: https://github.com/apache/hadoop/pull/4024#discussion_r814105513 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/curator/ZKCuratorManager.java ## @@ -428,4 +434,26 @@ public void setData(String path, byte[] data, int version) .forPath(path, data)); } } + + public static class HadoopZookeeperFactory implements ZookeeperFactory { +private final String zkPrincipal; + +public HadoopZookeeperFactory(String zkPrincipal) { + this.zkPrincipal = zkPrincipal; +} + +@Override +public ZooKeeper newZooKeeper(String connectString, int sessionTimeout, + Watcher watcher, boolean canBeReadOnly +) throws Exception { + ZKClientConfig zkClientConfig = new ZKClientConfig(); + if (zkPrincipal != null) { +LOG.info("Configuring zookeeper client to use {}", zkPrincipal); Review comment: Can you add "as the server principal" at the end of the log message so that it's clear what the value is being used for. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] simbadzina commented on a change in pull request #4024: Hadoop 18139
simbadzina commented on a change in pull request #4024: URL: https://github.com/apache/hadoop/pull/4024#discussion_r814105002 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/ZKDelegationTokenSecretManager.java ## @@ -52,9 +52,11 @@ import org.apache.hadoop.classification.InterfaceAudience.Private; import org.apache.hadoop.classification.InterfaceStability.Unstable; import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.CommonConfigurationKeys; Review comment: Unused import. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-15245) S3AInputStream.skip() to use lazy seek
[ https://issues.apache.org/jira/browse/HADOOP-15245?focusedWorklogId=732507&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-732507 ] ASF GitHub Bot logged work on HADOOP-15245: --- Author: ASF GitHub Bot Created on: 24/Feb/22 17:13 Start Date: 24/Feb/22 17:13 Worklog Time Spent: 10m Work Description: dannycjones commented on a change in pull request #3927: URL: https://github.com/apache/hadoop/pull/3927#discussion_r814094303 ## File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/statistics/impl/EmptyS3AStatisticsContext.java ## @@ -317,6 +322,9 @@ public long getInputPolicy() { return 0; } +@Override +public long getSkipOperations() { return 0; } Review comment: This line introduces a new checkstyle violation, can you drop the return on to a new line? ``` ./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/statistics/impl/EmptyS3AStatisticsContext.java:326: public long getSkipOperations() { return 0; }:37: '{' at column 37 should have line break after. [LeftCurly] ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 732507) Time Spent: 1h 10m (was: 1h) > S3AInputStream.skip() to use lazy seek > -- > > Key: HADOOP-15245 > URL: https://issues.apache.org/jira/browse/HADOOP-15245 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.1.0 >Reporter: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 1h 10m > Remaining Estimate: 0h > > the default skip() does a read and discard of all bytes, no matter how far > ahead the skip is. This is very inefficient if the skip() is being done on > S3A random IO, though exactly what to do when in sequential mode. > Proposed: > * add an optimized version of S3AInputStream.skip() which does a lazy seek, > which itself will decided when to skip() vs issue a new GET. > * add some more instrumentation to measure how often this gets used -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] dannycjones commented on a change in pull request #3927: HADOOP-15245. S3AInputStream.skip() to use lazy seek
dannycjones commented on a change in pull request #3927: URL: https://github.com/apache/hadoop/pull/3927#discussion_r814094303 ## File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/statistics/impl/EmptyS3AStatisticsContext.java ## @@ -317,6 +322,9 @@ public long getInputPolicy() { return 0; } +@Override +public long getSkipOperations() { return 0; } Review comment: This line introduces a new checkstyle violation, can you drop the return on to a new line? ``` ./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/statistics/impl/EmptyS3AStatisticsContext.java:326: public long getSkipOperations() { return 0; }:37: '{' at column 37 should have line break after. [LeftCurly] ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tasanuma commented on pull request #3828: HDFS-16397. Reconfig slow disk parameters for datanode
tasanuma commented on pull request #3828: URL: https://github.com/apache/hadoop/pull/3828#issuecomment-1050029461 Merged. Thanks for your contribution, @tomscut! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tasanuma merged pull request #3828: HDFS-16397. Reconfig slow disk parameters for datanode
tasanuma merged pull request #3828: URL: https://github.com/apache/hadoop/pull/3828 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18143) toString method of RpcCall throws IllegalArgumentException
[ https://issues.apache.org/jira/browse/HADOOP-18143?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] András Győri updated HADOOP-18143: -- Description: We have observed breaking tests such as TestApplicationACLs. We have located the root cause, which is HADOOP-18082. It seems that there is a concurrency issue within ProtobufRpcEngine2. When using a debugger, the missing fields are there, hence the suspicion of concurrency problem. The stack trace: {noformat} java.lang.IllegalArgumentException at java.nio.Buffer.position(Buffer.java:244) at org.apache.hadoop.ipc.RpcWritable$ProtobufWrapper.readFrom(RpcWritable.java:131) at org.apache.hadoop.ipc.RpcWritable$Buffer.getValue(RpcWritable.java:232) at org.apache.hadoop.ipc.ProtobufRpcEngine2$RpcProtobufRequest.getRequestHeader(ProtobufRpcEngine2.java:645) at org.apache.hadoop.ipc.ProtobufRpcEngine2$RpcProtobufRequest.toString(ProtobufRpcEngine2.java:663) at java.lang.String.valueOf(String.java:3425) at java.lang.StringBuilder.append(StringBuilder.java:516) at org.apache.hadoop.ipc.Server$RpcCall.toString(Server.java:1328) at java.lang.String.valueOf(String.java:3425) at java.lang.StringBuilder.append(StringBuilder.java:516) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:3097){noformat} > toString method of RpcCall throws IllegalArgumentException > -- > > Key: HADOOP-18143 > URL: https://issues.apache.org/jira/browse/HADOOP-18143 > Project: Hadoop Common > Issue Type: Bug >Reporter: András Győri >Assignee: András Győri >Priority: Critical > > We have observed breaking tests such as TestApplicationACLs. We have located > the root cause, which is HADOOP-18082. It seems that there is a concurrency > issue within ProtobufRpcEngine2. When using a debugger, the missing fields > are there, hence the suspicion of concurrency problem. The stack trace: > {noformat} > java.lang.IllegalArgumentException > at java.nio.Buffer.position(Buffer.java:244) > at > org.apache.hadoop.ipc.RpcWritable$ProtobufWrapper.readFrom(RpcWritable.java:131) > at org.apache.hadoop.ipc.RpcWritable$Buffer.getValue(RpcWritable.java:232) > at > org.apache.hadoop.ipc.ProtobufRpcEngine2$RpcProtobufRequest.getRequestHeader(ProtobufRpcEngine2.java:645) > at > org.apache.hadoop.ipc.ProtobufRpcEngine2$RpcProtobufRequest.toString(ProtobufRpcEngine2.java:663) > at java.lang.String.valueOf(String.java:3425) > at java.lang.StringBuilder.append(StringBuilder.java:516) > at org.apache.hadoop.ipc.Server$RpcCall.toString(Server.java:1328) > at java.lang.String.valueOf(String.java:3425) > at java.lang.StringBuilder.append(StringBuilder.java:516) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:3097){noformat} -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18143) toString method of RpcCall is intermittently t
András Győri created HADOOP-18143: - Summary: toString method of RpcCall is intermittently t Key: HADOOP-18143 URL: https://issues.apache.org/jira/browse/HADOOP-18143 Project: Hadoop Common Issue Type: Bug Reporter: András Győri Assignee: András Győri -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18143) toString method of RpcCall throws IllegalArgumentException
[ https://issues.apache.org/jira/browse/HADOOP-18143?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] András Győri updated HADOOP-18143: -- Summary: toString method of RpcCall throws IllegalArgumentException (was: toString method of RpcCall is intermittently t) > toString method of RpcCall throws IllegalArgumentException > -- > > Key: HADOOP-18143 > URL: https://issues.apache.org/jira/browse/HADOOP-18143 > Project: Hadoop Common > Issue Type: Bug >Reporter: András Győri >Assignee: András Győri >Priority: Critical > -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4028: HDFS-16481. Provide support to set Http and Rpc ports in MiniJournalCluster
hadoop-yetus commented on pull request #4028: URL: https://github.com/apache/hadoop/pull/4028#issuecomment-1050004062 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 5s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 36m 34s | | trunk passed | | +1 :green_heart: | compile | 1m 47s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 1m 37s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 9s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 39s | | trunk passed | | +1 :green_heart: | javadoc | 1m 4s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 32s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 4m 7s | | trunk passed | | +1 :green_heart: | shadedclient | 28m 57s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 20s | | the patch passed | | +1 :green_heart: | compile | 1m 26s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 1m 26s | | the patch passed | | +1 :green_heart: | compile | 1m 25s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 1m 25s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 53s | [/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/1/artifact/out/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs-project/hadoop-hdfs: The patch generated 2 new + 2 unchanged - 0 fixed = 4 total (was 2) | | +1 :green_heart: | mvnsite | 1m 31s | | the patch passed | | +1 :green_heart: | javadoc | 1m 0s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 31s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 45s | | the patch passed | | +1 :green_heart: | shadedclient | 26m 45s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 376m 16s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | | The patch does not generate ASF License warnings. | | | | 492m 43s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.mover.TestMover | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4028 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 1e8e24d1b39f 4.15.0-153-generic #160-Ubuntu SMP Thu Jul 29 06:54:29 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 948305f889420b9b502a9777879227178171cf7b | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/1/testReport/ | | Max. process+thread count | 2135 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4028/1/console | | versions |
[GitHub] [hadoop] GauthamBanasandra opened a new pull request #4029: [Do not commit] Debug CI issues
GauthamBanasandra opened a new pull request #4029: URL: https://github.com/apache/hadoop/pull/4029 ### Description of PR I'm reverting the changes done till the last known successful run. I'll abandon this PR once my debugging is complete. Please don't merge this. ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17198) Support S3 Access Points
[ https://issues.apache.org/jira/browse/HADOOP-17198?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran updated HADOOP-17198: Description: Improve VPC integration by supporting access points for buckets https://docs.aws.amazon.com/AmazonS3/latest/dev/access-points.html *important*: when backporting, always include as followup patches * HADOOP-17951 * HADOOP-18085 was: Improve VPC integration by supporting access points for buckets https://docs.aws.amazon.com/AmazonS3/latest/dev/access-points.html *important*: when backporting, always include HADOOP-17951 as the followup patch > Support S3 Access Points > > > Key: HADOOP-17198 > URL: https://issues.apache.org/jira/browse/HADOOP-17198 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.0 >Reporter: Steve Loughran >Assignee: Bogdan Stolojan >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.2 > > Time Spent: 16.5h > Remaining Estimate: 0h > > Improve VPC integration by supporting access points for buckets > https://docs.aws.amazon.com/AmazonS3/latest/dev/access-points.html > *important*: when backporting, always include as followup patches > * HADOOP-17951 > * HADOOP-18085 -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17946) Update commons-lang to 3.12.0
[ https://issues.apache.org/jira/browse/HADOOP-17946?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran updated HADOOP-17946: Summary: Update commons-lang to 3.12.0 (was: Update commons-lang to latest 3.x) > Update commons-lang to 3.12.0 > - > > Key: HADOOP-17946 > URL: https://issues.apache.org/jira/browse/HADOOP-17946 > Project: Hadoop Common > Issue Type: Task >Reporter: Sean Busbey >Assignee: Renukaprasad C >Priority: Minor > Labels: pull-request-available > Fix For: 3.4.0, 3.3.2 > > Time Spent: 1h 20m > Remaining Estimate: 0h > > our commons-lang3 dependency is currently 3.7, which is nearly 4 years old. > latest right now is 3.12 and there are at least some fixes that would make us > more robust on JDKs newer than openjdk8 (e.g. LANG-1384. [release notes > indicate 3.9 is the first to support > jdk11|https://commons.apache.org/proper/commons-lang/changes-report.html]). -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18142) Increase precommit job timeout from 24 hr to 30 hr
[ https://issues.apache.org/jira/browse/HADOOP-18142?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17497431#comment-17497431 ] Zoltan Haindrich commented on HADOOP-18142: --- [~vjasani] I've noticed this ticket by chance and would like to mention a few things I've done in Hive to avoid kinda like the same issues: * [use rateLimit|https://github.com/apache/hive/blob/af013246100be85675d18e6dcfcea7f202bc8d2c/Jenkinsfile#L21] to avoid building the same PR multiple time a day ; this naturally adds a 6 hour wait before the next would start * use a global lock to [limit the number|https://github.com/apache/hive/blob/af013246100be85675d18e6dcfcea7f202bc8d2c/Jenkinsfile#L150] of concurrently running * [disable concurrent builds|https://github.com/apache/hive/blob/af013246100be85675d18e6dcfcea7f202bc8d2c/Jenkinsfile#L23] as there is no point running the tests for someone who pushed new changes while it was still running => the contributor most likely will push more commits anyway which could launch even more builds...not starting a new build means it could pick up multiple trigger events while the one executing is still running * auto-kill the build in case the PR was updated while it waiting/running ; by calling [this method|https://github.com/apache/hive/blob/master/Jenkinsfile#L30-L45] at a few key points in the build > Increase precommit job timeout from 24 hr to 30 hr > -- > > Key: HADOOP-18142 > URL: https://issues.apache.org/jira/browse/HADOOP-18142 > Project: Hadoop Common > Issue Type: Task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > > As per some recent precommit build results, full build QA is not getting > completed in 24 hr (recent example > [here|https://github.com/apache/hadoop/pull/4000] where more than 5 builds > timed out after 24 hr). We should increase it to 30 hr. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18091) S3A auditing leaks memory through ThreadLocal references
[ https://issues.apache.org/jira/browse/HADOOP-18091?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran updated HADOOP-18091: Description: {{ActiveAuditManagerS3A}} uses thread locals to map to active audit spans, which (because they are wrapped) include back reference to the audit manager instance and the config it was created with. these *do not* get cleaned up when the FS instance is closed. if you have a long lived process creating and destroying many FS instances, then memory gets used up. This fix moves off threadlocal into a map of weak references. while a strong reference is held` (for example in the s3a entry point method) then the references will always resolve. but if those are released then when a GC is triggered these weak references will not be retained, so not use up memory other than entries in the the ha!sh map. the map is held by the s3a auditing integration, so when the fs is closed, everything is freed up. was: {{ActiveAuditManagerS3A}} uses thread locals to map to active audit spans, which (because they are wrapped) include back reference to the audit manager instance and the config it was created with. these *do not* get cleaned up when the FS instance is closed. if you have a long lived process creating and destroying many FS instances, then memory gets used up. l > S3A auditing leaks memory through ThreadLocal references > > > Key: HADOOP-18091 > URL: https://issues.apache.org/jira/browse/HADOOP-18091 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.2 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Fix For: 3.3.3 > > Time Spent: 7h 40m > Remaining Estimate: 0h > > {{ActiveAuditManagerS3A}} uses thread locals to map to active audit spans, > which (because they are wrapped) include back reference to the audit manager > instance and the config it was created with. > these *do not* get cleaned up when the FS instance is closed. > if you have a long lived process creating and destroying many FS instances, > then memory gets used up. > This fix moves off threadlocal into a map of weak references. while a strong > reference is held` (for example in the s3a entry point method) then the > references will always resolve. but if those are released then when a GC is > triggered these weak references will not be retained, so not use up memory > other than entries in the the ha!sh map. the map is held by the s3a auditing > integration, so when the fs is closed, everything is freed up. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18128) outputstream.md typo issue
[ https://issues.apache.org/jira/browse/HADOOP-18128?focusedWorklogId=732336&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-732336 ] ASF GitHub Bot logged work on HADOOP-18128: --- Author: ASF GitHub Bot Created on: 24/Feb/22 14:22 Start Date: 24/Feb/22 14:22 Worklog Time Spent: 10m Work Description: ted12138 commented on pull request #4025: URL: https://github.com/apache/hadoop/pull/4025#issuecomment-1049909702 > There are some other typos(such as `implmentation`, `satisifed`), please modify them together. Thank you. ok, got it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 732336) Time Spent: 40m (was: 0.5h) > outputstream.md typo issue > -- > > Key: HADOOP-18128 > URL: https://issues.apache.org/jira/browse/HADOOP-18128 > Project: Hadoop Common > Issue Type: Improvement >Reporter: leo sun >Assignee: leo sun >Priority: Major > Labels: pull-request-available > Attachments: image-2022-02-17-10-53-01-704.png > > Time Spent: 40m > Remaining Estimate: 0h > > There is a typo issue in outputstream.md on the branch – trunk > !image-2022-02-17-10-53-01-704.png! -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ted12138 commented on pull request #4025: HADOOP-18128. Fix typo issues of outputstream.md
ted12138 commented on pull request #4025: URL: https://github.com/apache/hadoop/pull/4025#issuecomment-1049909702 > There are some other typos(such as `implmentation`, `satisifed`), please modify them together. Thank you. ok, got it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4003: HDFS-16462. Make HDFS get tool cross platform
hadoop-yetus commented on pull request #4003: URL: https://github.com/apache/hadoop/pull/4003#issuecomment-1049897824 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 79m 5s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 43m 59s | | trunk passed | | +1 :green_heart: | compile | 4m 33s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 32s | | trunk passed | | -1 :x: | shadedclient | 76m 4s | | branch has errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 17s | | the patch passed | | +1 :green_heart: | compile | 4m 34s | | the patch passed | | +1 :green_heart: | cc | 4m 34s | | the patch passed | | +1 :green_heart: | golang | 4m 34s | | the patch passed | | +1 :green_heart: | javac | 4m 34s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 21s | | the patch passed | | -1 :x: | shadedclient | 22m 3s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 20m 7s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs-native-client.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4003/7/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-native-client.txt) | hadoop-hdfs-native-client in the patch failed. | | +1 :green_heart: | asflicense | 0m 35s | | The patch does not generate ASF License warnings. | | | | 205m 35s | | | | Reason | Tests | |---:|:--| | Failed CTEST tests | test_test_libhdfs_ops_hdfs_static | | | test_test_libhdfs_threaded_hdfs_static | | | test_test_libhdfs_zerocopy_hdfs_static | | | test_test_native_mini_dfs | | | test_libhdfs_threaded_hdfspp_test_shim_static | | | test_hdfspp_mini_dfs_smoke_hdfspp_test_shim_static | | | libhdfs_mini_stress_valgrind_hdfspp_test_static | | | memcheck_libhdfs_mini_stress_valgrind_hdfspp_test_static | | | test_libhdfs_mini_stress_hdfspp_test_shim_static | | | test_hdfs_ext_hdfspp_test_shim_static | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4003/7/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4003 | | Optional Tests | dupname asflicense compile cc mvnsite javac unit codespell golang | | uname | Linux 28ef380f6fa1 4.15.0-163-generic #171-Ubuntu SMP Fri Nov 5 11:55:11 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / c29a2316e9e54e9471b6ddafa3b9af6a913705bf | | Default Java | Red Hat, Inc.-1.8.0_322-b06 | | CTEST | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4003/7/artifact/out/patch-hadoop-hdfs-project_hadoop-hdfs-native-client-ctest.txt | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4003/7/testReport/ | | Max. process+thread count | 596 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: hadoop-hdfs-project/hadoop-hdfs-native-client | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4003/7/console | | versions | git=2.9.5 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4003: HDFS-16462. Make HDFS get tool cross platform
hadoop-yetus commented on pull request #4003: URL: https://github.com/apache/hadoop/pull/4003#issuecomment-1049872050 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 38m 40s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 36m 29s | | trunk passed | | +1 :green_heart: | compile | 4m 30s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 32s | | trunk passed | | +1 :green_heart: | shadedclient | 68m 30s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 24s | | the patch passed | | +1 :green_heart: | compile | 4m 22s | | the patch passed | | +1 :green_heart: | cc | 4m 22s | | the patch passed | | +1 :green_heart: | golang | 4m 22s | | the patch passed | | +1 :green_heart: | javac | 4m 22s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 18s | | the patch passed | | +1 :green_heart: | shadedclient | 24m 44s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 1m 45s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs-native-client.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4003/8/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-native-client.txt) | hadoop-hdfs-native-client in the patch failed. | | +1 :green_heart: | asflicense | 0m 41s | | The patch does not generate ASF License warnings. | | | | 141m 46s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4003/8/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4003 | | Optional Tests | dupname asflicense compile cc mvnsite javac unit codespell golang | | uname | Linux 457a702d7080 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / c29a2316e9e54e9471b6ddafa3b9af6a913705bf | | Default Java | Red Hat, Inc.-1.8.0_322-b06 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4003/8/testReport/ | | Max. process+thread count | 548 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: hadoop-hdfs-project/hadoop-hdfs-native-client | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4003/8/console | | versions | git=2.9.5 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3785: YARN-11042. Fix testQueueSubmitWithACLsEnabledWithQueueMapping in Tes…
hadoop-yetus commented on pull request #3785: URL: https://github.com/apache/hadoop/pull/3785#issuecomment-1049782417 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 35m 32s | | trunk passed | | +1 :green_heart: | compile | 1m 7s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 0m 57s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 44s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 5s | | trunk passed | | +1 :green_heart: | javadoc | 0m 52s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 44s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 4s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 49s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 55s | | the patch passed | | +1 :green_heart: | compile | 0m 59s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 0m 59s | | the patch passed | | +1 :green_heart: | compile | 0m 50s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 50s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 39s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 55s | | the patch passed | | +1 :green_heart: | javadoc | 0m 40s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 37s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 3s | | the patch passed | | +1 :green_heart: | shadedclient | 26m 14s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 165m 9s | [/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3785/8/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt) | hadoop-yarn-server-resourcemanager in the patch passed. | | -1 :x: | asflicense | 0m 44s | [/results-asflicense.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3785/8/artifact/out/results-asflicense.txt) | The patch generated 1 ASF License warnings. | | | | 267m 33s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.yarn.server.resourcemanager.TestApplicationACLs | | | hadoop.yarn.server.resourcemanager.scheduler.fair.TestFairSchedulerQueueACLs | | | hadoop.yarn.server.resourcemanager.scheduler.capacity.TestCapacitySchedulerQueueACLs | | | hadoop.yarn.server.resourcemanager.TestRMAdminService | | | hadoop.yarn.server.resourcemanager.security.TestClientToAMTokens | | | hadoop.yarn.server.resourcemanager.scheduler.capacity.TestApplicationPriorityACLs | | | hadoop.yarn.server.resourcemanager.security.TestAMRMTokens | | | hadoop.yarn.server.resourcemanager.TestClientRMService | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3785/8/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3785 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux ce0e188e841d 4.15.0-163-generic #171-Ubuntu SMP Fri Nov 5 11:55:11 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / c237aa2b23e6ce84ea66d2057f89c397b6c5a3bd | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi
[GitHub] [hadoop] tomscut commented on pull request #3828: HDFS-16397. Reconfig slow disk parameters for datanode
tomscut commented on pull request #3828: URL: https://github.com/apache/hadoop/pull/3828#issuecomment-1049765538 https://user-images.githubusercontent.com/55134131/155515733-fda10a2c-19fd-4817-8a25-ca27761ddafe.png";> Hi @tasanuma those failed unit tests seem related to OOM. Do I need to trigger the builder again? Thanks. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] pjfanning commented on pull request #3980: Hadoop-17563 bouncycastle 1.70
pjfanning commented on pull request #3980: URL: https://github.com/apache/hadoop/pull/3980#issuecomment-1049745859 hadoop-auth builds ok for me locally - not sure why it fails in the CI build -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3828: HDFS-16397. Reconfig slow disk parameters for datanode
hadoop-yetus commented on pull request #3828: URL: https://github.com/apache/hadoop/pull/3828#issuecomment-1049745388 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 52s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 43s | | trunk passed | | +1 :green_heart: | compile | 1m 28s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 1m 18s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 1s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 29s | | trunk passed | | +1 :green_heart: | javadoc | 1m 5s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 30s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 14s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 2s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 31s | | the patch passed | | +1 :green_heart: | compile | 1m 40s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 1m 40s | | the patch passed | | +1 :green_heart: | compile | 1m 28s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 1m 28s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 1s | [/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3828/7/artifact/out/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs-project/hadoop-hdfs: The patch generated 2 new + 136 unchanged - 2 fixed = 138 total (was 138) | | +1 :green_heart: | mvnsite | 1m 33s | | the patch passed | | +1 :green_heart: | javadoc | 1m 3s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 41s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 4m 15s | | the patch passed | | +1 :green_heart: | shadedclient | 27m 36s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 17m 2s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3828/7/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +0 :ok: | asflicense | 0m 29s | | ASF License check generated no output? | | | | 125m 57s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.TestReadStripedFileWithMissingBlocks | | | hadoop.hdfs.client.impl.TestBlockReaderFactory | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3828/7/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3828 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 8f8069a8fa46 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 08411f93d27c75a23015aee9658f8d7376c5c95b | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3828/7/testReport/ | | Max. process+thread count | 931 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs | | Console output | https://ci-hadoop.apache.org/job/had
[GitHub] [hadoop] steveloughran commented on pull request #3980: Hadoop-17563 bouncycastle 1.70
steveloughran commented on pull request #3980: URL: https://github.com/apache/hadoop/pull/3980#issuecomment-1049728581 @mehakmeet @mukund-thakur ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hfutatzhanghb commented on pull request #3976: HDFS-16452. msync RPC should send to acitve namenode directly
hfutatzhanghb commented on pull request #3976: URL: https://github.com/apache/hadoop/pull/3976#issuecomment-1049649888 > @xkrogen , very very thanks for replying me so carefully. the code below is the constructor function of ObserverReadProxyProvider. yes, the failoverProxy is an instance of ConfiguredFailoverProxyProvider. ```java public ObserverReadProxyProvider( Configuration conf, URI uri, Class xface, HAProxyFactory factory) { this(conf, uri, xface, factory, new ConfiguredFailoverProxyProvider<>(conf, uri, xface, factory)); } ``` In our implementation, we compute active namenode proxy to perform msync. But it does not effect the subsequent read rpc request to contact with observer namenode, because the code logic is below: ```java if (observerReadEnabled && shouldFindObserver() && isRead(method)) { // our code choose active nn to perform msync if (!msynced) { initializeMsync(); } else { autoMsyncIfNecessary(); } // . // code below will choose observer nn to perform read rpc for (int i = 0; i < nameNodeProxies.size(); i++) { NNProxyInfo current = getCurrentProxy(); HAServiceState currState = current.getCachedState(); if (currState != HAServiceState.OBSERVER) { if (currState == HAServiceState.ACTIVE) { activeCount++; } else if (currState == HAServiceState.STANDBY) { standbyCount++; } else if (currState == null) { unreachableCount++; } changeProxy(current); continue; } try { retVal = method.invoke(current.proxy, args); lastProxy = current; return retVal; } catch (InvocationTargetException ite) { // ... } } } ``` as the code show above, if we enter the if condition statement,it proves that a read rpc request is invoked. first we confirm which namenode is active and use active namenode to perform msync. After msync rpc, we get observer namenode proxy from nameNodeProxies to perfom read rpc request. In actually, there is no need to send msync to observer namenode and then failover, it is a waste of network. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4027: YARN-11081. TestYarnConfigurationFields consistently keeps failing
hadoop-yetus commented on pull request #4027: URL: https://github.com/apache/hadoop/pull/4027#issuecomment-1049641624 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 46s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 33m 36s | | trunk passed | | +1 :green_heart: | compile | 0m 46s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 0m 38s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 20s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 46s | | trunk passed | | +1 :green_heart: | javadoc | 0m 41s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 34s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 10s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 25s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 41s | | the patch passed | | +1 :green_heart: | compile | 0m 44s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 0m 44s | | the patch passed | | +1 :green_heart: | compile | 0m 35s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 35s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 15s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 40s | | the patch passed | | +1 :green_heart: | javadoc | 0m 32s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 29s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 6s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 27s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 56s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | asflicense | 0m 33s | | The patch does not generate ASF License warnings. | | | | 93m 34s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4027/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4027 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 224aaf6f6225 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 07e2694fa791866c736e5d131c9a324857706902 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4027/1/testReport/ | | Max. process+thread count | 692 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4027/1/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org ---
[GitHub] [hadoop] hadoop-yetus commented on pull request #4022: [Do not commit] Debug CI issues
hadoop-yetus commented on pull request #4022: URL: https://github.com/apache/hadoop/pull/4022#issuecomment-1049632508 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 50s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 35m 59s | | trunk passed | | +1 :green_heart: | compile | 3m 30s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 23s | | trunk passed | | +1 :green_heart: | shadedclient | 61m 58s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 14s | | the patch passed | | +1 :green_heart: | compile | 3m 20s | | the patch passed | | +1 :green_heart: | cc | 3m 20s | | the patch passed | | +1 :green_heart: | golang | 3m 20s | | the patch passed | | +1 :green_heart: | javac | 3m 20s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 15s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 7s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 44m 52s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs-native-client.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4022/2/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-native-client.txt) | hadoop-hdfs-native-client in the patch failed. | | +1 :green_heart: | asflicense | 0m 30s | | The patch does not generate ASF License warnings. | | | | 136m 18s | | | | Reason | Tests | |---:|:--| | Failed CTEST tests | test_test_libhdfs_ops_hdfs_static | | | test_test_libhdfs_threaded_hdfs_static | | | test_test_libhdfs_zerocopy_hdfs_static | | | test_test_native_mini_dfs | | | test_libhdfs_threaded_hdfspp_test_shim_static | | | libhdfs_mini_stress_valgrind_hdfspp_test_static | | | memcheck_libhdfs_mini_stress_valgrind_hdfspp_test_static | | | test_libhdfs_mini_stress_hdfspp_test_shim_static | | | test_hdfs_ext_hdfspp_test_shim_static | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4022/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4022 | | Optional Tests | dupname asflicense compile cc mvnsite javac unit codespell golang | | uname | Linux cd9d74e12c88 4.15.0-162-generic #170-Ubuntu SMP Mon Oct 18 11:38:05 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 818b2c200cd1748f6353af1df304e2c21bd6f117 | | Default Java | Red Hat, Inc.-1.8.0_322-b06 | | CTEST | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4022/2/artifact/out/patch-hadoop-hdfs-project_hadoop-hdfs-native-client-ctest.txt | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4022/2/testReport/ | | Max. process+thread count | 550 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: hadoop-hdfs-project/hadoop-hdfs-native-client | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4022/2/console | | versions | git=2.9.5 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tomscut commented on pull request #3828: HDFS-16397. Reconfig slow disk parameters for datanode
tomscut commented on pull request #3828: URL: https://github.com/apache/hadoop/pull/3828#issuecomment-1049629967 It looks like there are some errors. Let me retrigger it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org