[jira] [Updated] (HADOOP-19066) AWS SDK V2 - Enabling FIPS should be allowed with central endpoint

2024-02-07 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-19066?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated HADOOP-19066:

Labels: pull-request-available  (was: )

> AWS SDK V2 - Enabling FIPS should be allowed with central endpoint
> --
>
> Key: HADOOP-19066
> URL: https://issues.apache.org/jira/browse/HADOOP-19066
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.5.0, 3.4.1
>Reporter: Viraj Jasani
>Assignee: Viraj Jasani
>Priority: Major
>  Labels: pull-request-available
>
> FIPS support can be enabled by setting "fs.s3a.endpoint.fips". Since the SDK 
> considers overriding endpoint and enabling fips as mutually exclusive, we 
> fail fast if fs.s3a.endpoint is set with fips support (details on 
> HADOOP-18975).
> Now, we no longer override SDK endpoint for central endpoint since we enable 
> cross region access (details on HADOOP-19044) but we would still fail fast if 
> endpoint is central and fips is enabled.
> Changes proposed:
>  * S3A to fail fast only if FIPS is enabled and non-central endpoint is 
> configured.
>  * Tests to ensure S3 bucket is accessible with default region us-east-2 with 
> cross region access (expected with central endpoint).
>  * Document FIPS support with central endpoint on connecting.html.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19066) AWS SDK V2 - Enabling FIPS should be allowed with central endpoint

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815551#comment-17815551
 ] 

ASF GitHub Bot commented on HADOOP-19066:
-

virajjasani opened a new pull request, #6539:
URL: https://github.com/apache/hadoop/pull/6539

   Jira: HADOOP-19066




> AWS SDK V2 - Enabling FIPS should be allowed with central endpoint
> --
>
> Key: HADOOP-19066
> URL: https://issues.apache.org/jira/browse/HADOOP-19066
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.5.0, 3.4.1
>Reporter: Viraj Jasani
>Assignee: Viraj Jasani
>Priority: Major
>
> FIPS support can be enabled by setting "fs.s3a.endpoint.fips". Since the SDK 
> considers overriding endpoint and enabling fips as mutually exclusive, we 
> fail fast if fs.s3a.endpoint is set with fips support (details on 
> HADOOP-18975).
> Now, we no longer override SDK endpoint for central endpoint since we enable 
> cross region access (details on HADOOP-19044) but we would still fail fast if 
> endpoint is central and fips is enabled.
> Changes proposed:
>  * S3A to fail fast only if FIPS is enabled and non-central endpoint is 
> configured.
>  * Tests to ensure S3 bucket is accessible with default region us-east-2 with 
> cross region access (expected with central endpoint).
>  * Document FIPS support with central endpoint on connecting.html.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[PR] HADOOP-19066. S3A: AWS SDK V2 - Enabling FIPS should be allowed with central endpoint [hadoop]

2024-02-07 Thread via GitHub


virajjasani opened a new pull request, #6539:
URL: https://github.com/apache/hadoop/pull/6539

   Jira: HADOOP-19066


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HDFS-17362. RBF: Implement RouterObserverReadConfiguredFailoverProxyProvider [hadoop]

2024-02-07 Thread via GitHub


tasanuma commented on PR #6510:
URL: https://github.com/apache/hadoop/pull/6510#issuecomment-1933469403

   @simbadzina Thanks for your review! I will merge it in a couple of days.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19071) Update maven-surefire-plugin from 3.0.0 to 3.2.2

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815546#comment-17815546
 ] 

ASF GitHub Bot commented on HADOOP-19071:
-

hadoop-yetus commented on PR #6537:
URL: https://github.com/apache/hadoop/pull/6537#issuecomment-1933467016

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 21s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  shelldocs  |   0m  0s |  |  Shelldocs was not available.  |
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  31m 29s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 14s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 13s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  mvnsite  |   0m 16s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 17s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 14s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  shadedclient  |  19m 38s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 10s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m  8s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m  8s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m  8s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m  8s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  1s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   0m 12s |  |  the patch passed  |
   | +1 :green_heart: |  shellcheck  |   0m  1s |  |  No new issues.  |
   | +1 :green_heart: |  javadoc  |   0m  9s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 11s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  shadedclient  |  19m 44s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   0m 11s |  |  hadoop-project in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   0m 24s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   |  75m 58s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6537/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6537 |
   | Optional Tests | dupname asflicense mvnsite unit codespell detsecrets 
shellcheck shelldocs compile javac javadoc mvninstall shadedclient xmllint |
   | uname | Linux 4bbfeceb9378 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / a76f77e6adc0133b80f1753d459f03cf8d072f95 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6537/2/testReport/ |
   | Max. process+thread count | 558 (vs. ulimit of 5500) |
   | modules | C: hadoop-project U: hadoop-project |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6537/2/console |
   | versions | git=2.25.1 maven=3.6.3 shellcheck=0.7.0 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   




> Update maven-surefire-plugin fro

Re: [PR] HADOOP-19071. Update maven-surefire-plugin from 3.0.0 to 3.2.2. [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6537:
URL: https://github.com/apache/hadoop/pull/6537#issuecomment-1933467016

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 21s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  shelldocs  |   0m  0s |  |  Shelldocs was not available.  |
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  31m 29s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 14s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 13s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  mvnsite  |   0m 16s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 17s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 14s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  shadedclient  |  19m 38s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 10s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m  8s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m  8s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m  8s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m  8s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  1s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   0m 12s |  |  the patch passed  |
   | +1 :green_heart: |  shellcheck  |   0m  1s |  |  No new issues.  |
   | +1 :green_heart: |  javadoc  |   0m  9s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 11s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  shadedclient  |  19m 44s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   0m 11s |  |  hadoop-project in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   0m 24s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   |  75m 58s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6537/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6537 |
   | Optional Tests | dupname asflicense mvnsite unit codespell detsecrets 
shellcheck shelldocs compile javac javadoc mvninstall shadedclient xmllint |
   | uname | Linux 4bbfeceb9378 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / a76f77e6adc0133b80f1753d459f03cf8d072f95 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6537/2/testReport/ |
   | Max. process+thread count | 558 (vs. ulimit of 5500) |
   | modules | C: hadoop-project U: hadoop-project |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6537/2/console |
   | versions | git=2.25.1 maven=3.6.3 shellcheck=0.7.0 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure

[jira] [Commented] (HADOOP-14837) Handle S3A "glacier" data

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-14837?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815535#comment-17815535
 ] 

ASF GitHub Bot commented on HADOOP-14837:
-

bpahuja commented on PR #6407:
URL: https://github.com/apache/hadoop/pull/6407#issuecomment-1933450335

   Hi @steveloughran, Have addressed the checkstyle violations in the latest 
commit. 




> Handle S3A "glacier" data
> -
>
> Key: HADOOP-14837
> URL: https://issues.apache.org/jira/browse/HADOOP-14837
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.0.0-beta1
>Reporter: Steve Loughran
>Assignee: Bhavay Pahuja
>Priority: Minor
>  Labels: pull-request-available
>
> SPARK-21797 covers how if you have AWS S3 set to copy some files to glacier, 
> they appear in the listing but GETs fail, and so does everything else
> We should think about how best to handle this.
> # report better
> # if listings can identify files which are glaciated then maybe we could have 
> an option to filter them out
> # test & see what happens



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-14837 : Support Read Restored Glacier Objects [hadoop]

2024-02-07 Thread via GitHub


bpahuja commented on PR #6407:
URL: https://github.com/apache/hadoop/pull/6407#issuecomment-1933450335

   Hi @steveloughran, Have addressed the checkstyle violations in the latest 
commit. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18342) Upgrade to Avro 1.11.1

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18342?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815534#comment-17815534
 ] 

ASF GitHub Bot commented on HADOOP-18342:
-

hadoop-yetus commented on PR #4854:
URL: https://github.com/apache/hadoop/pull/4854#issuecomment-1933446188

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 56s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  1s |  |  xmllint was not available.  |
   | +0 :ok: |  shelldocs  |   0m  1s |  |  Shelldocs was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 3 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  14m 53s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  30m 52s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  16m 28s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |  14m 50s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   4m 14s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |  18m 36s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   8m 27s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   7m 36s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +0 :ok: |  spotbugs  |   0m 19s |  |  branch/hadoop-project no spotbugs 
output file (spotbugsXml.xml)  |
   | +0 :ok: |  spotbugs  |   0m 22s |  |  
branch/hadoop-client-modules/hadoop-client no spotbugs output file 
(spotbugsXml.xml)  |
   | +0 :ok: |  spotbugs  |   0m 21s |  |  
branch/hadoop-client-modules/hadoop-client-minicluster no spotbugs output file 
(spotbugsXml.xml)  |
   | +1 :green_heart: |  shadedclient  |  60m 40s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 51s |  |  Maven dependency ordering for patch  |
   | -1 :x: |  mvninstall  |  28m 30s | 
[/patch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/patch-mvninstall-root.txt)
 |  root in the patch failed.  |
   | -1 :x: |  mvninstall  |   1m  7s | 
[/patch-mvninstall-hadoop-mapreduce-project.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/patch-mvninstall-hadoop-mapreduce-project.txt)
 |  hadoop-mapreduce-project in the patch failed.  |
   | -1 :x: |  compile  |  14m 27s | 
[/patch-compile-root-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/patch-compile-root-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  root in the patch failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  javac  |  14m 27s | 
[/patch-compile-root-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/patch-compile-root-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  root in the patch failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  compile  |  13m 58s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_392-8u392-ga-1~20.04-b08.  |
   | -1 :x: |  javac  |  13m 58s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_392-8u392-ga-1~20.04-b08.  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   3m 55s | 
[/buildtool-patch-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/buildtool-patch-checkstyle-root.txt)
 |  The patch fails to run checkstyle in root  |
   | -1 :x: |  mvnsite  |   4m 23s | 
[/patch-mvnsite-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/

Re: [PR] HADOOP-18342: shaded avro jar [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #4854:
URL: https://github.com/apache/hadoop/pull/4854#issuecomment-1933446188

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 56s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  1s |  |  xmllint was not available.  |
   | +0 :ok: |  shelldocs  |   0m  1s |  |  Shelldocs was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 3 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  14m 53s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  30m 52s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  16m 28s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |  14m 50s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   4m 14s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |  18m 36s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   8m 27s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   7m 36s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +0 :ok: |  spotbugs  |   0m 19s |  |  branch/hadoop-project no spotbugs 
output file (spotbugsXml.xml)  |
   | +0 :ok: |  spotbugs  |   0m 22s |  |  
branch/hadoop-client-modules/hadoop-client no spotbugs output file 
(spotbugsXml.xml)  |
   | +0 :ok: |  spotbugs  |   0m 21s |  |  
branch/hadoop-client-modules/hadoop-client-minicluster no spotbugs output file 
(spotbugsXml.xml)  |
   | +1 :green_heart: |  shadedclient  |  60m 40s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 51s |  |  Maven dependency ordering for patch  |
   | -1 :x: |  mvninstall  |  28m 30s | 
[/patch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/patch-mvninstall-root.txt)
 |  root in the patch failed.  |
   | -1 :x: |  mvninstall  |   1m  7s | 
[/patch-mvninstall-hadoop-mapreduce-project.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/patch-mvninstall-hadoop-mapreduce-project.txt)
 |  hadoop-mapreduce-project in the patch failed.  |
   | -1 :x: |  compile  |  14m 27s | 
[/patch-compile-root-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/patch-compile-root-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  root in the patch failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  javac  |  14m 27s | 
[/patch-compile-root-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/patch-compile-root-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  root in the patch failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  compile  |  13m 58s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_392-8u392-ga-1~20.04-b08.  |
   | -1 :x: |  javac  |  13m 58s | 
[/patch-compile-root-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt)
 |  root in the patch failed with JDK Private 
Build-1.8.0_392-8u392-ga-1~20.04-b08.  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   3m 55s | 
[/buildtool-patch-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/buildtool-patch-checkstyle-root.txt)
 |  The patch fails to run checkstyle in root  |
   | -1 :x: |  mvnsite  |   4m 23s | 
[/patch-mvnsite-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4854/3/artifact/out/patch-mvnsite-root.txt)
 |  root in the patch failed.  |
   | +1 :green_heart: |  shellcheck  |   0m  0s |  |  No new issues.  |
   | -1 :x: |  javadoc  |   7m 10s | 
[/patch-javadoc-root-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt]

Re: [PR] HDFS-17299. Adding rack failure tolerance when creating a new file [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6513:
URL: https://github.com/apache/hadoop/pull/6513#issuecomment-1933421772

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 50s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 3 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  14m 46s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  35m  1s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   6m 12s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   5m 48s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   1m 28s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   2m 19s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   1m 53s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   2m 19s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   6m  1s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  40m 13s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   1m  2s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 59s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   6m  0s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   6m  0s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   5m 37s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   5m 37s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   1m 21s | 
[/results-checkstyle-hadoop-hdfs-project.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6513/9/artifact/out/results-checkstyle-hadoop-hdfs-project.txt)
 |  hadoop-hdfs-project: The patch generated 16 new + 243 unchanged - 2 fixed = 
259 total (was 245)  |
   | +1 :green_heart: |  mvnsite  |   2m  3s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   1m 33s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   2m  7s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   6m  5s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  39m 42s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 25s |  |  hadoop-hdfs-client in the patch 
passed.  |
   | -1 :x: |  unit  | 257m 45s | 
[/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6513/9/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt)
 |  hadoop-hdfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 43s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 446m  4s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.hdfs.server.namenode.TestDiskspaceQuotaUpdate |
   |   | hadoop.hdfs.server.sps.TestExternalStoragePolicySatisfier |
   |   | hadoop.hdfs.TestDFSClientExcludedNodes |
   |   | hadoop.hdfs.server.namenode.ha.TestPipelinesFailover |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6513/9/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6513 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 41aa593ac88e 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 3a24ff0393874c3ce44928c8fe94a0ee6ed3ddeb |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+

[jira] [Comment Edited] (HADOOP-19070) Java 17 compile support

2024-02-07 Thread Benjamin Teke (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19070?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815521#comment-17815521
 ] 

Benjamin Teke edited comment on HADOOP-19070 at 2/8/24 6:00 AM:


[~slfan1989] thanks for reporting this! There is already some progress in 
HADOOP-17177 or its child HADOOP-18887, so probably we could move the subtasks 
there for easier tracking. 


was (Author: bteke):
[~slfan1989] thanks for reporting this! There is already some progress in 
HADOOP-17177, so probably we could move the subtasks there for easier tracking. 

> Java 17 compile support
> ---
>
> Key: HADOOP-19070
> URL: https://issues.apache.org/jira/browse/HADOOP-19070
> Project: Hadoop Common
>  Issue Type: Task
>Reporter: Shilun Fan
>Priority: Major
>
> We plan to make Hadoop compatible with Java 17. There is a lot of work to be 
> done, but we still aim to complete this task by 2024.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19070) Java 17 compile support

2024-02-07 Thread Benjamin Teke (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19070?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815521#comment-17815521
 ] 

Benjamin Teke commented on HADOOP-19070:


[~slfan1989] thanks for reporting this! There is already some progress in 
HADOOP-17177, so probably we could move the subtasks there for easier tracking. 

> Java 17 compile support
> ---
>
> Key: HADOOP-19070
> URL: https://issues.apache.org/jira/browse/HADOOP-19070
> Project: Hadoop Common
>  Issue Type: Task
>Reporter: Shilun Fan
>Priority: Major
>
> We plan to make Hadoop compatible with Java 17. There is a lot of work to be 
> done, but we still aim to complete this task by 2024.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-19071. Update maven-surefire-plugin from 3.0.0 to 3.2.2. [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6537:
URL: https://github.com/apache/hadoop/pull/6537#issuecomment-1933404637

   (!) A patch to the testing environment has been detected. 
   Re-executing against the patched versions to perform further tests. 
   The console is at 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6537/2/console in 
case of problems.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19071) Update maven-surefire-plugin from 3.0.0 to 3.2.2

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815507#comment-17815507
 ] 

ASF GitHub Bot commented on HADOOP-19071:
-

hadoop-yetus commented on PR #6537:
URL: https://github.com/apache/hadoop/pull/6537#issuecomment-1933340199

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 20s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  1s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  33m  8s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 14s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 14s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  mvnsite  |   0m 16s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 19s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 15s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  shadedclient  |  53m 16s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m  9s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m  8s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m  8s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 10s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 10s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   0m 11s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m  9s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 11s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | -1 :x: |  shadedclient  |   1m 34s |  |  patch has errors when building 
and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   0m  9s |  |  hadoop-project in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   0m 19s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   |  59m 11s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6537/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6537 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient codespell detsecrets xmllint |
   | uname | Linux bf22f2dfefeb 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 9d0c44e21b0b96fba00bda5aa84a71bc9d00d9dd |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6537/1/testReport/ |
   | Max. process+thread count | 551 (vs. ulimit of 5500) |
   | modules | C: hadoop-project U: hadoop-project |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6537/1/console |
   | versions | git=2.25.1 maven=3.6.3 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   




> Update maven-surefire-plugin from 3.0.0 to 3.2.2  
> -
>
> Key: HADOOP-19071
> URL: https://issues.apache.org/jira/browse/HADOOP-19071
>   

Re: [PR] HADOOP-19071. Update maven-surefire-plugin from 3.0.0 to 3.2.2. [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6537:
URL: https://github.com/apache/hadoop/pull/6537#issuecomment-1933340199

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 20s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  1s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  33m  8s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 14s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 14s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  mvnsite  |   0m 16s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 19s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 15s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  shadedclient  |  53m 16s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m  9s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m  8s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m  8s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 10s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 10s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   0m 11s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m  9s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 11s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | -1 :x: |  shadedclient  |   1m 34s |  |  patch has errors when building 
and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   0m  9s |  |  hadoop-project in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   0m 19s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   |  59m 11s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6537/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6537 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient codespell detsecrets xmllint |
   | uname | Linux bf22f2dfefeb 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 9d0c44e21b0b96fba00bda5aa84a71bc9d00d9dd |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6537/1/testReport/ |
   | Max. process+thread count | 551 (vs. ulimit of 5500) |
   | modules | C: hadoop-project U: hadoop-project |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6537/1/console |
   | versions | git=2.25.1 maven=3.6.3 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail

[jira] [Commented] (HADOOP-19059) update AWS SDK to support S3 Access Grants in S3A

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19059?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815506#comment-17815506
 ] 

ASF GitHub Bot commented on HADOOP-19059:
-

adnanhemani commented on PR #6538:
URL: https://github.com/apache/hadoop/pull/6538#issuecomment-1933326099

   Integration and Unit Test results:
   
   ```
   88665a0afd7e:hadoop-aws ahemani$ mvn verify -Dparallel-tests -Dprefetch 
-DtestsThreadCount=8
   Picked up JAVA_TOOL_OPTIONS: -Dlog4j2.formatMsgNoLookups=true
   [INFO] Scanning for projects...
   [INFO] 

   [INFO] Detecting the operating system and CPU architecture
   [INFO] 

   [INFO] os.detected.name: osx
   [INFO] os.detected.arch: x86_64
   [INFO] os.detected.bitness: 64
   [INFO] os.detected.version: 14.2
   [INFO] os.detected.version.major: 14
   [INFO] os.detected.version.minor: 2
   [INFO] os.detected.classifier: osx-x86_64
   [INFO] 
   [INFO] < org.apache.hadoop:hadoop-aws 
>
   [INFO] Building Apache Hadoop Amazon Web Services support 3.5.0-SNAPSHOT
   [INFO] [ jar 
]-
   ...
   [INFO] ---
   [INFO]  T E S T S
   [INFO] ---
   [INFO] Running org.apache.hadoop.fs.s3a.commit.TestMagicCommitPaths
   [INFO] Running 
org.apache.hadoop.fs.s3a.commit.staging.TestDirectoryCommitterScale
   [INFO] Running 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingDirectoryOutputCommitter
   [INFO] Running org.apache.hadoop.fs.s3a.commit.staging.TestStagingCommitter
   [INFO] Running 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingPartitionedFileListing
   [INFO] Running 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingPartitionedTaskCommit
   [INFO] Running 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingPartitionedJobCommit
   [INFO] Running org.apache.hadoop.fs.s3a.commit.staging.TestPaths
   [INFO] Tests run: 28, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
1.954 s - in org.apache.hadoop.fs.s3a.commit.TestMagicCommitPaths
   [INFO] Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
3.744 s - in org.apache.hadoop.fs.s3a.commit.staging.TestPaths
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestAwsClientConfig
   [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.083 
s - in org.apache.hadoop.fs.s3a.impl.TestAwsClientConfig
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestHeaderProcessing
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestCreateFileBuilder
   [INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
3.112 s - in org.apache.hadoop.fs.s3a.impl.TestHeaderProcessing
   [INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.988 
s - in org.apache.hadoop.fs.s3a.impl.TestCreateFileBuilder
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestErrorTranslation
   [INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.945 
s - in org.apache.hadoop.fs.s3a.impl.TestErrorTranslation
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestS3AMultipartUploaderSupport
   [INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.851 
s - in org.apache.hadoop.fs.s3a.impl.TestS3AMultipartUploaderSupport
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestRequestFactory
   [INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
15.461 s - in 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingDirectoryOutputCommitter
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestDirectoryMarkerPolicy
   [INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.202 
s - in org.apache.hadoop.fs.s3a.impl.TestDirectoryMarkerPolicy
   [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.877 
s - in org.apache.hadoop.fs.s3a.impl.TestRequestFactory
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestOpenFileSupport
   [INFO] Tests run: 19, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
1.266 s - in org.apache.hadoop.fs.s3a.impl.TestOpenFileSupport
   [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
20.129 s - in 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingPartitionedFileListing
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestSDKStreamDrainer
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestNetworkBinding
   [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
20.905 s - in 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingPartitionedJobCommit
   [INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
0.994 s - in org.apache.hadoop.fs.s3a.impl.TestSDKStreamDrainer
   [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0,

Re: [PR] HADOOP-19059. Update AWS Java SDK to 2.23.19 [hadoop]

2024-02-07 Thread via GitHub


adnanhemani commented on PR #6538:
URL: https://github.com/apache/hadoop/pull/6538#issuecomment-1933326099

   Integration and Unit Test results:
   
   ```
   88665a0afd7e:hadoop-aws ahemani$ mvn verify -Dparallel-tests -Dprefetch 
-DtestsThreadCount=8
   Picked up JAVA_TOOL_OPTIONS: -Dlog4j2.formatMsgNoLookups=true
   [INFO] Scanning for projects...
   [INFO] 

   [INFO] Detecting the operating system and CPU architecture
   [INFO] 

   [INFO] os.detected.name: osx
   [INFO] os.detected.arch: x86_64
   [INFO] os.detected.bitness: 64
   [INFO] os.detected.version: 14.2
   [INFO] os.detected.version.major: 14
   [INFO] os.detected.version.minor: 2
   [INFO] os.detected.classifier: osx-x86_64
   [INFO] 
   [INFO] < org.apache.hadoop:hadoop-aws 
>
   [INFO] Building Apache Hadoop Amazon Web Services support 3.5.0-SNAPSHOT
   [INFO] [ jar 
]-
   ...
   [INFO] ---
   [INFO]  T E S T S
   [INFO] ---
   [INFO] Running org.apache.hadoop.fs.s3a.commit.TestMagicCommitPaths
   [INFO] Running 
org.apache.hadoop.fs.s3a.commit.staging.TestDirectoryCommitterScale
   [INFO] Running 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingDirectoryOutputCommitter
   [INFO] Running org.apache.hadoop.fs.s3a.commit.staging.TestStagingCommitter
   [INFO] Running 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingPartitionedFileListing
   [INFO] Running 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingPartitionedTaskCommit
   [INFO] Running 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingPartitionedJobCommit
   [INFO] Running org.apache.hadoop.fs.s3a.commit.staging.TestPaths
   [INFO] Tests run: 28, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
1.954 s - in org.apache.hadoop.fs.s3a.commit.TestMagicCommitPaths
   [INFO] Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
3.744 s - in org.apache.hadoop.fs.s3a.commit.staging.TestPaths
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestAwsClientConfig
   [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.083 
s - in org.apache.hadoop.fs.s3a.impl.TestAwsClientConfig
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestHeaderProcessing
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestCreateFileBuilder
   [INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
3.112 s - in org.apache.hadoop.fs.s3a.impl.TestHeaderProcessing
   [INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.988 
s - in org.apache.hadoop.fs.s3a.impl.TestCreateFileBuilder
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestErrorTranslation
   [INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.945 
s - in org.apache.hadoop.fs.s3a.impl.TestErrorTranslation
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestS3AMultipartUploaderSupport
   [INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.851 
s - in org.apache.hadoop.fs.s3a.impl.TestS3AMultipartUploaderSupport
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestRequestFactory
   [INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
15.461 s - in 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingDirectoryOutputCommitter
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestDirectoryMarkerPolicy
   [INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.202 
s - in org.apache.hadoop.fs.s3a.impl.TestDirectoryMarkerPolicy
   [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.877 
s - in org.apache.hadoop.fs.s3a.impl.TestRequestFactory
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestOpenFileSupport
   [INFO] Tests run: 19, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
1.266 s - in org.apache.hadoop.fs.s3a.impl.TestOpenFileSupport
   [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
20.129 s - in 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingPartitionedFileListing
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestSDKStreamDrainer
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestNetworkBinding
   [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
20.905 s - in 
org.apache.hadoop.fs.s3a.commit.staging.TestStagingPartitionedJobCommit
   [INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
0.994 s - in org.apache.hadoop.fs.s3a.impl.TestSDKStreamDrainer
   [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.806 
s - in org.apache.hadoop.fs.s3a.impl.TestNetworkBinding
   [INFO] Running org.apache.hadoop.fs.s3a.impl.TestS3ExpressStorage
   [INFO] Running org.apache.hadoop.fs.s3a.TestListing
   [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0,

[jira] [Commented] (HADOOP-19059) update AWS SDK to support S3 Access Grants in S3A

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19059?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815505#comment-17815505
 ] 

ASF GitHub Bot commented on HADOOP-19059:
-

adnanhemani opened a new pull request, #6538:
URL: https://github.com/apache/hadoop/pull/6538

   
   
   ### Description of PR
   This upgrades the AWS Java SDK version to 2.23.19, which brings the S3 
Access Grants plugin into the bundle.jar. We can then use that plugin to 
support S3 Access Grants with native capacity.
   
   ### How was this patch tested?
   Integration Tests ran against PDX (`us-west-2`). Results in next comment.
   
   ### For code changes:
   
   - [X] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [X] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [N/A] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [X] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   




> update AWS SDK to support S3 Access Grants in S3A
> -
>
> Key: HADOOP-19059
> URL: https://issues.apache.org/jira/browse/HADOOP-19059
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build, fs/s3
>Affects Versions: 3.4.0
>Reporter: Jason Han
>Assignee: Jason Han
>Priority: Minor
>  Labels: pull-request-available
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> In order to support S3 Access 
> Grants(https://aws.amazon.com/s3/features/access-grants/) in S3A, we need to 
> update AWS SDK in hadooop package.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[PR] HADOOP-19059. Update AWS Java SDK to 2.23.19 [hadoop]

2024-02-07 Thread via GitHub


adnanhemani opened a new pull request, #6538:
URL: https://github.com/apache/hadoop/pull/6538

   
   
   ### Description of PR
   This upgrades the AWS Java SDK version to 2.23.19, which brings the S3 
Access Grants plugin into the bundle.jar. We can then use that plugin to 
support S3 Access Grants with native capacity.
   
   ### How was this patch tested?
   Integration Tests ran against PDX (`us-west-2`). Results in next comment.
   
   ### For code changes:
   
   - [X] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [X] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [N/A] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [X] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19059) update AWS SDK to support S3 Access Grants in S3A

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19059?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815503#comment-17815503
 ] 

ASF GitHub Bot commented on HADOOP-19059:
-

adnanhemani commented on PR #6506:
URL: https://github.com/apache/hadoop/pull/6506#issuecomment-1933319453

   As per https://github.com/apache/hadoop/pull/6507#issuecomment-1933245232, 
I'm marking this PR as deprecated and will be opening a new one with a larger 
AWS Java SDK upgrade.




> update AWS SDK to support S3 Access Grants in S3A
> -
>
> Key: HADOOP-19059
> URL: https://issues.apache.org/jira/browse/HADOOP-19059
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build, fs/s3
>Affects Versions: 3.4.0
>Reporter: Jason Han
>Assignee: Jason Han
>Priority: Minor
>  Labels: pull-request-available
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> In order to support S3 Access 
> Grants(https://aws.amazon.com/s3/features/access-grants/) in S3A, we need to 
> update AWS SDK in hadooop package.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-19059. Update AWS SDK to v2.23.7 [hadoop]

2024-02-07 Thread via GitHub


adnanhemani commented on PR #6506:
URL: https://github.com/apache/hadoop/pull/6506#issuecomment-1933319453

   As per https://github.com/apache/hadoop/pull/6507#issuecomment-1933245232, 
I'm marking this PR as deprecated and will be opening a new one with a larger 
AWS Java SDK upgrade.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HDFS-17373. Fix connection leak in FolderRenamePending [hadoop]

2024-02-07 Thread via GitHub


xuzifu666 commented on PR #6534:
URL: https://github.com/apache/hadoop/pull/6534#issuecomment-1933310950

   > > Well, this may not belong to the HDFS module.
   > 
   > @zhangshuyan0 Thank you for your review! but I think this PR needs Steve’s 
help to review it.
   > 
   > @xuzifu666 Thanks for the contribution! we need to wait for the results of 
Steve's review.
   
   OK,Thanks  and @steveloughran could you give the final review?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HDFS-17373. Fix connection leak in FolderRenamePending [hadoop]

2024-02-07 Thread via GitHub


slfan1989 commented on PR #6534:
URL: https://github.com/apache/hadoop/pull/6534#issuecomment-1933302730

   > Well, this may not belong to the HDFS module.
   
   @zhangshuyan0 Thank you for your review! but I think this PR needs Steve’s 
help to review it.
   @xuzifu666 Thanks for the contribution! we need to wait for the results of 
steve's review


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-19071) Update maven-surefire-plugin from 3.0.0 to 3.2.2

2024-02-07 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-19071?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated HADOOP-19071:

Labels: pull-request-available  (was: )

> Update maven-surefire-plugin from 3.0.0 to 3.2.2  
> -
>
> Key: HADOOP-19071
> URL: https://issues.apache.org/jira/browse/HADOOP-19071
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Affects Versions: 3.5.0
>Reporter: Shilun Fan
>Assignee: Shilun Fan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[PR] HADOOP-19071. Update maven-surefire-plugin from 3.0.0 to 3.2.2. [hadoop]

2024-02-07 Thread via GitHub


slfan1989 opened a new pull request, #6537:
URL: https://github.com/apache/hadoop/pull/6537

   
   
   ### Description of PR
   
   JIRA: HADOOP-19071. Update maven-surefire-plugin from 3.0.0 to 3.2.2.
   
   ### How was this patch tested?
   
   
   ### For code changes:
   
   - [ ] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19071) Update maven-surefire-plugin from 3.0.0 to 3.2.2

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815497#comment-17815497
 ] 

ASF GitHub Bot commented on HADOOP-19071:
-

slfan1989 opened a new pull request, #6537:
URL: https://github.com/apache/hadoop/pull/6537

   
   
   ### Description of PR
   
   JIRA: HADOOP-19071. Update maven-surefire-plugin from 3.0.0 to 3.2.2.
   
   ### How was this patch tested?
   
   
   ### For code changes:
   
   - [ ] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   




> Update maven-surefire-plugin from 3.0.0 to 3.2.2  
> -
>
> Key: HADOOP-19071
> URL: https://issues.apache.org/jira/browse/HADOOP-19071
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Affects Versions: 3.5.0
>Reporter: Shilun Fan
>Assignee: Shilun Fan
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HDFS-17373. Fix connection leak in FolderRenamePending [hadoop]

2024-02-07 Thread via GitHub


zhangshuyan0 commented on PR #6534:
URL: https://github.com/apache/hadoop/pull/6534#issuecomment-1933299546

   Well, this may not belong to the HDFS module.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18980) S3A credential provider remapping: make extensible

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18980?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815493#comment-17815493
 ] 

ASF GitHub Bot commented on HADOOP-18980:
-

hadoop-yetus commented on PR #6525:
URL: https://github.com/apache/hadoop/pull/6525#issuecomment-1933297866

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  18m 15s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  markdownlint  |   0m  1s |  |  markdownlint was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 3 new or modified test files.  |
    _ branch-3.4 Compile Tests _ |
   | +0 :ok: |  mvndep  |   4m  6s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  44m 43s |  |  branch-3.4 passed  |
   | +1 :green_heart: |  compile  |  18m 21s |  |  branch-3.4 passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |  17m  5s |  |  branch-3.4 passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   5m 20s |  |  branch-3.4 passed  |
   | +1 :green_heart: |  mvnsite  |   2m 38s |  |  branch-3.4 passed  |
   | +1 :green_heart: |  javadoc  |   1m 46s |  |  branch-3.4 passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   1m 31s |  |  branch-3.4 passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   3m 43s |  |  branch-3.4 passed  |
   | +1 :green_heart: |  shadedclient  |  37m 58s |  |  branch has no errors 
when building and testing our client artifacts.  |
   | -0 :warning: |  patch  |  38m 23s |  |  Used diff version of patch file. 
Binary files and potentially other changes not applied. Please rebase and 
squash commits if necessary.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 32s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 24s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  18m 14s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |  18m 14s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  16m 37s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |  16m 37s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  1s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   4m 40s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   2m 26s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   1m 38s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   1m 33s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   4m  4s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  38m 52s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  19m 14s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m  6s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   0m 57s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 275m 14s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6525/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6525 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets markdownlint 
|
   | uname | Linux 05b3a7686dc8 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | branch-3.4 / a0c3a901962ff28b55f67aecee162edad350ab9e |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://

Re: [PR] HADOOP-18980. S3A credential provider remapping: make extensible (#6406) [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6525:
URL: https://github.com/apache/hadoop/pull/6525#issuecomment-1933297866

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  18m 15s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  markdownlint  |   0m  1s |  |  markdownlint was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 3 new or modified test files.  |
    _ branch-3.4 Compile Tests _ |
   | +0 :ok: |  mvndep  |   4m  6s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  44m 43s |  |  branch-3.4 passed  |
   | +1 :green_heart: |  compile  |  18m 21s |  |  branch-3.4 passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |  17m  5s |  |  branch-3.4 passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   5m 20s |  |  branch-3.4 passed  |
   | +1 :green_heart: |  mvnsite  |   2m 38s |  |  branch-3.4 passed  |
   | +1 :green_heart: |  javadoc  |   1m 46s |  |  branch-3.4 passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   1m 31s |  |  branch-3.4 passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   3m 43s |  |  branch-3.4 passed  |
   | +1 :green_heart: |  shadedclient  |  37m 58s |  |  branch has no errors 
when building and testing our client artifacts.  |
   | -0 :warning: |  patch  |  38m 23s |  |  Used diff version of patch file. 
Binary files and potentially other changes not applied. Please rebase and 
squash commits if necessary.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 32s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 24s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  18m 14s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |  18m 14s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  16m 37s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |  16m 37s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  1s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   4m 40s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   2m 26s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   1m 38s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   1m 33s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   4m  4s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  38m 52s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  19m 14s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m  6s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   0m 57s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 275m 14s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6525/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6525 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets markdownlint 
|
   | uname | Linux 05b3a7686dc8 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | branch-3.4 / a0c3a901962ff28b55f67aecee162edad350ab9e |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6525/2/testReport/ |
   | Max. process+thread count | 3137 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws 
U: . |
   | Console output | 
https://ci-hadoop.a

[jira] [Created] (HADOOP-19071) Update maven-surefire-plugin from 3.0.0 to 3.2.2

2024-02-07 Thread Shilun Fan (Jira)
Shilun Fan created HADOOP-19071:
---

 Summary: Update maven-surefire-plugin from 3.0.0 to 3.2.2  
 Key: HADOOP-19071
 URL: https://issues.apache.org/jira/browse/HADOOP-19071
 Project: Hadoop Common
  Issue Type: Sub-task
  Components: common
Affects Versions: 3.5.0
Reporter: Shilun Fan
Assignee: Shilun Fan






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-19070) Java 17 compile support

2024-02-07 Thread Shilun Fan (Jira)
Shilun Fan created HADOOP-19070:
---

 Summary: Java 17 compile support
 Key: HADOOP-19070
 URL: https://issues.apache.org/jira/browse/HADOOP-19070
 Project: Hadoop Common
  Issue Type: Task
Reporter: Shilun Fan


We plan to make Hadoop compatible with Java 17. There is a lot of work to be 
done, but we still aim to complete this task by 2024.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19058) [JDK-17] Fix UT Failures in hadoop common, hdfs, yarn

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19058?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815483#comment-17815483
 ] 

ASF GitHub Bot commented on HADOOP-19058:
-

slfan1989 commented on PR #6531:
URL: https://github.com/apache/hadoop/pull/6531#issuecomment-1933275327

   > commented. I looked at surefire and it is at 3.2.5; probably time to 
upgrade. I propose it is done before doing this one, splitting them in two.
   > 
   > and we need java8 and java17 profiles so we can add other options in other 
places
   
   I agree with your points, but there are some additional details to be added:
   
   - I agree to establish profiles for Java 8 and Java 17, and I think we 
should also include Java 11 and Java 21.
   - There are still some pending tasks related to Java 11. We need to continue 
working on HADOOP-15984, specifically updating Jersey from version 1.19 to 2.x. 
Ideally, we should update it to Jersey 3.x. (I will start following this JIRA 
in the next 1-2 days)
   - We need to migrate from JUnit 4 to JUnit 5. Some progress has already been 
made in Yarn.
   - We should consider upgrading the version of the Maven Surefire Plugin. 
Currently, we are using `3.0.0-M1`, but I suggest upgrading it to `3.2.2`. I 
have noticed that some projects are using this version, and I will try it out.
   
   I will create a JIRA ticket for Java 17 compile support. I am aware that 
there is a lot of work to be done, and I hope that we can address the JDK 
support issue this year.




> [JDK-17] Fix UT Failures in hadoop common, hdfs, yarn
> -
>
> Key: HADOOP-19058
> URL: https://issues.apache.org/jira/browse/HADOOP-19058
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Bilwa S T
>Assignee: Bilwa S T
>Priority: Major
>  Labels: pull-request-available
>
> Most of the UT's failed with below exception:
> Caused by: java.lang.ExceptionInInitializerError: Exception 
> java.lang.reflect.InaccessibleObjectException: Unable to make protected final 
> java.lang.Class 
> java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int) throws 
> java.lang.ClassFormatError accessible: module java.base does not "opens 
> java.lang" to unnamed module @d13f7c [in thread "Time-limited test"]



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-19058. [JDK-17] Fix UT Failures in hadoop common, hdfs, yarn [hadoop]

2024-02-07 Thread via GitHub


slfan1989 commented on PR #6531:
URL: https://github.com/apache/hadoop/pull/6531#issuecomment-1933275327

   > commented. I looked at surefire and it is at 3.2.5; probably time to 
upgrade. I propose it is done before doing this one, splitting them in two.
   > 
   > and we need java8 and java17 profiles so we can add other options in other 
places
   
   I agree with your points, but there are some additional details to be added:
   
   - I agree to establish profiles for Java 8 and Java 17, and I think we 
should also include Java 11 and Java 21.
   - There are still some pending tasks related to Java 11. We need to continue 
working on HADOOP-15984, specifically updating Jersey from version 1.19 to 2.x. 
Ideally, we should update it to Jersey 3.x. (I will start following this JIRA 
in the next 1-2 days)
   - We need to migrate from JUnit 4 to JUnit 5. Some progress has already been 
made in Yarn.
   - We should consider upgrading the version of the Maven Surefire Plugin. 
Currently, we are using `3.0.0-M1`, but I suggest upgrading it to `3.2.2`. I 
have noticed that some projects are using this version, and I will try it out.
   
   I will create a JIRA ticket for Java 17 compile support. I am aware that 
there is a lot of work to be done, and I hope that we can address the JDK 
support issue this year.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HDFS-17373. Fix connection leak in FolderRenamePending [hadoop]

2024-02-07 Thread via GitHub


xuzifu666 commented on PR #6534:
URL: https://github.com/apache/hadoop/pull/6534#issuecomment-1933246860

   @steveloughran PTAL


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-19050, Add Support for AWS S3 Access Grants [hadoop]

2024-02-07 Thread via GitHub


adnanhemani commented on PR #6507:
URL: https://github.com/apache/hadoop/pull/6507#issuecomment-1933245232

   Update: I've successfully influenced the S3 Access Grants team and the AWS 
Java SDK team to include the S3 Access Grants Plugin within the AWS Java SDK 
bundle. For that, we will require an SDK version upgrade. Will work on that 
first to ensure we don't require the reflection logic here.
   
   In the meantime, any review on the latest version of this code would be 
appreciated - it will help me make the logic here solid while we work on the 
AWS SDK upgrade in parallel.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19050) Add S3 Access Grants Support in S3A

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19050?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815467#comment-17815467
 ] 

ASF GitHub Bot commented on HADOOP-19050:
-

adnanhemani commented on PR #6507:
URL: https://github.com/apache/hadoop/pull/6507#issuecomment-1933245232

   Update: I've successfully influenced the S3 Access Grants team and the AWS 
Java SDK team to include the S3 Access Grants Plugin within the AWS Java SDK 
bundle. For that, we will require an SDK version upgrade. Will work on that 
first to ensure we don't require the reflection logic here.
   
   In the meantime, any review on the latest version of this code would be 
appreciated - it will help me make the logic here solid while we work on the 
AWS SDK upgrade in parallel.




> Add S3 Access Grants Support in S3A
> ---
>
> Key: HADOOP-19050
> URL: https://issues.apache.org/jira/browse/HADOOP-19050
> Project: Hadoop Common
>  Issue Type: New Feature
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Jason Han
>Assignee: Jason Han
>Priority: Minor
>  Labels: pull-request-available
>
> Add support for S3 Access Grants 
> (https://aws.amazon.com/s3/features/access-grants/) in S3A.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HDFS-17316 Add hadoop-compat-bench [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6535:
URL: https://github.com/apache/hadoop/pull/6535#issuecomment-1933237018

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 20s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  markdownlint  |   0m  0s |  |  markdownlint was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +0 :ok: |  shelldocs  |   0m  0s |  |  Shelldocs was not available.  |
   | +1 :green_heart: |  @author  |   0m  1s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 13 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  31m 59s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   8m  7s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   7m 23s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   2m  0s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |  12m  9s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   4m 33s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   4m 50s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |  16m 29s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  34m 51s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 21s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |  17m  0s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   7m 47s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   7m 47s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   7m 16s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   7m 16s |  |  the patch passed  |
   | -1 :x: |  blanks  |   0m  0s | 
[/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6535/1/artifact/out/blanks-eol.txt)
 |  The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix 
<>. Refer https://git-scm.com/docs/git-apply  |
   | +1 :green_heart: |  checkstyle  |   1m 56s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   6m 38s |  |  the patch passed  |
   | -1 :x: |  shellcheck  |   0m  0s | 
[/results-shellcheck.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6535/1/artifact/out/results-shellcheck.txt)
 |  The patch generated 3 new + 0 unchanged - 0 fixed = 3 total (was 0)  |
   | +1 :green_heart: |  javadoc  |   4m 22s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   4m 57s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | -1 :x: |  spotbugs  |  16m 31s | 
[/new-spotbugs-root.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6535/1/artifact/out/new-spotbugs-root.html)
 |  root generated 19 new + 0 unchanged - 0 fixed = 19 total (was 0)  |
   | +1 :green_heart: |  shadedclient  |  19m 16s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | -1 :x: |  unit  | 628m 32s | 
[/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6535/1/artifact/out/patch-unit-root.txt)
 |  root in the patch failed.  |
   | -1 :x: |  asflicense  |   0m 48s | 
[/results-asflicense.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6535/1/artifact/out/results-asflicense.txt)
 |  The patch generated 18 ASF License warnings.  |
   |  |   | 826m 36s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | SpotBugs | module:root |
   |  |  Random object created and used only once in 
org.apache.hadoop.compat.AbstractHdfsCompatCase.getUniquePath(Path)  At 
AbstractHdfsCompatCase.java:only once in 
org.apache.hadoop.compat.AbstractHdfsCompatCase.getUniquePath(Path)  At 
AbstractHdfsCompatCase.java:[line 60] |
   |  |  org.apache.hadoop.compat.HdfsCompatEnvironment.getStoragePolicyNames() 
may expose internal representation by returning 
HdfsCompatEnvironment.defaultStoragePolicyNames  At 
HdfsCompatEnvironment.java:by returning 
HdfsCompatEnvironment.defaultStoragePolicyNames  At 
HdfsCompatEnvironment.java:[line 115] |
 

Re: [PR] YARN-11649. [Federation] Create Override for Maximum Resource Capability in getNewApplication [hadoop]

2024-02-07 Thread via GitHub


jchanggg commented on PR #6481:
URL: https://github.com/apache/hadoop/pull/6481#issuecomment-1933105891

   Fixed all the lint issues. Could you help review this change @slfan1989 ? 
Thank you! 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-19069. Use hadoop-thirdparty 1.2.0. [hadoop]

2024-02-07 Thread via GitHub


slfan1989 commented on PR #6533:
URL: https://github.com/apache/hadoop/pull/6533#issuecomment-1933081495

   @Hexiaoqiao Can you help review this PR? Thank you very much!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HDFS-17146.Use the dfsadmin -reconfig command to initiate reconfiguration on all decommissioning datanodes. [hadoop]

2024-02-07 Thread via GitHub


slfan1989 commented on PR #6504:
URL: https://github.com/apache/hadoop/pull/6504#issuecomment-1933082242

   LGTM.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19069) Use hadoop-thirdparty 1.2.0

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815451#comment-17815451
 ] 

ASF GitHub Bot commented on HADOOP-19069:
-

slfan1989 commented on PR #6533:
URL: https://github.com/apache/hadoop/pull/6533#issuecomment-1933081495

   @Hexiaoqiao Can you help review this PR? Thank you very much!




> Use hadoop-thirdparty 1.2.0
> ---
>
> Key: HADOOP-19069
> URL: https://issues.apache.org/jira/browse/HADOOP-19069
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: hadoop-thirdparty
>Affects Versions: 3.4.0, 3.5.0
>Reporter: Shilun Fan
>Assignee: Shilun Fan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] [NOT FOR MERGE] test with shaded protobuf-java 3.21 (snapshot) [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6350:
URL: https://github.com/apache/hadoop/pull/6350#issuecomment-1933069853

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 21s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  1s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  14m  8s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  19m 24s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   8m 27s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   8m 51s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  mvnsite  |  12m 51s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   4m 46s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   4m 53s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  shadedclient  |  92m 16s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 33s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |  17m 43s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   8m  9s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   8m  9s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   7m 29s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   7m 29s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   7m 22s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   4m 38s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   4m 58s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  shadedclient  |  36m 50s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | -1 :x: |  unit  | 626m 50s | 
[/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6350/5/artifact/out/patch-unit-root.txt)
 |  root in the patch failed.  |
   | +1 :green_heart: |  asflicense  |   0m 50s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 794m 16s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.hdfs.server.datanode.TestLargeBlockReport |
   |   | hadoop.hdfs.protocol.TestBlockListAsLongs |
   |   | 
hadoop.yarn.server.timelineservice.security.TestTimelineAuthFilterForV2 |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6350/5/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6350 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient codespell detsecrets xmllint |
   | uname | Linux 31c0ac568fad 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 368e341bac1c54423ed5c1082443ff790e31a66f |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6350/5/testReport/ |
   | Max. process+thread count | 4729 (vs. ulimit of 5500) |
   | modules | C: hadoop-project hadoop-common-project/hadoop-common 
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api . U: . |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6350/5/console |
   | versions | git=2.25.1 maven=3.6.3 |
   | P

Re: [PR] HDFS-17374. EC: StripedBlockReader#newConnectedPeer should set SO_TIMEOUT and SO_KEEPALIVE [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6536:
URL: https://github.com/apache/hadoop/pull/6536#issuecomment-1932775446

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 34s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  41m 38s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   1m 19s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   1m 11s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   1m  9s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   1m 18s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   1m  5s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   1m 35s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   3m 14s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  34m 20s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   1m  7s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   1m 11s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   1m 11s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   1m  5s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   1m  5s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   0m 57s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   1m  9s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 52s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   1m 28s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   3m 16s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  34m 40s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | -1 :x: |  unit  | 261m  3s | 
[/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6536/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt)
 |  hadoop-hdfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 42s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 395m 31s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.hdfs.TestWriteRead |
   |   | hadoop.hdfs.TestStripedFileAppend |
   |   | hadoop.hdfs.TestReadStripedFileWithDecodingCorruptData |
   |   | hadoop.hdfs.TestDFSStripedOutputStream |
   |   | hadoop.hdfs.TestErasureCodingPolicyWithSnapshotWithRandomECPolicy |
   |   | hadoop.hdfs.TestReconstructStripedFileWithRandomECPolicy |
   |   | hadoop.hdfs.TestDFSStripedInputStream |
   |   | hadoop.hdfs.server.datanode.TestDirectoryScanner |
   |   | hadoop.hdfs.TestFileAppend2 |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6536/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6536 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 6495c596dca5 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 8d2e864a1b26d60883428234882ccd9224a8cbdd |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results

[jira] [Commented] (HADOOP-19057) S3 public test bucket landsat-pds unreadable -needs replacement

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19057?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815398#comment-17815398
 ] 

ASF GitHub Bot commented on HADOOP-19057:
-

hadoop-yetus commented on PR #6515:
URL: https://github.com/apache/hadoop/pull/6515#issuecomment-1932653106

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 58s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  markdownlint  |   0m  0s |  |  markdownlint was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 19 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  14m 12s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  35m 30s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  19m 20s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |  18m 25s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   5m  8s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   2m 41s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   1m 47s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   1m 29s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   4m  3s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  38m 29s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 30s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 27s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  17m 30s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |  17m 30s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  16m 59s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |  16m 59s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   4m 32s |  |  root: The patch generated 
0 new + 13 unchanged - 1 fixed = 13 total (was 14)  |
   | +1 :green_heart: |  mvnsite  |   2m 27s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   1m 43s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   1m 35s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   4m  8s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  38m 19s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  20m  0s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m  7s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   0m 59s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 262m  9s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6515/7/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6515 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets markdownlint 
xmllint |
   | uname | Linux 1ff8cf84b62a 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / a82c9ff1e06169500515e0cd53cbad4c305a6539 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6515/7/testReport/ |
   | Max. process+thre

Re: [PR] HADOOP-19057. S3A: Landsat bucket used in tests no longer accessible [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6515:
URL: https://github.com/apache/hadoop/pull/6515#issuecomment-1932653106

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 58s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  markdownlint  |   0m  0s |  |  markdownlint was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 19 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  14m 12s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  35m 30s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  19m 20s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |  18m 25s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   5m  8s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   2m 41s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   1m 47s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   1m 29s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   4m  3s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  38m 29s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 30s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 27s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  17m 30s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |  17m 30s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  16m 59s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |  16m 59s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   4m 32s |  |  root: The patch generated 
0 new + 13 unchanged - 1 fixed = 13 total (was 14)  |
   | +1 :green_heart: |  mvnsite  |   2m 27s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   1m 43s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   1m 35s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   4m  8s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  38m 19s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  20m  0s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m  7s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   0m 59s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 262m  9s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6515/7/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6515 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets markdownlint 
xmllint |
   | uname | Linux 1ff8cf84b62a 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / a82c9ff1e06169500515e0cd53cbad4c305a6539 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6515/7/testReport/ |
   | Max. process+thread count | 3137 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws 
U: . |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6515/7/console |
   | versions | git=2.25.1 maven=3.6.

Re: [PR] HDFS-17361. DiskBalancer: Query command support with multiple nodes [hadoop]

2024-02-07 Thread via GitHub


tasanuma commented on code in PR #6508:
URL: https://github.com/apache/hadoop/pull/6508#discussion_r1481790761


##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/diskbalancer/command/QueryCommand.java:
##
@@ -56,52 +66,72 @@ public QueryCommand(Configuration conf) {
   @Override
   public void execute(CommandLine cmd) throws Exception {
 LOG.info("Executing \"query plan\" command.");
+TextStringBuilder result = new TextStringBuilder();
 Preconditions.checkState(cmd.hasOption(DiskBalancerCLI.QUERY));
 verifyCommandOptions(DiskBalancerCLI.QUERY, cmd);
-String nodeName = cmd.getOptionValue(DiskBalancerCLI.QUERY);
-Preconditions.checkNotNull(nodeName);
-nodeName = nodeName.trim();
-String nodeAddress = nodeName;
-
-// if the string is not name:port format use the default port.
-if (!nodeName.matches("[^\\:]+:[0-9]{2,5}")) {
-  int defaultIPC = NetUtils.createSocketAddr(
-  getConf().getTrimmed(DFSConfigKeys.DFS_DATANODE_IPC_ADDRESS_KEY,
-  DFSConfigKeys.DFS_DATANODE_IPC_ADDRESS_DEFAULT)).getPort();
-  nodeAddress = nodeName + ":" + defaultIPC;
-  LOG.debug("Using default data node port :  {}", nodeAddress);
+String nodeVal = cmd.getOptionValue(DiskBalancerCLI.QUERY);
+Preconditions.checkNotNull(nodeVal);
+nodeVal = nodeVal.trim();
+Set resultSet = new TreeSet<>();
+String[] nodes = nodeVal.split(",");
+if (nodes.length == 0) {
+  String warnMsg = "The number of input nodes is 0. "
+  + "Please input the valid nodes.";
+  throw new DiskBalancerException(warnMsg,
+  DiskBalancerException.Result.INVALID_NODE);
 }
 
-ClientDatanodeProtocol dataNode = getDataNodeProxy(nodeAddress);
-try {
-  DiskBalancerWorkStatus workStatus = dataNode.queryDiskBalancerPlan();
-  System.out.printf("Plan File: %s%nPlan ID: %s%nResult: %s%n",
-  workStatus.getPlanFile(),
-  workStatus.getPlanID(),
-  workStatus.getResult().toString());
+Collections.addAll(resultSet, nodes);
+String outputLine = String.format(
+"Get current status of the diskbalancer for DataNode(s). "
++ "These DataNode(s) are parsed from '%s'.", nodeVal);
+recordOutput(result, outputLine);
+for (String nodeName : resultSet) {
+  // if the string is not name:port format use the default port.
+  String nodeAddress = nodeName;
+  if (!nodeName.matches("[^\\:]+:[0-9]{2,5}")) {
+int defaultIPC = NetUtils.createSocketAddr(
+getConf().getTrimmed(DFSConfigKeys.DFS_DATANODE_IPC_ADDRESS_KEY,
+DFSConfigKeys.DFS_DATANODE_IPC_ADDRESS_DEFAULT)).getPort();
+nodeAddress = nodeName + ":" + defaultIPC;
+LOG.debug("Using default data node port :  {}", nodeAddress);
+  }
 
-  if (cmd.hasOption(DiskBalancerCLI.VERBOSE)) {
-System.out.printf("%s", workStatus.currentStateString());
+  ClientDatanodeProtocol dataNode = getDataNodeProxy(nodeAddress);
+  try {
+DiskBalancerWorkStatus workStatus = dataNode.queryDiskBalancerPlan();
+outputLine = String.format("DataNode: %s%nPlan File: %s%nPlan ID: 
%s%nResult: %s%n",
+nodeAddress,
+workStatus.getPlanFile(),
+workStatus.getPlanID(),
+workStatus.getResult().toString());
+result.append(outputLine);
+if (cmd.hasOption(DiskBalancerCLI.VERBOSE)) {
+  outputLine = String.format("%s", workStatus.currentStateString());
+  result.append(outputLine);
+}
+result.append(System.lineSeparator());
+  } catch (DiskBalancerException ex) {
+LOG.error("Query plan failed by {}", nodeAddress, ex);
+throw ex;
   }
-} catch (DiskBalancerException ex) {
-  LOG.error("Query plan failed.", ex);
-  throw ex;
 }
+getPrintStream().println(result.toString());

Review Comment:
   ```suggestion
   getPrintStream().println(result);
   ```



##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/diskbalancer/command/QueryCommand.java:
##
@@ -56,52 +66,72 @@ public QueryCommand(Configuration conf) {
   @Override
   public void execute(CommandLine cmd) throws Exception {
 LOG.info("Executing \"query plan\" command.");
+TextStringBuilder result = new TextStringBuilder();
 Preconditions.checkState(cmd.hasOption(DiskBalancerCLI.QUERY));
 verifyCommandOptions(DiskBalancerCLI.QUERY, cmd);
-String nodeName = cmd.getOptionValue(DiskBalancerCLI.QUERY);
-Preconditions.checkNotNull(nodeName);
-nodeName = nodeName.trim();
-String nodeAddress = nodeName;
-
-// if the string is not name:port format use the default port.
-if (!nodeName.matches("[^\\:]+:[0-9]{2,5}")) {
-  int defaultIPC = NetUtils.createSocketAddr(
-  getConf().getTrimmed(DFSConfigKeys.DFS_DATANODE_IPC_ADDRESS_KEY,
-  DFSConfi

[jira] [Commented] (HADOOP-18830) S3A: Cut S3 Select

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18830?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815359#comment-17815359
 ] 

ASF GitHub Bot commented on HADOOP-18830:
-

hadoop-yetus commented on PR #6523:
URL: https://github.com/apache/hadoop/pull/6523#issuecomment-1932467536

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   4m 30s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  markdownlint  |   0m  0s |  |  markdownlint was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 11 new or modified test files.  |
    _ branch-3.3 Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  33m 31s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  compile  |   0m 24s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  checkstyle  |   0m 21s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  mvnsite  |   0m 28s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  javadoc  |   0m 24s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  spotbugs  |   0m 51s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  shadedclient  |  21m 49s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 25s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 21s |  |  the patch passed  |
   | -1 :x: |  javac  |   0m 21s | 
[/results-compile-javac-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6523/3/artifact/out/results-compile-javac-hadoop-tools_hadoop-aws.txt)
 |  hadoop-tools_hadoop-aws generated 9 new + 23 unchanged - 2 fixed = 32 total 
(was 25)  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   0m 12s |  |  hadoop-tools/hadoop-aws: 
The patch generated 0 new + 6 unchanged - 8 fixed = 6 total (was 14)  |
   | +1 :green_heart: |  mvnsite  |   0m 25s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 17s |  |  the patch passed  |
   | +1 :green_heart: |  spotbugs  |   0m 45s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  20m 31s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m  3s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   0m 24s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   |  89m 37s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6523/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6523 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets markdownlint 
xmllint |
   | uname | Linux 24fd4bed0c7b 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | branch-3.3 / cd085ad42dc71041bdfacc9982b2affe49ea515d |
   | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~18.04-b09 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6523/3/testReport/ |
   | Max. process+thread count | 553 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6523/3/console |
   | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   




> S3A: Cut S3 Select
> --
>
> Key: HADOOP-18830
> URL: https://issues.apache.org/jira/browse/HADOOP-18830
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Steve Loughran
>Assignee: Steve Loughran
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.5.0, 3.4.1
>
>
> getting s3 select to work with the

Re: [PR] HADOOP-18830. Cut S3 Select (#6144) [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6523:
URL: https://github.com/apache/hadoop/pull/6523#issuecomment-1932467536

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   4m 30s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  markdownlint  |   0m  0s |  |  markdownlint was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 11 new or modified test files.  |
    _ branch-3.3 Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  33m 31s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  compile  |   0m 24s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  checkstyle  |   0m 21s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  mvnsite  |   0m 28s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  javadoc  |   0m 24s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  spotbugs  |   0m 51s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  shadedclient  |  21m 49s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 25s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 21s |  |  the patch passed  |
   | -1 :x: |  javac  |   0m 21s | 
[/results-compile-javac-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6523/3/artifact/out/results-compile-javac-hadoop-tools_hadoop-aws.txt)
 |  hadoop-tools_hadoop-aws generated 9 new + 23 unchanged - 2 fixed = 32 total 
(was 25)  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   0m 12s |  |  hadoop-tools/hadoop-aws: 
The patch generated 0 new + 6 unchanged - 8 fixed = 6 total (was 14)  |
   | +1 :green_heart: |  mvnsite  |   0m 25s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 17s |  |  the patch passed  |
   | +1 :green_heart: |  spotbugs  |   0m 45s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  20m 31s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m  3s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   0m 24s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   |  89m 37s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6523/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6523 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets markdownlint 
xmllint |
   | uname | Linux 24fd4bed0c7b 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | branch-3.3 / cd085ad42dc71041bdfacc9982b2affe49ea515d |
   | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~18.04-b09 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6523/3/testReport/ |
   | Max. process+thread count | 553 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6523/3/console |
   | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-19045) S3A: pass request timeouts down to sdk clients

2024-02-07 Thread Steve Loughran (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-19045?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran updated HADOOP-19045:

Description: 
s3a client timeout settings are getting down to http client, but not sdk 
timeouts, so you can't have a longer timeout than the default. This surfaces in 
the inability to tune the timeouts for CreateSession calls even now the latest 
SDK does pick it up

The default value of {{fs.s3a.connection.request.timeout}} is now 60s; if an S3 
store takes longer than this to return then the operation will be reported by 
the SDK as a timeout.

  was:s3a client timeout settings are getting down to http client, but not sdk 
timeouts, so you can't have a longer timeout than the default. This surfaces in 
the inability to tune the timeouts for CreateSession calls even now the latest 
SDK does pick it up


> S3A: pass request timeouts down to sdk clients
> --
>
> Key: HADOOP-19045
> URL: https://issues.apache.org/jira/browse/HADOOP-19045
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Steve Loughran
>Assignee: Steve Loughran
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.5.0, 3.4.1
>
>
> s3a client timeout settings are getting down to http client, but not sdk 
> timeouts, so you can't have a longer timeout than the default. This surfaces 
> in the inability to tune the timeouts for CreateSession calls even now the 
> latest SDK does pick it up
> The default value of {{fs.s3a.connection.request.timeout}} is now 60s; if an 
> S3 store takes longer than this to return then the operation will be reported 
> by the SDK as a timeout.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-19045) S3A: pass request timeouts down to sdk clients

2024-02-07 Thread Steve Loughran (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-19045?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran resolved HADOOP-19045.
-
Resolution: Fixed

> S3A: pass request timeouts down to sdk clients
> --
>
> Key: HADOOP-19045
> URL: https://issues.apache.org/jira/browse/HADOOP-19045
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Steve Loughran
>Assignee: Steve Loughran
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.5.0, 3.4.1
>
>
> s3a client timeout settings are getting down to http client, but not sdk 
> timeouts, so you can't have a longer timeout than the default. This surfaces 
> in the inability to tune the timeouts for CreateSession calls even now the 
> latest SDK does pick it up



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-18980. S3A credential provider remapping: make extensible (#6406) [hadoop]

2024-02-07 Thread via GitHub


steveloughran commented on PR #6525:
URL: https://github.com/apache/hadoop/pull/6525#issuecomment-1932273541

   afraid another change has broken your merge. rebase and force push


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-14837) Handle S3A "glacier" data

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-14837?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815330#comment-17815330
 ] 

ASF GitHub Bot commented on HADOOP-14837:
-

steveloughran commented on PR #6407:
URL: https://github.com/apache/hadoop/pull/6407#issuecomment-1932206238

   @bpahuja can you not do rebase and force push unless some other change has 
broken the merge? breaks the "changes since your last review" feature. If 
someone has stepped on the code, don't be afraid to rebase -I generally squash 
all changes first there to make the rebase easier




> Handle S3A "glacier" data
> -
>
> Key: HADOOP-14837
> URL: https://issues.apache.org/jira/browse/HADOOP-14837
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.0.0-beta1
>Reporter: Steve Loughran
>Assignee: Bhavay Pahuja
>Priority: Minor
>  Labels: pull-request-available
>
> SPARK-21797 covers how if you have AWS S3 set to copy some files to glacier, 
> they appear in the listing but GETs fail, and so does everything else
> We should think about how best to handle this.
> # report better
> # if listings can identify files which are glaciated then maybe we could have 
> an option to filter them out
> # test & see what happens



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-14837 : Support Read Restored Glacier Objects [hadoop]

2024-02-07 Thread via GitHub


steveloughran commented on PR #6407:
URL: https://github.com/apache/hadoop/pull/6407#issuecomment-1932206238

   @bpahuja can you not do rebase and force push unless some other change has 
broken the merge? breaks the "changes since your last review" feature. If 
someone has stepped on the code, don't be afraid to rebase -I generally squash 
all changes first there to make the rebase easier


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-14837) Handle S3A "glacier" data

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-14837?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815328#comment-17815328
 ] 

ASF GitHub Bot commented on HADOOP-14837:
-

steveloughran commented on PR #6407:
URL: https://github.com/apache/hadoop/pull/6407#issuecomment-1932200850

   checkstyle failures, we all get those. and sometimes we have to say "better 
if broken"
   ```
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/api/RequestFactory.java:256:
  /**: First sentence should end with a period. [JavadocStyle]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/impl/RequestFactoryImpl.java:617:
.optionalObjectAttributes(OptionalObjectAttributes.RESTORE_STATUS) // 
Optional Attribute to get the Restored Status of the Glacier Objects: Line is 
longer than 100 characters (found 146). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/impl/StoreContext.java:419:
  /**: First sentence should end with a period. [JavadocStyle]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/impl/StoreContext.java:420:
   * Return the S3ObjectStorageClassFilter object for S3A, whose value is set 
according to the config {@code fs.s3a.glacier.read.restored.objects}: Line is 
longer than 100 characters (found 146). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/Listing.java:467:
// Handle Glacier Storage Class objects based on the config 
fs.s3a.glacier.read.restored.objects value set: Line is longer than 100 
characters (found 114). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/Listing.java:468:
if ( s3ObjectStorageClassFilter.getFilter().apply(s3Object) &&  
acceptor.accept(keyPath, s3Object) && filter.accept(keyPath)) {: Line is longer 
than 100 characters (found 135). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/Listing.java:468:
if ( s3ObjectStorageClassFilter.getFilter().apply(s3Object) &&  
acceptor.accept(keyPath, s3Object) && filter.accept(keyPath)) {:12: '(' is 
followed by whitespace. [ParenPad]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:30:
 * {@link S3ObjectStorageClassFilter} will filter the S3 files based on the 
{@code fs.s3a.glacier.read.restored.objects} configuration set in {@link 
S3AFileSystem}: Line is longer than 100 characters (found 163). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:32:
 * {@code READ_ALL}: Retrieval of Glacier files will fail with 
InvalidObjectStateException: The operation is not valid for the object's 
storage class.: Line is longer than 100 characters (found 150). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:33:
 * {@code SKIP_ALL_GLACIER}: If this value is set then this will ignore any S3 
Objects which are tagged with Glacier storage classes and retrieve the others.: 
Line is longer than 100 characters (found 157). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:34:
 * {@code READ_RESTORED_GLACIER_OBJECTS}: If this value is set then restored 
status of the Glacier object will be checked, if restored the objects would be 
read like normal S3 objects else they will be ignored as the objects would not 
have been retrieved from the S3 Glacier.: Line is longer than 100 characters 
(found 275). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:42:
  private static final Set GLACIER_STORAGE_CLASSES = 
Sets.newHashSet(ObjectStorageClass.GLACIER, ObjectStorageClass.DEEP_ARCHIVE);: 
Line is longer than 100 characters (found 150). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:50:
  /**: First sentence should end with a period. [JavadocStyle]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:59:
  /**: First sentence should end with a period. [JavadocStyle]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:68:
  /**: First sentence should end with a period. [JavadocStyle]
   
./hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/list/ITestS3AReadRestoredGlacierObjects.java:70:
  private final int MAX_RETRIES = 100;:21: Name 'MAX_RETRIES' must match 
pattern '^[a-z][a-zA-Z0-9]*$'. [MemberName]
   
./hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/list/ITestS3AReadRestoredGlacierObjects.java:71:
  private final int RETRY_DELAY_MS = 5000;:21: Name 'RETRY_DELAY_MS' must match 
pattern '^[a-z][a-zA-Z0-9]*$'. [MemberName]
   
./hadoop-tools/hadoop-aws/src/test/java/org/apa

Re: [PR] HADOOP-14837 : Support Read Restored Glacier Objects [hadoop]

2024-02-07 Thread via GitHub


steveloughran commented on PR #6407:
URL: https://github.com/apache/hadoop/pull/6407#issuecomment-1932200850

   checkstyle failures, we all get those. and sometimes we have to say "better 
if broken"
   ```
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/api/RequestFactory.java:256:
  /**: First sentence should end with a period. [JavadocStyle]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/impl/RequestFactoryImpl.java:617:
.optionalObjectAttributes(OptionalObjectAttributes.RESTORE_STATUS) // 
Optional Attribute to get the Restored Status of the Glacier Objects: Line is 
longer than 100 characters (found 146). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/impl/StoreContext.java:419:
  /**: First sentence should end with a period. [JavadocStyle]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/impl/StoreContext.java:420:
   * Return the S3ObjectStorageClassFilter object for S3A, whose value is set 
according to the config {@code fs.s3a.glacier.read.restored.objects}: Line is 
longer than 100 characters (found 146). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/Listing.java:467:
// Handle Glacier Storage Class objects based on the config 
fs.s3a.glacier.read.restored.objects value set: Line is longer than 100 
characters (found 114). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/Listing.java:468:
if ( s3ObjectStorageClassFilter.getFilter().apply(s3Object) &&  
acceptor.accept(keyPath, s3Object) && filter.accept(keyPath)) {: Line is longer 
than 100 characters (found 135). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/Listing.java:468:
if ( s3ObjectStorageClassFilter.getFilter().apply(s3Object) &&  
acceptor.accept(keyPath, s3Object) && filter.accept(keyPath)) {:12: '(' is 
followed by whitespace. [ParenPad]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:30:
 * {@link S3ObjectStorageClassFilter} will filter the S3 files based on the 
{@code fs.s3a.glacier.read.restored.objects} configuration set in {@link 
S3AFileSystem}: Line is longer than 100 characters (found 163). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:32:
 * {@code READ_ALL}: Retrieval of Glacier files will fail with 
InvalidObjectStateException: The operation is not valid for the object's 
storage class.: Line is longer than 100 characters (found 150). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:33:
 * {@code SKIP_ALL_GLACIER}: If this value is set then this will ignore any S3 
Objects which are tagged with Glacier storage classes and retrieve the others.: 
Line is longer than 100 characters (found 157). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:34:
 * {@code READ_RESTORED_GLACIER_OBJECTS}: If this value is set then restored 
status of the Glacier object will be checked, if restored the objects would be 
read like normal S3 objects else they will be ignored as the objects would not 
have been retrieved from the S3 Glacier.: Line is longer than 100 characters 
(found 275). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:42:
  private static final Set GLACIER_STORAGE_CLASSES = 
Sets.newHashSet(ObjectStorageClass.GLACIER, ObjectStorageClass.DEEP_ARCHIVE);: 
Line is longer than 100 characters (found 150). [LineLength]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:50:
  /**: First sentence should end with a period. [JavadocStyle]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:59:
  /**: First sentence should end with a period. [JavadocStyle]
   
./hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ObjectStorageClassFilter.java:68:
  /**: First sentence should end with a period. [JavadocStyle]
   
./hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/list/ITestS3AReadRestoredGlacierObjects.java:70:
  private final int MAX_RETRIES = 100;:21: Name 'MAX_RETRIES' must match 
pattern '^[a-z][a-zA-Z0-9]*$'. [MemberName]
   
./hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/list/ITestS3AReadRestoredGlacierObjects.java:71:
  private final int RETRY_DELAY_MS = 5000;:21: Name 'RETRY_DELAY_MS' must match 
pattern '^[a-z][a-zA-Z0-9]*$'. [MemberName]
   
./hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/list/ITestS3AReadRestoredGlacierObjects.java:84:
conf.set(STORAGE_CLASS, glacierClass); // Create Glacier objects:Storage 
Class:DEEP_ARCHIVE/GLACIER: Line is longer than 100 characters (found 103). 
[LineLength]
   
./hadoop-tools/hadoop-

[jira] [Commented] (HADOOP-19043) S3A: Regression: ITestS3AOpenCost fails on prefetch test runs

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19043?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815327#comment-17815327
 ] 

ASF GitHub Bot commented on HADOOP-19043:
-

steveloughran commented on PR #6465:
URL: https://github.com/apache/hadoop/pull/6465#issuecomment-1932193942

   Ready to go in; reviews welcome.
   
   @ahmarsuhail @mukund-thakur @HarshitGupta11 




> S3A: Regression: ITestS3AOpenCost fails on prefetch test runs
> -
>
> Key: HADOOP-19043
> URL: https://issues.apache.org/jira/browse/HADOOP-19043
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3, test
>Affects Versions: 3.4.0
>Reporter: Steve Loughran
>Assignee: Steve Loughran
>Priority: Minor
>  Labels: pull-request-available
>
> Getting test failures in the new ITestS3AOpenCost tests when run with 
> {{-Dprefetch}}
> Thought I'd tested this, but clearly not
> * class cast failures on asserts (fix: skip)
> * bytes read different in one test: (fix: identify and address)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-19043. S3A: Regression: ITestS3AOpenCost fails on prefetch test runs [hadoop]

2024-02-07 Thread via GitHub


steveloughran commented on PR #6465:
URL: https://github.com/apache/hadoop/pull/6465#issuecomment-1932193942

   Ready to go in; reviews welcome.
   
   @ahmarsuhail @mukund-thakur @HarshitGupta11 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18993) S3A: Add option fs.s3a.classloader.isolation (#6301)

2024-02-07 Thread Steve Loughran (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran updated HADOOP-18993:

Description: 
In HADOOP-17372 the S3AFileSystem forces the configuration classloader to be 
the same as the one that loaded S3AFileSystem. This leads to the impossibility 
in Spark applications to load third party credentials providers as user jars.


The option fs.s3a.classloader.isolation (default: true) can be set to false to 
disable s3a classloader isolation;

This can assist in using custom credential providers and other extension points.




  was:
In HADOOP-17372 the S3AFileSystem forces the configuration classloader to be 
the same as the one that loaded S3AFileSystem. This leads to the impossibility 
in Spark applications to load third party credentials providers as user jars.

I propose to add a configuration key {{fs.s3a.extensions.isolated.classloader}} 
with a default value of {{true}} that if set to {{false}} will not perform the 
classloader set.


> S3A: Add option fs.s3a.classloader.isolation (#6301)
> 
>
> Key: HADOOP-18993
> URL: https://issues.apache.org/jira/browse/HADOOP-18993
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: hadoop-thirdparty
>Affects Versions: 3.3.6
>Reporter: Antonio Murgia
>Assignee: Antonio Murgia
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 3.5.0, 3.4.1
>
>
> In HADOOP-17372 the S3AFileSystem forces the configuration classloader to be 
> the same as the one that loaded S3AFileSystem. This leads to the 
> impossibility in Spark applications to load third party credentials providers 
> as user jars.
> The option fs.s3a.classloader.isolation (default: true) can be set to false 
> to disable s3a classloader isolation;
> This can assist in using custom credential providers and other extension 
> points.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18993) S3A: Add option fs.s3a.classloader.isolation (#6301)

2024-02-07 Thread Steve Loughran (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran updated HADOOP-18993:

Component/s: fs/s3
 (was: hadoop-thirdparty)

> S3A: Add option fs.s3a.classloader.isolation (#6301)
> 
>
> Key: HADOOP-18993
> URL: https://issues.apache.org/jira/browse/HADOOP-18993
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: fs/s3
>Affects Versions: 3.3.6
>Reporter: Antonio Murgia
>Assignee: Antonio Murgia
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 3.5.0, 3.4.1
>
>
> In HADOOP-17372 the S3AFileSystem forces the configuration classloader to be 
> the same as the one that loaded S3AFileSystem. This leads to the 
> impossibility in Spark applications to load third party credentials providers 
> as user jars.
> The option fs.s3a.classloader.isolation (default: true) can be set to false 
> to disable s3a classloader isolation;
> This can assist in using custom credential providers and other extension 
> points.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18993) S3A: Add option fs.s3a.classloader.isolation (#6301)

2024-02-07 Thread Steve Loughran (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran updated HADOOP-18993:

Summary: S3A: Add option fs.s3a.classloader.isolation (#6301)  (was: Allow 
to not isolate S3AFileSystem classloader when needed)

> S3A: Add option fs.s3a.classloader.isolation (#6301)
> 
>
> Key: HADOOP-18993
> URL: https://issues.apache.org/jira/browse/HADOOP-18993
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: hadoop-thirdparty
>Affects Versions: 3.3.6
>Reporter: Antonio Murgia
>Assignee: Antonio Murgia
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 3.5.0, 3.4.1
>
>
> In HADOOP-17372 the S3AFileSystem forces the configuration classloader to be 
> the same as the one that loaded S3AFileSystem. This leads to the 
> impossibility in Spark applications to load third party credentials providers 
> as user jars.
> I propose to add a configuration key 
> {{fs.s3a.extensions.isolated.classloader}} with a default value of {{true}} 
> that if set to {{false}} will not perform the classloader set.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18993) Allow to not isolate S3AFileSystem classloader when needed

2024-02-07 Thread Steve Loughran (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran updated HADOOP-18993:

Fix Version/s: 3.4.1

> Allow to not isolate S3AFileSystem classloader when needed
> --
>
> Key: HADOOP-18993
> URL: https://issues.apache.org/jira/browse/HADOOP-18993
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: hadoop-thirdparty
>Affects Versions: 3.3.6
>Reporter: Antonio Murgia
>Assignee: Antonio Murgia
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 3.5.0, 3.4.1
>
>
> In HADOOP-17372 the S3AFileSystem forces the configuration classloader to be 
> the same as the one that loaded S3AFileSystem. This leads to the 
> impossibility in Spark applications to load third party credentials providers 
> as user jars.
> I propose to add a configuration key 
> {{fs.s3a.extensions.isolated.classloader}} with a default value of {{true}} 
> that if set to {{false}} will not perform the classloader set.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] Hadoop 18993 - target 3.4 [hadoop]

2024-02-07 Thread via GitHub


steveloughran merged PR #6529:
URL: https://github.com/apache/hadoop/pull/6529


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18993) Allow to not isolate S3AFileSystem classloader when needed

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18993?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815322#comment-17815322
 ] 

ASF GitHub Bot commented on HADOOP-18993:
-

steveloughran commented on PR #6301:
URL: https://github.com/apache/hadoop/pull/6301#issuecomment-1932165399

   thanks. low priority issue as it'll just be fs leakage somewhere. simplest 
fix would be to use closeAllForUGI() in setup to destroy any old ones around




> Allow to not isolate S3AFileSystem classloader when needed
> --
>
> Key: HADOOP-18993
> URL: https://issues.apache.org/jira/browse/HADOOP-18993
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: hadoop-thirdparty
>Affects Versions: 3.3.6
>Reporter: Antonio Murgia
>Assignee: Antonio Murgia
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 3.5.0
>
>
> In HADOOP-17372 the S3AFileSystem forces the configuration classloader to be 
> the same as the one that loaded S3AFileSystem. This leads to the 
> impossibility in Spark applications to load third party credentials providers 
> as user jars.
> I propose to add a configuration key 
> {{fs.s3a.extensions.isolated.classloader}} with a default value of {{true}} 
> that if set to {{false}} will not perform the classloader set.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-18993 Allow to not isolate S3AFileSystem classloader when needed [hadoop]

2024-02-07 Thread via GitHub


steveloughran commented on PR #6301:
URL: https://github.com/apache/hadoop/pull/6301#issuecomment-1932165399

   thanks. low priority issue as it'll just be fs leakage somewhere. simplest 
fix would be to use closeAllForUGI() in setup to destroy any old ones around


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[PR] HDFS-17374. EC: StripedBlockReader#newConnectedPeer should set SO_TIMEOUT and SO_KEEPALIVE [hadoop]

2024-02-07 Thread via GitHub


hfutatzhanghb opened a new pull request, #6536:
URL: https://github.com/apache/hadoop/pull/6536

   ### Description of PR
   Refer to HDFS-17374.
   more details need time to finish.
   
   ### How was this patch tested?
   
   
   ### For code changes:
   
   - [ ] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HDFS-17373. Fix connection leak in FolderRenamePending [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6534:
URL: https://github.com/apache/hadoop/pull/6534#issuecomment-1932003350

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  12m 56s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  42m  5s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 36s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 34s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   0m 32s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 38s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 38s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 33s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m  3s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  31m 59s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 26s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 28s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m 28s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 25s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 25s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   0m 18s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   0m 28s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 25s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 24s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m  3s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  32m 29s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 15s |  |  hadoop-azure in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   0m 35s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 134m 54s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6534/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6534 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 5eae444d189b 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 649c40fe96cc84f17ff0d97949fc2b55b46772e9 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6534/2/testReport/ |
   | Max. process+thread count | 624 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-azure U: hadoop-tools/hadoop-azure |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6534/2/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.

Re: [PR] HDFS-17373. Fix connection leak in FolderRenamePending [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6534:
URL: https://github.com/apache/hadoop/pull/6534#issuecomment-1931980610

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 34s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  42m  6s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 37s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 34s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   0m 31s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 39s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 38s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 32s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m  4s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  33m 57s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 28s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 28s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m 28s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 26s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 26s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   0m 19s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   0m 30s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 25s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 24s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m  2s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  32m 14s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   1m 57s |  |  hadoop-azure in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   0m 35s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 123m 48s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6534/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6534 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux ac3427661d95 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 649c40fe96cc84f17ff0d97949fc2b55b46772e9 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6534/1/testReport/ |
   | Max. process+thread count | 700 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-azure U: hadoop-tools/hadoop-azure |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6534/1/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.

[jira] [Commented] (HADOOP-19045) S3A: pass request timeouts down to sdk clients

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19045?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815271#comment-17815271
 ] 

ASF GitHub Bot commented on HADOOP-19045:
-

steveloughran merged PR #6532:
URL: https://github.com/apache/hadoop/pull/6532




> S3A: pass request timeouts down to sdk clients
> --
>
> Key: HADOOP-19045
> URL: https://issues.apache.org/jira/browse/HADOOP-19045
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Steve Loughran
>Assignee: Steve Loughran
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.5.0, 3.4.1
>
>
> s3a client timeout settings are getting down to http client, but not sdk 
> timeouts, so you can't have a longer timeout than the default. This surfaces 
> in the inability to tune the timeouts for CreateSession calls even now the 
> latest SDK does pick it up



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-19045. CreateSession Timeout - followup [hadoop]

2024-02-07 Thread via GitHub


steveloughran merged PR #6532:
URL: https://github.com/apache/hadoop/pull/6532


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19069) Use hadoop-thirdparty 1.2.0

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815270#comment-17815270
 ] 

ASF GitHub Bot commented on HADOOP-19069:
-

hadoop-yetus commented on PR #6533:
URL: https://github.com/apache/hadoop/pull/6533#issuecomment-1931908407

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  17m 15s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  1s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  14m 19s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  35m 48s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  18m 36s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |  16m 45s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  mvnsite  |   3m 12s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 39s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   2m 12s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  shadedclient  | 131m 10s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 40s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 47s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  17m 23s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |  17m 23s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  16m 31s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |  16m 31s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   3m  9s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 32s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   2m  3s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  shadedclient  |  46m 55s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   0m 33s |  |  hadoop-project in the patch 
passed.  |
   | +1 :green_heart: |  unit  |  19m 14s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   1m 11s |  |  hadoop-yarn-api in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   0m 56s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 258m 15s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6533/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6533 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient codespell detsecrets xmllint |
   | uname | Linux ebd4d47543cd 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 7930d5c7a2977ed071852034770d1881798f1a26 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6533/1/testReport/ |
   | Max. process+thread count | 1263 (vs. ulimit of 5500) |
   | modules | C: hadoop-project hadoop-common-project/hadoop-common 
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api U: . |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/P

Re: [PR] HADOOP-19069. Use hadoop-thirdparty 1.2.0. [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6533:
URL: https://github.com/apache/hadoop/pull/6533#issuecomment-1931908407

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  17m 15s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  1s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  14m 19s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  35m 48s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  18m 36s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |  16m 45s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  mvnsite  |   3m 12s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 39s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   2m 12s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  shadedclient  | 131m 10s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 40s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 47s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  17m 23s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |  17m 23s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  16m 31s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |  16m 31s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   3m  9s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 32s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   2m  3s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  shadedclient  |  46m 55s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   0m 33s |  |  hadoop-project in the patch 
passed.  |
   | +1 :green_heart: |  unit  |  19m 14s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   1m 11s |  |  hadoop-yarn-api in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   0m 56s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 258m 15s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6533/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6533 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient codespell detsecrets xmllint |
   | uname | Linux ebd4d47543cd 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 7930d5c7a2977ed071852034770d1881798f1a26 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6533/1/testReport/ |
   | Max. process+thread count | 1263 (vs. ulimit of 5500) |
   | modules | C: hadoop-project hadoop-common-project/hadoop-common 
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api U: . |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6533/1/console |
   | versions | git=2.25.1 maven=3.6.3 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond 

[jira] [Updated] (HADOOP-19057) S3 public test bucket landsat-pds unreadable -needs replacement

2024-02-07 Thread Steve Loughran (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-19057?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran updated HADOOP-19057:

Description: 
The s3 test bucket used in hadoop-aws tests of S3 select and large file reads 
is no longer publicly accessible

{code}
java.nio.file.AccessDeniedException: landsat-pds: getBucketMetadata() on 
landsat-pds: software.amazon.awssdk.services.s3.model.S3Exception: null 
(Service: S3, Status Code: 403, Request ID: 06QNYQ9GND5STQ2S, Extended Request 
ID: 
O+u2Y1MrCQuuSYGKRAWHj/5LcDLuaFS8owNuXXWSJ0zFXYfuCaTVLEP351S/umti558eKlUqV6U=):null

{code}

* Because HADOOP-18830 has cut s3 select, all we need in 3.4.1+ is a large file 
for some reading tests
* changing the default value disables s3 select tests on older releases
* if fs.s3a.scale.test.csvfile is set to " " then other tests which need it 
will be skipped

Proposed
* we locate a new large file under the (requester pays) s3a://usgs-landsat/ 
bucket . All releases with HADOOP-18168 can use this
* update 3.4.1 source to use this; document it
* do something similar for 3.3.9 + maybe even cut s3 select there too.
* document how to use it on older releases with requester-pays support
* document how to completely disable it on older releases.

h2. How to fix (most) landsat test failures on older releases

add this to your auth-keys.xml file. Expect some failures in a few tests 
with-hardcoded references to the bucket (assumed role delegation tokens)

{code}

  
fs.s3a.scale.test.csvfile
s3a://noaa-cors-pds/raw/2023/017/ohfh/OHFH017d.23_.gz
file used in scale tests
  

  
fs.s3a.bucket.noaa-cors-pds.endpoint.region
us-east-1
  

  
fs.s3a.bucket.noaa-isd-pds.multipart.purge
false
Don't try to purge uploads in the read-only bucket, as
it will only create log noise.
  

  
fs.s3a.bucket.noaa-isd-pds.probe
0
Let's postpone existence checks to the first IO operation 

  

  
fs.s3a.bucket.noaa-isd-pds.audit.add.referrer.header
false
Do not add the referrer header
  

  
fs.s3a.bucket.noaa-isd-pds.prefetch.block.size
128k
Use a small prefetch size so tests fetch multiple 
blocks
  


{code}


  was:
The s3 test bucket used in hadoop-aws tests of S3 select and large file reads 
is no longer publicly accessible

{code}
java.nio.file.AccessDeniedException: landsat-pds: getBucketMetadata() on 
landsat-pds: software.amazon.awssdk.services.s3.model.S3Exception: null 
(Service: S3, Status Code: 403, Request ID: 06QNYQ9GND5STQ2S, Extended Request 
ID: 
O+u2Y1MrCQuuSYGKRAWHj/5LcDLuaFS8owNuXXWSJ0zFXYfuCaTVLEP351S/umti558eKlUqV6U=):null

{code}

* Because HADOOP-18830 has cut s3 select, all we need in 3.4.1+ is a large file 
for some reading tests
* changing the default value disables s3 select tests on older releases
* if fs.s3a.scale.test.csvfile is set to " " then other tests which need it 
will be skipped

Proposed
* we locate a new large file under the (requester pays) s3a://usgs-landsat/ 
bucket . All releases with HADOOP-18168 can use this
* update 3.4.1 source to use this; document it
* do something similar for 3.3.9 + maybe even cut s3 select there too.
* document how to use it on older releases with requester-pays support
* document how to completely disable it on older releases.


> S3 public test bucket landsat-pds unreadable -needs replacement
> ---
>
> Key: HADOOP-19057
> URL: https://issues.apache.org/jira/browse/HADOOP-19057
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3, test
>Affects Versions: 3.4.0, 3.2.4, 3.3.9, 3.3.6, 3.5.0
>Reporter: Steve Loughran
>Priority: Critical
>  Labels: pull-request-available
>
> The s3 test bucket used in hadoop-aws tests of S3 select and large file reads 
> is no longer publicly accessible
> {code}
> java.nio.file.AccessDeniedException: landsat-pds: getBucketMetadata() on 
> landsat-pds: software.amazon.awssdk.services.s3.model.S3Exception: null 
> (Service: S3, Status Code: 403, Request ID: 06QNYQ9GND5STQ2S, Extended 
> Request ID: 
> O+u2Y1MrCQuuSYGKRAWHj/5LcDLuaFS8owNuXXWSJ0zFXYfuCaTVLEP351S/umti558eKlUqV6U=):null
> {code}
> * Because HADOOP-18830 has cut s3 select, all we need in 3.4.1+ is a large 
> file for some reading tests
> * changing the default value disables s3 select tests on older releases
> * if fs.s3a.scale.test.csvfile is set to " " then other tests which need it 
> will be skipped
> Proposed
> * we locate a new large file under the (requester pays) s3a://usgs-landsat/ 
> bucket . All releases with HADOOP-18168 can use this
> * update 3.4.1 source to use this; document it
> * do something similar for 3.3.9 + maybe even cut s3 select there too.
> * document how to use it on older releases with requester-pays support
> * document how to completely 

Re: [PR] Hadoop 18993 - target 3.4 [hadoop]

2024-02-07 Thread via GitHub


steveloughran commented on PR #6529:
URL: https://github.com/apache/hadoop/pull/6529#issuecomment-1931901281

   yeah, landsat bucket failures. see #6523. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19058) [JDK-17] Fix UT Failures in hadoop common, hdfs, yarn

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19058?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815264#comment-17815264
 ] 

ASF GitHub Bot commented on HADOOP-19058:
-

steveloughran commented on PR #6531:
URL: https://github.com/apache/hadoop/pull/6531#issuecomment-1931896429

   commented. I looked at surefire and it is at 3.2.5; probably time to 
upgrade. I propose it is done before doing this one, splitting them in two.
   
   and we need java8 and java17 profiles so we can add other options in other 
places




> [JDK-17] Fix UT Failures in hadoop common, hdfs, yarn
> -
>
> Key: HADOOP-19058
> URL: https://issues.apache.org/jira/browse/HADOOP-19058
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Bilwa S T
>Assignee: Bilwa S T
>Priority: Major
>  Labels: pull-request-available
>
> Most of the UT's failed with below exception:
> Caused by: java.lang.ExceptionInInitializerError: Exception 
> java.lang.reflect.InaccessibleObjectException: Unable to make protected final 
> java.lang.Class 
> java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int) throws 
> java.lang.ClassFormatError accessible: module java.base does not "opens 
> java.lang" to unnamed module @d13f7c [in thread "Time-limited test"]



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-19058. [JDK-17] Fix UT Failures in hadoop common, hdfs, yarn [hadoop]

2024-02-07 Thread via GitHub


steveloughran commented on PR #6531:
URL: https://github.com/apache/hadoop/pull/6531#issuecomment-1931896429

   commented. I looked at surefire and it is at 3.2.5; probably time to 
upgrade. I propose it is done before doing this one, splitting them in two.
   
   and we need java8 and java17 profiles so we can add other options in other 
places


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[PR] HDFS-17316 Add hadoop-compat-bench [hadoop]

2024-02-07 Thread via GitHub


HanFreedom opened a new pull request, #6535:
URL: https://github.com/apache/hadoop/pull/6535

   ### Description of PR
   
   A new hadoop-compat-bench module which is directly related to HDFS-17316.
   
   ### How was this patch tested?
   
   Unit test
   
   ### For code changes:
   
   - [ ] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?  YES
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?  Not 
applicable
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?  No new dependency
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?  Not applicable
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19058) [JDK-17] Fix UT Failures in hadoop common, hdfs, yarn

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19058?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815261#comment-17815261
 ] 

ASF GitHub Bot commented on HADOOP-19058:
-

steveloughran commented on code in PR #6531:
URL: https://github.com/apache/hadoop/pull/6531#discussion_r1481358429


##
hadoop-project/pom.xml:
##
@@ -168,7 +168,18 @@
 [3.3.0,)
 
  [JDK-17] Fix UT Failures in hadoop common, hdfs, yarn
> -
>
> Key: HADOOP-19058
> URL: https://issues.apache.org/jira/browse/HADOOP-19058
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Bilwa S T
>Assignee: Bilwa S T
>Priority: Major
>  Labels: pull-request-available
>
> Most of the UT's failed with below exception:
> Caused by: java.lang.ExceptionInInitializerError: Exception 
> java.lang.reflect.InaccessibleObjectException: Unable to make protected final 
> java.lang.Class 
> java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int) throws 
> java.lang.ClassFormatError accessible: module java.base does not "opens 
> java.lang" to unnamed module @d13f7c [in thread "Time-limited test"]



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-19058. [JDK-17] Fix UT Failures in hadoop common, hdfs, yarn [hadoop]

2024-02-07 Thread via GitHub


steveloughran commented on code in PR #6531:
URL: https://github.com/apache/hadoop/pull/6531#discussion_r1481358429


##
hadoop-project/pom.xml:
##
@@ -168,7 +168,18 @@
 [3.3.0,)
 
 
--Xmx2048m 
-XX:+HeapDumpOnOutOfMemoryError
+-Xmx2048m -XX:+HeapDumpOnOutOfMemoryError 
--add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED

Review Comment:
   it's a big ongoing bit of work; aws have been busier than the rest of 
us...we do test against later java versions, but do the build/UT/IT against v8.
   
   I'd like to move 3.5 and then 3.4.x to java17+ ..., where x is a release 
which comes out this year but is greater than 1
   
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19068) ITestS3AClosedFS.testClosedInstrumentation fails on trunk on Mac OS Silicon

2024-02-07 Thread Steve Loughran (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19068?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815260#comment-17815260
 ] 

Steve Loughran commented on HADOOP-19068:
-

do a parallel run to see if that makes it "go away" -with more jvms there may 
be less chance of the problem surfacing.

we always run that way just because it is so much faster...this problem may 
have been hidden for that reason

> ITestS3AClosedFS.testClosedInstrumentation fails on trunk on Mac OS Silicon
> ---
>
> Key: HADOOP-19068
> URL: https://issues.apache.org/jira/browse/HADOOP-19068
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs/s3, test
>Affects Versions: 3.5.0
>Reporter: Antonio Murgia
>Priority: Minor
>
> Running {{mvn clean verify -pl :hadoop-aws}} on my laptop against my s3 
> bucket on eu-west-1 the test \{{ITestS3AClosedFS.testClosedInstrumentation}} 
> fails every time, while when running it in isolation passes correctly.
> It does not fail on branch-3.4, the attached stack trace is tested on commit: 
> 9a7eeadaac818258b319cdb0dc19e9bb1e4fa11a.
> h4. aws endpoint:
> s3.eu-west-1.amazonaws.com
> h4. os:
> [INFO] os.detected.name: osx
> [INFO] os.detected.arch: aarch_64
> [INFO] os.detected.bitness: 64
> [INFO] os.detected.version: 14.3
> [INFO] os.detected.version.major: 14
> [INFO] os.detected.version.minor: 3
> [INFO] os.detected.classifier: osx-aarch_64
> h4. java -version:
> openjdk version "1.8.0_292"
> OpenJDK Runtime Environment (Zulu 8.54.0.21-CA-macos-aarch64) (build 
> 1.8.0_292-b10)
> OpenJDK 64-Bit Server VM (Zulu 8.54.0.21-CA-macos-aarch64) (build 25.292-b10, 
> mixed mode)
> h4. stack trace:
> {{[ERROR] Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 
> 3.706 s <<< FAILURE! - in org.apache.hadoop.fs.s3a.ITestS3AClosedFS}}
> {{[ERROR] 
> testClosedInstrumentation(org.apache.hadoop.fs.s3a.ITestS3AClosedFS) Time 
> elapsed: 0.447 s <<< FAILURE!}}
> {{org.junit.ComparisonFailure: [S3AInstrumentation.hasMetricSystem()] 
> expected:<[fals]e> but was:<[tru]e>}}
> {{at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)}}
> {{at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)}}
> {{at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)}}
> {{at 
> org.apache.hadoop.fs.s3a.ITestS3AClosedFS.testClosedInstrumentation(ITestS3AClosedFS.java:111)}}
> {{at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)}}
> {{at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)}}
> {{at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)}}
> {{at java.lang.reflect.Method.invoke(Method.java:498)}}
> {{at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)}}
> {{at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)}}
> {{at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)}}
> {{at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)}}
> {{at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)}}
> {{at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)}}
> {{at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:61)}}
> {{at 
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:299)}}
> {{at 
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:293)}}
> {{at java.util.concurrent.FutureTask.run(FutureTask.java:266)}}
> {{at java.lang.Thread.run(Thread.java:748)}}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] YARN-11387. [GPG] YARN GPG mistakenly deleted applicationid. [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6473:
URL: https://github.com/apache/hadoop/pull/6473#issuecomment-1931867203

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 35s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  44m 22s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 22s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 22s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   0m 23s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 25s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 30s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 23s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   0m 43s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  36m 56s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 18s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 17s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m 17s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 16s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 16s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   0m 12s | 
[/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-globalpolicygenerator.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6473/4/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-globalpolicygenerator.txt)
 |  
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-globalpolicygenerator:
 The patch generated 10 new + 0 unchanged - 0 fixed = 10 total (was 0)  |
   | +1 :green_heart: |  mvnsite  |   0m 18s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 19s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 17s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   0m 44s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  35m 27s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   0m 59s |  |  
hadoop-yarn-server-globalpolicygenerator in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 35s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 129m 37s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6473/4/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6473 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux a244f44d355e 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / fa157666cbc6cdc1894473c748d16682b8ee8e9b |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6473/4/testReport/ |
   | Max. process+thread count | 552 (vs. ulimit of 5500) |
   | modules | C: 
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-globalpolicygenerator
 U: 
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-globalpolicygenerator
 |
   | Consol

Re: [PR] HDFS-17354: Delay invoke clearStaleNamespacesInRouterStateIdContext during router start up [hadoop]

2024-02-07 Thread via GitHub


ThinkerLei commented on PR #6498:
URL: https://github.com/apache/hadoop/pull/6498#issuecomment-1931846902

   @zhangshuyan0 @simbadzina Add a test case, please help to review again.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HDFS-17354: Delay invoke clearStaleNamespacesInRouterStateIdContext during router start up [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6498:
URL: https://github.com/apache/hadoop/pull/6498#issuecomment-1931819876

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  12m 39s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 2 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  43m  6s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 39s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 37s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   0m 30s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 42s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 42s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 31s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m 20s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  32m 59s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 30s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 32s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m 32s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 29s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 29s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   0m 18s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   0m 31s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 28s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 24s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m 20s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  32m 48s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   9m 43s |  |  hadoop-hdfs-rbf in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   0m 35s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 145m 39s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6498/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6498 |
   | JIRA Issue | HDFS-17354 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 3c4257fdeb6b 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 24fc113c036c1b4530eec123f788cdd0da72db49 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6498/2/testReport/ |
   | Max. process+thread count | 2572 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-rbf U: 
hadoop-hdfs-project/hadoop-hdfs-rbf |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6498/2/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastruct

Re: [PR] HADOOP-14837 : Support Read Restored Glacier Objects [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6407:
URL: https://github.com/apache/hadoop/pull/6407#issuecomment-1931817110

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 20s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  markdownlint  |   0m  1s |  |  markdownlint was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  31m 45s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 25s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 20s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   0m 21s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 25s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 18s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 21s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   0m 43s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  19m 36s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 17s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 19s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m 19s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 16s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 16s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   0m 12s | 
[/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/10/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt)
 |  hadoop-tools/hadoop-aws: The patch generated 21 new + 6 unchanged - 0 fixed 
= 27 total (was 6)  |
   | +1 :green_heart: |  mvnsite  |   0m 18s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m  9s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 17s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   0m 42s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  19m  1s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 28s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   0m 23s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   |  81m 32s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/10/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6407 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets markdownlint 
|
   | uname | Linux 8e4ff0d8df99 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / c5dcd5c4426135b7b054ed19f4bcf8e68d955d77 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/10/testReport/ |
   | Max. process+thread count | 559 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/10/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


--

[jira] [Commented] (HADOOP-14837) Handle S3A "glacier" data

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-14837?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815249#comment-17815249
 ] 

ASF GitHub Bot commented on HADOOP-14837:
-

hadoop-yetus commented on PR #6407:
URL: https://github.com/apache/hadoop/pull/6407#issuecomment-1931817110

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 20s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  markdownlint  |   0m  1s |  |  markdownlint was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  31m 45s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 25s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 20s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   0m 21s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 25s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 18s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 21s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   0m 43s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  19m 36s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 17s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 19s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m 19s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 16s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 16s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   0m 12s | 
[/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/10/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt)
 |  hadoop-tools/hadoop-aws: The patch generated 21 new + 6 unchanged - 0 fixed 
= 27 total (was 6)  |
   | +1 :green_heart: |  mvnsite  |   0m 18s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m  9s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 17s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   0m 42s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  19m  1s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 28s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   0m 23s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   |  81m 32s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/10/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6407 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets markdownlint 
|
   | uname | Linux 8e4ff0d8df99 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / c5dcd5c4426135b7b054ed19f4bcf8e68d955d77 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/10/testReport/ |
   | Max. process+thread count | 559 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
ht

Re: [PR] HDFS-17373. Fix connection leak in FolderRenamePending [hadoop]

2024-02-07 Thread via GitHub


xuzifu666 commented on PR #6534:
URL: https://github.com/apache/hadoop/pull/6534#issuecomment-1931780456

   @ayushtkn @slfan1989  PTAL


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[PR] HDFS-17373. Fix connection leak in FolderRenamePending [hadoop]

2024-02-07 Thread via GitHub


xuzifu666 opened a new pull request, #6534:
URL: https://github.com/apache/hadoop/pull/6534

   
   
   ### Description of PR
   Currently connection leak in FolderRenamePending in getting bytes. the pr 
aim to fix it
   
   
   ### How was this patch tested?
   
   
   ### For code changes:
   
   - [ ] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19044) AWS SDK V2 - Update S3A region logic

2024-02-07 Thread Steve Loughran (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815233#comment-17815233
 ] 

Steve Loughran commented on HADOOP-19044:
-

as well as the fips stuff, we've hit some fun with regions which came out after 
2020: 
https://docs.aws.amazon.com/AmazonS3/latest/userguide/VirtualHosting.html#VirtualHostingBackwardsCompatibility

bq. For S3 buckets in Regions launched after March 20, 2019, the DNS server 
doesn't route your request directly to the AWS Region where your bucket 
resides. It returns an HTTP 400 Bad Request error instead.

> AWS SDK V2 - Update S3A region logic 
> -
>
> Key: HADOOP-19044
> URL: https://issues.apache.org/jira/browse/HADOOP-19044
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Assignee: Viraj Jasani
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.5.0, 3.4.1
>
>
> If both fs.s3a.endpoint & fs.s3a.endpoint.region are empty, Spark will set 
> fs.s3a.endpoint to 
> s3.amazonaws.com here:
> [https://github.com/apache/spark/blob/9a2f39318e3af8b3817dc5e4baf52e548d82063c/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala#L540]
>  
>  
> HADOOP-18908, updated the region logic such that if fs.s3a.endpoint.region is 
> set, or if a region can be parsed from fs.s3a.endpoint (which will happen in 
> this case, region will be US_EAST_1), cross region access is not enabled. 
> This will cause 400 errors if the bucket is not in US_EAST_1. 
>  
> Proposed: Updated the logic so that if the endpoint is the global 
> s3.amazonaws.com , cross region access is enabled.  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] YARN-11387. [GPG] YARN GPG mistakenly deleted applicationid. [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6473:
URL: https://github.com/apache/hadoop/pull/6473#issuecomment-1931745994

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  17m 56s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  49m  4s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 30s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 24s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   0m 28s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 30s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 37s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 24s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   0m 49s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  38m 22s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 18s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 17s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m 17s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 16s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 16s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   0m 13s | 
[/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-globalpolicygenerator.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6473/3/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-globalpolicygenerator.txt)
 |  
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-globalpolicygenerator:
 The patch generated 5 new + 0 unchanged - 0 fixed = 5 total (was 0)  |
   | +1 :green_heart: |  mvnsite  |   0m 18s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 18s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 17s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   0m 44s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  39m 27s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | -1 :x: |  unit  |   0m 58s | 
[/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-globalpolicygenerator.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6473/3/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-globalpolicygenerator.txt)
 |  hadoop-yarn-server-globalpolicygenerator in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 34s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 157m 21s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | 
hadoop.yarn.server.globalpolicygenerator.applicationcleaner.TestDefaultApplicationCleaner
 |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6473/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6473 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux a8d0e4abcb07 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / b66ad5192a6baee990f4bf2afa4424df2e9a9d71 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~

[jira] [Commented] (HADOOP-19057) S3 public test bucket landsat-pds unreadable -needs replacement

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19057?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815225#comment-17815225
 ] 

ASF GitHub Bot commented on HADOOP-19057:
-

steveloughran commented on PR #6515:
URL: https://github.com/apache/hadoop/pull/6515#issuecomment-1931710763

   file s3a://noaa-cors-pds/raw/2023/017/ohfh/OHFH017d.23_.gz is longer; should 
stop eof regressions on the older branches.




> S3 public test bucket landsat-pds unreadable -needs replacement
> ---
>
> Key: HADOOP-19057
> URL: https://issues.apache.org/jira/browse/HADOOP-19057
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3, test
>Affects Versions: 3.4.0, 3.2.4, 3.3.9, 3.3.6, 3.5.0
>Reporter: Steve Loughran
>Priority: Critical
>  Labels: pull-request-available
>
> The s3 test bucket used in hadoop-aws tests of S3 select and large file reads 
> is no longer publicly accessible
> {code}
> java.nio.file.AccessDeniedException: landsat-pds: getBucketMetadata() on 
> landsat-pds: software.amazon.awssdk.services.s3.model.S3Exception: null 
> (Service: S3, Status Code: 403, Request ID: 06QNYQ9GND5STQ2S, Extended 
> Request ID: 
> O+u2Y1MrCQuuSYGKRAWHj/5LcDLuaFS8owNuXXWSJ0zFXYfuCaTVLEP351S/umti558eKlUqV6U=):null
> {code}
> * Because HADOOP-18830 has cut s3 select, all we need in 3.4.1+ is a large 
> file for some reading tests
> * changing the default value disables s3 select tests on older releases
> * if fs.s3a.scale.test.csvfile is set to " " then other tests which need it 
> will be skipped
> Proposed
> * we locate a new large file under the (requester pays) s3a://usgs-landsat/ 
> bucket . All releases with HADOOP-18168 can use this
> * update 3.4.1 source to use this; document it
> * do something similar for 3.3.9 + maybe even cut s3 select there too.
> * document how to use it on older releases with requester-pays support
> * document how to completely disable it on older releases.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-19057. Landsat bucket deleted [hadoop]

2024-02-07 Thread via GitHub


steveloughran commented on PR #6515:
URL: https://github.com/apache/hadoop/pull/6515#issuecomment-1931710763

   file s3a://noaa-cors-pds/raw/2023/017/ohfh/OHFH017d.23_.gz is longer; should 
stop eof regressions on the older branches.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-14837) Handle S3A "glacier" data

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-14837?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815211#comment-17815211
 ] 

ASF GitHub Bot commented on HADOOP-14837:
-

bpahuja commented on PR #6407:
URL: https://github.com/apache/hadoop/pull/6407#issuecomment-1931661739

   Build failed due to an OOM Error, 
   
   build passing on local 
   
   ```
   [INFO] 

   [INFO] BUILD FAILURE
   [INFO] 

   [INFO] Total time:  01:33 min
   [INFO] Finished at: 2024-02-07T08:13:12Z
   [INFO] 

   [ERROR] unable to create new native thread -> [Help 1]
   [ERROR] 
   [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
   [ERROR] Re-run Maven using the -X switch to enable full debug logging.
   [ERROR] 
   [ERROR] For more information about the errors and possible solutions, please 
read the following articles:
   [ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError
   
   ```




> Handle S3A "glacier" data
> -
>
> Key: HADOOP-14837
> URL: https://issues.apache.org/jira/browse/HADOOP-14837
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.0.0-beta1
>Reporter: Steve Loughran
>Assignee: Bhavay Pahuja
>Priority: Minor
>  Labels: pull-request-available
>
> SPARK-21797 covers how if you have AWS S3 set to copy some files to glacier, 
> they appear in the listing but GETs fail, and so does everything else
> We should think about how best to handle this.
> # report better
> # if listings can identify files which are glaciated then maybe we could have 
> an option to filter them out
> # test & see what happens



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HADOOP-14837 : Support Read Restored Glacier Objects [hadoop]

2024-02-07 Thread via GitHub


bpahuja commented on PR #6407:
URL: https://github.com/apache/hadoop/pull/6407#issuecomment-1931661739

   Build failed due to an OOM Error, 
   
   build passing on local 
   
   ```
   [INFO] 

   [INFO] BUILD FAILURE
   [INFO] 

   [INFO] Total time:  01:33 min
   [INFO] Finished at: 2024-02-07T08:13:12Z
   [INFO] 

   [ERROR] unable to create new native thread -> [Help 1]
   [ERROR] 
   [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
   [ERROR] Re-run Maven using the -X switch to enable full debug logging.
   [ERROR] 
   [ERROR] For more information about the errors and possible solutions, please 
read the following articles:
   [ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError
   
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



Re: [PR] HDFS-17365. EC: Add extra redunency configuration in checkStreamerFailures to prevent data loss. [hadoop]

2024-02-07 Thread via GitHub


hfutatzhanghb commented on PR #6517:
URL: https://github.com/apache/hadoop/pull/6517#issuecomment-1931654235

   > This is a good idea for me. I think we should allow different EC 
strategies to use different sizes of `extraStreamerRedunency`. May be a 
structure like `ReplaceDatanodeOnFailure`? Or may just contain EC schema name 
in configuration key?
   
   Nice opinion, Sir.  Give me some time to think about how to design it. Hope 
we can push this idea forward. Thanks a lot!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-14837) Handle S3A "glacier" data

2024-02-07 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-14837?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17815153#comment-17815153
 ] 

ASF GitHub Bot commented on HADOOP-14837:
-

hadoop-yetus commented on PR #6407:
URL: https://github.com/apache/hadoop/pull/6407#issuecomment-1931517312

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 23s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  markdownlint  |   0m  0s |  |  markdownlint was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | -1 :x: |  mvninstall  |   1m 40s | 
[/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/branch-mvninstall-root.txt)
 |  root in trunk failed.  |
   | -1 :x: |  compile  |   0m 22s | 
[/branch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/branch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  hadoop-aws in trunk failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  compile  |   3m 21s | 
[/branch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/branch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt)
 |  hadoop-aws in trunk failed with JDK Private 
Build-1.8.0_392-8u392-ga-1~20.04-b08.  |
   | -0 :warning: |  checkstyle  |   0m 19s | 
[/buildtool-branch-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/buildtool-branch-checkstyle-hadoop-tools_hadoop-aws.txt)
 |  The patch fails to run checkstyle in hadoop-aws  |
   | -1 :x: |  mvnsite  |   0m 21s | 
[/branch-mvnsite-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/branch-mvnsite-hadoop-tools_hadoop-aws.txt)
 |  hadoop-aws in trunk failed.  |
   | -1 :x: |  javadoc  |   0m 22s | 
[/branch-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/branch-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  hadoop-aws in trunk failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  javadoc  |   0m  5s | 
[/branch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/branch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt)
 |  hadoop-aws in trunk failed with JDK Private 
Build-1.8.0_392-8u392-ga-1~20.04-b08.  |
   | +1 :green_heart: |  spotbugs  |   1m 45s |  |  trunk passed  |
   | -1 :x: |  shadedclient  |   4m 27s |  |  branch has errors when building 
and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | -1 :x: |  mvninstall  |   0m 20s | 
[/patch-mvninstall-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/patch-mvninstall-hadoop-tools_hadoop-aws.txt)
 |  hadoop-aws in the patch failed.  |
   | -1 :x: |  compile  |   0m 20s | 
[/patch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  hadoop-aws in the patch failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  javac  |   0m 20s | 
[/patch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  hadoop-aws in the patch failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | +1 :green_heart: |  compile  |   0m 30s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 30s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s 

Re: [PR] HADOOP-14837 : Support Read Restored Glacier Objects [hadoop]

2024-02-07 Thread via GitHub


hadoop-yetus commented on PR #6407:
URL: https://github.com/apache/hadoop/pull/6407#issuecomment-1931517312

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 23s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  markdownlint  |   0m  0s |  |  markdownlint was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | -1 :x: |  mvninstall  |   1m 40s | 
[/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/branch-mvninstall-root.txt)
 |  root in trunk failed.  |
   | -1 :x: |  compile  |   0m 22s | 
[/branch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/branch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  hadoop-aws in trunk failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  compile  |   3m 21s | 
[/branch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/branch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt)
 |  hadoop-aws in trunk failed with JDK Private 
Build-1.8.0_392-8u392-ga-1~20.04-b08.  |
   | -0 :warning: |  checkstyle  |   0m 19s | 
[/buildtool-branch-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/buildtool-branch-checkstyle-hadoop-tools_hadoop-aws.txt)
 |  The patch fails to run checkstyle in hadoop-aws  |
   | -1 :x: |  mvnsite  |   0m 21s | 
[/branch-mvnsite-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/branch-mvnsite-hadoop-tools_hadoop-aws.txt)
 |  hadoop-aws in trunk failed.  |
   | -1 :x: |  javadoc  |   0m 22s | 
[/branch-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/branch-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  hadoop-aws in trunk failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  javadoc  |   0m  5s | 
[/branch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/branch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt)
 |  hadoop-aws in trunk failed with JDK Private 
Build-1.8.0_392-8u392-ga-1~20.04-b08.  |
   | +1 :green_heart: |  spotbugs  |   1m 45s |  |  trunk passed  |
   | -1 :x: |  shadedclient  |   4m 27s |  |  branch has errors when building 
and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | -1 :x: |  mvninstall  |   0m 20s | 
[/patch-mvninstall-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/patch-mvninstall-hadoop-tools_hadoop-aws.txt)
 |  hadoop-aws in the patch failed.  |
   | -1 :x: |  compile  |   0m 20s | 
[/patch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  hadoop-aws in the patch failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  javac  |   0m 20s | 
[/patch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  hadoop-aws in the patch failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | +1 :green_heart: |  compile  |   0m 30s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 30s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   0m 20s | 
[/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6407/9/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.