[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-09-05 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17879678#comment-17879678
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

steveloughran commented on PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#issuecomment-2332491384

   thanks. credited you there.
   
   also backported with testing onto branches 3.4 and 3.4.1; the next RC of 
that release is coming soon!




> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Assignee: Shintaro Onuma
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.5.0
>
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-09-05 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17879555#comment-17879555
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

steveloughran commented on PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#issuecomment-2331656550

   merged. 
   
   @shintaroonuma what is your jira account, so we can assign the issue to you 
there too.




> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.5.0
>
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-08-29 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17877856#comment-17877856
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

steveloughran commented on PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#issuecomment-2318561465

   sorry, forgotten about this...don't be afraid to remind me! 
   testing locally and will merge if good.




> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>  Labels: pull-request-available
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-04-25 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17841054#comment-17841054
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

hadoop-yetus commented on PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#issuecomment-2078647702

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m 01s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  spotbugs  |   0m 00s |  |  spotbugs executables are not 
available.  |
   | +0 :ok: |  codespell  |   0m 00s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m 00s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m 01s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m 00s |  |  The patch appears to 
include 2 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  | 108m 16s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   5m 41s |  |  trunk passed  |
   | +1 :green_heart: |  checkstyle  |   5m 21s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   6m 00s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   5m 40s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  | 173m 14s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   4m 11s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   2m 42s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |   2m 42s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m 00s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   2m 25s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   2m 52s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 34s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  | 191m 32s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  asflicense  |   6m 49s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 501m 31s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | GITHUB PR | https://github.com/apache/hadoop/pull/6466 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | MINGW64_NT-10.0-17763 c92fbd9e5cae 3.4.10-87d57229.x86_64 
2024-02-14 20:17 UTC x86_64 Msys |
   | Build tool | maven |
   | Personality | /c/hadoop/dev-support/bin/hadoop.sh |
   | git revision | trunk / 97360ba71f24df4cfc2d44f2f05c1bee0129a968 |
   | Default Java | Azul Systems, Inc.-1.8.0_332-b09 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6466/1/testReport/
 |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6466/1/console
 |
   | versions | git=2.44.0.windows.1 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   




> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>  Labels: pull-request-available
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-02-09 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17816200#comment-17816200
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

hadoop-yetus commented on PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#issuecomment-1936513805

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 31s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 2 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  41m 47s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 42s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 34s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   0m 32s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 41s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 27s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 33s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m  7s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  32m 25s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 29s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 33s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m 33s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 26s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 26s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   0m 19s | 
[/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/5/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt)
 |  hadoop-tools/hadoop-aws: The patch generated 14 new + 2 unchanged - 0 fixed 
= 16 total (was 2)  |
   | +1 :green_heart: |  mvnsite  |   0m 30s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 16s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 25s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m  6s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  32m 18s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   3m  0s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   0m 35s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 122m 58s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/5/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6466 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 2dd56b6a7570 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 97360ba71f24df4cfc2d44f2f05c1bee0129a968 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/5/testReport/ |
   | Max. process+thread count | 552 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/5/console |
   | versions | git=2.25.1

[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-02-09 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17816152#comment-17816152
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

hadoop-yetus commented on PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#issuecomment-1936281849

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 31s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 2 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  41m 49s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 42s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 34s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   0m 32s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 41s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 28s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 34s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m  6s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  32m 15s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 29s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 34s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m 34s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 26s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 26s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   0m 20s | 
[/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/4/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt)
 |  hadoop-tools/hadoop-aws: The patch generated 14 new + 2 unchanged - 0 fixed 
= 16 total (was 2)  |
   | +1 :green_heart: |  mvnsite  |   0m 31s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 15s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 25s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m  6s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  32m 16s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 57s |  |  hadoop-aws in the patch passed. 
 |
   | -1 :x: |  asflicense  |   0m 35s | 
[/results-asflicense.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/4/artifact/out/results-asflicense.txt)
 |  The patch generated 1 ASF License warnings.  |
   |  |   | 122m 52s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/4/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6466 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux e5feb12daa84 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 5580fa2f0d366ae6f9ba0379666f240be28dcb1e |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/4/testReport/ |
   | Max. process+thread count | 635 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console 

[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-02-09 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17816110#comment-17816110
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

shintaroonuma commented on PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#issuecomment-1936098571

   Thanks for the comments, rebased and updated the pr.




> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>  Labels: pull-request-available
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-02-09 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17816108#comment-17816108
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

shintaroonuma commented on code in PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#discussion_r148232


##
hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/DefaultS3ClientFactory.java:
##
@@ -361,7 +366,15 @@ private static URI getS3Endpoint(String endpoint, final 
Configuration conf) {
*/
   private static Region getS3RegionFromEndpoint(String endpoint) {

Review Comment:
   Added some unit tests on endpoint parsing.





> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>  Labels: pull-request-available
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-02-09 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17816109#comment-17816109
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

shintaroonuma commented on code in PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#discussion_r148513


##
hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/DefaultS3ClientFactory.java:
##
@@ -82,6 +84,9 @@ public class DefaultS3ClientFactory extends Configured
 
   private static final String S3_SERVICE_NAME = "s3";
 
+  private static final Pattern VPC_ENDPOINT_PATTERN =
+  
Pattern.compile("^(?:.+\\.)?([a-z0-9-]+)\\.vpce\\.amazonaws\\.(?:com|com\\.cn)$");

Review Comment:
   Govcloud is amazonaws.com aswell.





> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>  Labels: pull-request-available
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-01-30 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17812385#comment-17812385
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

steveloughran commented on code in PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#discussion_r1471504240


##
hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/DefaultS3ClientFactory.java:
##
@@ -82,6 +84,9 @@ public class DefaultS3ClientFactory extends Configured
 
   private static final String S3_SERVICE_NAME = "s3";
 
+  private static final Pattern VPC_ENDPOINT_PATTERN =
+  
Pattern.compile("^(?:.+\\.)?([a-z0-9-]+)\\.vpce\\.amazonaws\\.(?:com|com\\.cn)$");

Review Comment:
   does aws govcloud have a different pattern? or is it just .cn





> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>  Labels: pull-request-available
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-01-30 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17812381#comment-17812381
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

steveloughran commented on code in PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#discussion_r1471473061


##
hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/DefaultS3ClientFactory.java:
##
@@ -361,6 +366,13 @@ private static URI getS3Endpoint(String endpoint, final 
Configuration conf) {
*/
   private static Region getS3RegionFromEndpoint(String endpoint) {
 
+// S3 VPC endpoint parsing
+Matcher matcher = VPC_ENDPOINT_PATTERN.matcher(endpoint);
+if(matcher.find()) {
+  LOG.debug("Endpoint {} is vpc endpoint; parsing", endpoint);
+  return Region.of(matcher.group(1));

Review Comment:
   could you log the group(1) value in the debug log so we can see exactly 
which one it is. region and endpoint bindings are the cause of many of our 
support calls.



##
hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/DefaultS3ClientFactory.java:
##
@@ -361,7 +366,15 @@ private static URI getS3Endpoint(String endpoint, final 
Configuration conf) {
*/
   private static Region getS3RegionFromEndpoint(String endpoint) {

Review Comment:
   this could be made package private/visible for testing and then have some 
unit tests which will run every time yetus does its work. this should include a 
test which passes an endpoint which shouldn't match the regexp -and verifies 
that it is rejected





> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>  Labels: pull-request-available
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-01-26 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17811403#comment-17811403
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

hadoop-yetus commented on PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#issuecomment-1912763510

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 34s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  1s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  41m 45s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 41s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 33s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   0m 30s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 40s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 27s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 33s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m  6s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  32m 32s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 27s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 32s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m 32s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 25s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 25s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   0m 19s | 
[/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/3/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt)
 |  hadoop-tools/hadoop-aws: The patch generated 2 new + 2 unchanged - 0 fixed 
= 4 total (was 2)  |
   | +1 :green_heart: |  mvnsite  |   0m 31s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 14s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 25s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m  3s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  32m  8s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 57s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   0m 34s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 122m 31s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6466 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux fd8ff0a64fec 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 487921f916cd3ac039931de469e171a4cc6cffa5 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/3/testReport/ |
   | Max. process+thread count | 647 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/3/console |
   | versions | git=2.25.1 m

[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-01-26 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17811356#comment-17811356
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

shintaroonuma commented on code in PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#discussion_r1467937565


##
hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/DefaultS3ClientFactory.java:
##
@@ -361,6 +366,13 @@ private static URI getS3Endpoint(String endpoint, final 
Configuration conf) {
*/
   private static Region getS3RegionFromEndpoint(String endpoint) {
 
+// S3 VPC endpoint parsing
+Matcher matcher = VPC_ENDPOINT_PATTERN.matcher(endpoint);
+if(matcher.find()) {
+  LOG.debug("Endpoint {} is vpc endpoint; parsing", endpoint);

Review Comment:
   Updated PR to match entire endpoint to avoid the confusion. 





> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>  Labels: pull-request-available
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-01-25 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17811048#comment-17811048
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

steveloughran commented on code in PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#discussion_r1467048011


##
hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/DefaultS3ClientFactory.java:
##
@@ -361,6 +366,13 @@ private static URI getS3Endpoint(String endpoint, final 
Configuration conf) {
*/
   private static Region getS3RegionFromEndpoint(String endpoint) {
 
+// S3 VPC endpoint parsing
+Matcher matcher = VPC_ENDPOINT_PATTERN.matcher(endpoint);
+if(matcher.find()) {
+  LOG.debug("Endpoint {} is vpc endpoint; parsing", endpoint);
+  return Region.of(matcher.group(1));

Review Comment:
   add a debug log saying "mapping to vpce"



##
hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/DefaultS3ClientFactory.java:
##
@@ -361,6 +366,13 @@ private static URI getS3Endpoint(String endpoint, final 
Configuration conf) {
*/
   private static Region getS3RegionFromEndpoint(String endpoint) {
 
+// S3 VPC endpoint parsing
+Matcher matcher = VPC_ENDPOINT_PATTERN.matcher(endpoint);
+if(matcher.find()) {

Review Comment:
   nit, add a space after `if`



##
hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/DefaultS3ClientFactory.java:
##
@@ -361,6 +366,13 @@ private static URI getS3Endpoint(String endpoint, final 
Configuration conf) {
*/
   private static Region getS3RegionFromEndpoint(String endpoint) {
 
+// S3 VPC endpoint parsing
+Matcher matcher = VPC_ENDPOINT_PATTERN.matcher(endpoint);
+if(matcher.find()) {
+  LOG.debug("Endpoint {} is vpc endpoint; parsing", endpoint);

Review Comment:
   so this is going to match on anything with .vpce. isn't it? I think it 
should include amazonaws.{com,com.cn} at the end so if someone ever sets up an 
internal host called vpce there's no confusion.





> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>  Labels: pull-request-available
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-01-19 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17808744#comment-17808744
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

hadoop-yetus commented on PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#issuecomment-1900780612

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 30s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  41m 48s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 43s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  compile  |   0m 31s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   0m 30s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 39s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 27s |  |  trunk passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 32s |  |  trunk passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m  6s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  32m 13s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 31s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 32s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javac  |   0m 32s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 26s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |   0m 26s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   0m 21s | 
[/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/2/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt)
 |  hadoop-tools/hadoop-aws: The patch generated 2 new + 2 unchanged - 0 fixed 
= 4 total (was 2)  |
   | +1 :green_heart: |  mvnsite  |   0m 33s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 15s |  |  the patch passed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04  |
   | +1 :green_heart: |  javadoc  |   0m 25s |  |  the patch passed with JDK 
Private Build-1.8.0_392-8u392-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   1m 15s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  32m 42s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 59s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   0m 35s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 122m 56s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.44 ServerAPI=1.44 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6466 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux b046b3667eb5 5.15.0-88-generic #98-Ubuntu SMP Mon Oct 2 
15:18:56 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / b32bb52913596a620774dc88af13671384e31767 |
   | Default Java | Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_392-8u392-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/2/testReport/ |
   | Max. process+thread count | 555 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/2/console |
   | versions | git=2.25.1 m

[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-01-19 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17808704#comment-17808704
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

shintaroonuma commented on PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#issuecomment-1900615534

   Thanks for the review! Added parsing of S3 VPC endpoints to help reduce the 
support calls. 




> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>  Labels: pull-request-available
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-01-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17808232#comment-17808232
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

hadoop-yetus commented on PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#issuecomment-1898567392

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  13m 22s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | -1 :x: |  mvninstall  |   0m 21s | 
[/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/1/artifact/out/branch-mvninstall-root.txt)
 |  root in trunk failed.  |
   | -1 :x: |  compile  |   5m 12s | 
[/branch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/1/artifact/out/branch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  hadoop-aws in trunk failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  compile  |   0m 22s | 
[/branch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/1/artifact/out/branch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt)
 |  hadoop-aws in trunk failed with JDK Private 
Build-1.8.0_392-8u392-ga-1~20.04-b08.  |
   | -0 :warning: |  checkstyle  |   0m 21s | 
[/buildtool-branch-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/1/artifact/out/buildtool-branch-checkstyle-hadoop-tools_hadoop-aws.txt)
 |  The patch fails to run checkstyle in hadoop-aws  |
   | -1 :x: |  mvnsite  |   0m 22s | 
[/branch-mvnsite-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/1/artifact/out/branch-mvnsite-hadoop-tools_hadoop-aws.txt)
 |  hadoop-aws in trunk failed.  |
   | -1 :x: |  javadoc  |   0m 22s | 
[/branch-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/1/artifact/out/branch-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  hadoop-aws in trunk failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  javadoc  |   0m 23s | 
[/branch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/1/artifact/out/branch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt)
 |  hadoop-aws in trunk failed with JDK Private 
Build-1.8.0_392-8u392-ga-1~20.04-b08.  |
   | -1 :x: |  spotbugs  |   0m 23s | 
[/branch-spotbugs-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/1/artifact/out/branch-spotbugs-hadoop-tools_hadoop-aws.txt)
 |  hadoop-aws in trunk failed.  |
   | +1 :green_heart: |  shadedclient  |   2m 39s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | -1 :x: |  mvninstall  |   0m 22s | 
[/patch-mvninstall-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/1/artifact/out/patch-mvninstall-hadoop-tools_hadoop-aws.txt)
 |  hadoop-aws in the patch failed.  |
   | -1 :x: |  compile  |   0m 22s | 
[/patch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/1/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  hadoop-aws in the patch failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  javac  |   0m 22s | 
[/patch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6466/1/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.txt)
 |  hadoop-aws in the patch failed with JDK 
Ubuntu-11.0.21+9-post-Ubuntu-0ubuntu120.04.  |
   | -1 :x: |  compile  |   0m 22s | 
[/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_392-8u392-ga-1~20.04-b08.txt](ht

[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-01-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17808224#comment-17808224
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

steveloughran commented on code in PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#discussion_r1457491066


##
hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/Constants.java:
##
@@ -242,6 +242,11 @@ private Constants() {
*/
   public static final String CENTRAL_ENDPOINT = "s3.amazonaws.com";
 
+  /**
+   * The vpc endpoint :{@value}.
+   */
+  public static final String VPC_ENDPOINT = "vpce.amazonaws.com";

Review Comment:
   1. move to InternalConstants
   2. is there a .cn equivalent?





> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>  Labels: pull-request-available
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2024-01-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17808205#comment-17808205
 ] 

ASF GitHub Bot commented on HADOOP-18938:
-

shintaroonuma opened a new pull request, #6466:
URL: https://github.com/apache/hadoop/pull/6466

   
   
   ### Description of PR
   JIRA: https://issues.apache.org/jira/browse/HADOOP-18938
   Skip endpoint parsing for regions when vpc endpoint is configured.  Previous 
endpoint parsing behaviour would parse vpc endpoints incorrectly, setting the 
region as "vpce".
   
   ### How was this patch tested?
   Tested with a bucket in eu-west-1 with `mvn -Dparallel-tests 
-DtestsThreadCount=16 clean verify`
   
   Failures observed:
   ```
   ITestS3ACommitterFactory.testEverything (testInvalidFileBinding())
   ITestS3AConfiguration.testRequestTimeout (Expected: 
https://issues.apache.org/jira/browse/HADOOP-19022)
   ```
   
   ### For code changes:
   
   - [x] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [x] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   




> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2023-10-26 Thread Steve Loughran (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17779995#comment-17779995
 ] 

Steve Loughran commented on HADOOP-18938:
-

this is why ITestS3AEndpointRegion.testWithVPCE

{code}

ERROR] testWithVPCE(org.apache.hadoop.fs.s3a.ITestS3AEndpointRegion)  Time 
elapsed: 2.887 s  <<< FAILURE!
org.junit.ComparisonFailure: [Incorrect region set] expected:<"[us-west-2]"> 
but was:<"[vpce]">
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

{code}



> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2023-10-17 Thread Steve Loughran (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17776243#comment-17776243
 ] 

Steve Loughran commented on HADOOP-18938:
-

fail fast may not be the right strategy as the region could be set in 
AWS_REGION or the system property. So leave to the SDK

> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints

2023-10-16 Thread Steve Loughran (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17775842#comment-17775842
 ] 

Steve Loughran commented on HADOOP-18938:
-

followup to HADOOP-18908

> S3A region logic to handle vpce and non standard endpoints 
> ---
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Affects Versions: 3.4.0
>Reporter: Ahmar Suhail
>Priority: Major
>
> For non standard endpoints such as VPCE the region parsing added in 
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be 
> used for standard endpoints. 
> If you are using a non-standard endpoint, check if a region is also provided, 
> else fail fast. 
> Also update documentation to explain to region and endpoint behaviour with 
> SDK V2. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org