[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=793623&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793623 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 21/Jul/22 09:16 Start Date: 21/Jul/22 09:16 Worklog Time Spent: 10m Work Description: steveloughran merged PR #4572: URL: https://github.com/apache/hadoop/pull/4572 Issue Time Tracking --- Worklog Id: (was: 793623) Time Spent: 3h 50m (was: 3h 40m) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 3h 50m > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=793385&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793385 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 20/Jul/22 18:35 Start Date: 20/Jul/22 18:35 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4572: URL: https://github.com/apache/hadoop/pull/4572#issuecomment-1190622231 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 42m 5s | | trunk passed | | +1 :green_heart: | compile | 1m 3s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 0m 50s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 45s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 57s | | trunk passed | | +1 :green_heart: | javadoc | 0m 41s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 44s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 31s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 22s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 39s | | the patch passed | | +1 :green_heart: | compile | 0m 43s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 0m 43s | | the patch passed | | +1 :green_heart: | compile | 0m 37s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 37s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 27s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 41s | | the patch passed | | +1 :green_heart: | javadoc | 0m 23s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 32s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 13s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 26s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 44s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 52s | | The patch does not generate ASF License warnings. | | | | 102m 46s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4572/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4572 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 32b996b7804d 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 1de79b6766c59f26890a2968c83a7e0ad85cf9e4 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4572/3/testReport/ | | Max. process+thread count | 719 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4572/3/console | | versions | git=2.25.1
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=793330&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793330 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 20/Jul/22 16:14 Start Date: 20/Jul/22 16:14 Worklog Time Spent: 10m Work Description: steveloughran commented on code in PR #4572: URL: https://github.com/apache/hadoop/pull/4572#discussion_r925806836 ## hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ClientFactory.java: ## @@ -264,5 +270,26 @@ public S3ClientCreationParameters withHeader( public Map getHeaders() { return headers; } + +/** + * Get the full s3 path. + * added in HADOOP-18330 + * @return path URI + */ +public URI getPath() { Review Comment: can you name the getter/setter to pathUri for consistency? Issue Time Tracking --- Worklog Id: (was: 793330) Time Spent: 3.5h (was: 3h 20m) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 3.5h > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=792897&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-792897 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 19/Jul/22 20:17 Start Date: 19/Jul/22 20:17 Worklog Time Spent: 10m Work Description: ashutoshpant commented on PR #4572: URL: https://github.com/apache/hadoop/pull/4572#issuecomment-1189511230 > ok, which s3 endpoint did you run the tests against? Just ran the tests against us-east-1 Issue Time Tracking --- Worklog Id: (was: 792897) Time Spent: 3h 20m (was: 3h 10m) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 3h 20m > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=792889&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-792889 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 19/Jul/22 19:51 Start Date: 19/Jul/22 19:51 Worklog Time Spent: 10m Work Description: steveloughran commented on PR #4572: URL: https://github.com/apache/hadoop/pull/4572#issuecomment-1189489423 ok, which s3 endpoint did you run the tests against? Issue Time Tracking --- Worklog Id: (was: 792889) Time Spent: 3h 10m (was: 3h) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 3h 10m > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=792756&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-792756 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 19/Jul/22 15:22 Start Date: 19/Jul/22 15:22 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4572: URL: https://github.com/apache/hadoop/pull/4572#issuecomment-1189191525 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 42s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 31s | | trunk passed | | +1 :green_heart: | compile | 1m 2s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 0m 54s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 52s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 2s | | trunk passed | | +1 :green_heart: | javadoc | 0m 49s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 51s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 33s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 2s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 39s | | the patch passed | | +1 :green_heart: | compile | 0m 42s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 0m 42s | | the patch passed | | +1 :green_heart: | compile | 0m 37s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 37s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 27s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 41s | | the patch passed | | +1 :green_heart: | javadoc | 0m 25s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 32s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 13s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 27s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 40s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 51s | | The patch does not generate ASF License warnings. | | | | 98m 0s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4572/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4572 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux df2b3c78addb 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 19ee4f51782c0cd33de1ec1715613ecd5744ea25 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4572/2/testReport/ | | Max. process+thread count | 732 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4572/2/console | | versions | git=2.25.1
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=792542&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-792542 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 19/Jul/22 09:32 Start Date: 19/Jul/22 09:32 Worklog Time Spent: 10m Work Description: steveloughran commented on PR #4572: URL: https://github.com/apache/hadoop/pull/4572#issuecomment-1188824862 get yetus to stop failing the build, then run the integration tests against an s3 bucket, then i will review Issue Time Tracking --- Worklog Id: (was: 792542) Time Spent: 2h 50m (was: 2h 40m) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 2h 50m > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=791702&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-791702 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 16/Jul/22 22:29 Start Date: 16/Jul/22 22:29 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4572: URL: https://github.com/apache/hadoop/pull/4572#issuecomment-1186311338 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 44s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 37m 58s | | trunk passed | | +1 :green_heart: | compile | 1m 0s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 0m 55s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 53s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 1s | | trunk passed | | +1 :green_heart: | javadoc | 0m 49s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 51s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 36s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 5s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | -1 :x: | mvninstall | 0m 25s | [/patch-mvninstall-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4572/1/artifact/out/patch-mvninstall-hadoop-tools_hadoop-aws.txt) | hadoop-aws in the patch failed. | | -1 :x: | compile | 0m 24s | [/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4572/1/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-aws in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | javac | 0m 24s | [/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4572/1/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-aws in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | compile | 0m 24s | [/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4572/1/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-aws in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -1 :x: | javac | 0m 24s | [/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4572/1/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-aws in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 28s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4572/1/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) | hadoop-tools/hadoop-aws: The patch generated 1 new + 5 unchanged - 0 fixed = 6 total (was 5) | | -1 :x: | mvnsite | 0m 26s | [/patch-mvnsite-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4572/1/artifact/out/patch-mvnsite-hadoop-tools_hadoop-aws.txt) | hadoop-aws in the patch failed. | | +1 :green_heart: | javadoc | 0m 24s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | -1 :x: | javadoc | 0m 25s | [/patch-javadoc-hadoop-tools_hado
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=791690&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-791690 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 16/Jul/22 20:56 Start Date: 16/Jul/22 20:56 Worklog Time Spent: 10m Work Description: ashutoshpant commented on PR #4557: URL: https://github.com/apache/hadoop/pull/4557#issuecomment-1186290248 > aah, I'd merged this and only then noticed this was against 3.3.3. reverted. > > Can you create a PR with the final patch applied to trunk? and test it (just tell us the endpoint, no need for the other details). then we can merge there and back in to branch-3.3 > > the 3.3.3 branch is frozen; a fork was made earlier for a critical integration/cve release, which this doesn't qualify for...target the next release after that Oops!! sorry! Just created a new PR: https://github.com/apache/hadoop/pull/4572 Issue Time Tracking --- Worklog Id: (was: 791690) Time Spent: 2.5h (was: 2h 20m) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 2.5h > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=791689&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-791689 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 16/Jul/22 20:54 Start Date: 16/Jul/22 20:54 Worklog Time Spent: 10m Work Description: ashutoshpant opened a new pull request, #4572: URL: https://github.com/apache/hadoop/pull/4572 ### Description of PR added new parameter object in s3ClientCreationParameters that returns full s3a path ### How was this patch tested? Tested against the following endpoint -> s3.us-east-1.amazonaws.com ### For code changes: - [ X] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? Issue Time Tracking --- Worklog Id: (was: 791689) Time Spent: 2h 20m (was: 2h 10m) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 2h 20m > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=791669&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-791669 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 16/Jul/22 16:56 Start Date: 16/Jul/22 16:56 Worklog Time Spent: 10m Work Description: steveloughran commented on PR #4557: URL: https://github.com/apache/hadoop/pull/4557#issuecomment-1186239135 aah, I'd merged this and only then noticed this was against 3.3.3. reverted. Can you create a PR with the final patch applied to trunk? and test it (just tell us the endpoint, no need for the other details). then we can merge there and back in to branch-3.3 the 3.3.3 branch is frozen; a fork was made earlier for a critical integration/cve release, which this doesn't qualify for...target the next release after that Issue Time Tracking --- Worklog Id: (was: 791669) Time Spent: 2h 10m (was: 2h) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 2h 10m > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=791668&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-791668 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 16/Jul/22 16:52 Start Date: 16/Jul/22 16:52 Worklog Time Spent: 10m Work Description: steveloughran merged PR #4557: URL: https://github.com/apache/hadoop/pull/4557 Issue Time Tracking --- Worklog Id: (was: 791668) Time Spent: 2h (was: 1h 50m) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 2h > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=791555&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-791555 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 15/Jul/22 19:02 Start Date: 15/Jul/22 19:02 Worklog Time Spent: 10m Work Description: ashutoshpant commented on PR #4557: URL: https://github.com/apache/hadoop/pull/4557#issuecomment-1185824626 > changes look great; no more suggestions. > > now, testing diligence. > > Which aws region did you run the hadoop-aws integration tests against, what where the cli settings? > > this isn't just to put homework on you, the more people testing with different configs, the more likely we are to find failures before they ship. > > +1 pending those aws-test results AWS Region: us-east-1 IT test command: mvn -Dparallel-tests -DtestsThreadCount=8 clean verify other test: mvn clean test Settings used: ``` test.fs.s3a.name s3a://test-hadoop1/ fs.contract.test.fs.s3a ${test.fs.s3a.name} fs.s3a.access.key AWS access key ID. Omit for IAM role-based authentication. fs.s3a.secret.key AWS secret key. Omit for IAM role-based authentication. fs.s3a.endpoint s3.us-east-1.amazonaws.com test.fs.s3a.encryption.enabled false test.fs.s3a.sts.enabled false Issue Time Tracking --- Worklog Id: (was: 791555) Time Spent: 1h 50m (was: 1h 40m) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 1h 50m > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=790547&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-790547 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 13/Jul/22 18:27 Start Date: 13/Jul/22 18:27 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4557: URL: https://github.com/apache/hadoop/pull/4557#issuecomment-1183544043 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 11m 21s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ branch-3.3.3 Compile Tests _ | | +1 :green_heart: | mvninstall | 35m 23s | | branch-3.3.3 passed | | +1 :green_heart: | compile | 0m 56s | | branch-3.3.3 passed | | +1 :green_heart: | checkstyle | 0m 52s | | branch-3.3.3 passed | | +1 :green_heart: | mvnsite | 1m 3s | | branch-3.3.3 passed | | +1 :green_heart: | javadoc | 0m 56s | | branch-3.3.3 passed | | +1 :green_heart: | spotbugs | 1m 46s | | branch-3.3.3 passed | | +1 :green_heart: | shadedclient | 25m 14s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 45s | | the patch passed | | +1 :green_heart: | compile | 0m 38s | | the patch passed | | +1 :green_heart: | javac | 0m 38s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 28s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 43s | | the patch passed | | +1 :green_heart: | javadoc | 0m 34s | | the patch passed | | +1 :green_heart: | spotbugs | 1m 19s | | the patch passed | | +1 :green_heart: | shadedclient | 24m 5s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 34s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 51s | | The patch does not generate ASF License warnings. | | | | 109m 32s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4557/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4557 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux a8ef679543bd 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3.3 / cc79f799b1297c688fab959a6bc68de0f8350499 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4557/2/testReport/ | | Max. process+thread count | 676 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4557/2/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. Issue Time Tracking --- Worklog Id: (was: 790547) Time Spent: 1h 40m (was: 1.5h) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 1h 40m > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails beca
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=790546&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-790546 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 13/Jul/22 18:27 Start Date: 13/Jul/22 18:27 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4557: URL: https://github.com/apache/hadoop/pull/4557#issuecomment-1183543689 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 42s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ branch-3.3.3 Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 27s | | branch-3.3.3 passed | | +1 :green_heart: | compile | 0m 56s | | branch-3.3.3 passed | | +1 :green_heart: | checkstyle | 0m 51s | | branch-3.3.3 passed | | +1 :green_heart: | mvnsite | 1m 0s | | branch-3.3.3 passed | | +1 :green_heart: | javadoc | 0m 53s | | branch-3.3.3 passed | | +1 :green_heart: | spotbugs | 1m 50s | | branch-3.3.3 passed | | +1 :green_heart: | shadedclient | 27m 20s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 47s | | the patch passed | | +1 :green_heart: | compile | 0m 40s | | the patch passed | | +1 :green_heart: | javac | 0m 40s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 30s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 48s | | the patch passed | | +1 :green_heart: | javadoc | 0m 31s | | the patch passed | | +1 :green_heart: | spotbugs | 1m 26s | | the patch passed | | +1 :green_heart: | shadedclient | 26m 55s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 40s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 50s | | The patch does not generate ASF License warnings. | | | | 107m 2s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4557/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4557 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 2b9752724c11 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3.3 / cc79f799b1297c688fab959a6bc68de0f8350499 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4557/3/testReport/ | | Max. process+thread count | 733 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4557/3/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. Issue Time Tracking --- Worklog Id: (was: 790546) Time Spent: 1.5h (was: 1h 20m) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 1.5h > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails becau
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=790493&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-790493 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 13/Jul/22 16:16 Start Date: 13/Jul/22 16:16 Worklog Time Spent: 10m Work Description: steveloughran commented on code in PR #4557: URL: https://github.com/apache/hadoop/pull/4557#discussion_r920269270 ## hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ClientFactory.java: ## @@ -264,5 +269,20 @@ public S3ClientCreationParameters withHeader( public Map getHeaders() { return headers; } + +public URI getPath() { Review Comment: can you add javadoc, also say "added in HADOOP-18330" so anyone writing a client will be aware it is not always present. Issue Time Tracking --- Worklog Id: (was: 790493) Time Spent: 1h 20m (was: 1h 10m) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 1h 20m > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=790218&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-790218 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 12/Jul/22 23:37 Start Date: 12/Jul/22 23:37 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4557: URL: https://github.com/apache/hadoop/pull/4557#issuecomment-1182599104 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 8m 19s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ branch-3.3.3 Compile Tests _ | | +1 :green_heart: | mvninstall | 35m 19s | | branch-3.3.3 passed | | +1 :green_heart: | compile | 0m 57s | | branch-3.3.3 passed | | +1 :green_heart: | checkstyle | 0m 53s | | branch-3.3.3 passed | | +1 :green_heart: | mvnsite | 1m 3s | | branch-3.3.3 passed | | +1 :green_heart: | javadoc | 0m 56s | | branch-3.3.3 passed | | +1 :green_heart: | spotbugs | 1m 46s | | branch-3.3.3 passed | | +1 :green_heart: | shadedclient | 25m 4s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 46s | | the patch passed | | +1 :green_heart: | compile | 0m 38s | | the patch passed | | +1 :green_heart: | javac | 0m 38s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 28s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 45s | | the patch passed | | +1 :green_heart: | javadoc | 0m 33s | | the patch passed | | +1 :green_heart: | spotbugs | 1m 20s | | the patch passed | | +1 :green_heart: | shadedclient | 24m 12s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 42s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 52s | | The patch does not generate ASF License warnings. | | | | 106m 59s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4557/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4557 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 8143d2669d56 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3.3 / 74d5e3f853108277018667c79be60fa647e1156d | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4557/1/testReport/ | | Max. process+thread count | 539 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4557/1/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. Issue Time Tracking --- Worklog Id: (was: 790218) Time Spent: 1h 10m (was: 1h) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 1h 10m > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=790203&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-790203 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 12/Jul/22 21:49 Start Date: 12/Jul/22 21:49 Worklog Time Spent: 10m Work Description: ashutoshpant commented on PR #4551: URL: https://github.com/apache/hadoop/pull/4551#issuecomment-1182532060 Closed due to error!! New PR out Issue Time Tracking --- Worklog Id: (was: 790203) Time Spent: 50m (was: 40m) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 50m > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=790204&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-790204 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 12/Jul/22 21:49 Start Date: 12/Jul/22 21:49 Worklog Time Spent: 10m Work Description: ashutoshpant closed pull request #4551: HADOOP-18330 URL: https://github.com/apache/hadoop/pull/4551 Issue Time Tracking --- Worklog Id: (was: 790204) Time Spent: 1h (was: 50m) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 1h > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=790202&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-790202 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 12/Jul/22 21:48 Start Date: 12/Jul/22 21:48 Worklog Time Spent: 10m Work Description: ashutoshpant opened a new pull request, #4557: URL: https://github.com/apache/hadoop/pull/4557 First of all, sorry for the multiple PR's, it's because i cant push from my device because of security reasons and have to use https://github.dev/ ### Description of PR Added a new parameter object (pathUrl) that holds the full s3a path ### How was this patch tested? - Ran all the tests successfully using `mvn clean compile package`. - Used the jar from above step to successfully read/write to an S3 bucket in us-east. Repeated this 3 times. ### For code changes: - [ X] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? Issue Time Tracking --- Worklog Id: (was: 790202) Time Spent: 40m (was: 0.5h) > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Assignee: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 40m > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=790062&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-790062 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 12/Jul/22 14:57 Start Date: 12/Jul/22 14:57 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4551: URL: https://github.com/apache/hadoop/pull/4551#issuecomment-1181864553 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 38s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 37m 26s | | trunk passed | | +1 :green_heart: | compile | 1m 1s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 0m 54s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 53s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 1s | | trunk passed | | +1 :green_heart: | javadoc | 0m 48s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 51s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 33s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 54s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | -1 :x: | mvninstall | 0m 28s | [/patch-mvninstall-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4551/2/artifact/out/patch-mvninstall-hadoop-tools_hadoop-aws.txt) | hadoop-aws in the patch failed. | | -1 :x: | compile | 0m 31s | [/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4551/2/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-aws in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | javac | 0m 31s | [/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4551/2/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-aws in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | compile | 0m 28s | [/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4551/2/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-aws in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -1 :x: | javac | 0m 28s | [/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4551/2/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-aws in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 27s | | the patch passed | | -1 :x: | mvnsite | 0m 29s | [/patch-mvnsite-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4551/2/artifact/out/patch-mvnsite-hadoop-tools_hadoop-aws.txt) | hadoop-aws in the patch failed. | | +1 :green_heart: | javadoc | 0m 23s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 32s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | spotbugs | 0m 29s | [/patch-spotbugs-hadoo
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=789796&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-789796 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 12/Jul/22 00:39 Start Date: 12/Jul/22 00:39 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4551: URL: https://github.com/apache/hadoop/pull/4551#issuecomment-1181186399 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 35s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 37m 38s | | trunk passed | | +1 :green_heart: | compile | 0m 53s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 0m 53s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 53s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 1s | | trunk passed | | +1 :green_heart: | javadoc | 0m 49s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 50s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 33s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 6s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | -1 :x: | mvninstall | 0m 40s | [/patch-mvninstall-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4551/1/artifact/out/patch-mvninstall-hadoop-tools_hadoop-aws.txt) | hadoop-aws in the patch failed. | | -1 :x: | compile | 0m 31s | [/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4551/1/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-aws in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | javac | 0m 31s | [/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4551/1/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-aws in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | compile | 0m 27s | [/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4551/1/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-aws in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -1 :x: | javac | 0m 27s | [/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4551/1/artifact/out/patch-compile-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-aws in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 25s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4551/1/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) | hadoop-tools/hadoop-aws: The patch generated 2 new + 5 unchanged - 0 fixed = 7 total (was 5) | | -1 :x: | mvnsite | 0m 27s | [/patch-mvnsite-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4551/1/artifact/out/patch-mvnsite-hadoop-tools_hadoop-aws.txt) | hadoop-aws in the patch failed. | | +1 :green_heart: | javadoc | 0m 24s | | the patch passed with J
[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client
[ https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=789769&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-789769 ] ASF GitHub Bot logged work on HADOOP-18330: --- Author: ASF GitHub Bot Created on: 11/Jul/22 23:04 Start Date: 11/Jul/22 23:04 Worklog Time Spent: 10m Work Description: ashutoshpant opened a new pull request, #4551: URL: https://github.com/apache/hadoop/pull/4551 ] ### Description of PR Added path to client creation parameters ### How was this patch tested? I just have an Enterprise restricted device with me so could not clone repo for testing purposes! Used git dev for PR ### For code changes: - [ X] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? Issue Time Tracking --- Worklog Id: (was: 789769) Remaining Estimate: 0h Time Spent: 10m > S3AFileSystem removes Path when calling createS3Client > -- > > Key: HADOOP-18330 > URL: https://issues.apache.org/jira/browse/HADOOP-18330 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3 >Reporter: Ashutosh Pant >Priority: Minor > Labels: pull-request-available > Time Spent: 10m > Remaining Estimate: 0h > > when using hadoop and spark to read/write data from an s3 bucket like -> > s3a://bucket/path and using a custom Credentials Provider, the path is > removed from the s3a URI and the credentials provider fails because the full > path is gone. > In Spark 3.2, > It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf) > .createS3Client(name, bucket, credentials); > But In spark 3.3.3 > It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, > conf).createS3Client(getUri(), parameters); > the getUri() removes the path from the s3a URI -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org