[ https://issues.apache.org/jira/browse/HADOOP-14081?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15871711#comment-15871711 ]
Rajesh Balamohan commented on HADOOP-14081: ------------------------------------------- Thanks [~ste...@apache.org]. Here are the test results (region: S3 bucket in U.S east. Tests were run from my laptop). Errors are due to socket time outs (180 seconds). Checked ITestS3AContractGetFileStatus.teardown, which was again due to socket timeout. {noformat} Results : Tests in error: ITestS3ContractOpen>AbstractFSContractTestBase.setup:193->AbstractFSContractTestBase.mkdirs:338 » SocketTimeout ITestS3AContractGetFileStatus.teardown:40->AbstractFSContractTestBase.teardown:204->AbstractFSContractTestBase.deleteTestDirInTeardown:213 » ITestS3AContractRootDir>AbstractContractRootDirectoryTest.testRmEmptyRootDirNonRecursive:116 » PathIO ITestS3NContractOpen>AbstractFSContractTestBase.setup:193->AbstractFSContractTestBase.mkdirs:338 » SocketTimeout Tests run: 454, Failures: 0, Errors: 4, Skipped: 56 .. .. [INFO] Total time: 02:11 h {noformat} > S3A: Consider avoiding array copy in S3ABlockOutputStream (ByteArrayBlock) > -------------------------------------------------------------------------- > > Key: HADOOP-14081 > URL: https://issues.apache.org/jira/browse/HADOOP-14081 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 > Reporter: Rajesh Balamohan > Assignee: Rajesh Balamohan > Priority: Minor > Attachments: HADOOP-14081.001.patch > > > In {{S3ADataBlocks::ByteArrayBlock}}, data is copied whenever {{startUpload}} > is called. It might be possible to directly access the byte[] array from > ByteArrayOutputStream. > Might have to extend ByteArrayOutputStream and create a method like > getInputStream() which can return ByteArrayInputStream. This would avoid > expensive array copy during large upload. -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org