[jira] [Updated] (COMPRESS-619) Large SevenZFile fails When Next Header Size is Greater than Max Int

2022-05-04 Thread Brian Miller (Jira)


 [ 
https://issues.apache.org/jira/browse/COMPRESS-619?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Brian Miller updated COMPRESS-619:
--
Description: 
When reading a large file (42GB) the following stack trace is produced:

 
{code:java}
java.io.IOException: Cannot handle nextHeaderSize 4102590414
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.assertFitsIntoNonNegativeInt(SevenZFile.java:2076)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.initializeArchive(SevenZFile.java:528)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.readHeaders(SevenZFile.java:474)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:343)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:136)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:376)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:364)
 ~[classes/:?] {code}
 

The file was produced using the SevenZOutputFile class and contains a large 
number of very small files all inserted using copy compression. It passes the 
7z tests and has the following statistics:

 
{code:java}
Files: 40872560
Size:       43708874326
Compressed: 47811464772
 {code}
It is failing because a ByteBuffer can't be created that is large enough with 
something over max integer in size to do the CRC check.

  was:
When reading a large file (42GB) the following stack trace is produced:

 
{code:java}
java.io.IOException: Cannot handle nextHeaderSize 4102590414
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.assertFitsIntoNonNegativeInt(SevenZFile.java:2076)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.initializeArchive(SevenZFile.java:528)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.readHeaders(SevenZFile.java:474)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:343)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:136)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:376)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:364)
 ~[classes/:?] {code}
 

The file was produced using the SevenZFile class and contains a large number of 
very small files all inserted using copy compression. It passes the 7z tests 
and has the following statistics:

 
{code:java}
Files: 40872560
Size:       43708874326
Compressed: 47811464772
 {code}
It is failing because a ByteBuffer can't be created that is large enough with 
something over max integer in size to do the CRC check.


> Large SevenZFile fails When Next Header Size is Greater than Max Int
> 
>
> Key: COMPRESS-619
> URL: https://issues.apache.org/jira/browse/COMPRESS-619
> Project: Commons Compress
>  Issue Type: Bug
>  Components: Archivers
>Affects Versions: 1.21
>Reporter: Brian Miller
>Priority: Minor
>
> When reading a large file (42GB) the following stack trace is produced:
>  
> {code:java}
> java.io.IOException: Cannot handle nextHeaderSize 4102590414
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.assertFitsIntoNonNegativeInt(SevenZFile.java:2076)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.initializeArchive(SevenZFile.java:528)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.readHeaders(SevenZFile.java:474)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:343)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:136)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:376)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:364)
>  ~[classes/:?] {code}
>  
> The file was produced using the SevenZOutputFile class and contains a large 
> number of very small files all inserted using copy compression. It passes the 
> 7z tests and has the following statistics:
>  
> {code:java}
> Files: 40872560
> Size:       43708874326
> Compressed: 47811464772
>  {code}
> It is failing because a ByteBuffer can't be created that is large enough with 
> something over max integer in size to do the CRC check.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)


[jira] [Updated] (COMPRESS-619) Large SevenZFile fails When Next Header Size is Greater than Max Int

2022-05-04 Thread Brian Miller (Jira)


 [ 
https://issues.apache.org/jira/browse/COMPRESS-619?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Brian Miller updated COMPRESS-619:
--
Description: 
When reading a large file (42GB) the following stack trace is produced:

 
{code:java}
java.io.IOException: Cannot handle nextHeaderSize 4102590414
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.assertFitsIntoNonNegativeInt(SevenZFile.java:2076)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.initializeArchive(SevenZFile.java:528)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.readHeaders(SevenZFile.java:474)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:343)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:136)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:376)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:364)
 ~[classes/:?] {code}
 

The file was produced using the SevenZFile class and contains a large number of 
very small files all inserted using copy compression. It passes the 7z tests 
and has the following statistics:

 
{code:java}
Files: 40872560
Size:       43708874326
Compressed: 47811464772
 {code}
It is failing because a ByteBuffer can't be created that is large enough with 
something over max integer in size to do the CRC check.

  was:
When reading a large file (42GB) the following stack trace is produced:

 
{code:java}
java.io.IOException: Cannot handle nextHeaderSize 4102590414
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.assertFitsIntoNonNegativeInt(SevenZFile.java:2076)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.initializeArchive(SevenZFile.java:528)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.readHeaders(SevenZFile.java:474)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:343)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:136)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:376)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:364)
 ~[classes/:?] {code}
 

The file was produced using the SevenZOutputFile class and contains a large 
number of very small files all inserted using copy compression. It passes the 
7z tests and has the following statistics:

 
{code:java}
Files: 40872560
Size:       43708874326
Compressed: 47811464772
 {code}
It is failing because a ByteBuffer can't be created that is large enough with 
something over max integer in size to do the CRC check.


> Large SevenZFile fails When Next Header Size is Greater than Max Int
> 
>
> Key: COMPRESS-619
> URL: https://issues.apache.org/jira/browse/COMPRESS-619
> Project: Commons Compress
>  Issue Type: Bug
>  Components: Archivers
>Affects Versions: 1.21
>Reporter: Brian Miller
>Priority: Minor
>
> When reading a large file (42GB) the following stack trace is produced:
>  
> {code:java}
> java.io.IOException: Cannot handle nextHeaderSize 4102590414
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.assertFitsIntoNonNegativeInt(SevenZFile.java:2076)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.initializeArchive(SevenZFile.java:528)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.readHeaders(SevenZFile.java:474)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:343)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:136)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:376)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:364)
>  ~[classes/:?] {code}
>  
> The file was produced using the SevenZFile class and contains a large number 
> of very small files all inserted using copy compression. It passes the 7z 
> tests and has the following statistics:
>  
> {code:java}
> Files: 40872560
> Size:       43708874326
> Compressed: 47811464772
>  {code}
> It is failing because a ByteBuffer can't be created that is large enough with 
> something over max integer in size to do the CRC check.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)


[jira] [Comment Edited] (COMPRESS-619) Large SevenZFile fails When Next Header Size is Greater than Max Int

2022-04-26 Thread Brian Miller (Jira)


[ 
https://issues.apache.org/jira/browse/COMPRESS-619?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17528149#comment-17528149
 ] 

Brian Miller edited comment on COMPRESS-619 at 4/26/22 12:50 PM:
-

It seems like doing something that the PR for COMPRESS-514 is doing would allow 
the header to be read.


was (Author: JIRAUSER288490):
It seems like doing something that COMPRESS-514 is doing would allow the header 
to be read.

> Large SevenZFile fails When Next Header Size is Greater than Max Int
> 
>
> Key: COMPRESS-619
> URL: https://issues.apache.org/jira/browse/COMPRESS-619
> Project: Commons Compress
>  Issue Type: Bug
>  Components: Archivers
>Affects Versions: 1.21
>Reporter: Brian Miller
>Priority: Minor
>
> When reading a large file (42GB) the following stack trace is produced:
>  
> {code:java}
> java.io.IOException: Cannot handle nextHeaderSize 4102590414
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.assertFitsIntoNonNegativeInt(SevenZFile.java:2076)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.initializeArchive(SevenZFile.java:528)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.readHeaders(SevenZFile.java:474)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:343)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:136)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:376)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:364)
>  ~[classes/:?] {code}
>  
> The file was produced using the SevenZOutputFile class and contains a large 
> number of very small files all inserted using copy compression. It passes the 
> 7z tests and has the following statistics:
>  
> {code:java}
> Files: 40872560
> Size:       43708874326
> Compressed: 47811464772
>  {code}
> It is failing because a ByteBuffer can't be created that is large enough with 
> something over max integer in size to do the CRC check.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)


[jira] [Commented] (COMPRESS-619) Large SevenZFile fails When Next Header Size is Greater than Max Int

2022-04-26 Thread Brian Miller (Jira)


[ 
https://issues.apache.org/jira/browse/COMPRESS-619?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17528149#comment-17528149
 ] 

Brian Miller commented on COMPRESS-619:
---

It seems like doing something that COMPRESS-514 is doing would allow the header 
to be read.

> Large SevenZFile fails When Next Header Size is Greater than Max Int
> 
>
> Key: COMPRESS-619
> URL: https://issues.apache.org/jira/browse/COMPRESS-619
> Project: Commons Compress
>  Issue Type: Bug
>  Components: Archivers
>Affects Versions: 1.21
>Reporter: Brian Miller
>Priority: Minor
>
> When reading a large file (42GB) the following stack trace is produced:
>  
> {code:java}
> java.io.IOException: Cannot handle nextHeaderSize 4102590414
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.assertFitsIntoNonNegativeInt(SevenZFile.java:2076)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.initializeArchive(SevenZFile.java:528)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.readHeaders(SevenZFile.java:474)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:343)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:136)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:376)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:364)
>  ~[classes/:?] {code}
>  
> The file was produced using the SevenZOutputFile class and contains a large 
> number of very small files all inserted using copy compression. It passes the 
> 7z tests and has the following statistics:
>  
> {code:java}
> Files: 40872560
> Size:       43708874326
> Compressed: 47811464772
>  {code}
> It is failing because a ByteBuffer can't be created that is large enough with 
> something over max integer in size to do the CRC check.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)


[jira] [Updated] (COMPRESS-619) Large SevenZFile fails When Next Header Size is Greater than Max Int

2022-04-22 Thread Brian Miller (Jira)


 [ 
https://issues.apache.org/jira/browse/COMPRESS-619?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Brian Miller updated COMPRESS-619:
--
Summary: Large SevenZFile fails When Next Header Size is Greater than Max 
Int  (was: Large SevenZFile When Next Header Size is Greater than Max Int)

> Large SevenZFile fails When Next Header Size is Greater than Max Int
> 
>
> Key: COMPRESS-619
> URL: https://issues.apache.org/jira/browse/COMPRESS-619
> Project: Commons Compress
>  Issue Type: Bug
>  Components: Archivers
>Affects Versions: 1.21
>Reporter: Brian Miller
>Priority: Minor
>
> When reading a large file (42GB) the following stack trace is produced:
>  
> {code:java}
> java.io.IOException: Cannot handle nextHeaderSize 4102590414
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.assertFitsIntoNonNegativeInt(SevenZFile.java:2076)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.initializeArchive(SevenZFile.java:528)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.readHeaders(SevenZFile.java:474)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:343)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:136)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:376)
>  ~[classes/:?]
>     at 
> org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:364)
>  ~[classes/:?] {code}
>  
> The file was produced using the SevenZOutputFile class and contains a large 
> number of very small files all inserted using copy compression. It passes the 
> 7z tests and has the following statistics:
>  
> {code:java}
> Files: 40872560
> Size:       43708874326
> Compressed: 47811464772
>  {code}
> It is failing because a ByteBuffer can't be created that is large enough with 
> something over max integer in size to do the CRC check.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)


[jira] [Created] (COMPRESS-619) Large SevenZFile When Next Header Size is Greater than Max Int

2022-04-22 Thread Brian Miller (Jira)
Brian Miller created COMPRESS-619:
-

 Summary: Large SevenZFile When Next Header Size is Greater than 
Max Int
 Key: COMPRESS-619
 URL: https://issues.apache.org/jira/browse/COMPRESS-619
 Project: Commons Compress
  Issue Type: Bug
  Components: Archivers
Affects Versions: 1.21
Reporter: Brian Miller


When reading a large file (42GB) the following stack trace is produced:

 
{code:java}
java.io.IOException: Cannot handle nextHeaderSize 4102590414
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.assertFitsIntoNonNegativeInt(SevenZFile.java:2076)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.initializeArchive(SevenZFile.java:528)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.readHeaders(SevenZFile.java:474)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:343)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:136)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:376)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.(SevenZFile.java:364)
 ~[classes/:?] {code}
 

The file was produced using the SevenZOutputFile class and contains a large 
number of very small files all inserted using copy compression. It passes the 
7z tests and has the following statistics:

 
{code:java}
Files: 40872560
Size:       43708874326
Compressed: 47811464772
 {code}
It is failing because a ByteBuffer can't be created that is large enough with 
something over max integer in size to do the CRC check.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)