[ 
https://issues.apache.org/jira/browse/NIFI-10792?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17683937#comment-17683937
 ] 

Daniel Stieglitz commented on NIFI-10792:
-----------------------------------------

[~mayki] I just double checked in the Git history on that pom.xml file. To be 
precise I believe 1.19.1 actually uses 5.22 while the latest 1.20.0-SNAPSHOT 
uses 5.23. Either way you should be okay per the bug report. 

> ConvertExcelToCSVProcessor : Failed to convert file over 10MB 
> --------------------------------------------------------------
>
>                 Key: NIFI-10792
>                 URL: https://issues.apache.org/jira/browse/NIFI-10792
>             Project: Apache NiFi
>          Issue Type: Bug
>          Components: Core UI
>    Affects Versions: 1.17.0, 1.16.3, 1.18.0
>            Reporter: mayki
>            Priority: Critical
>              Labels: Excel, csv, processor
>             Fix For: 1.15.3
>
>
> Hello all,
> It seems all version greater 1.15.3 introduce a failure on the processor 
> *ConvertExcelToCSVProcessor* with this error :
> {code:java}
> Tried to allocate an array of length 101,695,141, but the maximum length for 
> this record type is 100,000,000. If the file is not corrupt or large, please 
> open an issue on bugzilla to request increasing the maximum allowable size 
> for this record type. As a temporary workaround, consider setting a higher 
> override value with IOUtils.setByteArrayMaxOverride() {code}
> I have tested with 2 differences instances nifi version 1.15.3 ==> Work: OK
> And since upgrade in 1.16, 1.17, 1.18 ==> same processsor *failed* with file 
> greater than 10MB.
> Could you help us to correct this bug ?



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to