[ 
https://issues.apache.org/jira/browse/CAMEL-20556?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17825919#comment-17825919
 ] 

Claus Ibsen commented on CAMEL-20556:
-------------------------------------

If you can debug the code yourself and see what happens. And also check the 
code changes between 3.20.2 and 3.20.3 since you say that is where the thing 
started to fail for you.


> FileConsumer OutOfMemory for big files in Unix
> ----------------------------------------------
>
>                 Key: CAMEL-20556
>                 URL: https://issues.apache.org/jira/browse/CAMEL-20556
>             Project: Camel
>          Issue Type: Bug
>          Components: camel-file
>    Affects Versions: 3.20.9, 3.22.1
>            Reporter: Alexander Anpilov
>            Priority: Minor
>
> For clarity, I have tested and described issue with simple routes:
>  
> 1.  *FileConsumer*
>      **     
> {code:java}
> from("file:/path")
> .log("Received file: ${file:name}"); {code}
>      
>  
> 2. *File pollEnrich*
> {code:java}
> from("direct:start")
> .setProperty("SOURCE_URI", simple("{{path_to_folder}}"))
> .pollEnrich().exchangeProperty("SOURCE_URI").timeout(60000)
> .log("Received file: ${file:name}"); {code}
> My process receive big files (more than 4x of Java XMX) and running as 
> spring-boot app on Kubernetes.
> After upgrading from camel-3.20.2 to camel-3.20.9, the OutOfMemory error 
> occurs on consume. I have rolled back and checked with camel-3.20.2 again on 
> the same files - everything OK.
> It seems, that FileConsumer publish file content into memory, instead of 
> using streams.
> The problem starts from version camel-3.20.3 and reproduced up to 3.20.9.
> *BUT:* I couldn't reproduce problem on Windows with Camel Test, only in Unix 
> prod env. Maybe, there are some camel-test effects...
> *UPD:* I have tested camel-3.22.1 and received OutOfMemory



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to