Hi
This is on the roadmap with something called resume strategy and kamelets.
Then the resume strategy will be implemented on components that can
resume from a specific point, eg a file/ftp from N bytes into the
file.
A kamelet can then for kafka connector handle all of this as source to
use
Hello Raymond
Many thanks for your quick response.
1) How big are the files? Is there a specific threshold when doesn't it work
anymore?
Files are between 250MB and 500MB
2) What kind of error do you get and where do you get it? (On Camel side or
Kafka side, is it memory related)?
I am
Hi Sergio,
Can you tell a bit more about your use case?
1) How big are the files? Is there a specific threshold when doesn't it
work anymore?
2) What kind of error do you get and where do you get it? (On Camel side or
Kafka side, is it memory related)?
3) What kind of files are you using (csv,
Hello
We are trying to read files using the following connector
https://github.com/apache/camel-kafka-connector-examples/tree/main/sftp/sftp-source
The connector send the whole file in one record to kafka which it is failing
with big files
It is possible to override the apply method to convert