Hi Devs ,
I have a requirement to read from a directory where there are no of files
in the subdirectories of it . I am using the camel File component for it .
I don't want to process the same file which has been processed earlier
using an idempotentConsumer and using RedisCache for it.
In the cache I am adding the checksum to identify the duplicate file. The
problem is that after reading more than 1000 files the same files(which
were read earlier) are being read again and again which I don't want .
Below is the snippet of the camel route
from("file://" + sourceLoc + "/?recursive=true&noop=true"
+"&idempotent=true&idempotentKey=${file:name}-${file:modified}&readLockRemoveOnCommit=false")
.log("the read file is
${file:name}-${file:modified} ")
.filter()
.method(SourceMinAgeFilter.class, "accept")
.process("checksumprocessor")
.idempotentConsumer(header("fileCheckSum"),
RedisIdempotentRepository.redisIdempotentRepository(stringredisTemplate,
redisStoragename))
.eager(true)
.skipDuplicate(false)
.choice()
.when(exchangeProperty(Exchange.DUPLICATE_MESSAGE).isEqualTo(true))
.log("file ${file:onlyname} is a duplicate and hence
skipped.")
.otherwise()
.process("fileprocessor")
.log("processed file name from Source Folder" +
("${file:onlyname}"))
.log("Total time to process a file : from Source Folder " +
(System.currentTimeMillis() - starttime) + " ms").end();
Could you please let me know what needs to be done so that the same file is
not read again and again. Since I am using RedisCache it is not getting
processed as the checksum is same but I don't want to read the file again
that has been processed
Thanks and Regards
Deepak Anand