Thanks Claus for you response.
The same files can be present in different sub directory , so I need
checksum to identify them.
How in the file endpoint I can generate the checksum and add it to the
idempotentRepository .
May be a silly question but I am new to Camel earlier I was generating the
checksum of the file in the processor and adding it to the header and
passing it to the idempotent consumer

Thanks and Regards
Deepak Anand


On Tue, Jul 9, 2019 at 6:08 PM Claus Ibsen <claus.ib...@gmail.com> wrote:

> Set idempotentRepository on the file endpoint instead and remove
> idempotent consumer in the route.
>
> On Tue, Jul 9, 2019 at 7:01 PM Deepak Anand <deepakanand2...@gmail.com>
> wrote:
> >
> > Hi Devs ,
> >
> > I have a requirement to read from a directory where there are no of files
> > in the subdirectories of it . I am using the camel File component for it
> .
> > I don't want to process the same file which has been processed earlier
> > using an idempotentConsumer and using RedisCache for it.
> > In the cache I am adding the checksum to identify the duplicate file. The
> > problem is that after reading more than 1000 files the same files(which
> > were read earlier) are being read again and again which I don't want .
> > Below is the snippet of the camel route
> >
> > from("file://" + sourceLoc + "/?recursive=true&noop=true"
> >
> +"&idempotent=true&idempotentKey=${file:name}-${file:modified}&readLockRemoveOnCommit=false")
> >                         .log("the read file is
> > ${file:name}-${file:modified} ")
> >                 .filter()
> >                 .method(SourceMinAgeFilter.class, "accept")
> >                 .process("checksumprocessor")
> >                 .idempotentConsumer(header("fileCheckSum"),
> >
> > RedisIdempotentRepository.redisIdempotentRepository(stringredisTemplate,
> > redisStoragename))
> >                 .eager(true)
> >                 .skipDuplicate(false)
> >                 .choice()
> >
> > .when(exchangeProperty(Exchange.DUPLICATE_MESSAGE).isEqualTo(true))
> >                 .log("file  ${file:onlyname} is a duplicate and hence
> > skipped.")
> >                 .otherwise()
> >                 .process("fileprocessor")
> >                 .log("processed file name from Source Folder" +
> > ("${file:onlyname}"))
> >                 .log("Total time to process a file : from Source Folder
> " +
> > (System.currentTimeMillis() - starttime) + " ms").end();
> >
> > Could you please let me know what needs to be done so that the same file
> is
> > not read again and again. Since I am using RedisCache it is not getting
> > processed as the checksum is same but I don't want to read the file again
> > that has been processed
> >
> > Thanks and Regards
> > Deepak Anand
>
>
>
> --
> Claus Ibsen
> -----------------
> http://davsclaus.com @davsclaus
> Camel in Action 2: https://www.manning.com/ibsen2
>

Reply via email to