luigiberrettini commented on pull request #3743:
URL: https://github.com/apache/kafka/pull/3743#issuecomment-698379057


   I saw that the 
[Sender](https://github.com/apache/kafka/blob/2.6.0/clients/src/main/java/org/apache/kafka/clients/producer/internals/Sender.java#L602)
 checks for a `Errors.DUPLICATE_SEQUENCE_NUMBER` but I was not able to find 
where this error is triggered on the server side.
   
   It seems to me that duplicate detection relies on checking if the sequence 
number is more than `lastPersistedSeq + 1`.
   If this is the case:
    - why storing the metadata for the last batches and not just relying on the 
sequence number of the last message persisted in the log?
    - why limiting `max.in.flight.requests.per.connection` to a maximun value 
of 5 if duplicates are still detected when metadata is not found (and therefore 
with any number of max in flights)?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to