Higher number of producer creation/cleanup can lead to memory leaks at the brokers?

2019-08-14 Thread Tianning Zhang
Dear all,  I am using Amazon AWS Lambda functions to produce messages to a Kafka cluster. As I can not control how frequently a Lambda function is initiated/invoked and I can not share object between invocations - I have to create a new Kafka producer for each invocation and clean it up after t

RecordTooLargeException on 16M messages in Kafka?

2019-08-14 Thread l vic
My kafka (1.0.0) producer errors out on large (16M) messages. ERROR Error when sending message to topic test with key: null, value: 16777239 bytes with error: (org.apache.kafka.clients.producer.internals. ErrorLoggingCallback) org.apache.kafka.common.errors.RecordTooLargeException: The message is