Thanks. Missed that part of documentation. Appreciate your help. Regards.

On Mon, May 25, 2020 at 10:42 PM Jungtaek Lim <kabhwan.opensou...@gmail.com>
wrote:

> Hi,
>
> You need to add the prefix "kafka." for the configurations which should be
> propagated to the Kafka. Others will be used in Spark data source
> itself. (Kafka connector in this case)
>
>
> https://spark.apache.org/docs/2.4.5/structured-streaming-kafka-integration.html#kafka-specific-configurations
>
> Hope this helps.
>
> Thanks,
> Jungtaek Lim (HeartSaVioR)
>
>
> On Tue, May 26, 2020 at 6:42 AM Something Something <
> mailinglist...@gmail.com> wrote:
>
>> I keep getting this error message:
>>
>>
>> *The message is 1169350 bytes when serialized which is larger than the
>> maximum request size you have configured with the max.request.size
>> configuration.*
>>
>>
>>
>> As indicated in other posts, I am trying to set the “max.request.size”
>> configuration in the Producer as follows:
>>
>>
>> ---------------------
>>
>> .writeStream
>>
>> .format(*"kafka"*)
>>
>> .option(
>>
>>   *"kafka.bootstrap.servers"*,
>>
>>   conig.outputBootstrapServer
>>
>> )
>>
>> .option(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, *"10000000"*)
>>
>> ---------------------
>>
>>
>>
>> But this is not working. Am I setting this correctly? Is there a
>> different way to set this property under Spark Structured Streaming?
>>
>>
>> Please help. Thanks.
>>
>>
>>

Reply via email to