@Gwen
I am having a very very similar issue where I am attempting to send a
rather small message and it's blowing up on me (my specific error is:
Invalid receive (size = 1347375956 larger than 104857600)). I tried to
change the relevant settings but it seems that this particular request is
of 1340 mbs (and davids will be 1500 mb) and attempting to change the
setting will give you another error saying there is not enough memory in
the java heap. Any insight here?

Specifically I am speculating the issue is indeed what Shayne has said
about encoding: I am trying to use apachebench to send a post request to a
kafka server but it is returning the above error -- do I have to format the
data in any way as this might be the reason why I'm experience this issue.


On Sun, Jul 12, 2015 at 6:35 AM, Shayne S <shaynest...@gmail.com> wrote:

> Your payload is so small that I suspect it's an encoding issue. Is your
> producer set to expect a byte array and you're passing a string? Or vice
> versa?
>
> On Sat, Jul 11, 2015 at 11:08 PM, David Montgomery <
> davidmontgom...@gmail.com> wrote:
>
> > I cant send this soooo simple payload using python.
> >
> > topic: topic-test-development
> > payload: {"utcdt": "2015-07-12T03:59:36", "ghznezzhmx": "apple"}
> >
> >
> > No handlers could be found for logger "kafka.conn"
> > Traceback (most recent call last):
> >   File "/home/ubuntu/workspace/feed-tests/tests/druid-adstar.py", line
> 81,
> > in <module>
> >     test_send_data_to_realtimenode()
> >   File "/home/ubuntu/workspace/feed-tests/tests/druid-adstar.py", line
> 38,
> > in test_send_data_to_realtimenode
> >     response = producer.send_messages(test_topic,test_payload)
> >   File "/usr/local/lib/python2.7/dist-packages/kafka/producer/simple.py",
> > line 54, in send_messages
> >     topic, partition, *msg
> >   File "/usr/local/lib/python2.7/dist-packages/kafka/producer/base.py",
> > line 349, in send_messages
> >     return self._send_messages(topic, partition, *msg)
> >   File "/usr/local/lib/python2.7/dist-packages/kafka/producer/base.py",
> > line 390, in _send_messages
> >     fail_on_error=self.sync_fail_on_error
> >   File "/usr/local/lib/python2.7/dist-packages/kafka/client.py", line
> 480,
> > in send_produce_request
> >     (not fail_on_error or not self._raise_on_response_error(resp))]
> >   File "/usr/local/lib/python2.7/dist-packages/kafka/client.py", line
> 247,
> > in _raise_on_response_error
> >     raise resp
> > kafka.common.FailedPayloadsError
> >
> > Here is what is in my logs
> > [2015-07-12 03:29:58,103] INFO Closing socket connection to
> > /xxx.xxx.xxx.xxx due to invalid request: Request of length 1550939497 is
> > not valid, it is larger than the maximum size of 104857600 bytes.
> > (kafka.network.Processor)
> >
> >
> >
> > Server is 4 gigs of ram.
> >
> > I used export KAFKA_HEAP_OPTS=-Xmx256M -Xms128M in kafka-server-start.sh
> >
> > So.....why?
> >
>



-- 

Jiefu Gong
University of California, Berkeley | Class of 2017
B.A Computer Science | College of Letters and Sciences

jg...@berkeley.edu <elise...@berkeley.edu> | (925) 400-3427

Reply via email to