@James: that was incredible. Thank you.
On Wed, Apr 26, 2017 at 9:53 PM, James Cheng wrote:
> Ramya, Todd, Jiefu, David,
>
> Sorry to drag up an ancient thread. I was looking for something in my
> email archives, and ran across this, and I might have solved part of these
> mysteries.
>
> I ran a
Ramya, Todd, Jiefu, David,
Sorry to drag up an ancient thread. I was looking for something in my email
archives, and ran across this, and I might have solved part of these mysteries.
I ran across this post that talked about seeing weirdly large allocations when
incorrect requests are accidental
st of length 1550939497 is not valid, it is
larger than the maximum size of 104857600 bytes
Are you actually getting requests that are 1.3 GB in size, or is something
else happening, like someone trying to make HTTP requests against the Kafka
broker port?
-Todd
On Mon, Dec 12, 2016 at 4:
Are you actually getting requests that are 1.3 GB in size, or is something
else happening, like someone trying to make HTTP requests against the Kafka
broker port?
-Todd
On Mon, Dec 12, 2016 at 4:19 AM, Ramya Ramamurthy <
ramyaramamur...@teledna.com> wrote:
> We have got exactly the same proble
We have got exactly the same problem.
nvalid receive (size = 1347375956 larger than 104857600).
When trying to increase the size, Java Out of Memory Exception.
Did you find a work around for the same ??
Thanks.
It could be a client error, but we're seeing it show up in Mirror Maker.
-Todd
On Tue, Jul 14, 2015 at 1:27 PM, JIEFU GONG wrote:
> Got it, looks like I didn't understand the request process and am failing
> to use AB properly. Thanks for the help everyone! I suspect you might be
> running int
Got it, looks like I didn't understand the request process and am failing
to use AB properly. Thanks for the help everyone! I suspect you might be
running into a similar error, David.
On Tue, Jul 14, 2015 at 11:56 AM, Jay Kreps wrote:
> This is almost certainly a client bug. Kafka's request form
This is almost certainly a client bug. Kafka's request format is size
delimited messages in the form
<4 byte size N>
If the client sends a request with an invalid size or sends a partial
request the server will see effectively random bytes from the next request
as the size of the next message an
I am not familiar with Apache Bench. Can you share more details on
what you are doing?
On Tue, Jul 14, 2015 at 11:45 AM, JIEFU GONG wrote:
> So I'm trying to make a request with a simple ASCII text file, but what's
> strange is even if I change files to send or the contents of the file I get
> th
So I'm trying to make a request with a simple ASCII text file, but what's
strange is even if I change files to send or the contents of the file I get
the same error message, even specifically the number of bytes of the
message which seems weird if I'm changing the content? Should I be using
Avro wi
This is interesting. We have seen something similar internally at LinkedIn
with one particular topic (and Avro schema), and only once in a while.
We've seen it happen 2 or 3 times so far. We had chalked it up to bad
content in the message, figuring that the sender was doing something like
sending a
@Gwen
I am having a very very similar issue where I am attempting to send a
rather small message and it's blowing up on me (my specific error is:
Invalid receive (size = 1347375956 larger than 104857600)). I tried to
change the relevant settings but it seems that this particular request is
of 1340
Your payload is so small that I suspect it's an encoding issue. Is your
producer set to expect a byte array and you're passing a string? Or vice
versa?
On Sat, Jul 11, 2015 at 11:08 PM, David Montgomery <
davidmontgom...@gmail.com> wrote:
> I cant send this s simple payload using python.
>
>
Did you try setting message.max.bytes and replica.fetch.max.bytes to
values larger than the message you are trying to send?
>From the error message, they should be at least 1550939497.
On Sat, Jul 11, 2015 at 10:14 PM, David Montgomery
wrote:
> Hi
>
>
> Below is my server.properties
>
> I am not
Hi
Below is my server.properties
I am not having an issue with consuming from my kafka broker. I have
having an issue writing to my broker. One send bombs.
# limitations under the License.
# see kafka.server.KafkaConfig for additional details and defaults
# Serve
You need to configure the Kafka broker to allow you to send larger messages.
The relevant parameters are:
message.max.bytes (default:100) – Maximum size of a message the
broker will accept. This has to be smaller than the consumer
fetch.message.max.bytes, or the broker will have messages that
I cant send this s simple payload using python.
topic: topic-test-development
payload: {"utcdt": "2015-07-12T03:59:36", "ghznezzhmx": "apple"}
No handlers could be found for logger "kafka.conn"
Traceback (most recent call last):
File "/home/ubuntu/workspace/feed-tests/tests/druid-adstar.py
17 matches
Mail list logo