Re: Kafka Applicability - Large Messages

2016-03-14 Thread David Remy
9am the the-life-cycle-o9f-programming-languages 9
Sent from mobile, please excuse typos.


 Original Message 
From: Ben Stopford 
Sent: Monday, March 14, 2016 08:19 AM
To: users@kafka.apache.org
Subject: Re: Kafka Applicability - Large Messages


Becket did a good talk at the last Kafka meetup on how Linked In handle the 
large message problem.

http://www.slideshare.net/JiangjieQin/handle-large-messages-in-apache-kafka-58692297
 
<http://www.slideshare.net/JiangjieQin/handle-large-messages-in-apache-kafka-58692297>

> On 14 Mar 2016, at 09:42, Jens Rantil  wrote:
>
> Just making it more explicit: AFAIK, all Kafka consumers I've seen loads the 
> incoming messages into memory. Unless you make it possible to stream it do 
> disk or something you need to make sure your consumers has the available 
> memory.
>
> Cheers,
> Jens
>
> On Fri, Mar 4, 2016 at 6:07 PM Cees de Groot  <mailto:c...@pagerduty.com>> wrote:
> 1GB sounds like a tad steep, you may want to do some testing, as Kafka
> needs to be told that such large messages can arrive and broker will then
> pre-allocate buffers for that. Personally, I'd stop short of low megabytes,
> anything bigger can be dropped off in e.g. S3 and then you just queue a
> link for further processing.
>
> I'm not saying it's impossible, Kafka handles large messages better than
> most other tools out there, but you do want to do a test setup to make sure
> that it'll handle the sort of traffic you fling at it in any case.
>
> On Fri, Mar 4, 2016 at 4:26 AM, Mahesh Dharmasena  <mailto:mahesh@gmail.com>>
> wrote:
>
> > We have a client with several thousand stores which send and receive
> > messages to main system that resides on the headquarters.
> >
> > A single Store sends and receive around 50 to 100 messages per day.
> >
> > Average Message size could be from 2KB to 1GB.
> >
> > Please let me know whether I can adapt Apache Kafka for the solution?
> >
> >
> > - Mahesh.
> >
>
>
>
> --
>
> *Cees de Groot*
> PRINCIPAL SOFTWARE ENGINEER
> [image: PagerDuty logo] <http://pagerduty.com/ <http://pagerduty.com/>>
> pagerduty.com <http://pagerduty.com/>
> c...@pagerduty.com <mailto:c...@pagerduty.com>  <mailto:m...@pagerduty.com>>
> +1(416)435-4085
>
> [image: Twitter] <http://twitter.com/pagerduty 
> <http://twitter.com/pagerduty>>[image: FaceBook]
> <https://www.facebook.com/PagerDuty 
> <https://www.facebook.com/PagerDuty>>[image: Google+]
> <https://plus.google.com/114339089137644062989 
> <https://plus.google.com/114339089137644062989>>[image: LinkedIn]
> <https://www.linkedin.com/company/pagerduty 
> <https://www.linkedin.com/company/pagerduty>>[image: Blog]
> <https://blog.pagerduty.com/ <https://blog.pagerduty.com/>>
> --
> Henrik Hedvall
> Lead Designer
> henrik.hedv...@tink.se <mailto:henrik.hedv...@tink.se>
> +46 72 505 57 59
>
> Tink AB
> Wallingatan 5
> 111 60 Stockholm, Sweden
> www.tink.se <http://www.tink.se/>
>
>
>


P Respectons ensemble l'environnement. N'imprimez ce message que si nécessaire. 
Let's respect the environment together. Only print this message if necessary.


Re: Kafka Applicability - Large Messages

2016-03-14 Thread Cees de Groot
On Mon, Mar 14, 2016 at 5:42 AM, Jens Rantil  wrote:

> Just making it more explicit: AFAIK, all Kafka consumers I've seen loads
> the incoming messages into memory. Unless you make it possible to stream it
> do disk or something you need to make sure your consumers has the available
> memory.
>
>
And to complete that picture - followers in a broker replica set are
technically also consumers in terms of memory behavior, etc.


Re: Kafka Applicability - Large Messages

2016-03-14 Thread Ben Stopford
Becket did a good talk at the last Kafka meetup on how Linked In handle the 
large message problem. 

http://www.slideshare.net/JiangjieQin/handle-large-messages-in-apache-kafka-58692297
 


> On 14 Mar 2016, at 09:42, Jens Rantil  wrote:
> 
> Just making it more explicit: AFAIK, all Kafka consumers I've seen loads the 
> incoming messages into memory. Unless you make it possible to stream it do 
> disk or something you need to make sure your consumers has the available 
> memory.
> 
> Cheers,
> Jens
> 
> On Fri, Mar 4, 2016 at 6:07 PM Cees de Groot  > wrote:
> 1GB sounds like a tad steep, you may want to do some testing, as Kafka
> needs to be told that such large messages can arrive and broker will then
> pre-allocate buffers for that. Personally, I'd stop short of low megabytes,
> anything bigger can be dropped off in e.g. S3 and then you just queue a
> link for further processing.
> 
> I'm not saying it's impossible, Kafka handles large messages better than
> most other tools out there, but you do want to do a test setup to make sure
> that it'll handle the sort of traffic you fling at it in any case.
> 
> On Fri, Mar 4, 2016 at 4:26 AM, Mahesh Dharmasena  >
> wrote:
> 
> > We have a client with several thousand stores which send and receive
> > messages to main system that resides on the headquarters.
> >
> > A single Store sends and receive around 50 to 100 messages per day.
> >
> > Average Message size could be from 2KB to 1GB.
> >
> > Please let me know whether I can adapt Apache Kafka for the solution?
> >
> >
> > - Mahesh.
> >
> 
> 
> 
> --
> 
> *Cees de Groot*
> PRINCIPAL SOFTWARE ENGINEER
> [image: PagerDuty logo] >
> pagerduty.com 
> c...@pagerduty.com   >
> +1(416)435-4085
> 
> [image: Twitter]  >[image: FaceBook]
>  >[image: Google+]
>  >[image: LinkedIn]
>  >[image: Blog]
> >
> -- 
> Henrik Hedvall
> Lead Designer
> henrik.hedv...@tink.se 
> +46 72 505 57 59
> 
> Tink AB
> Wallingatan 5
> 111 60 Stockholm, Sweden 
> www.tink.se 
> 
> 
> 



Re: Kafka Applicability - Large Messages

2016-03-14 Thread Jens Rantil
Just making it more explicit: AFAIK, all Kafka consumers I've seen loads
the incoming messages into memory. Unless you make it possible to stream it
do disk or something you need to make sure your consumers has the available
memory.

Cheers,
Jens

On Fri, Mar 4, 2016 at 6:07 PM Cees de Groot  wrote:

> 1GB sounds like a tad steep, you may want to do some testing, as Kafka
> needs to be told that such large messages can arrive and broker will then
> pre-allocate buffers for that. Personally, I'd stop short of low megabytes,
> anything bigger can be dropped off in e.g. S3 and then you just queue a
> link for further processing.
>
> I'm not saying it's impossible, Kafka handles large messages better than
> most other tools out there, but you do want to do a test setup to make sure
> that it'll handle the sort of traffic you fling at it in any case.
>
> On Fri, Mar 4, 2016 at 4:26 AM, Mahesh Dharmasena 
> wrote:
>
> > We have a client with several thousand stores which send and receive
> > messages to main system that resides on the headquarters.
> >
> > A single Store sends and receive around 50 to 100 messages per day.
> >
> > Average Message size could be from 2KB to 1GB.
> >
> > Please let me know whether I can adapt Apache Kafka for the solution?
> >
> >
> > - Mahesh.
> >
>
>
>
> --
>
> *Cees de Groot*
> PRINCIPAL SOFTWARE ENGINEER
> [image: PagerDuty logo] 
> pagerduty.com
> c...@pagerduty.com 
> +1(416)435-4085
>
> [image: Twitter] [image: FaceBook]
> [image: Google+]
> [image: LinkedIn]
> [image: Blog]
> 
>
-- 
*Henrik Hedvall*
Lead Designer
henrik.hedv...@tink.se
+46 72 505 57 59

Tink AB
Wallingatan 5
111 60 Stockholm, Sweden
www.tink.se


Re: Kafka Applicability - Large Messages

2016-03-04 Thread Cees de Groot
1GB sounds like a tad steep, you may want to do some testing, as Kafka
needs to be told that such large messages can arrive and broker will then
pre-allocate buffers for that. Personally, I'd stop short of low megabytes,
anything bigger can be dropped off in e.g. S3 and then you just queue a
link for further processing.

I'm not saying it's impossible, Kafka handles large messages better than
most other tools out there, but you do want to do a test setup to make sure
that it'll handle the sort of traffic you fling at it in any case.

On Fri, Mar 4, 2016 at 4:26 AM, Mahesh Dharmasena 
wrote:

> We have a client with several thousand stores which send and receive
> messages to main system that resides on the headquarters.
>
> A single Store sends and receive around 50 to 100 messages per day.
>
> Average Message size could be from 2KB to 1GB.
>
> Please let me know whether I can adapt Apache Kafka for the solution?
>
>
> - Mahesh.
>



-- 

*Cees de Groot*
PRINCIPAL SOFTWARE ENGINEER
[image: PagerDuty logo] 
pagerduty.com
c...@pagerduty.com 
+1(416)435-4085

[image: Twitter] [image: FaceBook]
[image: Google+]
[image: LinkedIn]
[image: Blog]



Re: Kafka Applicability - Large Messages

2016-03-04 Thread Vinoth Chandar
I have used messages upto 20MB and while not ideal, works fairly well. But
if you are stepping into GB of data, you may need to chunk them up and
reassemble.


On Friday, March 4, 2016, Mahesh Dharmasena  wrote:

> We have a client with several thousand stores which send and receive
> messages to main system that resides on the headquarters.
>
> A single Store sends and receive around 50 to 100 messages per day.
>
> Average Message size could be from 2KB to 1GB.
>
> Please let me know whether I can adapt Apache Kafka for the solution?
>
>
> - Mahesh.
>


Kafka Applicability - Large Messages

2016-03-04 Thread Mahesh Dharmasena
We have a client with several thousand stores which send and receive
messages to main system that resides on the headquarters.

A single Store sends and receive around 50 to 100 messages per day.

Average Message size could be from 2KB to 1GB.

Please let me know whether I can adapt Apache Kafka for the solution?


- Mahesh.