Hi Stevo

Thank's for your comment.

On 16 May 2015 at 17:43, Stevo Slavić <ssla...@gmail.com> wrote:
> Nice, thanks for sharing!
>
> Is 30k msgs/sec publishing or push  throughput? Will check, hopefully
> performance tests are included in sources.

30k msgs/sec is our regular traffic handled by hermes on 4-nodes
cluster. 30k/sec messages published to the broker.

>
> Does Hermes have same max number of topics limitations as Kafka or does it
> include a solution to have that aspect scalable as well?

I'm interested how many topics/partitions are problematic for Kafka
cluster? We did not encounter Kafka limitations for max number of
topics in our pub/sub scenarios.

> On May 16, 2015 8:02 AM, "Marcin Kuthan" <marcin.kut...@gmail.com> wrote:
>
>> Hi Everyone
>>
>> Hermes is an asynchronous message broker built on top of Kafka. It
>> provides reliable, fault tolerant REST interface for message
>> publishing and adaptive push mechanisms for message sending. Hermes is
>> used as a message broker for asynchronous communication between
>> microservices.
>>
>> Some of the main features:
>>
>> - Performance and scalability - Hermes in production handling up to
>> 30.000 msgs/sec with 99.9th percentile latency below 100 ms on a
>> 4-node cluster.
>>
>> - Reliability - Hermes is used for publishing sensitive data, such as
>> billing events, user account changes etc. Hermes allows to define more
>> reliable policy for those important events - require acknowledge from
>> all Kafka replicas and increase request timeouts.
>>
>> - Push model - It makes receiving messages from Hermes dead simple:
>> you just have to write one HTTP endpoint in your service. It’s up to
>> Hermes to create Kafka consumer, redeliver messages, keep eye on
>> throughput limits etc.
>>
>> Feedback and comments are welcome, you can find Hermes documentation at:
>> http://hermes-pubsub.readthedocs.org/en/latest/index.html
>>
>> Hermes is published under Apache Licence:
>> https://github.com/allegro/hermes
>>
>> Best Regards,
>> Marcin
>>

Reply via email to