Not completely related to the topic of the question but when I change a
configuration of one my jobs do I need to recompile it and send the new
tar.gz to hdfs or just change the deploy/samza config and it should work.

Thanks


On Tue, Aug 12, 2014 at 11:23 PM, Telles Nobrega <[email protected]>
wrote:

> Hi, I'm running an experiment that I'm suppose to run samza with different
> input rates. First I'm running with 420 messages/second and I scale up to
> 33200 messages/second.
>
> Does one kafka-broker handle this much messages per second?
> Second, what is the best way to read into samza this much messages? I have
> one job that get this messages and another that reads from the output of
> the first job that does some more processing. Is the best way to use more
> containers and split kafka topics in partitions (the same number of
> containers) or is there a better way to do this.
>
> Thanks in advance,
>
> --
> ------------------------------------------
> Telles Mota Vidal Nobrega
> M.sc. Candidate at UFCG
> B.sc. in Computer Science at UFCG
> Software Engineer at OpenStack Project - HP/LSD-UFCG
>



-- 
------------------------------------------
Telles Mota Vidal Nobrega
M.sc. Candidate at UFCG
B.sc. in Computer Science at UFCG
Software Engineer at OpenStack Project - HP/LSD-UFCG

Reply via email to