I have just proposed enabling Travis on a different thread. That should
help with this. (Having a separate machine would be best, but I don't know
how we could get one. I'll do the homework for this.)

On Oct 13, 2016 5:57 PM, "Lior Zeno" <liorz...@gmail.com> wrote:

> Maybe getting an isolated environment? The CI environment might be shared
> among multiple users, adding too much noise to the performance test.
>
> On Thu, Oct 13, 2016 at 6:53 PM, Balazs Donat Bessenyei <
> bes...@cloudera.com
> > wrote:
>
> > +1
> >
> > I think this is a good idea!
> >
> > How can I help with setting it up?
> >
> > On Oct 13, 2016 5:20 PM, "Lior Zeno" <liorz...@gmail.com> wrote:
> >
> > > Hi All,
> > >
> > > Monitoring Flume's performance over time is an important step in every
> > > production-level application.  Benchmarking Flume on a nightly basis
> has
> > > the following advantages:
> > >
> > > * Better understanding of Flume's bottlenecks.
> > > * Allow users to compare the performance of different solutions, such
> as
> > > Logstash and Fluentd.
> > > * Better understanding of the influence of recent commits on
> performance.
> > >
> > > Logstash already conducts various performance tests, more details in
> this
> > > link:
> > > http://logstash-benchmarks.elastic.co/
> > >
> > > I propose adding a few micro-benchmarks showing Flume's TPS vs date (of
> > > course, in the ideal case where the input and/or output do not
> bottleneck
> > > the system), e.g. using the SeqGen source.
> > >
> > > Thoughts?
> > >
> > > Thanks
> > >
> >
>

Reply via email to