On Fri, Jun 10, 2016 at 4:19 PM, Boris Pavlovic wrote:
> Morgan,
>
>
>> When there were failures, the failures were both not looked at by the
>> Rally team and was not performance reasons at the time, it was rally not
>> able to be setup/run at all.
>
>
> Prove this, or it's not true. I agree the
Morgan,
> When there were failures, the failures were both not looked at by the
> Rally team and was not performance reasons at the time, it was rally not
> able to be setup/run at all.
Prove this, or it's not true. I agree there were such situations (very
rarely actually) and we were fixing th
Lance,
I share just how it looked from my side.
I really support your idea (no matter what you pick to use your
tooling/rally/jmeter) it is very valuable, especially if it will become
voting job.
This really should be done by someone.
Best regards,
Boris Pavlovic
On Fri, Jun 10, 2016 at 3:26 PM
On Fri, Jun 10, 2016 at 3:26 PM, Lance Bragstad wrote:
>
>1. I care about performance. I just believe that a big hurdle has been
>finding infrastructure that allows us to run performance tests in a
>consistent manner. Dedicated infrastructure plays a big role in this,
> which is h
1. I care about performance. I just believe that a big hurdle has been
finding infrastructure that allows us to run performance tests in a
consistent manner. Dedicated infrastructure plays a big role in this,
which is hard (if not impossible) to obtain in the gate - making the gate
Lance,
It is amazing effort, I am wishing you good luck with Keystone team,
however i faced some issues when I started similar effort
about 3 years ago with Rally. Here are some points, that are going to be
very useful for you:
1. I think that Keystone team doesn't care about performance &
Ok - here is what I have so far [0], and I admit there is still a bunch of
work to do [1]. I encourage folks to poke through the code and suggest
improvements via Github Issues. I've never really stood up third-party
testing before so this is completely new to me and I'm open to feedback,
and worki
Excerpts from Brant Knudson's message of 2016-06-03 15:16:20 -0500:
> On Fri, Jun 3, 2016 at 2:35 PM, Lance Bragstad wrote:
>
> > Hey all,
> >
> > I have been curious about impact of providing performance feedback as part
> > of the review process. From what I understand, keystone used to have a
From: Lance Bragstad [lbrags...@gmail.com]
Sent: Friday, June 03, 2016 1:57 PM
To: OpenStack Development Mailing List (not for usage questions)
Subject: Re: [openstack-dev] [keystone][all] Incorporating performance feedback
into the review process
Here is a list
On Jun 3, 2016 13:16, "Brant Knudson" wrote:
>
>
>
> On Fri, Jun 3, 2016 at 2:35 PM, Lance Bragstad
wrote:
>>
>> Hey all,
>>
>> I have been curious about impact of providing performance feedback as
part of the review process. From what I understand, keystone used to have a
performance job that wo
Dedicated and isolated infrastructure is a must if we want consistent
performance numbers. If we can come up with a reasonable plan, I'd be happy
to ask for resources. Even with dedicated infrastructure we would still
have to keep in mind that it's a data point from a single provider that
hopefully
On Fri, Jun 3, 2016 at 2:35 PM, Lance Bragstad wrote:
> Hey all,
>
> I have been curious about impact of providing performance feedback as part
> of the review process. From what I understand, keystone used to have a
> performance job that would run against proposed patches (I've only heard
> abo
On Fri, Jun 03, 2016 at 01:53:16PM -0600, Matt Fischer wrote:
> On Fri, Jun 3, 2016 at 1:35 PM, Lance Bragstad wrote:
>
> > Hey all,
> >
> > I have been curious about impact of providing performance feedback as part
> > of the review process. From what I understand, keystone used to have a
> > pe
On Fri, Jun 3, 2016 at 1:35 PM, Lance Bragstad wrote:
> Hey all,
>
> I have been curious about impact of providing performance feedback as part
> of the review process. From what I understand, keystone used to have a
> performance job that would run against proposed patches (I've only heard
> abo
Hey all,
I have been curious about impact of providing performance feedback as part
of the review process. From what I understand, keystone used to have a
performance job that would run against proposed patches (I've only heard
about it so someone else will have to keep me honest about its timefra
15 matches
Mail list logo