Hi Daniel,

On Tue, Nov 3, 2020 at 9:42 AM Daniel Winkler <danielwink...@google.com> wrote:
>
> Hello Luiz,
>
> Thank you for the information. It is good to know that this tool is
> actively used and that there is a way to skip existing flaky tests.
> Just for clarification, is this a requirement to land the kernel
> changes, i.e. should I prioritize adding these tests immediately to
> move the process forward? Or can we land the changes based on the
> testing I have already done and I'll work on these tests in parallel?

We used to require updates to mgmt-tester but it seems some of recent
command did not have a test yet, but if we intend to have the CI to
tests the kernel changes properly I think we should start to requiring
it some basic testing, obviously it will be hard to cover everything
that is affected by a new command but the basic formatting, etc, we
should be able to test, also tester supports the concept of 'not run'
which we can probably use for experimental commands.

> Thanks,
> Daniel
>
> On Thu, Oct 29, 2020 at 5:04 PM Luiz Augusto von Dentz
> <luiz.de...@gmail.com> wrote:
> >
> > Hi Daniel,
> >
> > On Thu, Oct 29, 2020 at 3:25 PM Daniel Winkler <danielwink...@google.com> 
> > wrote:
> > >
> > > Hi Luiz,
> > >
> > > Thank you for the feedback regarding mgmt-tester. I intended to use
> > > the tool, but found that it had a very high rate of test failure even
> > > before I started adding new tests. If you have a strong preference for
> > > its use, I can look into it again but it may take some time. These
> > > changes were tested with manual and automated functional testing on
> > > our end.
> > >
> > > Please let me know your thoughts.
> >
> > Total: 406, Passed: 358 (88.2%), Failed: 43, Not Run: 5
> >
> > Looks like there are some 43 tests failing, we will need to fix these
> > but it should prevent us to add new ones as well, you can use -p to
> > filter what tests to run if you want to avoid these for now.



-- 
Luiz Augusto von Dentz

Reply via email to