I agree more information is better, however it is not obvious where that 
information will come from.


-        Test case run result history
Do we keep records of old runs in Colorado for functest & yardsticks? If we do, 
let’s link them up.
If not, we can always re-run these tests on the frozen Colorado release and 
produce these results. Are we regularly running them now?
Also note, the Colorado release is frozen, test cases are frozen, so this spot 
info may not be as relevant as it appears. However I agree it’ll become more 
informational and valuable with longer history.


-        Source code of test cases
Do we have a link to source code repos in openstack, ODL/ONOS, etc upstreams?
Can someone involved in CI/CD pitch in?

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Friday, January 27, 2017 6:20 AM
To: Wenjing Chu <wenjing....@huawei.com>; Pierre Lynch <ply...@ixiacom.com>; 
Jose Lausuch <jose.laus...@ericsson.com>
Cc: TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

More inline.

Thanks,
Bryan Sullivan | AT&T

From: Wenjing Chu [mailto:wenjing....@huawei.com]
Sent: Thursday, January 26, 2017 12:30 PM
To: SULLIVAN, BRYAN L <bs3...@att.com<mailto:bs3...@att.com>>; Pierre Lynch 
<ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Hi Bryan

Hope my inline responses are still readable …
Thanks.

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Thursday, January 26, 2017 8:29 AM
To: Wenjing Chu <wenjing....@huawei.com<mailto:wenjing....@huawei.com>>; Pierre 
Lynch <ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

More comments inline.

Thanks,
Bryan Sullivan | AT&T

From: Wenjing Chu [mailto:wenjing....@huawei.com]
Sent: Wednesday, January 25, 2017 3:00 PM
To: SULLIVAN, BRYAN L <bs3...@att.com<mailto:bs3...@att.com>>; Pierre Lynch 
<ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Thanks Bryan. See my response inline below.

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Wednesday, January 25, 2017 11:32 AM
To: Wenjing Chu <wenjing....@huawei.com<mailto:wenjing....@huawei.com>>; Pierre 
Lynch <ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

I posted some comments in gerrit. Here are the main points I think we need 
alignment on:

1)      All proposed dovetail included tests will be added one-by-one, in a 
separate commit.

Please follow the gerrit tickets below and see if you can follow through. Test 
cases are organized into two levels for convenience: test areas and test cases. 
There will be one commit for each test case, and one commit for each test area 
(which includes a lot of test cases that are related to a function area). The 
test case commit says we are good on how that test case is implemented. The 
test area commit says we agree the test case ought to be included. Clear enough?
[bryan] A test reference should not be added to a test area until the test has 
been approved (e.g. verified by more than one committer/reviewer). I see one 
commit with about 40 tests referenced. Have all these been verified?

[wenjing] the act of “adding a test case to a test area” is reflected as adding 
a single line in the compliance_set.yml file. Using your example, I can add 40 
tests in a patch, meaning, I “propose” (based on the wiki discussion in the 
past) that these 40 cases be included. The submitter is sending this proposal 
out for review. If you have opinion about any of these, or simply want more 
explanation/clarification, please comment on the gerrit patch. Based on those 
comments, the submitter can modify the patch to reflect updated view and 
resubmit, until we are good and approve with +1 (or disapprove with -1). Hope 
that is clear. These are patch reviews, to be approved. Not yet.

I’m not sure what you mean by “verified”. If you mean if the software is 
tested, I think the answer is yes. If you mean if the test case has been 
approved, no, a patch email is precisely asking you to review.

[bryan] It will be much more complicated to approve a block of tests for 
inclusion, rather than test-by-test, unless it is clarified that as a group 
they have been passing (e.g by reference to Jenkins jobs where the tests have 
been successfully running as a group). A single large commit with a bunch of 
text strings with no explanation as to what they are, where they have been 
tested, etc, is not the way we should manage the dovetail testcase list.


2)      The commit will include a link to the details of the test case (script 
or otherwise what I would use to run the test for myself)

You can trace down to the source step by step, e.g. from test area to test 
case, then to functest or yardstick, and/or to openstack or other upstream 
eventually the source code in that upstream project.

To test run it, you would need a test environment/pod. I would think that 
running dovetail tool, specifying the individual test cases you’d want to run, 
and examining the results probably a good way to go. Maybe good to write down a 
“how-to” cheat-sheet for this?
[bryan] Not sure how that answered the comment. Rather than having to search 
for something that relates to the test case reference, it would be good for the 
commit message (for test cases) to contain a URL reference to the test source. 
That’s what I was referring to. We need to simplify the effort of reviewers, to 
encourage more active reviews and second-opinions/testing on the tests. Re the 
test environment, that’s no problem the test should clarify what is needed and 
how to run it. Having an environment to do so is clear, and should be clarified 
by the test anyway.

[wenjing] I’ll be happy to try a simpler way for us to review upstream code, so 
if anyone has better idea, please suggest here. But let me first make sure I 
understand what you are asking. If a piece of code is written in Dovetail 
project, yes, your gerrit patch will show everything. If we are only 
referencing a test case in Functest, you will only see the reference line of 
source in Dovetail. I believe we’ve provided in that line a handle to locate 
the source in Functest (which can be a lot more line or files). If then 
Functest references another test in Openstack or ODL, then the source info is 
buried in CI/CD process and ultimately out of OPNFV. We may not even have a 
ready copy to reference other than may be github clones.  Without full CI/CD 
integration between projects, it’s not obvious how one can precisely locate 
them. Let me ask around how this can be done. But before I embark on this, is 
that what you are looking for? Anything else I missed? I will look into the 
how-to doc on running individual test cases.

[bryan] I believe the request is clear; we need direct evidence that each test 
is passing in CI/CD. Having to go hunt for (1) information about what the test 
is about; (2) evidence that it is a regular successful test in CI/CD (meaning 
that it is executed successfully on a regular basis), are both ineffective ways 
to manage the approved tests. If we are going to use Gerrit for the review 
process (which I think is a bit too low-level of a tool – I prefer JIRA), then 
the reviews need to be as self-sufficient as possible, with at most direct 
links to the test cases as proposed for inclusion, and the evidence of regular 
successful runs in CI/CD.


3)      All tests need to be working under at least one scenario, and the more 
scenarios that have been validated (either explicitly or implicitly), the 
higher priority the test should get. “Implicit” means that a test validated on 
a basic scenario (e.g. nosdn) is implicitly validated on other scenarios for 
that installer. But explicit validation is of course best.

Thanks for highlighting the implicit cases: more “implicit” is “better”, 
because it means something works  more “universally” rather than relying on 
special cases. I would caution on the “more scenario” metrics again because it 
does not necessarily mean “larger applicability”. Sometimes it does, sometimes 
it doesn’t. Also note the fact that we ought not be counting non-generic 
scenarios as the same as generic ones. So let’s not be too numerical about it, 
the criteria should be about the larger applicability scope. I made this point 
in one my earlier emails as well going through the scenarios in Colorado.
[bryan] Sure, more scenarios tested does not necessarily mean more broadly 
validated. But at least for the basic installer scenarios, validation across 
more installers should bump up the priority of the test.

[wenjing] agreed.


4)      The reviewers may require that they be able to duplicate the test 
validation before commit merge.

Please refer to 2) and see if you need anything else.
///


Thanks,
Bryan Sullivan | AT&T

From: 
opnfv-tech-discuss-boun...@lists.opnfv.org<mailto:opnfv-tech-discuss-boun...@lists.opnfv.org>
 [mailto:opnfv-tech-discuss-boun...@lists.opnfv.org] On Behalf Of Wenjing Chu
Sent: Wednesday, January 25, 2017 11:26 AM
To: Pierre Lynch <ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area


The process that we may have already being informally following is as follows. 
We work towards having consensus on majority of areas that arise within the 
dovetail project. If there are open questions that we can’t resolve, we could 
gather the relevant info and bring that to TSC for decision. In the TSC review, 
dovetail will present the proposed plan out of dovetail, plus potentially open 
issues, and ask for (a) approval of the proposal (b) determination of open 
questions, if any.
Does this sound like a good process to follow?

On the topic of scenario cleanup, the Dovetail team has been voicing that 
opinion for a long time, and so I applaud and strongly support the effort to 
separate general vs specific scenarios, and it will help Dovetail tremendously 
going forward. However, also keep in mind that that work is slated for D and E 
releases. It unfortunately can’t help in the immediate task for us for C 
release target.

To join in the detailed review effort, please note that review of test areas 
and test cases are based on Jira and Gerrit. For example,

These are for test areas: (the file is compliance_set.yml)

https://gerrit.opnfv.org/gerrit/27493
https://gerrit.opnfv.org/gerrit/27219

And here is an example of a test case within a test area:

https://gerrit.opnfv.org/gerrit/27221

These Gerrit  links are also posted on wiki for convenience: 
https://wiki.opnfv.org/display/dovetail/Dovetail+Test+Areas+and+Test+Cases
However it’s a bit slow to refresh there since it is a manual process. I would 
recommend you get on gerrit still. We are at the beginning of the review 
process so it’s not late. General level questions or specific topics can of 
course still be done in mailing list or on meetings, but try to stay on gerrit 
as much as you can. Let us know if you have any feedback. Thanks.

Regards
Wenjing

From: 
opnfv-tech-discuss-boun...@lists.opnfv.org<mailto:opnfv-tech-discuss-boun...@lists.opnfv.org>
 [mailto:opnfv-tech-discuss-boun...@lists.opnfv.org] On Behalf Of Pierre Lynch
Sent: Wednesday, January 25, 2017 8:44 AM
To: Jose Lausuch <jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

IMHO, getting agreement on what the scope of testing will be (features, etc) 
should be pretty urgent. How should we go about it? Agree within the Dovetail 
team, then run it to the TSC to get their blessing? Should we consolidate this 
process with the current ongoing discussion on scenario consolidation, which 
lead to the idea of generic versus specific scenarios? Dovetail would include 
generic scenarios, while specific scenarios would be excluded from Dovetail? It 
would provide uniformity….

I would expect that determining what’s in and what’s out could be a delicate 
process.

Thanks,

Pierre




On Jan 25, 2017, at 7:18 AM, Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>> wrote:

Thanks Chris, that makes things clearer. But still, it is a broad statement and 
difficult to measure. I guess and as you say, the TSC has the final word when 
approving features to be verified/certified in Dovetail with existing tests. 
From functional prospective, I can just provide and overview about how the 
tests were behaving when releasing Colorado.

Regards,
Jose


From: Christopher Price
Sent: Wednesday, January 25, 2017 14:17 PM
To: Jose Lausuch; Tianhongbo; Tim Irnich
Cc: 'TECH-DISCUSS OPNFV'
Subject: Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Hi Jose,

The intent of this statement is that we should not attempt to establish 
compliance tests on features or capabilities that are unique to a very specific 
configuration or composition of components. The statement is intended to mean 
that we should focus our efforts on compliance on “generally available” or 
“community relevant” use cases and features.

Again, I am not able to accurately articulate what that means or how to measure 
it, as such we have a somewhat obtuse statement in the documentation.

This should be seen as a guideline to be applied by the development, testing 
and dovetail teams around expectations for compliance testing.  It would be 
eventually ratified or judged by the TSC as they have the final say on the 
tests that are approved for compliance validation for a given dovetail release.

Does that help?
I do believe we should formalize and commit our governance into a repo and have 
the TSC cast an approving eye over it as well for good form.  Then, if nothing 
else, we would have a more consistent view of our intention and needed approach.

/ Chris

From: 
<opnfv-tech-discuss-boun...@lists.opnfv.org<mailto:opnfv-tech-discuss-boun...@lists.opnfv.org>>
 on behalf of Jose Angel Lausuch Sales 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Date: Wednesday, 25 January 2017 at 11:51
To: Tianhongbo 
<hongbo.tianhon...@huawei.com<mailto:hongbo.tianhon...@huawei.com>>, Tim Irnich 
<tim.irn...@ericsson.com<mailto:tim.irn...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Hi Hongbo,


Test cases must pass on OPNFV reference deployments
•             Tests must not require a specific NFVi platform composition or 
installation tool

Can you please explain what this statement exactly means? By “installation 
tool” are we talking about the installers we have or a specific and different 
tool to install a certain feature?

Adding Tim, who is the SDNVPN PTL.

Thanks,
Jose

From: Tianhongbo [mailto:hongbo.tianhon...@huawei.com]
Sent: Wednesday, January 25, 2017 01:46 AM
To: Jose Lausuch
Cc: 'TECH-DISCUSS OPNFV'
Subject: [dovetail]L3VPN for dovetail area

Hi Jose:

As you mentioned, there will be discussion about the more detail of the L3VPN 
with L3VPN team to check if the L3VPN can be included in the dovetail area now.

There are some requirements from the dovetail wiki page: 
https://wiki.opnfv.org/display/dovetail/Dovetail+Test+Case+Requirements<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwiki.opnfv.org%2Fdisplay%2Fdovetail%2FDovetail%2BTest%2BCase%2BRequirements&data=01%7C01%7Cplynch%40ixiacom.com%7Ca899fc5c2cac41cc1b5e08d445357c28%7C069fd614e3f843728e18cd06724a9b23%7C0&sdata=%2FCI%2BUD%2F5T2kB12q%2Bv1Be2QLxpb3uvv34yrJ%2B2uUyyrA%3D&reserved=0>

Look forward to your reply.

Best regards

hongbo
_______________________________________________
opnfv-tech-discuss mailing list
opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>
https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.opnfv.org%2Fmailman%2Flistinfo%2Fopnfv-tech-discuss&data=01%7C01%7Cplynch%40ixiacom.com%7Ca899fc5c2cac41cc1b5e08d445357c28%7C069fd614e3f843728e18cd06724a9b23%7C0&sdata=tEw44iTeTSA%2FTtVV877qqLFhR%2FR8mwD2MbB1OBUZCP0%3D&reserved=0

_______________________________________________
opnfv-tech-discuss mailing list
opnfv-tech-discuss@lists.opnfv.org
https://lists.opnfv.org/mailman/listinfo/opnfv-tech-discuss

Reply via email to