Heh--I started this on Rave, so I'm +1.

Marlon

On 12/16/13 2:05 PM, Suresh Marru wrote:
> On Dec 16, 2013, at 1:51 PM, Saminda Wijeratne <samin...@gmail.com> wrote:
>
>> I was thinking of an actual checklist where we can check-off/vote-off once 
>> each test is done. Perhaps we can start with a simple spreadsheet with the 
>> Tests specified by Raman added.
> + 1. Here is an example from Rave. Template for Quality Assurance [1] and an 
> example [2].
>
> Bottom line, for atleast few days during the release process, we all should 
> become the QA Team.
>
> Currently, we are doing scripted testing like 5, 10 minute tutorials and grid 
> job submissions and lot of code still does not get touched. As an example, 
> provenance aware search became nonfunctional and until Sanjaya pointed it 
> out, we did not notice it. It will be useful, if randomly (or by 
> co-ordination) we all test an RC against various features and then post them 
> to DISCUSS thread. Otherwise, the releases just become pointing to a tag. We 
> need to move from releases being a formality to every release robusting the 
> code. We have so much active development and if we turn some energy to 
> testing and bug fixing, I think our users will be happy with the outcome. 
>
> Suresh
> [1] - http://wiki.apache.org/rave/ReleaseManagement/QualityAssurance
> [2] - 
> http://wiki.apache.org/rave/ReleaseManagement/ReleaseSchedule/VerificationResults-0.11
>>
>> On Mon, Dec 16, 2013 at 10:24 AM, Chathuri Wimalasena <kamalas...@gmail.com> 
>> wrote:
>> There is a general checklist added by Raman [1], which covers basic 
>> functionalities. 
>>
>> Thanks..
>> Chathuri
>>
>> [1] 
>> https://cwiki.apache.org/confluence/display/AIRAVATA/Airavata+Release+Testing
>>
>>
>> On Mon, Dec 16, 2013 at 12:56 PM, Saminda Wijeratne <samin...@gmail.com> 
>> wrote:
>>
>>
>>
>> On Mon, Dec 16, 2013 at 9:28 AM, Suresh Marru <sma...@apache.org> wrote:
>> Thanks Amila for weighing in. Comments inline:
>>
>> On Dec 16, 2013, at 11:29 AM, Amila Jayasekara <thejaka.am...@gmail.com> 
>> wrote:
>>
>>> Hi Suresh,
>>>
>>> I have some comments inline.
>>>
>>>
>>> On Mon, Dec 16, 2013 at 10:53 AM, Suresh Marru <sma...@apache.org> wrote:
>>> Hi All,
>>>
>>> This is a very good question. Lets discuss these options so we are 
>>> consistent across releases.
>>>
>>> If we look at the way we are doing releases, we are calling a feature 
>>> freeze and code freeze and cutting a release. Most of the time, our build 
>>> is broken. Jenkins   statistics for Airavata is not looking good at all [1].
>>>
>>> There is something wrong with the Jenkins configurations. I tried to figure 
>>> out sometime back I was unable to do so. Even though builds are successful 
>>> in our local machines they are failing intermittently in Jenkins.
>>>
>>> We are barely fixing the build a day before the release, putting out an RC 
>>> and testing on it and releasing it in a quick succession.
>>>
>>> This is not entirely true. For the past few months I only experienced one 
>>> or two build breaks (maybe less). I build couple of times per week. I 
>>> believe usually build is stable and with integration tests passing, we 
>>> always get a workable version. I know its not a good practice not to rely 
>>> on the build server. But commiters have personal discipline to keep the 
>>> build stable. Nevertheless we must fix Jenkins configuration issue.
>> May be we should put focus on Jenkins configuration? Any volunteers?
>>
>>> As we are seeing on user lists, we have users upgrading with every release. 
>>> I think we should increase the release quality.
>>>
>>> +1 for this.
>>>
>>> I would vote for atleast 3 RC’s per release. If we are not finding issues 
>>> in first RC, I would say, either the software has magically become too too 
>>> good or we are not doing through testing. I suspect the later.
>> How about we keep a checklist of release tests? I know we already send a 
>> mail on dev on what needs to be tested for each RC, but I need that is too 
>> abstract. For core developers of Airavata I think there should be test cases 
>> predefined (a test document if you may). Since we have several core 
>> developers in the list we can atleast decide upon what must be tested and 
>> make sure that each test case is covered by atleast one developer for a RC.
>>> I guess you mentioned this under assumption that build is not stable.
>> Half of my assumption is on Jenkins, so if builds are ok and Jenkins is 
>> thinking wrong, then we can alleviate it by fixing it.
>>
>>> I will propose the following, please counter it and lets agree on a process:
>>>
>>> * Lets post a RC1 as is (which means it will have a snapshot). This pack, 
>>> we should all test as much as possible, so its more of a test candidate 
>>> then a release candidate. If it helps, we can use the name TC1. I am not 
>>> particular on the naming but trying to emphasize the need for having 
>>> atleast more RC's per release.
>>>
>>> I am not sure whether we really need a TC. The release manager should be 
>>> doing some verifications on the RC before putting it out. Therefore it 
>>> should be a RC. Anyhow i am fine having TC concept and trying it out.
>> We probably should stick to RC, but I think the onus should not be on the RM 
>> to test it. They should coordinate and mobilize every one to do the testing 
>> including doing a testing bit more than others. But my point is, we should 
>> test and the only way to do that is to put a series of RC’s and have focused 
>> testing.
>> A TC should be something internal IMO. But when we are going for a release 
>> it should be alpha, beta and then RC releases. I think it need not be 
>> mandatory for the RMs to do pre-evaluation of the builds other than making 
>> sure all the unit tests and integration tests pass. Once an RC is confirmed 
>> of release quality I think we can follow the actual release cycle from the 
>> trunk itself with since its in a code freeze anyway.
>>
>> Suresh
>>
>>> What we really need is set of verifiable test cases.
>>>
>>> Thank you
>>> Regards
>>> Amila
>>>
>>>
>>> * If we do not expose significant issues in RC/TC 1 then we proceed with 
>>> RC2 which will follow the proper release process. But if we have a 
>>> reasonable issues bought out, we need a RC2/TC2 also without following the 
>>> release process.
>>>
>>> * The key thing I am proposing is, we keep doing RC/TC’s until we all are 
>>> sure the quality is good enough with documented known issues. When we are 
>>> sure, then we proceed to have RC with proper release process.
>>>
>>> So this will mean more testing and twice (or more) the times every one has 
>>> to test, but I think it is worth it. This might also get over the 6 week 
>>> release cycle, but I think we need to trade for some quality releases as we 
>>> march towards 1.0.
>>>
>>> Suresh
>>> [1] - https://builds.apache.org/job/Apache%20Airavata/
>>>
>>>
>>> On Dec 15, 2013, at 4:28 PM, Lahiru Gunathilake <glah...@gmail.com> wrote:
>>>
>>>> Hi Chathuri,
>>>>
>>>> I think having snapshot as the version in RC is wrong. Every RC has to be 
>>>> like a release and if it pass we just call a vote/discussion thread and do 
>>>> the release. If we do with snapshot  and if things go right, then have to 
>>>> change versions and test again. But we can do the release just by changing 
>>>> snapshot without testing but that wrong AFAIT.
>>>>
>>>> I remember doing this mistake in earlier release with RC1 build. I think 
>>>> we can stick to the release management instructions in airavata.org.
>>>>
>>>> Regards
>>>> Lahiru
>>>>
>>>>
>>>> On Fri, Dec 13, 2013 at 3:43 PM, Chathuri Wimalasena 
>>>> <kamalas...@gmail.com> wrote:
>>>> Hi All,
>>>>
>>>> Airavata 0.11 RC1[1] is ready for testing.
>>>>
>>>> Here are some pointers for testing
>>>>       • Verify the fixed issue for this release [2]
>>>>       • Verify the basic workflow composition/execution/monitoring 
>>>> scenarios from
>>>>       • Airavata 5 & 10 min tutorials [3],[4]
>>>>       • Verify airavata client samples
>>>>       • Verify the stability with derby & mysql backend databases
>>>>       • Verify that the XBaya JNLP distribution works
>>>>       • Verify deploying Airavata server in a tomcat distribution
>>>> Please report any issues[5] if you encounter while testing. Thank you for 
>>>> your time in validating the release.
>>>>
>>>> Regards,
>>>> Chathuri (On behalf of Airavata PMC)
>>>>
>>>> [1] https://dist.apache.org/repos/dist/dev/airavata/0.11/RC1/
>>>> [2] 
>>>> https://issues.apache.org/jira/browse/AIRAVATA-278?jql=project%20%3D%20AIRAVATA%20AND%20fixVersion%20%3D%20%220.11%22%20ORDER%20BY%20status%20DESC%2C%20priority%20DESC
>>>> [3] 
>>>> http://airavata.apache.org/documentation/tutorials/airavata-in-5-minutes.html
>>>> [4] 
>>>> http://airavata.apache.org/documentation/tutorials/airavata-in-10-minutes.html
>>>> [5] https://issues.apache.org/jira/browse/AIRAVATA
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> System Analyst Programmer
>>>> PTI Lab
>>>> Indiana University
>>>
>>
>>
>>

Reply via email to