Re: On (semi-)automated testing and improved workflow of LTS uploads

2019-07-13 Thread Raphael Hertzog
Hi,

On Tue, 09 Jul 2019, Jonas Meurer wrote:
> 1. Upload packages targeted at LTS suites to some dedicated place for
>automated testing
> 2. Run automatic tests (piuparts, autopkgtests, lintian?, ...)
> 3. If tests passed, publish the packages somewhere to do manual
>testing (and reviews)
> 4. (Optionally?) demand acknowledgement by a second (different) LTS
>developer
> 5. Automatically upload packages that got uploaded, passed tests and got
>second acknowledgement to the targeted LTS upload queue
> 
> While that would be very nice to have, it's probably a long way to go
> until we have such infrastructure.

If we never start to work in this direction, it will be a long way, yes
;-)

> There seems to be some agreement that the first step would be to run
> (semi-)automated tests (e.g. piuparts and autopkgtests) against the
> packages before uploading them to ${LTS}-security, i.e. point 2 of the
> list above.

Right, how do we make sure that this has been done? Shall we require that
the logs of autopkgtest/piuparts have been uploaded somewhere?

That could be the start of the above infrastructure. Have a place where
we can create "updates" and upload artifacts related to each update.

> What's your thoughts on this? Do you think that we could implement
> most/all of the desired workflow using Salsa-CI/Gitlab-CI? Or would it
> be better to build it entirely independently of Salsa - e.g. implement
> it in dak?

I believe that the service should be separate and standalone. It should
not be part of dak either.

On Thu, 11 Jul 2019, Mike Gabriel wrote:
> > 2. Run automatic tests (piuparts, autopkgtests, lintian?, ...)
> 
> Maybe, probably not lintian. As the package maintainer, at the time
> (old)oldstable (meaning the current Debian LTS) turned from testing to
> stable, might not have had their packages in shape lintian-wise. Also only
> using an old lintian would be appropriate. A recent lintian on old packages
> just creates too much noise (which we won't fix anyway).

Running lintian might be interesting... but to compare the lintian output
on the current package and on the updated package.

> > 4. (Optionally?) demand acknowledgement by a second (different) LTS
> >developer
> 
> Although demanding a second ACK adds an extra delay to our workflow, I sense
> that such a second pair of eyes peering at security patches might greatly
> improve the quality of the LTS work. Even if we don't come up with some
> auto-test engine, we should consider "peer-"reviewed uploads.

I'm not so sure about this. In some cases reviewing the patch itself and
the updated code is hard and I don't expect the reviewer to be willing to
invest that energy. But at least the reviewer can ensure that the tests
have been run, can double check whether we should add some autopkgtest
for extra safety, etc.

> > What's your thoughts on this? Do you think that we could implement
> > most/all of the desired workflow using Salsa-CI/Gitlab-CI? Or would it
> > be better to build it entirely independently of Salsa - e.g. implement
> > it in dak?
> 
> Personally, I think that using Salsa for this, adds an extra layer of
> complexity to the uploading workflow, because we have to pump all packages
> that we want to fix in LTS through GitLab.

So that means that we have to build some automation for this... that way
it could even be run by frontdesk whene it does CVE triage and add the
package to dla-needed.txt and the repository would be ready for work when
the contributors wants to work on it.

> the paid contributors. Some package uploads may even be embargoed, so
> generically, the LTS-team namespace on Salsa needs to be private (which
> excludes other contributors, also the usual package maintainers/uploaders,
> by default).

We can have a private sub-group for embargoed updates.

> As our intention is to operate on packages (not on upstream code in Git), so

We obviously work on source code... using git is not in opposition with
using an external infrastructure to review the updates. And the merge
request is the logical place where both parts meet.

Cheers,
-- 
Raphaël Hertzog ◈ Debian Developer

Support Debian LTS: https://www.freexian.com/services/debian-lts.html
Learn to master Debian: https://debian-handbook.info/get/



Re: On (semi-)automated testing and improved workflow of LTS uploads

2019-07-11 Thread Sylvain Beucler
Hi,

On 11/07/2019 15:20, Jonas Meurer wrote:
>> Many packages are packaged in Git already (probably on Salsa) and have a
>> repo location of their own. With applying GitLab based CI to the
>> workflow, the LTS team would add an extra Git repo, just for the LTS
>> uploads done by the paid contributors.
> Yeah, I agree that this is a downside. An ideal workflow would probably
> look like this:
>
> 1a: for packages on salsa, fork the repo to lts-team/packages/
> 1a.1: if repo doesn't provide a 'jessie' branch, 'gbp import-dsc' the
>   jessie sources into a new branch
> 1b: for packages not on salsa, push the latest package version there
> 2: apply the package updating workflow, as discussed above
> 3a: file a merge request against the official package repo that asks
> to merge back the changes
>
> 99: whenever support for a LTS release ends, cleanup our team workspace
> and remove packages that no longer exist in the next LTS release(s).

I have my doubts about enforcing Git. Large packages such as Firefox can
take several GB once extracted and are time-consuming enough to handle
as-is, without an extra layer of storage, you see what I mean?

Beware that packages stored in Git may not use git-build-package.

TBH I don't understand much how Salsa-CI is to be used, are will it
(re)build massive packages from source, or will we commit pre-built
packages?


> We could use a dedicated private subgroup and move the working repos
> there whenever we need to handle embargoed issues.

Yes, embargoed issues are very rare and special-casing them sounds
better than making everything private (especially from a transparency PoV).


> One advantage of Gitlab/Salsa is that reviewing changes is very
> convenient in the Gitlab UI, especially if we used merge requests for
> new security uploads and require a second developer to merge them.

Note that some issues are purely compilation-related (e.g. building the
source package with a more recent Debian release) and won't appear in a
Salsa source diff view.
debdiff is my authoritative source :)

Cheers!
Sylvain




Re: On (semi-)automated testing and improved workflow of LTS uploads

2019-07-11 Thread Jonas Meurer
Hello,

Mike Gabriel:
>> In the internal discussions, the following vision for an improved upload
>> workflow arose:
>>
>> 1. Upload packages targeted at LTS suites to some dedicated place for
>>    automated testing
> 
>> 2. Run automatic tests (piuparts, autopkgtests, lintian?, ...)
> 
> Maybe, probably not lintian. As the package maintainer, at the time
> (old)oldstable (meaning the current Debian LTS) turned from testing to
> stable, might not have had their packages in shape lintian-wise. Also
> only using an old lintian would be appropriate. A recent lintian on old
> packages just creates too much noise (which we won't fix anyway).

The Salsa-CI runs lintian from the target suite against the packages.
While I agree that we should not put any effort into fixing longstanding
lintian warnings, running lintian per default and providing the output
for review seems sensible to me.

>> 4. (Optionally?) demand acknowledgement by a second (different) LTS
>>    developer
> 
> Although demanding a second ACK adds an extra delay to our workflow, I
> sense that such a second pair of eyes peering at security patches might
> greatly improve the quality of the LTS work. Even if we don't come up
> with some auto-test engine, we should consider "peer-"reviewed uploads.

I agree.

>> So far, two implementation approaches have been discussed:
>>
>> */ Build an own service that provides a dedicated upload queue (e.g.
>>    'lts-proposed-updates') which accepts uploads targeted at LTS suites,
>>    and processes the uploaded packages according to the workflow
>>    described above.
>>
>> */ Use Salsa-CI and their pipeline[2] for as much of the above proposal
>>    as possible.
>>
>> What's your thoughts on this? Do you think that we could implement
>> most/all of the desired workflow using Salsa-CI/Gitlab-CI? Or would it
>> be better to build it entirely independently of Salsa - e.g. implement
>> it in dak?
> 
> Personally, I think that using Salsa for this, adds an extra layer of
> complexity to the uploading workflow, because we have to pump all
> packages that we want to fix in LTS through GitLab.

We need a place to upload them to (for tests and reviews), anyway. So
what's the advantage of having a dedicated service (and use resources
there) compared to using some of Salsa's resources for it.

To be honest, I don't consider the expected extra CPU cylces for Salsa
that high. The amount of LTS security uploads by us (and if the Security
Team would use a similar solution: even the amount of security uploads
by us and them) isn't that high compared to the number of Debian
packages hosted on Salsa already.

> Many packages are packaged in Git already (probably on Salsa) and have a
> repo location of their own. With applying GitLab based CI to the
> workflow, the LTS team would add an extra Git repo, just for the LTS
> uploads done by the paid contributors.

Yeah, I agree that this is a downside. An ideal workflow would probably
look like this:

1a: for packages on salsa, fork the repo to lts-team/packages/
1a.1: if repo doesn't provide a 'jessie' branch, 'gbp import-dsc' the
  jessie sources into a new branch
1b: for packages not on salsa, push the latest package version there
2: apply the package updating workflow, as discussed above
3a: file a merge request against the official package repo that asks
to merge back the changes

99: whenever support for a LTS release ends, cleanup our team workspace
and remove packages that no longer exist in the next LTS release(s).

> Some package uploads may even be
> embargoed, so generically, the LTS-team namespace on Salsa needs to be
> private (which excludes other contributors, also the usual package
> maintainers/uploaders, by default).

We could use a dedicated private subgroup and move the working repos
there whenever we need to handle embargoed issues.

> As our intention is to operate on packages (not on upstream code in
> Git), so I'd suggest deploying/extending some sort of
> setup/infrastructure that utilizes Debian means for auto-testing LTS
> package upload candidates. And I really love the idea of a review
> workflow for package uploads.

One advantage of Gitlab/Salsa is that reviewing changes is very
convenient in the Gitlab UI, especially if we used merge requests for
new security uploads and require a second developer to merge them.

Cheers
 jonas



signature.asc
Description: OpenPGP digital signature


Re: On (semi-)automated testing and improved workflow of LTS uploads

2019-07-11 Thread Guido Günther
Hi,
On Thu, Jul 11, 2019 at 11:15:34AM +, Mike Gabriel wrote:
[..snip..]
> Personally, I think that using Salsa for this, adds an extra layer of
> complexity to the uploading workflow, because we have to pump all packages
> that we want to fix in LTS through GitLab.

On the plus side of salsa/gitlab is:

- already existing account management (including groups)
- being able to leverage the work of the gitlab ci project
- already existing commenting system so all information would be in one
  place

It would basically allow to start right away, worst case an additional
worker would need to be added.
Cheers,
 -- Guido



Re: On (semi-)automated testing and improved workflow of LTS uploads

2019-07-11 Thread Mike Gabriel

Hi Jonas, hi all,

thanks for summarizing the discussion we had on the non-public paid  
LTS contributors' "mailing list".


On  Di 09 Jul 2019 16:21:47 CEST, Jonas Meurer wrote:


Hello,

Some LTS members recently started discussing options for better
(semi-)automated testing of LTS uploads and an improved upload workflow.
I'll try to summarize the discussion in order to bring it to this public
mailinglist. [1]

The motivation for an improved package upload workflow basically is to
lower the risk of (simple) regressions and improve the overall quality
of LTS security uploads. The way to get there is to run (semi-)automated
tests against packages before uploading them to ${LTS}-security and
(optionally) enforce a second review+acknowledgement by another LTS
developer.

In the internal discussions, the following vision for an improved upload
workflow arose:

1. Upload packages targeted at LTS suites to some dedicated place for
   automated testing


Yep.


2. Run automatic tests (piuparts, autopkgtests, lintian?, ...)


Maybe, probably not lintian. As the package maintainer, at the time  
(old)oldstable (meaning the current Debian LTS) turned from testing to  
stable, might not have had their packages in shape lintian-wise. Also  
only using an old lintian would be appropriate. A recent lintian on  
old packages just creates too much noise (which we won't fix anyway).



3. If tests passed, publish the packages somewhere to do manual
   testing (and reviews)


If either step (2. or 3.) fails, we go back to 1.


4. (Optionally?) demand acknowledgement by a second (different) LTS
   developer


Although demanding a second ACK adds an extra delay to our workflow, I  
sense that such a second pair of eyes peering at security patches  
might greatly improve the quality of the LTS work. Even if we don't  
come up with some auto-test engine, we should consider "peer-"reviewed  
uploads.



5. Automatically upload packages that got uploaded, passed tests and got
   second acknowledgement to the targeted LTS upload queue


yep


While that would be very nice to have, it's probably a long way to go
until we have such infrastructure.

There seems to be some agreement that the first step would be to run
(semi-)automated tests (e.g. piuparts and autopkgtests) against the
packages before uploading them to ${LTS}-security, i.e. point 2 of the
list above.

So far, two implementation approaches have been discussed:

*/ Build an own service that provides a dedicated upload queue (e.g.
   'lts-proposed-updates') which accepts uploads targeted at LTS suites,
   and processes the uploaded packages according to the workflow
   described above.

*/ Use Salsa-CI and their pipeline[2] for as much of the above proposal
   as possible.

What's your thoughts on this? Do you think that we could implement
most/all of the desired workflow using Salsa-CI/Gitlab-CI? Or would it
be better to build it entirely independently of Salsa - e.g. implement
it in dak?


Personally, I think that using Salsa for this, adds an extra layer of  
complexity to the uploading workflow, because we have to pump all  
packages that we want to fix in LTS through GitLab.


Many packages are packaged in Git already (probably on Salsa) and have  
a repo location of their own. With applying GitLab based CI to the  
workflow, the LTS team would add an extra Git repo, just for the LTS  
uploads done by the paid contributors. Some package uploads may even  
be embargoed, so generically, the LTS-team namespace on Salsa needs to  
be private (which excludes other contributors, also the usual package  
maintainers/uploaders, by default).


As our intention is to operate on packages (not on upstream code in  
Git), so I'd suggest deploying/extending some sort of  
setup/infrastructure that utilizes Debian means for auto-testing LTS  
package upload candidates. And I really love the idea of a review  
workflow for package uploads.


And, open question: Would such a workflow be an option for the  
security team's workflow, too?


Greets,
Mike
--

DAS-NETZWERKTEAM
c\o Technik- und Ökologiezentrum Eckernförde
Mike Gabriel, Marienthaler str. 17, 24340 Eckernförde
mobile: +49 (1520) 1976 148
landline: +49 (4351) 486 14 27

GnuPG Fingerprint: 9BFB AEE8 6C0A A5FF BF22  0782 9AF4 6B30 2577 1B31
mail: mike.gabr...@das-netzwerkteam.de, http://das-netzwerkteam.de



pgpL6ZV17YY4O.pgp
Description: Digitale PGP-Signatur


On (semi-)automated testing and improved workflow of LTS uploads

2019-07-09 Thread Jonas Meurer
Hello,

Some LTS members recently started discussing options for better
(semi-)automated testing of LTS uploads and an improved upload workflow.
I'll try to summarize the discussion in order to bring it to this public
mailinglist. [1]

The motivation for an improved package upload workflow basically is to
lower the risk of (simple) regressions and improve the overall quality
of LTS security uploads. The way to get there is to run (semi-)automated
tests against packages before uploading them to ${LTS}-security and
(optionally) enforce a second review+acknowledgement by another LTS
developer.

In the internal discussions, the following vision for an improved upload
workflow arose:

1. Upload packages targeted at LTS suites to some dedicated place for
   automated testing
2. Run automatic tests (piuparts, autopkgtests, lintian?, ...)
3. If tests passed, publish the packages somewhere to do manual
   testing (and reviews)
4. (Optionally?) demand acknowledgement by a second (different) LTS
   developer
5. Automatically upload packages that got uploaded, passed tests and got
   second acknowledgement to the targeted LTS upload queue

While that would be very nice to have, it's probably a long way to go
until we have such infrastructure.

There seems to be some agreement that the first step would be to run
(semi-)automated tests (e.g. piuparts and autopkgtests) against the
packages before uploading them to ${LTS}-security, i.e. point 2 of the
list above.

So far, two implementation approaches have been discussed:

*/ Build an own service that provides a dedicated upload queue (e.g.
   'lts-proposed-updates') which accepts uploads targeted at LTS suites,
   and processes the uploaded packages according to the workflow
   described above.

*/ Use Salsa-CI and their pipeline[2] for as much of the above proposal
   as possible.

What's your thoughts on this? Do you think that we could implement
most/all of the desired workflow using Salsa-CI/Gitlab-CI? Or would it
be better to build it entirely independently of Salsa - e.g. implement
it in dak?

---

Regardless of our long-term approach, I decided to work on the second
approach as I consider it desirable to have proper support for testing
Jessie LTS uploads using the Salsa-CI as an intermediate target and it
seems to be the much lower hanging fruit to me.

So during the last two weeks, I spent some time on improving and testing
Jessie support in the Salsa-CI. There's two pending merge requests [3,
4], and once merged, we would be able to use a slightly modified[5]
version of the Salsa-CI pipeline for testing Jessie LTS package uploads.

Cheers
 jonas

PS: I put the Debian Security Team in the loop as I don't expect them to
read debian-lts and the topic might be interesting to them as well.
My hope is that if we build some QA solution for LTS uploads, that
solution can be used for Security Team uploads as well.

[1] To all others: please complement/correct me if you feel that I
missed or missframed anything.
[2] https://salsa.debian.org/salsa-ci-team/pipeline
[3] https://salsa.debian.org/salsa-ci-team/images/merge_requests/74
[4] https://salsa.debian.org/ci-team/debci/merge_requests/89
[5] https://salsa.debian.org/mejo/pdns/blob/jessie/debian/gitlab-ci.yml
(after MR's got merged, only lines 1-12 would be necessary)




signature.asc
Description: OpenPGP digital signature