My logging changes [0][1][2][3] are getting closer to being mergeable
(the first has already been merged). Tony Thomas' Swift Mailer change
[4] is also progressing. Both sets of changes introduce the concept of
specifying external library dependencies, both required and suggested,
to mediawiki/core.git via composer.json. Composer can be used by
people directly consuming the git repository to install and manage
these dependencies. I gave a example set of usage instructions in the
commit message for my patch that introduced the dependency on PSR-3
[0]. In the production cluster, on Jenkins job runners and in the
tarball releases we will want a different solution.

My idea of how to deal with this is to create a new gerrit repository
(mediawiki/core/vendor.git?) that contains a composer.json file
similar to the one I had in patch set 7 of my first logging patch [5].
This composer.json file would be used to tell Composer the exact
versions of libraries to download. Someone would manually run Composer
in a checkout of this repository and then commit the downloaded
content, composer.lock file and generated autoloader.php to the
repository for review. We would then be able to branch and use this
repository as git submodule in the wmf/1.2XwmfY branches that are
deployed to production and ensure that it is checked out along with
mw-core on the Jenkins nodes. By placing this submodule at $IP/vendor
in mw-core we would be mimicking the configuration that direct users
of Composer will experience. WebStart.php already includes
$IP/vendor/autoload.php when present so integration with the rest of
wm-core should follow from that.

It would also be possible to add this repo to the tarballs for
distribution. There will probably need to be some adjustments for that
process however and the final result may be that release branches
update the mediawiki/core composer.json and provide a composer.lock
along with a pre-populated vendor directory. I would be glad to
participate in discussions of that use case, but we will have about 6
months before we need to solve it (and a new release management RFC to
resolve between now and then).

There are several use cases to consider for the general solution:

== Adding/updating a library ==
* Update composer.json in mediawiki/core/vendor.git
* Run `composer update` locally to download library (and dependencies)
* Run `composer dump-autoload --optimize` to make an optimized autoloader.php
* Commit changes
* Push changes for review in gerrit

== Hotfix for an external library ==
At some point we will run into a bug or missing feature in a Composer
managed library that we need to work around with a patch. Obviously we
will attempt to upstream any such fixes (otherwise what's the point of
this whole exercise?). To keep from blocking things for our production
cluster we would want to fork the upstream, add our patch for local
use and upstream the patch. During the time that the patch was pending
review in the upstream we would want to use our locally patched
version in production and Jenkins.

Composer provides a solution for this with its "repository" package
source. The Composer documentation actually gives this exact example
in their discussion of the "vcs" repository type [6]. We would create
a gerrit repository tracking the external library, add our patch(es),
adjust the composer.json file in mediawiki/core/vendor.git to
reference our fork, and finally run Composer in
mediawiki/core/vendor.git to pull in our patched version.

== Adding a locally developed library ==
The Platform Core team has been talking about extracting libraries
from mw-core and/or extensions to be published externally. This may be
done for any and all of the current $IP/includes/libs classes and
possibly other content from core such as FormatJson.

My idea for this would be to create a new gerrit repository for each
exported project. The project repo would contain a composer.json
manifest describing the project correctly to be published at
packagist.org like most Composer installable libraries. In the
mediawiki/core/vendor.git composer.json file we would pull these
libraries just like any third-party developed library. This isn't
functionally much different than the way that we use git submodules
today. There is one extra level of indirection when a library is
changed. The mediawiki/core/vendor.git will have to be updated with
the new library version before the hash for the git submodule of
mediawiki/core/vendor.git is updated in a deploy or release branch.

== wmf/1.XwmfY branches ==
The make-wmf-branch script (found in mediawiki/tools/release.git) is
used to create the weekly release branches that are deployed by the
"train" on each Thursday. This script would be updated to branch the
new mediawiki/core/vendor.git repository and add the version
appropriate branch as a submodule of mediawiki/core.git on the wmf/*
branch. This is functionally exactly what we do for extensions today.

== Updating a deployment branch ==
SWAT deploys often deploy bug fixes for extensions and core that can't
wait for the next train release. It is a near certainty that
mediawiki/core/vendor.git will have the same need. The process for
updating mediawiki/core/vendor.git will be almost the same as updating
an extension.

* Follow the adding/updating library or hotfix instructions to get the
changes merged into the mediawiki/core/vendor.git master branch.
* Cherry-pick the change into the proper deployment branch
* Merge the cherry-pick
* Update the git submodule for mediawiki/core/vendor.git in the
appropriate deployed branch
* Pull update to tin
* sync-dir to deploy to cluster

== Security fixes ==
This is a special case of upstreaming a patch. A security patch would
be applied directly on the deployed branch of
mediawiki/core/vendor.git as we would do for any extension. The
vulnerability and patch must then be submitted upstream in a
responsible manner and tracked for resolution.

== Jenkins ==
The Jenkins jobs that checkout and run tests involving mediawiki/core
would need to be amended to also checkout the
mediawiki/core/vendor.git in the appropriate location before running
tests.


What use cases did I miss? What other concerns do we have for this process?

[0]: https://gerrit.wikimedia.org/r/#/c/119939/
[1]: https://gerrit.wikimedia.org/r/#/c/119940/
[2]: https://gerrit.wikimedia.org/r/#/c/119941/
[3]: https://gerrit.wikimedia.org/r/#/c/119942/
[4]: https://gerrit.wikimedia.org/r/#/c/135290/
[5]: https://gerrit.wikimedia.org/r/#/c/119939/7/libs/composer.json,unified
[6]: https://getcomposer.org/doc/05-repositories.md#vcs

Bryan
-- 
Bryan Davis              Wikimedia Foundation    <bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software Engineer            Boise, ID USA
irc: bd808                                        v:415.839.6885 x6855

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to