[Wikitech-l] Re: 🤿🚂 Diving Into Wikimedia Deployment Data

2022-02-16 Thread Risker
Thank you very much for sharing this data, Tyler (and to the team that
researched and analysed it, as well).  I think it shows that the train has
been pretty successful in mitigating the issues it was intended to
improve.

I note the data points that show there has been a significant and clear
trend toward fewer comments per patch.  This would be worth investigating
further. Iis the total number of reviews pretty consistent, or is it
increasing or decreasing?  Is it possible that developers have become more
proficient at writing patches to standard, and thus fewer comments are
required?  Or could it be that, because more time is invested in writing
patches (assuming that more patches = more time writing them), there is
less time for review?

I've always found the train to be very interesting, and in fact mentioned
it when being interviewed for a recently published article (in a positive
way).  I'm pleased and perhaps a bit relieved to see that the research has
borne out my impression of how it has made such a big difference in the
deployment process.

Risker/Anne

On Wed, 16 Feb 2022 at 06:01, Tyler Cipriani 
wrote:

> *tl;dr:* We have open data on Wikimedia production deployments. Read Diving
> into Deployment Data
> 
>  to
> learn more (or read on, I guess).
> _
>
> If you’ve ever experienced the pride of seeing your name on MediaWiki's
> contributor list, you've been involved in our deployment process.
>
> This realization inspires questions – *we have *📈 *data to answer those
> questions!*
>
>- We wrote a blog (Research folks did the hard parts): ⚙️Phabricator:
>Diving Into Our Deployment Data
>
> 
>- The data is open: 🦊 GitLab train-stats
>
>- Play with the live data (if you'd rather dive into SQL): 💾
>data.releng.team 
>
> Thanks!
>
> Tyler Cipriani (he/him/his)
> Engineering Manager, Release Engineering
> Wikimedia Foundation
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: [Cloud] [Cloud-announce] [IMPORTANT] Announcing Toolforge Debian Stretch Grid Engine deprecation

2022-02-16 Thread Roy Smith
From my perspective of a Toolforge user, one of the issues I see is that it's 
often not clear how to map the "friendly command line interface" into concepts 
I already understand about the lower level tools.

For example, the webservice script does some useful stuff.  But, it wasn't 
clear exactly what it was doing, i.e. there was a lot of magic happening.  
While the magic is certainly an integral part of hiding the low-level details, 
it also obfuscates things.  Reading the webservice script wasn't much help; 
it's long and complicated, and mixes grid and k8s functionality in a way that 
further hides what's actually going on.

Anyway, all I'm really asking is that as the docs get written for the "friendly 
command line interface", you also include some explanation of what's happening 
behind the scenes.  For example, maybe have a --verbose option to all the tools 
which makes it print all the back end commands it's executing, so

> webservice --backend=kubernetes python3.7 restart

might print:

> kubectl exec -i -tshell-1645020371 --container main-app -- /bin/bash

And then somebody who already understands kubectl would instantly understand 
what's happening.  It's not hard to guess the basic gist of what it must be 
doing, but having the details confessed eliminates any doubt, enhancing 
comprehension.

As another example, it took me a little bit to figure out that the "become" 
command doesn't do anything more magic than run sudo with a little sanity 
checking wrapped around it.  Fortunately, that script is simple enough that 
once I looked at it, it was obvious what it was doing.  But other parts of the 
"friendly command line interface" are rather more opaque.


> On Feb 15, 2022, at 11:42 AM, Seyram Komla Sapaty  
> wrote:
> 
> One of the most prominent missing features on Kubernetes was a friendly
> command line interface to schedule jobs (like jsub). We've been working
> on that, and have a beta-level interface that you can try today: the
> Toolforge jobs framework [4].

___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: new extension PageProperties

2022-02-16 Thread Zoran Dori
Hi,
extension itself looks good to me. There are some changes needed, but I
will make a patch later.

Meanwhile, I found that there are two branches: main and master.

As the default branch on Gerrit is master, I would like to propose merging
the main branch with the master branch.

Having in mind that I'm not able to do it, I would like to ask Gerrit
admins to do it.

Best regards,
Zoran

уто, 15. феб 2022. у 13:27 Thomas  је написао/ла:

> hello,
> I have published recently the extension PageProperties
>
> https://www.mediawiki.org/wiki/Extension:PageProperties
>
> it is mainly a page properties aggregator where users can
> set display title, language and content model of a page, in one place,
> plus defining SEO meta data and
> Semantic Mediawiki properties (provided that SMW is installed)
> without annotating them manually on the page.
>
> It partly develops the concept of "Enterprise Mediawiki" mentioned here
> https://phabricator.wikimedia.org/T149612
>
> (together with other extensions which are either in beta status, or not yet
> published that I plan to publish progressively.)
>
> Technically, it uses the Javascript tabs used by PreferencesFormOOUI,
> plus other pieces taken from the DisplayTitle extension (mentioned within
> the code) and from the special pages SpecialPageLanguage and
> SpecialChangeContentModel, and of course original code.
>
>
> Kind regards
> (Thomas)
>
>
>
>
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] 🤿🚂 Diving Into Wikimedia Deployment Data

2022-02-16 Thread Tyler Cipriani
*tl;dr:* We have open data on Wikimedia production deployments. Read Diving
into Deployment Data

to
learn more (or read on, I guess).
_

If you’ve ever experienced the pride of seeing your name on MediaWiki's
contributor list, you've been involved in our deployment process.

This realization inspires questions – *we have *📈 *data to answer those
questions!*

   - We wrote a blog (Research folks did the hard parts): ⚙️Phabricator:
   Diving Into Our Deployment Data
   

   - The data is open: 🦊 GitLab train-stats
   
   - Play with the live data (if you'd rather dive into SQL): 💾
   data.releng.team 

Thanks!

Tyler Cipriani (he/him/his)
Engineering Manager, Release Engineering
Wikimedia Foundation
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/