[Wikitech-l] Re: 🚀 TK-999 receives Web Perf Hero award!

2024-01-17 Thread Máté Szabó via Wikitech-l
Dear Derick and team,

Thank you for the award—I am humbled by the recognition. This was an 
interesting rabbit hole to dive into and I was happy to see the optimisation 
deliver such tangible results after the initial investigation was concluded. 
Looking forward to see the next interesting Perf Hero story!

Best regards,
Máté Szabó (he/him/his)
Staff Software Engineer
msz...@fandom.com


> On 16 Jan 2024, at 09:45, Derick Alangi  wrote:
> 
> Hi,
> 
> I'm happy to share that the next Web Perf Hero award (for 2023 Q4) goes to 
> TK-999, in recognition of their contributions, which made a huge positive 
> impact on the performance of Wikimedia software.
> 
> TK-999 worked on improving MediaWiki’s autoloading mechanism 
> <https://gerrit.wikimedia.org/r/c/853972> 
> (https://phabricator.wikimedia.org/T274041) specifically for classes which 
> have been namespaced, by including them in the classmap. This means our 
> autoloader will not spend significant amounts of time searching for files in 
> directories, but rather just do a direct lookup. This approach is 
> particularly optimized for speed and as a result, Wikipedia’s backend 
> response time gained a 20% increase in the portion of requests that complete 
> within 50ms.
> 
> This award is given once a quarter, and manifests as a Phabricator badge. 
> More information and past recipients: 
> https://wikitech.wikimedia.org/wiki/Web_Perf_Hero_award
> 
> Phabricator badge: https://phabricator.wikimedia.org/badges/view/17/
> 
> 
> -- Derick Alangi, on behalf of the MediaWiki Platform Team
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

Re: [Wikitech-l] Fatal exception when calling replaceVariables() from parser hook, after upgrade to MW 1.34

2021-04-08 Thread Máté Szabó
Hi Mark,

Yeah, newChild() expects a preprocessor node instance (rather than the raw 
arguments array itself) for the arguments, which can be obtained by calling 
Preprocessor::newPartNodeArray() with the given set of arguments.

For Parsoid-PHP and the potential upgrade work it requires, it is probably 
something that will need to be addressed eventually, but from what I 
understand, there’s plenty of time until then, so it should not be an immediate 
concern. :)

Best,
Máté Szabó 
Sr. Software Engineer
he - him - his

Fandom Poland sp. z o.o. z siedzibą w Poznaniu, ul. Abp. A. Baraniaka 6
Sąd Rejonowy Poznań – Nowe Miasto i Wilda w Poznaniu, VIII Wydział Gospodarczy 
Krajowego Rejestru Sądowego, KRS 254365
NIP: 5252358778
Kapitał zakładowy: 50.000,00 złotych

> On 8 Apr 2021, at 02:44, Mark Clements (HappyDog)  
> wrote:
> 
> "M�t� Szab�"  wrote in message 
> news:80e88bac-ae9b-42ae-a0ba-834a39a7a...@wikia-inc.com...
> 
> 
> Hi M�t�,
> 
> I know it's been a while, but I've only now found some time to work on this 
> in any depth.
> 
>> The DOM-based wikitext preprocessor (Preprocessor_DOM class and friends)
>> was deprecated in MediaWiki 1.34 and removed in MediaWiki 1.35 as part of
>> the Wikimedia Parsing Team's work around Parsoid-PHP.[1]
> 
> I guess that explains why things changed in MW 1.34, specifically.
> 
>> In the short/medium term, the easiest fix to keep your code working would 
>> be to
>> use the other preprocessor implementation (class Preprocessor_Hash and 
>> friends)
>> instead.
> 
> I think this is what I have now done.  The solution I implemented was to 
> replace the following line:
> 
>$NewFrame = new PPTemplateFrame_DOM($Frame->preprocessor,
>$Frame, array(), $Vars, 
> $Frame->title);
> 
> With this:
> 
>if (is_a($Frame, "PPFrame_Hash"))
>$TemplateFrameClass = "PPTemplateFrame_Hash";
>else
>$TemplateFrameClass = "PPTemplateFrame_DOM";
> 
>$NewFrame = new $TemplateFrameClass($Frame->preprocessor,
>$Frame, array(), $Vars, 
> $Frame->title);
> 
> This seems to work on both MW versions I am testing on (1.29 and 1.34) and 
> fits-in with your explanation, above.
> 
>> Since your code already has access to a PPFrame instance,
>> you can also try invoking its newChild() method to construct a
>> new child frame with your arguments, instead of creating the
>> instance directly.
> 
> I couldn't get this to work.  I needed to pass additional arguments into the 
> constructor, but got an error if I passed in an array of string => string 
> pairs and there was no documentation about how to convert such an array into 
> a format that the function would accept, so I gave up on this approach.
> 
>> In the long term, the legacy wikitext preprocessor will be removed, so
>> you may want to reach out to the Parsing Team[2] to find out how you
>> can make your code ready for Parsoid-PHP.
> 
> Based on that comment, I suspect that further upgrade work will be required 
> in due course, but at least I have solved the immediate problem for now!
> 
> Thanks for your help,
> 
> - Mark Clements
> (HappyDog)
> 
> 
> 
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Fatal exception when calling replaceVariables() from parser hook, after upgrade to MW 1.34

2021-01-04 Thread Máté Szabó
Hi,

The DOM-based wikitext preprocessor (Preprocessor_DOM class and friends) was 
deprecated in MediaWiki 1.34 and removed in MediaWiki 1.35 as part of the 
Wikimedia Parsing Team’s work around Parsoid-PHP.[1]
In the short/medium term, the easiest fix to keep your code working would be to 
use the other preprocessor implementation (class Preprocessor_Hash and friends) 
instead.
Since your code already has access to a PPFrame instance, you can also try 
invoking its newChild() method to construct a new child frame with your 
arguments, instead of creating the instance directly.

In the long term, the legacy wikitext preprocessor will be removed, so you may 
want to reach out to the Parsing Team[2] to find out how you can make your code 
ready for Parsoid-PHP.

Hope this helps.

—
[1] https://phabricator.wikimedia.org/T204945
[2] https://www.mediawiki.org/wiki/Parsing

Best,
Máté Szabó 
Sr. Software Engineer
he - him - his

Fandom Poland sp. z o.o. z siedzibą w Poznaniu, ul. Abp. A. Baraniaka 6
Sąd Rejonowy Poznań – Nowe Miasto i Wilda w Poznaniu, VIII Wydział Gospodarczy 
Krajowego Rejestru Sądowego, KRS 254365
NIP: 5252358778
Kapitał zakładowy: 50.000,00 złotych

> On 29 Dec 2020, at 12:48, Mark Clements (HappyDog)  
> wrote:
> 
> Hi there,
> 
> I am investigating a breakage in my extension that has occurred in MW 1.34 
> but which didn't seem to be a problem on MW 1.29.  (I have not tested 
> interim versions to see where the issue first arises.)
> 
> One of the parser hooks in the extension needs to perform variable 
> expansion.  What is happening is a lot more complicated than this example, 
> but effectively
> 
>   {{{Foo}}}
> 
> should end up generating the following output, using variable expansion:
> 
>What the foo!
> 
> The semantics of variable handling need to follow the MW semantics, 
> including default values (possibly nested), parser functions, etc. therefore 
> it needs to use the MW parser to perform the expansion.
> 
> Assuming the arguments that MW passes into the parser hook are named $Text, 
> $Vars, $Parser and $Frame, the relevant code looks something like this 
> (again, a bit more complicated in practice):
> 
>$NewFrame = new PPTemplateFrame_DOM($Frame->preprocessor, $Frame,
>array(), $Vars, $Frame->title);
>return $Parser->replaceVariables($Text, $NewFrame);
> 
> 
> (I have included a more detailed listing of the code that I am using for 
> doing the parse at the end of this message.)
> 
> My code was working fine on MW 1.29 and earlier, but when I upgrade to 1.34 
> I am finding that I get a fatal exception thrown when my tag is encountered:
> 
>/index.php?title=Main_Page MWException
>from line 373 of ~\includes\parser\PPFrame_DOM.php:
>PPFrame_DOM::expand: Invalid parameter type
> 
> I have generated a backtrace and the top of the stack is as follows:
> 
> #0 ~\includes\parser\Parser.php(3330): PPFrame_DOM->expand(PPNode_Hash_Tree, 
> integer)
> #1 MyExtension.php (434): Parser->replaceVariables(string, 
> PPTemplateFrame_DOM)
> #2 ~\includes\parser\Parser.php(4293): MyExtensionParserHook(string, array, 
> Parser, PPTemplateFrame_Hash)
> 
> (The subsequent call stack entries are the parent functions you would expect 
> to see in that situation.)
> 
> Can anyone see why the above code would no longer work as it did on previous 
> versions?  What is the current recommended method for manually expanding 
> template variables from within a parser hook?
> 
> Kind regards,
> 
> - Mark Clements (HappyDog)
> 
> --
> Full example (with extension-specific code omitted):
> --
> 
>function MyExtensionParserHook($Text, $Vars, $Parser, $Frame) {
> 
>// 1) Manipulate $Text and $Vars
> 
>// (omitted)
> 
>// 2) Expand variables in the resulting text.
> 
>// Set up a new frame which mirrors the existing one but which also has 
> the
>// field values as arguments.
>// If we are already in a template frame, merge the field arguments with 
> the
>// existing template arguments first.
>if ($Frame instanceof PPTemplateFrame_DOM) {
>$NumberedArgs = $Frame->numberedArgs;
>$NamedArgs = array_merge($Frame->namedArgs, $Vars);
>}
>else {
>$NumberedArgs = array();
>$NamedArgs = $Vars;
>}
>$NewFrame = new PPTemplateFrame_DOM($Frame->preprocessor, $Frame,
>$NumberedArgs, $NamedArgs,
>$Frame->title);
> 
>// Perform a recursive parse on the input, using our newly 

Re: [Wikitech-l] SelectQueryBuilder

2020-05-22 Thread Máté Szabó
Hey Tim,

I really like this new feature—thanks for making it happen! It reminds me of 
the fluent SQL builder[1] that Owen presented at the 2014 Architecture 
Summit.[2] While integration with that library was ultimately rejected, I 
wonder if there are any useful concepts or features that could be cherry-picked 
from there to enhance the functionality of MediaWiki’s new query builder.

What do you think?

—
[1] https://github.com/Wikia/fluent-sql-php
[2] https://www.mediawiki.org/wiki/Architecture_Summit_2014/SQL_abstraction

Cheers,
Máté Szabó 
SOFTWARE ENGINEER
he - him - his

Fandom Poland sp. z o.o. z siedzibą w Poznaniu, ul. Abp. A. Baraniaka 6
Sąd Rejonowy Poznań – Nowe Miasto i Wilda w Poznaniu, VIII Wydział Gospodarczy 
Krajowego Rejestru Sądowego, KRS 254365
NIP: 5252358778
Kapitał zakładowy: 50.000,00 złotych

> On 22 May 2020, at 03:37, Tim Starling  wrote:
> 
> SelectQueryBuilder is a new fluent interface for constructing database
> queries, which has been merged to master for release in MediaWiki
> 1.35. Please consider using it in new code.
> 
> SELECT page_id FROM page
> WHERE page_namespace=$namespace AND page_title=$title
> 
> becomes
> 
> $id = $db->newSelectQueryBuilder()
>   ->select( 'page_id' )
>   ->from( 'page' )
>   ->where( [
>  'page_namespace' => $namespace,
>  'page_title' => $title,
>   ] )
>   ->fetchField();
> 
> As explained on the design task T243051, SelectQueryBuilder was
> loosely based on the query builder in Doctrine, but I made an effort
> to respect existing MediaWiki conventions, to make migration easy.
> 
> SelectQueryBuilder is easy to use for simple cases, but has the most
> impact on readability when it is used for complex queries. That's why
> I chose to migrate the showIndirectLinks query in
> Special:WhatLinksHere as a pilot -- it was one of the gnarliest
> queries in core.
> 
> SelectQueryBuilder excels at building joins, including parenthesized
> (nested) joins and joins on subqueries.
> 
> SelectQueryBuilder can be used as a structured alternative to the
> "query info" pattern, in which the parameters to Database::select()
> are stored in an associative array. It can convert to and from such
> arrays. As a pilot of this functionality, I converted ApiQueryBase to
> use a SelectQueryBuilder to store accumulated query info.
> 
> Check it out!
> 
> -- Tim Starling
> 
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Running JS for parser function preview in Visual Editor

2020-04-05 Thread Máté Szabó
Hi Jeroen,

I think you need to register a custom DM (DataModel)[1] and CE 
(ContentEditable)[2] node implementation for this extension tag inside 
VisualEditor to control their rendering and how the user can interact with 
them. Unfortunately, I am not sure if there is documentation (apart from the 
generated JSDoc) for these APIs, so I tend to refer to usage patterns in 
existing extensions when working with them. The Graph extension is an example 
of an extension that leverages these APIs to customise the behaviour of an 
extension tag that relies on JS libraries inside the VE.[3]

Hope this helps!

—
[1] https://doc.wikimedia.org/VisualEditor/master/#!/api/ve.dm.MWExtensionNode
[2] https://doc.wikimedia.org/VisualEditor/master/#!/api/ve.ce.MWExtensionNode
[3] 
https://github.com/wikimedia/mediawiki-extensions-Graph/tree/master/modules/ve-graph

Cheers,
Máté Szabó 
SOFTWARE ENGINEER

Fandom Poland sp. z o.o. z siedzibą w Poznaniu, ul. Abp. A. Baraniaka 6
Sąd Rejonowy Poznań – Nowe Miasto i Wilda w Poznaniu, VIII Wydział Gospodarczy 
Krajowego Rejestru Sądowego, KRS 254365
NIP: 5252358778
Kapitał zakładowy: 50.000,00 złotych

> On 4 Apr 2020, at 23:46, Jeroen De Dauw  wrote:
> 
> Hey,
> 
> I am maintainer of the Maps extension for MediaWiki.
> https://github.com/JeroenDeDauw/Maps#maps
> 
> Recently I worked on improving integration with Visual Editor. Maps used to
> not show up as a grey box [0] since their initialization JavaScript was not
> being run. I got things working but am not happy with the solution and hope
> there is a better approach.
> 
> After some searching I found that if I ran the initialization code in a
> handler to the ve.activationComplete hook, the maps would get initialized
> as desired [1]. That is not a complete solution though, since you can edit
> the maps with Visual Editor [2]. This causes an API request that parses the
> new wikitext to HTML, which then replaces the old HTML of the initialized
> map. So initialization needs to happen again.
> 
> An hour or two searching through the docs and Visual Editor code did not
> yield any usable hook/event. (This was quite surprising to me. Has no one
> ever done something similar to what I am doing here?) So I settled on just
> running initialization once every second with setInterval(). I found the
> ve.deactivationComplete hook which I then used to stop the code running
> every second on saving and existing the visual editor. Turns out that when
> opening the visual editor after that does not result in a new
> ve.activationComplete event, hence I had to remove termination of the
> repeating initialization.
> 
> Surely there is a better way to run my code once per second (ad infinitum)
> starting with Visual Editor activation? The perfect event for my usecase
> would be "visual editor rendered a parser function". A broader event (ie
> "something got rendered") would still be better than nothing.
> 
> You can see my current code here [3], including a few commented out bits
> which I left in so you can see some of what did not work.
> 
> [0]
> https://user-images.githubusercontent.com/146040/78461765-18d15780-76cc-11ea-82cd-eb69d0179fd7.png
> [1]
> https://user-images.githubusercontent.com/146040/78461769-21299280-76cc-11ea-9461-a2c343062482.png
> [2]
> https://user-images.githubusercontent.com/146040/78461779-369ebc80-76cc-11ea-9495-4a91a24a.png
> [3]
> https://github.com/JeroenDeDauw/Maps/blob/7.17.1/resources/leaflet/LeafletLoader.js#L24-L45
> 
> Cheers
> 
> --
> Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf>
> Professional wiki hosting and services: www.Professional.Wiki
> <https://Professional.Wiki>
> Entrepreneur | Software Crafter | Open Source | Wikimedia | Speaker
> ~=[,,_,,]:3
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to invalidate a ResourceLoader URL that depends on user-generated content?

2020-02-13 Thread Máté Szabó
Hey Roan,

Thank you for your response!

I did stumble upon ContentOverrideCallback. Unfortunately, it doesn’t seem to 
be an exact fit for this use case.
I think I will opt for manual URL purges to ensure timely updates in this 
scenario

Cheers,

Máté Szabó 
SOFTWARE ENGINEER

Fandom Poland sp. z o.o. z siedzibą w Poznaniu, ul. Abp. A. Baraniaka 6
Sąd Rejonowy Poznań – Nowe Miasto i Wilda w Poznaniu, VIII Wydział Gospodarczy 
Krajowego Rejestru Sądowego, KRS 254365
NIP: 5252358778
Kapitał zakładowy: 50.000,00 złotych


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] How to invalidate a ResourceLoader URL that depends on user-generated content?

2020-02-12 Thread Máté Szabó
Hi all,

I am working on a custom ResourceLoader module whose style contents are 
compiled from Sass stylesheets. These Sass stylesheets take as input 
user-defined theme variables that affect the output.
I am looking for a way to invalidate ResourceLoader URLs that include such 
style modules, when these variables change as a result of an user 
action—primarily so that the user can see their own changes.

ResourceLoader will cache such style URLs for 5 minutes on the CDN and in the 
user’s browser cache, so by default the user would have to wait until their 
cached copy of styles expire before they would receive the updated version. 
Accordingly, I have tried to explore if following the document steps to empty 
one’s browser cache[1] after making changes would cause the browser to fetch 
the updated resource from the backend. Unfortunately, both Chrome and Firefox 
appear to send a “Cache-Control: no-cache” request header for the style URL if 
the page is hard refreshed, which seems to cause most CDNs (including WMF’s) to 
act as the source of truth and serve the asset from cache if it is already 
there, without triggering a backend fetch or conditional revalidation. In the 
case of a simple refresh, Chromium and derivatives will not trigger a 
conditional revalidation of resources including on a page since 2017,[2] so 
they just continue to use the asset that was already in the browser’s cache. 
Thus, the standard instructions for emptying one’s cache do not seem to be 
sufficient to allow the user to see the effect of their styling changes.

Given the above, the only option I see that would allow the user to see the 
updated styles after modifying input variables would be to append a version 
hash to the style URL included in the page HTML—something which ResourceLoader 
does not
currently do.

Do you know of a better approach to this problem? Are there any examples in 
MediaWiki core or extensions that accomplish something similar? I have 
consulted the code for ResourceLoaderSiteStylesModule, but it does not seem to 
have any specific invalidation behaviour, nor does it use a version hash in 
style URLs in the page HTML, so it seems that any updates to it take 5 minutes 
to propagate.

Thanks in advance!

—
[1] https://en.wikipedia.org/wiki/Wikipedia:Bypass_your_cache
[2] https://bugs.chromium.org/p/chromium/issues/detail?id=505048

Cheers,

Máté Szabó 
SOFTWARE ENGINEER
+36 30 947 5903

Fandom Poland sp. z o.o. z siedzibą w Poznaniu, ul. Abp. A. Baraniaka 6
Sąd Rejonowy Poznań – Nowe Miasto i Wilda w Poznaniu, VIII Wydział Gospodarczy 
Krajowego Rejestru Sądowego, KRS 254365
NIP: 5252358778
Kapitał zakładowy: 50.000,00 złotych


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] TechCom Radar 2019-09-18

2019-09-20 Thread Máté Szabó
Hi Daniel,

Thank you for the update. I was wondering, how does the new Front-end 
Architecture Working Group relate to the existing Front-end Standards Group?[1] 
Does it replace the previous group or do they operate independently?

—
[1] https://phabricator.wikimedia.org/project/view/1616/

Cheers,
Máté Szabó 
SOFTWARE ENGINEER
+36 30 947 5903

WIKIA sp. z o.o. z siedzibą w Poznaniu, ul. Abp. A. Baraniaka 6
Sąd Rejonowy Poznań – Nowe Miasto i Wilda w Poznaniu, VIII Wydział Gospodarczy 
Krajowego Rejestru Sądowego, KRS 254365
NIP: 5252358778
Kapitał zakładowy: 50.000,00 złotych


> On 20 Sep 2019, at 13:20, Daniel Kinzler  wrote:
> 
> Hi All,
> 
> Here are the minutes from this week's TechCom meeting:
> 
> 
> * RFC Last Call until 2019-10-02: Drop IE6 and IE7 basic compatibility and
> security support <https://phabricator.wikimedia.org/T232563>. This means we 
> can
> remove code that works around issues with IE6 and IE7. If no concerns remain
> unaddressed by October 2nd, the RFC will be approved as proposed.
> 
> * Discussed: Kickoff of the Front-end Architecture Working Group (FAWG)
> <https://www.mediawiki.org/wiki/Front_End_Architecture_Working_Group>. The
> purpose of the Frontent Architecture Working Group is to propose an 
> architecture
> for a more modern front-end to MediaWiki, enabling a richer user experience.
> Efforts to modernize the user interface have often struggled with limitations
> imposed by MediaWiki core and the overall system architecture. FAWG is
> an effort by the Product and Technology departments to develop a mid-term plan
> to overcome these limitations.
> 
> * Discussed: Section headings with clickable anchor
> <https://phabricator.wikimedia.org/T18691>. This is about a "link to this
> section" feature pegged for deployment on WMF wikis. The RFC is to make sure
> that there are no performance issues with this change.
> 
> * IRC discussion next week: Core REST API namespace and version
> <https://phabricator.wikimedia.org/T232485>. This is about versioning 
> strategies
> for public APIs. As always, the meeting will take place on Freenode in the
> #wikimedia-office channel, at 21:00 UTC (2pm PDT, 23:00 CEST).
> 
> 
> You can also find our meeting minutes at
> <https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Minutes>
> 
> See also the TechCom RFC board
> <https://phabricator.wikimedia.org/tag/mediawiki-rfcs/>.
> 
> If you prefer you can subscribe to our newsletter here
> <https://www.mediawiki.org/wiki/Newsletter:TechCom_Radar>
> 
> Cheers,
> Daniel
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Declaring methods final in classes

2019-08-27 Thread Máté Szabó
Hey,

My understanding is that mocking final methods and types is a limitation that 
is not specific to just PHPUnit, or indeed to PHP itself. Mockery, another 
established PHP mock object framework, relies on a workaround for mocking final 
methods and types that prevents testing code with type constraints or 
instanceof checks[1]. On the JVM, the popular mocking framework Mockito 
similarly has to rely on instrumentation to provide support for this use 
case[2].

Personally, I do not have enough context to judge whether there is added value 
in declaring those methods in e.g. WANObjectCache as final. It may have been 
intended to explicitly prevent subclassing and overriding by extension code, 
although this is just a guess. A question that could be posed is whether these 
benefits are worth the tradeoff in terms of reduced testability.

What do you think?

——
[1] http://docs.mockery.io/en/latest/reference/final_methods_classes.html
[2] 
https://static.javadoc.io/org.mockito/mockito-core/3.0.0/org/mockito/Mockito.html#39

Máté Szabó 
SOFTWARE ENGINEER
+36 30 947 5903

WIKIA sp. z o.o. z siedzibą w Poznaniu, ul. Abp. A. Baraniaka 6
Sąd Rejonowy Poznań – Nowe Miasto i Wilda w Poznaniu, VIII Wydział Gospodarczy 
Krajowego Rejestru Sądowego, KRS 254365
NIP: 5252358778
Kapitał zakładowy: 50.000,00 złotych


> On 27 Aug 2019, at 20:56, Daimona  wrote:
> 
> Personally, I don't like these limitations in PHPUnit and the like. IMHO,
> they should never be a reason for changing good code. And sometimes,
> methods have to be final. I don't know about the specific case, though.
> 
> Anyway, some time ago I came across [1], which allows mocking final methods
> and classes. IIRC, it does that by removing the `final` keywords from the
> tokenized PHP code. I don't know how well it works, nor if it could degrade
> performance, but if it doesn't we could bring it in via composer.
> 
> 
> [1] - https://github.com/dg/bypass-finals
> 
> 
> Il giorno martedì 27 agosto 2019, Aryeh Gregor  ha scritto:
> 
>> I see that in some classes, like WANObjectCache, most methods are declared
>> final. Why is this? Is it an attempt to optimize?
>> 
>> The problem is that PHPUnit mocks can't touch final methods. Any ->method()
>> calls that try to do anything to them silently do nothing. This makes
>> writing tests harder.
>> 
>> If we really want these methods to be marked final for some reason, the
>> workaround for PHP is to make an interface that has all the desired
>> methods, have the class implement the interface, and make type hints all
>> refer to the interface instead of the class. But if there's no good reason
>> to declare the methods final to begin with, it's simplest to just drop it.
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> 
> 
> 
> -- 
> https://meta.wikimedia.org/wiki/User:Daimona_Eaytoy
> "Daimona" is not my real name -- he/him
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dealing with composer dependencies in early MediaWiki initialization

2019-06-26 Thread Máté Szabó
Hey,

Looking at Setup.php, it seems to include the relevant items in the following 
order:
- DefaultSettings.php
- Composer autoloader
- LocalSettings.php or config callback

Could this allow us to initialise $wgServer in Setup.php, right after the 
Composer autoloader is included? It seems to me this would not break custom 
LocalSettings files that expect it to be set, as LocalSettings would not yet be 
included at that point. What do you think?

Best

Máté Szabó 
SOFTWARE ENGINEER
+36 30 947 5903

WIKIA sp. z o.o. z siedzibą w Poznaniu, ul. Abp. A. Baraniaka 6
Sąd Rejonowy Poznań – Nowe Miasto i Wilda w Poznaniu, VIII Wydział Gospodarczy 
Krajowego Rejestru Sądowego, KRS 254365
NIP: 5252358778
Kapitał zakładowy: 50.000,00 złotych


> On 26 Jun 2019, at 04:21, Kunal Mehta  wrote:
> 
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA512
> 
> Hi,
> 
> I (with Reedy's help) recently started work on librarizing MediaWiki's
> IP class into a separate composer package (wikimedia/ip-utils[1]). The
> main motivation was so that the Parsoid PHP port could use it[2].
> 
> However, I ran into an unexpected hitch[3], as it seems we're using
> the IP class before the composer autoloader is even intialized. Here's
> the basic initialization in Setup.php:
> 
> - - AutoLoader.php (MediaWiki's)
> - - Defines.php
> - - DefaultSettings.php
>  - $wgServer = WebRequest::detectServer()
>- Calls IP::splitHostAndPort()
> - - GlobalFunctions.php
> - - vendor/autoload.php (composer's)
> 
> My understanding is that composer's autoloader runs late so extensions
> registering themselves using it can add their stuff to the necessary
> globals.
> 
> And we call WebRequest::detectServer() in DefaultSettings.php so that
> in LocalSettings.php people can use the value of $wgServer for other
> stuff.
> 
> I see 3 main ways to move forward:
> 
> 1. Move vendor/autoload.php earlier in Setup.php, potentially breaking
> extensions that still rely on composer autoloading for initialization.
> 2. Set $wgServer = false or something in DefaultSettings.php, and then
> fill it in later in Setup.php *after* the composer autoloader has been
> loaded, potentially breaking anyone relying on the value of $wgServer
> in LocalSettings.php.
> 3. (status quo) not librarize code that runs before composer
> autoloader initialization. :(
> 
> Advice/input welcome.
> 
> [1] https://packagist.org/packages/wikimedia/ip-utils
> [2]
> https://gerrit.wikimedia.org/g/mediawiki/services/parsoid/+/77064cfff717
> 6493a2828bb4f95f397dfce7d659/src/Utils/Title.php#46
> [3] https://gerrit.wikimedia.org/r/c/mediawiki/core/+/519089/
> 
> - -- Legoktm
> -BEGIN PGP SIGNATURE-
> 
> iQIzBAEBCgAdFiEE2MtZ8F27ngU4xIGd8QX4EBsFJpsFAl0S1oQACgkQ8QX4EBsF
> Jpufrg/+J9RUUxRAtgJLEkyACE6GREis0eyEIZnWmMr3s9YpFPoqtWocFrUk6Wsn
> W7d9Oda/8CW0/d894gGMn8LWIj9oWq2gMPWzCVFpg8uu3r4967qxBp+ba29uMOJw
> Qpw6DhXtPvVAeUCy8P38Y5vM7TGmV+J1T5jDY21zimT1dRrJsI1KD+u/Ue3nYy/y
> B1ic3i7vJfhYErdhHgN98ETXfXOaDx4rgd2N7PLjVNx3IYCC8LNiR8wSLuydfdbk
> PLTT1bA2qi0h2wgcEr7Qtq9YstVotq8899rgKLtGDBwQi3qGNcdOgQGEMFDVfjfO
> CsiWocj6s4oc3ScVj+Eb9xtvIqhNx+oRbWE1vKd4TmtSdyzpv6xadV60tq5qNFEY
> I0cBDOWU5UFNHbvbyjK4dqIDEVhJ6LiEgLVBOj81U27s8mR4Dv/yFB3eac0ROk7p
> gaEeOjfhtVU558XfpEsmu1H05VJT3kXNxK8y0UQOjy11SErzsXv6vDzyzLDJM/W7
> WF0I4nyjeqVsBjLBN9li+5AnU3cAKVOCfZ+/aRYyg89Du//nJRjm+4lxnuPrGlaG
> ES/nVUnkDZ9Yc/xA1yacm3Ytx9hpoY1mIZgxxxveyeU1KsNXAZ2BOGA2T7kU4yUw
> Uyg+byYwI+1uVOjAVd3BInGV2R2/GmeIn9FOpthBaw8wcz0Y/8c=
> =tU4+
> -END PGP SIGNATURE-
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Separating MediaWiki core unit and integration tests — help welcome!

2019-06-02 Thread Máté Szabó
Hey all,

The idea of separating PHPUnit unit and integration/system tests in MediaWiki 
core has been around for some time[1][2][3]. Currently, the tests assume the 
presence of valid MediaWiki settings and a database connection, meaning one 
must install & configure MediaWiki and an RDBMS in their local development 
environment to be able to run the tests. The fact we use a non-standard entry 
point (phpunit.php) also makes these tests incompatible with existing tooling 
such as IDE integrations.[4]

At the 2019 hackathon I worked with Amir Sarabadani, Michael Große and Kosta 
Harlan to perform some preliminary investigation into separating unit tests 
(that can be run without a database and MediaWiki configuration) into a 
separate PHPUnit configuration that could be run via the official phpunit 
binary. After some additional work, this has evolved into a patch that 
separates 5301 unit tests into a dedicated suite that can be executed via 
vendor/bin/phpunit in 15 seconds (on my machine!). By contrast, running the 
same 5301 tests via phpunit.php takes around 30 seconds.[5] Not using 
phpunit.php here also allows for integrating with e.g. Intellij/PHPStorm’s test 
execution and code coverage functionality—here’s a screenshot of the execution 
and coverage information of PathRouterTest via IntelliJ.[6] I feel that these 
benefits would be felt both by developers and CI maintainers—developers could 
iterate more rapidly by running the unit tests, and Jenkins would have to spend 
less time executing the test suite.

I’d like to thank everyone who has supported this enterprise so far—Amir 
(Ladsgroup) for creating a script to identify the initial set of tests that do 
not rely on a database[7][8] and providing assistance throughout, Kosta Harlan 
for highlighting this old and forgotten issue and bringing it to the hackathon, 
Bartosz Dziewoński (MatmaRex) for providing a solution to scoping issues around 
MediaWiki core files,[9] Michael Große for demonstrating the feasibility of 
this approach in the Wikibase extension test suite,[10] and James Forrester for 
reviewing the changes and outlining possible next steps.

However, the work is far from done yet! The patch is not yet merged, so any 
reviews, comments and suggestions would be very welcome there! And if it does 
get merged, we will have to think about how to bring this separation to 
extensions’ test suites as well as potentially port more core tests to the unit 
test suite. So if you have any ideas on how to improve this patch or would like 
to add to the next steps, don’t hesitate to leave a note :)

[1] https://phabricator.wikimedia.org/T89432
[2] https://phabricator.wikimedia.org/T87781
[3] https://phabricator.wikimedia.org/T84948
[4] 
https://www.mediawiki.org/wiki/Manual:PHP_unit_testing/Running_the_unit_tests
[5] https://gerrit.wikimedia.org/r/c/mediawiki/core/+/513106
[6] https://phabricator.wikimedia.org/F29316124
[7] https://phabricator.wikimedia.org/T87781#5194643
[8] https://gerrit.wikimedia.org/r/#/c/mediawiki/core/+/510226/
[9] 
https://gerrit.wikimedia.org/r/c/mediawiki/core/+/513106#message-32b2a171a7c86c35a572c4b14342030486659ab5
[10] https://gerrit.wikimedia.org/r/511035

Yours,
---
Máté Szabó 
Software Engineer
+36 30 947 5903

WIKIA sp. z o.o. z siedzibą w Poznaniu, ul. Abp. A. Baraniaka 6
Sąd Rejonowy Poznań – Nowe Miasto i Wilda w Poznaniu, VIII Wydział Gospodarczy 
Krajowego Rejestru Sądowego, KRS 254365
NIP: 5252358778
Kapitał zakładowy: 50.000,00 złotych



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l