Michael Dale wrote:
> That is part of the idea of centrally hosting reusable client-side 
> components so we control the jquery version and plugin set. So a
> new version won't "come along" until its been tested and
> integrated.

You can't host every client-side component in the world in a
subdirectory of the MediaWiki core. Not everyone has commit access to
it. Nobody can hope to properly test every MediaWiki extension.

Most extension developers write an extension for a particular site,
and distribute their code as-is for the benefit of other users. They
have no interest in integration with the core. If they find some
jQuery plugin on the web that defines an interface that conflicts with
MediaWiki, say jQuery.load() but with different parameters, they're
not going to be impressed when you tell them that to make it work with
MediaWiki, they need to rewrite the plugin and get it tested and
integrated.

Different modules should have separate namespaces. This is a key
property of large, maintainable systems of code.

> The nice thing about the way its working right now is you can just
> turn off the script-loader and the system continues to work ... you
> can build a page that includes the js and it "works"

The current system kind of works. It's not efficient or scalable and
it doesn't have many features.

> Having an export mode,  scripts doing transformations, dependency
> management output sounds complicated. I can imagine it ~sort of~
> working... but it seems much easier to go the other way around.

Sometimes complexity is necessary in the course of achieving other
goals, such as performance, features, and ease of use for extension
developers.

> I agree that the present system of parsing top of the javascipt
> file on every script-loader generation request is un-optimized.
> (the idea is those script-loader generations calls happen rarely
> but even still it should be cached at any number of levels. (ie
> checking the filemodifcation timestamp, witting out a php or
> serialized file .. or storing it in any of the other cache levels
> we have available, memcahce, database, etc )

Actually it parses the whole of the JavaScript file, not the top, and
it does it on every request that invokes WebStart.php, not just on
mwScriptLoader.php requests. I'm talking about
jsAutoloadLocalClasses.php if that's not clear.

>> Have you looked at the profiling? On the Wikimedia app servers,
>> even the simplest MW request takes 23ms, and gen=js takes 46ms. A
>> static file like wikibits.js takes around 0.5ms. And that's with
>> APC. You say MW on small sites is OK, I think it's slow and
>> resource-intensive.
>> 
>> That's not to say I'm sold on the idea of a static file cache, it
>>  brings its own problems, which I listed.
>> 
> 
> yea... but almost all script-loader request will be cached.  it
> does not need to check the DB or anything its just a key-file
> lookup (since script-loader request pass a request key either its
> there in cache or its not ...it should be on par with the simplest
> MW request. Which is substantially shorter then around trip time
> for getting each script individually, not to mention gziping which
> can't otherwise be easily enabled for 3rd party installations.

I don't think that that comparison can be made so lightly. For the
server operator, CPU time is much more expensive than time spent
waiting for the network. And I'm not proposing that the client fetches
each script individually, I'm proposing that scripts be concatentated
and stored in a cache file which is then referenced directly in the HTML.

I'm aware of the gzip issue, I mentioned it in my original post.

> ...right... we would want to avoid lots of live hacks. But I think
> we want to avoid lots of live hacks anyway.  A serious javascript
> bug would only affect the pages that where generated in thous hours
> that it was a bug was present not the 30 days that your
> characterizing the lag time of page generation.

Bugs don't only come from live hacks. Most bugs come to the site from
the developers who wrote the code in the first place, via subversion.

> Do you have stats on that?... its surprising to me that pages are 
> re-generated that rarely... How do central notice campaigns work?

$wgSquidMaxage is set to 31 days (2678400 seconds) for all wikis
except wikimediafoundation.org. It's necessary to have a very long
expiry time in order to fill the caches and achieve a high hit rate,
because Wikimedia's access pattern is very broad, with the "long tail"
dominating the request rate.

The CentralNotice extension was created to overcome this problem and
display short-lived messages. Aryeh described how it works.

-- Tim Starling


_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to