Re: [Wikitech-l] wikipedia is one of the slower sites on the web
2010/7/30 Daniel Friesen li...@nadir-seen-fire.com That's pretty much the purpose of the caching servers. Yes, but I presume that a big advantage could come from having a simplified, unique, js-free version of the pages online, completely devoid of user preferences to avoid any need to parse it again when uploaded by different users with different preferences profile. Nevertheless I say again: it's only a completely layman idea. -- Alex ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] wikipedia is one of the slower sites on the web
On Fri, Jul 30, 2010 at 6:23 AM, Aryeh Gregor simetrical+wikil...@gmail.com wrote: On Thu, Jul 29, 2010 at 4:07 PM, Strainu strain...@gmail.com wrote: Could you please elaborate on that? Thanks. When pages are parsed, the parsed version is cached, since parsing can take a long time (sometimes 10 s). Some preferences change how pages are parsed, so different copies need to be stored based on those preferences. If these settings are all default for you, you'll be using the same parser cache copies as anonymous users, so you're extremely likely to get a parser cache hit. If any of them is non-default, you'll only get a parser cache hit if someone with your exact parser-related preferences viewed the page since it was last changed; otherwise it will have to reparse the page just for you, which will take a long time. This is probably a bad thing. Could we add a logged-in-reader mode, for people who are infrequent contributors but wish to be logged in for the prefs. They could be served a slightly old cached version of the page when one is available for their prefs. e.g. if the cached version is less than a minute old. The down side is that if they see an error, it may already be fixed. OTOH, if the page is being revised frequently, the same is likely to happen anyway. The text could be stale before it hits the wire due to parsing delay. For pending changes, the pref 'Always show the latest accepted revision (if there is one) of a page by default' could be enabled by default. Was there any discussion about the default setting for this pref? -- John Vandenberg ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] MediaWiki version statistics
On Fri, Jul 30, 2010 at 7:20 AM, Max Semenik maxsem.w...@gmail.com wrote: There's already http://www.mediawiki.org/wiki/Extension:MWReleases that does server part of version checks for core, it could be tweaked to supply version information for extensions, too. It's being rewritten, FYI. -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki version statistics
/me wrote: Last time I heard about it, it had huge problems with security and code quality. Did anything change positively in that area over the last several months? If s***c developers believe that all Tim's concerns have been addressed, they should resubmit it for review. Sorry, as Jeroen noted, only SemanticForms had these problems. My bad. -- Max Semenik ([[User:MaxSem]]) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Caching, was: Re: wikipedia is one of the slower sites on the web
On Fri, Jul 30, 2010 at 4:13 AM, Domas Mituzas midom.li...@gmail.com wrote: So, we may have 1000x slower performance for our users because they don't really know about our caching internals. Our only hope is that most of them are also ignorant that those settings exist ;-) There'd be of course another workaround - precaching objects for every variation, at extremely high cost for relatively low impact. Alternative is either having warning icon whenever people are in slow-perf mode that they'd be able to hide, or eliminating the choice (you know, the killing features business, that quite often works really well!!! ;-) Or we could just store an intermediate form in the parser cache, and apply the settings afterwards. For instance, one preference is enable section edit links. If instead of outputting HTML, the parser stuck a string like \001SECTIONEDIT1\001 where the first section edit link goes, we could do preg_replace($page, '/\001SECTIONEDIT(\d+)\001/, $replacement), where $replacement = 'blah blah blah $1 blah blah blah' or '' according to user preference. Then we could use the same parser cache for everyone. I think almost all if not all the parser-changing prefs could be implemented this way, preg_replace_callback() at worst. So we don't have to remove features, probably. In fact, we can even add features, like {{USERNAME}}. It wouldn't work for {{#ifeq:{{USERNAME}}|Simetrical|You're awesome!|}} or anything, but fine for Hello, {{USERNAME}}, welcome to Wikipedia! As long as we keep it down to preg_replace(), or better yet require it to be one big single-pass strtr() for all such settings, it should have no noticeable performance impact even if we add lots and lots of features like this. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Storing data across requests
On Fri, Jul 30, 2010 at 5:32 PM, Aryeh Gregor simetrical+wikil...@gmail.com wrote: On Thu, Jul 29, 2010 at 6:07 PM, Platonides platoni...@gmail.com wrote: Memcached* Our $_SESSION simply lives in memcached. So we could do $fake_session = $wgMemc-get( wfMemcKey( 'session', $session_id ) ) ; $fake_session[upload_ok] = true; $wgMemc-set( wfMemcKey( 'session', $session_id ), $fake_session, 3600 ) ; This means that if a memcached server goes down, the information will be lost. The database is the correct place to put this. (Also the correct place to put sessions, for that matter . . .) Also, on places where no memcached or equivalent is available (i.e. CACHE_NONE), this will not work. I think the loss of session data due to memcached breakage was found to be acceptable. Does anybody have some references for this? Bryan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Caching, was: Re: wikipedia is one of the slower sites on the web
Which of the preference settings are likely to cause this problem? On Fri, Jul 30, 2010 at 12:18 PM, Aryeh Gregor simetrical+wikil...@gmail.com wrote: On Fri, Jul 30, 2010 at 4:13 AM, Domas Mituzas midom.li...@gmail.com wrote: So, we may have 1000x slower performance for our users because they don't really know about our caching internals. Our only hope is that most of them are also ignorant that those settings exist ;-) There'd be of course another workaround - precaching objects for every variation, at extremely high cost for relatively low impact. Alternative is either having warning icon whenever people are in slow-perf mode that they'd be able to hide, or eliminating the choice (you know, the killing features business, that quite often works really well!!! ;-) Or we could just store an intermediate form in the parser cache, and apply the settings afterwards. For instance, one preference is enable section edit links. If instead of outputting HTML, the parser stuck a string like \001SECTIONEDIT1\001 where the first section edit link goes, we could do preg_replace($page, '/\001SECTIONEDIT(\d+)\001/, $replacement), where $replacement = 'blah blah blah $1 blah blah blah' or '' according to user preference. Then we could use the same parser cache for everyone. I think almost all if not all the parser-changing prefs could be implemented this way, preg_replace_callback() at worst. So we don't have to remove features, probably. In fact, we can even add features, like {{USERNAME}}. It wouldn't work for {{#ifeq:{{USERNAME}}|Simetrical|You're awesome!|}} or anything, but fine for Hello, {{USERNAME}}, welcome to Wikipedia! As long as we keep it down to preg_replace(), or better yet require it to be one big single-pass strtr() for all such settings, it should have no noticeable performance impact even if we add lots and lots of features like this. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- David Goodman, Ph.D, M.L.S. http://en.wikipedia.org/wiki/User_talk:DGG ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Request for comments: New Message class (2nd round)
Στις 30-07-2010, ημέρα Παρ, και ώρα 11:09 -0700, ο/η Neil Kandalgaonkar έγραψε: On 7/30/10 8:35 AM, Aryeh Gregor wrote: Msg::get(). wfMsg() This seems like a minimum compromise. Personally I'd go all the way to M::get() or M(), but that would be a bit too obscure and would break existing conventions for Mediawiki source. It's worth favoring brevity more than usual in the message string function. Since it's used so often, a short name tends to increase overall comprehensibility of the code. I could live with the class name Msg instead of Message. get is not particularly descriptive. wfMsgObj is only three letters longer than the string we use now, if people think that wfMessage is too long to type. Ariel ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Storing data across requests
Chad wrote: On Fri, Jul 30, 2010 at 3:57 PM, Platonides platoni...@gmail.com wrote: Bryan Tong Minh wrote: Also, on places where no memcached or equivalent is available (i.e. CACHE_NONE), this will not work. Then you could be using the objectcache table in the database. No, that's CACHE_DB. CACHE_NONE really means what it says. -Chad There's no CACHE_NONE environment by itself. Just sysadmins disabling the caching. So if $wgMainCacheType is CACHE_NONE, block the feature or treat it as CACHE_ANYTHING for features that require it. You know, everybody should have a writable db for running mediawiki... ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] MediaWiki version statistics
On Sat, Jul 31, 2010 at 12:28 AM, Jeroen De Dauw jeroended...@gmail.com wrote: Hey, There's already http://www.mediawiki.org/wiki/Extension:MWReleases that does server part of version checks for core, it could be tweaked to supply version information for extensions, too. Although that suffices for determining if your version is up to date or not, it does not allow for actual update fetching and all the related stuff such as dependency resolution and simply browsing through available extensions in the repository, as you have with WordPress. When for updates to the software, both core and extensions the system is to phone home, it makes sense to integrate the LocalisationUpdate functionality and make it a more complete package. Yes, that makes a lot of sense. I was not aware this functionality existed, so I'm definitely going to have a look at it now. I would highly unrecommended having the update feature in there, we already highly recommend against running as a db user with certain admins rights amongst other things, this feature will probably end up breaking more installs then updating (and yes I know wordpress has it, and I know how many times i've had to fix their botch updates), and not all installs would have the required modules that it needs (cURL/wGet comes to mind on IIS setups which some people use). Nor should we be assigning the update right or giving messages to the admin group by default, since most people that are admins are non technical and will just click any bright button that has messages along the lines of omg update me now without thinking if it will break something (Perhaps we should un-deprecate the developer usergroup for this). -Peachey ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] MediaWiki version statistics
On Fri, Jul 30, 2010 at 11:44 PM, Jeroen De Dauw jeroended...@gmail.com wrote: ..snip.. I totally agree here with Ryan. The idea is to have the repository where the version data is fetched is configurable, so it's possible to have other distributors then the WMF, and to turn of the feature entirely. I'm currently looking into the repository and package fetching parts do allow for such dialling home. MediaWiki.org seems the obvious choice to have the main repository on. There are many ways to then provide the needed data. Personally I think the best approach would be to install Semantic MediaWiki (yes, I used the s-word!) so data from the extension pages can be queried and shown in a distribution metadata format. That might require a small extension for some new spacial pages, and some scripts to collect other existing version data and put it into the wiki. Is it possible to get SMW onto MW.org? This would also finally be a proof of concept of SMW on a WMF wiki, on which a lot of people have been waiting a long time now. With only a little over 3 weeks left in GSoC, I have little doubt this project will not be finished, so any help in any form is definitely welcome. [0] https://secure.wikimedia.org/wikipedia/mediawiki/wiki/Deployment I don't think on-wiki would be the best way for this, espically for the extensions within our SVN, because you would have to list the revision that it needs against the version number and the version of the extension. and then for the ones we don't have in our SVN you would need to store their download format (http/git/svn etc) and location as well. You would also need to be vigilant and make sure people don't vandalize the information, For example if a spam version change got entered and broke someones installed. -Peachey ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l