[Wikitech-l] browsing in phpmyadmin vs. charset

2009-03-14 Thread jidanni
Say one can successfully see the UTF-8 in an SQL dump with
$ mysqldump --default-character-set=latin1 my_database | less

OK, now how about browsing with phpmyadmin? no matter what I set,
browser charset or various phpmyadmin choices,
fields like 'ar_title varchar(255) latin1_bin'
come out as mess.

I just want to browse them, not change or 'injure' them.

(That is the varchars, phpmyadmin won't let me browse the blobs.)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] browsing in phpmyadmin vs. charset

2009-03-14 Thread Chad
On Sat, Mar 14, 2009 at 2:02 AM,  jida...@jidanni.org wrote:
 Say one can successfully see the UTF-8 in an SQL dump with
 $ mysqldump --default-character-set=latin1 my_database | less

 OK, now how about browsing with phpmyadmin? no matter what I set,
 browser charset or various phpmyadmin choices,
 fields like 'ar_title varchar(255) latin1_bin'
 come out as mess.

 I just want to browse them, not change or 'injure' them.

 (That is the varchars, phpmyadmin won't let me browse the blobs.)

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Maybe try asking the phpMyAdmin support? This has nothing
to do with MW specifically.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Understanding the meaning of “List of page titles”

2009-03-14 Thread Daniel Kinzler
Andrew Garrett schrieb:
 On Sat, Mar 14, 2009 at 9:34 AM, O. O. olson...@yahoo.com wrote:
 Andrew Garrett wrote:
 On Sat, Mar 14, 2009 at 9:26 AM, O. O. olson...@yahoo.com wrote:
The above link says that “only articles” and no redirects are in the
 namespace NS0. Also Talk: pages are not included in the NS0.
 Then, when the current English Wikipedia advertises 2,791,033 Articles,
 I cannot understand why the list of Titles contains 5716820 Titles? This
 is a little more than double?
 The larger number includes redirects, the smaller number doesn't.

 Then why does this http://en.wikipedia.org/wiki/Wikipedia:NS0 say that
 “Redirects” are not considered as Articles and hence are not in NS0?
 
 It doesn't say that, it says Not all pages in the article namespace
 are considered to be articles, listing redirects as an example.

The terminology is indeed confusing. ns0 is the main namespace, which is used
for articles. But it also contains redirects. For the statistics, the software
tries to count real or good articles, which is defined to be in ns0, not a
redirect, and containing at least one link. It may in the future even be
redefined not to include disambiguation pages. The title list however contains
all pages in ns0.

Talk pages are in their own namesapace, or rather, namespaces. Namespaces come
in pairs: the namespace itself (even id), and the corresponding talk namespace
(odd id).

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] research-oriented toolserver?

2009-03-14 Thread Daniel Kinzler
Morten Warncke-Wang schrieb:
 Hi all,
 
 Judging by the replies we think we've failed to communicate clearly
 some of the ideas we wanted to put forward, and we'd like to take the
 opportunity to try to clear that up.
 
 We did not want to narrow this down to be only about a third party
 toolserver.  Before we initiated contact we noticed the need for
 adding more resources to the existing cluster.  Therefore we also had
 in mind the idea of augmenting the toolserver, rather than attempt to
 create a competitor for it.  For instance this could help allow the
 toolserver to also host applications requiring some amounts of text
 crunching, which is currently not feasible as far as we can tell.

That would be excellent.

 Additionally we think there could perhaps be two paths to account
 creation, one for Wikipedians and one for researchers, with the
 research path laid out with clearer documentation on the requirements
 projects would need to fit the toolserver and what the application
 should contain, which combined with faster feedback would aid to make
 the process easier for the researchers.

I think this should be done for all accounts. Why only researchers?

 We hope that this clears up some central points in our ideas
 surrounding a research oriented toolserver.  Currently we are
 exploring several ideas and this particular one might not become more
 than a thought and a thread on a mailing list.  Nonetheless perhaps
 there are thoughts here that can become more solid somewhere down the
 line.

In order to develop ideas, it would be useful to get some idea of what kind of
resources you think you can contribute, and under what terms and in what
timeframe. I know that talking money in public is usually a bad idea, especially
if the money isn't really there yet. If you like, contact me in private,
preferrably under my office address, daniel.kinzler AT wikimedia.de. I'm
responsible for toolserver operations, so I suppose it's my job to look into 
this.

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] research-oriented toolserver?

2009-03-14 Thread Daniel Kinzler
Brian schrieb:
 I think what the toolserver guys are saying is that they've got the
 data (e.g., a replica of the master database) and they are willing to
 expand operations to include larger-scale computations, and so yes
 they are willing to become more research oriented. They just need
 the extra hardware of course. I think it's difficult to estimate how
 much but here are some applications that I would like to make or see
 made sooner or later:
 
 * WikiBlame - A Lucene index of the history of all projects that can
 instantly find the authors of a pasted snippet. I'm not clear on the
 memory requirements of hosting an app like this after the index is
 created, but the index will be terabyte-size at 35% of the text dump.

Note that WikiTrust can do this too, and will probably go into testing soon. For
now, the database for WikiTrust weill be off-site, but if it goes live on
wikipedia, the hardwaree would be run at the main wmf cluster, and not on the
toolserver.

 * WikiBlame for images - an image similarity algorithm over all images
 in all projects that can find all places a given image is being used.
 I believe there is a one-time major cpu cost when first analyzing the
 images and then a much lesser realtime comparison cost. Again, the
 memory requirements of hosting such an app are unclear.

That would be very nice to have...

 * A vandalism classifier bot that uses the entire history of a wiki in
 order to predict whether the current edit is vandalism. Basically, a
 major extension of existing published work on automatically detecting
 vandalism, which only used several hundred edits. This would require
 major cpu resources for training but very little cost for real-time
 classification.

Pretty big for a toolserver poroject. But an excellent research topic!

 * Dumps, including extended dump formats such as a natural language
 parse of the full text of the recent version of a wiki made readily
 available for researchers.
 
 Finally, there are many worthwhile projects that have been presented
 at past Wikimanias or published in the literature that deserve to be
 kept up to date as the encyclopedia continues to grow. Permanent
 hosting for such projects would be a worthwhile goal, as would
 reaching out to these researchers. If the foundation can afford such
 an endeavor, the hardware cost is actually not that great. Perhaps
 datacenter fees are.

Please don't foprget that the toolserver is NOT run by the wikimedia foundation.
It's run by wikimedia germany, which has maybe a tenth of the foundation's
budget. If the foundation is interested in supporting us further, that's great,
we just need to keep responsibilities clear: is the foundation runnign a
project, or is the foundation heling us (wikimedia germany) to run a project?...

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] not all tables need to be backed up

2009-03-14 Thread Daniel Kinzler
Chad schrieb:
 On Sat, Mar 14, 2009 at 1:33 AM,  jida...@jidanni.org wrote:
 Gentlemen, it occurred to me that under close examination one finds
 that when making a backup of one's wiki's database, some of the tables
 dumped have various degrees of temporariness, and thus though needing
 to be present in a proper dump, could perhaps be emptied of their
 values, saving much space in the SQL.bz2 etc. file produced.

 Looking at the mysqldump man page, one finds no perfect options to do
 so, so instead makes one's own script:

 $ mysqldump my_database|
 perl -nwle 'BEGIN{$dontdump=wiki_(objectcache|searchindex)}
 s/(^-- )(Dumping data for table `$dontdump`$)/$1NOT $2/;
 next if /^LOCK TABLES `$dontdump` WRITE;$/../^UNLOCK TABLES;$/;
 print;'

 Though not myself daring to make any recommendations on
 http://www.mediawiki.org/wiki/Manual:Backing_up_a_wiki#Tables
 I am still curious which tables can be emptied always,
 which can be emptied if one is willing to remember to run a
 maintenance script to resurrect their contents, etc.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 
 Really, the only 3 that are needed are page, revision and text.
 Of course, to keep old versions of stuff you'll need archive, oldimage
 and filearchive too.

That would be sufficient to keep page content. You would however also want to
keep the user and user_groups tables, probably. and the interwiki table, for it
can not be restored and determins the interpretation of link prefixes. The log,
too, can't be restored, but if you need it is another question. The image table
can generally be restored by looking at the files, though the result may not be
exactly the same as the original.

I think it's better to put it this way: tables with cache in their name can
safely be truncated, the same is true for the profiling table (if used at all).
Tables with link in their name can always be rebuild, though it may take a
while, same for the searchindex.

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] browsing in phpmyadmin vs. charset

2009-03-14 Thread Daniel Kinzler
Chad schrieb:
 On Sat, Mar 14, 2009 at 2:02 AM,  jida...@jidanni.org wrote:
 Say one can successfully see the UTF-8 in an SQL dump with
 $ mysqldump --default-character-set=latin1 my_database | less

 OK, now how about browsing with phpmyadmin? no matter what I set,
 browser charset or various phpmyadmin choices,
 fields like 'ar_title varchar(255) latin1_bin'
 come out as mess.

 I just want to browse them, not change or 'injure' them.

 (That is the varchars, phpmyadmin won't let me browse the blobs.)

Well, it has to do with the fact that MW declared fields to be latin1 and then
fills them with utf8. I don't know if there is a way to tell phpmyadmin to
ignore the charset info the database is providing.

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Understanding the meaning of “List of page titles”

2009-03-14 Thread Tei
On Sat, Mar 14, 2009 at 8:46 AM, Daniel Kinzler dan...@brightbyte.de wrote:
 Andrew Garrett schrieb:
 On Sat, Mar 14, 2009 at 9:34 AM, O. O. olson...@yahoo.com wrote:
 Andrew Garrett wrote:
 On Sat, Mar 14, 2009 at 9:26 AM, O. O. olson...@yahoo.com wrote:
        The above link says that “only articles” and no redirects are in 
 the
 namespace NS0. Also Talk: pages are not included in the NS0.
 Then, when the current English Wikipedia advertises 2,791,033 Articles,
 I cannot understand why the list of Titles contains 5716820 Titles? This
 is a little more than double?
 The larger number includes redirects, the smaller number doesn't.

 Then why does this http://en.wikipedia.org/wiki/Wikipedia:NS0 say that
 “Redirects” are not considered as Articles and hence are not in NS0?

 It doesn't say that, it says Not all pages in the article namespace
 are considered to be articles, listing redirects as an example.

 The terminology is indeed confusing. ns0 is the main namespace, which is 
 used
 for articles. But it also contains redirects. For the statistics, the 
 software
 tries to count real or good articles, which is defined to be in ns0, not a
 redirect, and containing at least one link. It may in the future even be
 redefined not to include disambiguation pages. The title list however contains
 all pages in ns0.

 Talk pages are in their own namesapace, or rather, namespaces. Namespaces come
 in pairs: the namespace itself (even id), and the corresponding talk namespace
 (odd id).

plotting number of articles could help a observer see the grown of a
wiki, but is a bad number to see the dead of a wiki.

but.. he!.. maybe all wikis on the mediawiki proyect are just growing,
so we don't have this phenomenon just now, maybe in a few years we
will see some wastelands wikis.  Immense amounts of text that no one
can maintain (are interested in maintain) and let on his own suffer a
continuous degradation.  Anyway all our wikis are on his infancy, and
I am thinking  5+ years forward, and there are lots and lots of urgent
problems just now.

please ignore this email




-- 
--
ℱin del ℳensaje.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Understanding the meaning of “List of page titles”

2009-03-14 Thread Daniel Kinzler
 plotting number of articles could help a observer see the grown of a
 wiki, but is a bad number to see the dead of a wiki.

For this kind of analysis, check out WikiXRay and WikiTracer.

-- daniel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Special page access and render other lists

2009-03-14 Thread David Di Biase
Hi there,

Two quick questions. I just created a new Wiki special page, setup the group
extension files and all. The first issue I'm noticing is that on the
extension page the name of the page is displaying as lt;bibliographygt; .
I can't figure out why, because all the config files have the proper
naming...

The second question I have is...I need to write a script that iterates
through all articles (or better yet articles attached to a specific
category), sort through them, extract specific info and render it on my
bibliography page. I can do the last part but I'm having trouble looking
through the class list and supported hooks that will allow me to retrieve a
list of articles. Does someone have code that does this already or can point
me in the correct direction?

Thanks!

Dave
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] implementing the Interlanguage extension

2009-03-14 Thread Amir E. Aharoni
Sorry about bugging the list about it, but can anyone please explain
the reason for not enabling the Interlanguage extension?

See bug 15607 -
https://bugzilla.wikimedia.org/show_bug.cgi?id=15607

I believe that enabling it will be very beneficial for many projects
and many people expressed their support of it. I am not saying that
there are no reasons to not enable it; maybe there is a good reason,
but i don't understand it. I also understand that there are many other
unsolved bugs, but this one seems to have a ready and rather simple
solution.

I am only sending it to raise the problem. If you know the answer, you
may comment at the bug page.

Thanks in advance.

-- 
Amir Elisha Aharoni

heb: http://haharoni.wordpress.com | eng: http://aharoni.wordpress.com
cat: http://aprenent.wordpress.com | rus: http://amire80.livejournal.com

We're living in pieces,
 I want to live in peace. - T. Moore

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Special page access and render other lists

2009-03-14 Thread Ahmad Sherif
Hi Dave

You should put this line in execute function in the main class of your
special page (i.e. class SpecialPage extends SpecialPage)

wfLoadExtensionMessages('Your Special Page');

Also you should add message with your special page name in lowercase i think
('yourspecialpage' = 'The Desired Name',)

And don't forget to load the messages file with

$wgExtensionMessagesFiles['SpecialPageName'] = $dir
.'SpecialPageName.i18n.php';


Ahmad
On Sat, Mar 14, 2009 at 8:20 PM, David Di Biase dave.dibi...@gmail.comwrote:

 Sorry, I'm slightly confused. Could you explain why I would be doing that?
 Also where the heck in the documentation does it say this? lol.

 Thanks for the response.

 Dave

 On Sat, Mar 14, 2009 at 1:59 PM, Angela bees...@gmail.com wrote:

  On Sun, Mar 15, 2009 at 2:27 AM, David Di Biase dave.dibi...@gmail.com
  wrote:
   Two quick questions. I just created a new Wiki special page, setup the
  group
   extension files and all. The first issue I'm noticing is that on the
   extension page the name of the page is displaying as
 lt;bibliographygt;
  .
 
  Try creating the [[MediaWiki:Bibliography]] on your wiki with the name
  of the page (you'll need to be logged in as an admin).
 
  Angela
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Special page access and render other lists

2009-03-14 Thread David Di Biase
Hi Ahmad,

I actually copied the four sample files and just replaced a reference they
used with my extension name :-) so I believe I've got everything you
mentioned.  I've attached two for your viewing just in case I AM doing
something incorrect.

Thanks,

Dave

On Sat, Mar 14, 2009 at 2:49 PM, Ahmad Sherif ahmad.m.she...@gmail.comwrote:

 Hi Dave

 You should put this line in execute function in the main class of your
 special page (i.e. class SpecialPage extends SpecialPage)

 wfLoadExtensionMessages('Your Special Page');

 Also you should add message with your special page name in lowercase i
 think
 ('yourspecialpage' = 'The Desired Name',)

 And don't forget to load the messages file with

 $wgExtensionMessagesFiles['SpecialPageName'] = $dir
 .'SpecialPageName.i18n.php';


 Ahmad
 On Sat, Mar 14, 2009 at 8:20 PM, David Di Biase dave.dibi...@gmail.com
 wrote:

  Sorry, I'm slightly confused. Could you explain why I would be doing
 that?
  Also where the heck in the documentation does it say this? lol.
 
  Thanks for the response.
 
  Dave
 
  On Sat, Mar 14, 2009 at 1:59 PM, Angela bees...@gmail.com wrote:
 
   On Sun, Mar 15, 2009 at 2:27 AM, David Di Biase 
 dave.dibi...@gmail.com
   wrote:
Two quick questions. I just created a new Wiki special page, setup
 the
   group
extension files and all. The first issue I'm noticing is that on the
extension page the name of the page is displaying as
  lt;bibliographygt;
   .
  
   Try creating the [[MediaWiki:Bibliography]] on your wiki with the name
   of the page (you'll need to be logged in as an admin).
  
   Angela
  
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Foundation-l] Proposed revised attribution language

2009-03-14 Thread Magnus Manske
On Sat, Mar 14, 2009 at 3:25 PM, David Gerard dger...@gmail.com wrote:
 2009/3/14 MinuteElectron minuteelect...@googlemail.com:
 2009/3/14 David Gerard dger...@gmail.com:

 Here's an idea: nice URLs for the history. So we don't end up with
 stupid things peppered with ? and  and = printed on mugs, travel
 guides, etc.
 e.g. http://en.wikipedia.org/history/Xenu for the history of
 http://en.wikipedia.org/wiki/Xenu .

 This is already possible in MediaWiki, using a feature called action
 paths. It simply needs Apache rewrites setting up and a configuration
 variable within MediaWiki altering, there may be other implications in
 terms of internal organisation, robot functionality and caching
 though.


 Oh, I know it's not hard (though mod_rewrite rules resemble alchemy
 more than anything deterministic or logical). So I suppose the
 question is: can we get this into the Wikimedia settings?

IIRC one reason to use wiki/ and w/ instead of direct URLs
(en.wikipedia.org/Xenu) was to allow for non-article data at a later
time (the other reason was to set noindex/nofollow rules). Looks like
we will use that space after all :-)

What might /really/ be cool would be
http://en.wikipedia.org/authors/Xenu

or even
http://en.wikipedia.org/main_authors/Xenu
filtering out minor contribs and IPs...

Magnus

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Foundation-l] Proposed revised attribution language

2009-03-14 Thread David Gerard
2009/3/14 Magnus Manske magnusman...@googlemail.com:
 On Sat, Mar 14, 2009 at 3:25 PM, David Gerard dger...@gmail.com wrote:
 2009/3/14 MinuteElectron minuteelect...@googlemail.com:
 2009/3/14 David Gerard dger...@gmail.com:

 Here's an idea: nice URLs for the history. So we don't end up with
 stupid things peppered with ? and  and = printed on mugs, travel
 guides, etc.
 e.g. http://en.wikipedia.org/history/Xenu for the history of
 http://en.wikipedia.org/wiki/Xenu .

 This is already possible in MediaWiki, using a feature called action

 Oh, I know it's not hard (though mod_rewrite rules resemble alchemy
 more than anything deterministic or logical). So I suppose the
 question is: can we get this into the Wikimedia settings?

 IIRC one reason to use wiki/ and w/ instead of direct URLs
 (en.wikipedia.org/Xenu) was to allow for non-article data at a later
 time (the other reason was to set noindex/nofollow rules). Looks like
 we will use that space after all :-)


Kewl! Submitted as
https://bugzilla.wikimedia.org/show_bug.cgi?id=17981 - comments
welcome. Any devs like it/dislike it? A Simple Matter of mod_rewrite
rules?

It'd be nice if it went into the base MediaWiki whenever short URLs
are enabled, but as long as the /history/ link works that's fine for
these purposes: to have reasonably obvious URLs that won't die in
speech.


 What might /really/ be cool would be
 http://en.wikipedia.org/authors/Xenu
 or even
 http://en.wikipedia.org/main_authors/Xenu
 filtering out minor contribs and IPs...


We can save that for another bug if this one is accepted ;-)


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Foundation-l] Proposed revised attribution language

2009-03-14 Thread Thomas Dalton
2009/3/14 Magnus Manske magnusman...@googlemail.com:
 IIRC one reason to use wiki/ and w/ instead of direct URLs
 (en.wikipedia.org/Xenu) was to allow for non-article data at a later
 time (the other reason was to set noindex/nofollow rules). Looks like
 we will use that space after all :-)

That may be one reason, but I think the main reason is to avoid
problems with articles called things like index.php. /wiki/ is a
dummy directory, there's nothing actually there to conflict with, the
root directory has real files in it that need to accessible.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Extension message loading (was: Special page access and render other lists)

2009-03-14 Thread Ilmari Karonen
Ahmad Sherif wrote:
 Hi Dave
 
 You should put this line in execute function in the main class of your
 special page (i.e. class SpecialPage extends SpecialPage)
 
 wfLoadExtensionMessages('Your Special Page');
 
 Also you should add message with your special page name in lowercase i think
 ('yourspecialpage' = 'The Desired Name',)
 
 And don't forget to load the messages file with
 
 $wgExtensionMessagesFiles['SpecialPageName'] = $dir
 .'SpecialPageName.i18n.php';

This is going off on a tangent a bit, but this isn't the first time that
our system for loading extension messages has struck me as awkward.  We
already have extensions specify the location of their message files via
$wgExtensionMessagesFiles, but currently there's no way for MediaWiki to
know what messages are in each file without loading them.

Does anyone see a reason why we shouldn't make (okay, allow and strongly
recommend) extensions list the names of the messages they define?  That
way the message loading code could just automatically pull in the right
message files as needed.  I'm envisioning something like:

$wgExtensionMessagesFiles['MyExtension'] = $dir/MyExtension.i18n.php;
$wgExtensionMessages['MyExtension'] = array(
'myextension-desc',
'myextension-something',
'myextension-other',
'myextension-whatever',
);

I believe this would also allow us to fix several longstanding bugs, 
such as the fact that links to the MediaWiki: pages of extension 
messages currently appear as redlinks unless the message just happens to 
have been loaded.

-- 
Ilmari Karonen


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] How often is the message cache rebuilt?

2009-03-14 Thread Robert Rohde
I had an occasion to move (with redirect suppressed) a Mediawiki: page
into Template:  The intent was that the Mediawiki system message
should then fall back to its default behavior, while the page's
contents could still be used as a template under circumstances where
it was desirable to do so.

However, after several hours the system message is still giving the
same result and doesn't recognize that its contents were removed.

Obviously this is some sort of caching issue, perhaps because the
software doesn't know how to react properly to having a Mediawiki page
subject to a move.  Is this kind of lag likely to clear itself up in
short order?  How long does the message cache persist?

-Robert Rohde

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] research-oriented toolserver?

2009-03-14 Thread Brian
How will WikiTrust accomplish the WikiBlame function? I think I know
what WikiTrust is: http://trust.cse.ucsc.edu/

What gives it the function that you can enter a piece of wiki code
from the history of any wiki - totally out of context - and it returns
the authors?

On Sat, Mar 14, 2009 at 2:02 AM, Daniel Kinzler dan...@brightbyte.de wrote:
 Brian schrieb:
 * WikiBlame - A Lucene index of the history of all projects that can
 instantly find the authors of a pasted snippet. I'm not clear on the
 memory requirements of hosting an app like this after the index is
 created, but the index will be terabyte-size at 35% of the text dump.

 Note that WikiTrust can do this too, and will probably go into testing soon. 
 For
 now, the database for WikiTrust weill be off-site, but if it goes live on
 wikipedia, the hardwaree would be run at the main wmf cluster, and not on the
 toolserver.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] not all tables need to be backed up

2009-03-14 Thread Aryeh Gregor
On Sat, Mar 14, 2009 at 1:33 AM,  jida...@jidanni.org wrote:
 Looking at the mysqldump man page, one finds no perfect options to do
 so, so instead makes one's own script:

 $ mysqldump my_database|
 perl -nwle 'BEGIN{$dontdump=wiki_(objectcache|searchindex)}
 s/(^-- )(Dumping data for table `$dontdump`$)/$1NOT $2/;
 next if /^LOCK TABLES `$dontdump` WRITE;$/../^UNLOCK TABLES;$/;
 print;'

Why don't you just do:

$ mysqldump --ignore-table=my_database.wiki_objectcache
--ignore-table=my_database.wiki_searchindex my_database

Certainly you can skip objectcache.  searchindex can be rebuilt.  So
can all the *links tables, and the redirect table, and probably some
others.  Of course, rebuilding all these tables on backup restore
might take an awfully long time, which you need to weigh against the
convenience of not having to back them up.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Understanding the meaning of “List of page titles”

2009-03-14 Thread Aryeh Gregor
On Fri, Mar 13, 2009 at 6:26 PM, O. O. olson...@yahoo.com wrote:
 Thanks  Daniel. I had not understood the meaning of NS0. Anyway I found
 the details of NS0 from http://en.wikipedia.org/wiki/Wikipedia:NS0
 However this confuses me even more.

Pages on the English Wikipedia that start with any of the following
prefixes are *not* in the main namespace (ns0):

Talk:
User:
User talk:
Wikipedia:
Wikipedia talk:
File:
File talk:
MediaWiki:
MediaWiki talk:
Template:
Template talk:
Help:
Help talk:
Category:
Category talk:
Portal:
Portal talk:
Special:

All pages that do not start with one of these special prefixes are
automatically in namespace 0.  To check the namespace number of a page
if you're uncertain, you can view the page source and check the body
element's classes.  namespace 0 pages will have the class ns-0.
Other pages will have some other number; for instance, Talk: pages
will have ns-1, because Talk: is namespace 1.  User: is 2, User
talk: is 3, etc.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How often is the message cache rebuilt?

2009-03-14 Thread David Gerard
2009/3/14 Robert Rohde raro...@gmail.com:

 Obviously this is some sort of caching issue, perhaps because the
 software doesn't know how to react properly to having a Mediawiki page
 subject to a move.  Is this kind of lag likely to clear itself up in
 short order?  How long does the message cache persist?


In my experience? Several hours, if it's a *really* popular piece of
interface, e.g. the en:wp copyright notice. The en:wp sidebar doesn't
seem to take quite as long, oddly enough.


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How often is the message cache rebuilt?

2009-03-14 Thread Platonides
Robert Rohde wrote:
 I had an occasion to move (with redirect suppressed) a Mediawiki: page
 into Template:  The intent was that the Mediawiki system message
 should then fall back to its default behavior, while the page's
 contents could still be used as a template under circumstances where
 it was desirable to do so.
 
 However, after several hours the system message is still giving the
 same result and doesn't recognize that its contents were removed.
 
 Obviously this is some sort of caching issue, perhaps because the
 software doesn't know how to react properly to having a Mediawiki page
 subject to a move.  Is this kind of lag likely to clear itself up in
 short order?  How long does the message cache persist?
 
 -Robert Rohde

Have you tried purging the MediaWiki: page?


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How often is the message cache rebuilt?

2009-03-14 Thread Tim Starling
Robert Rohde wrote:
 I had an occasion to move (with redirect suppressed) a Mediawiki: page
 into Template:  The intent was that the Mediawiki system message
 should then fall back to its default behavior, while the page's
 contents could still be used as a template under circumstances where
 it was desirable to do so.
 
 However, after several hours the system message is still giving the
 same result and doesn't recognize that its contents were removed.
 
 Obviously this is some sort of caching issue, perhaps because the
 software doesn't know how to react properly to having a Mediawiki page
 subject to a move.  Is this kind of lag likely to clear itself up in
 short order?  How long does the message cache persist?

This is a bug. There is code in Title::moveTo() to update the message
cache but it appears to be totally wrong and broken.

The message cache expiry time is 24 hours.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l