Re: [Wikitech-l] implementing the Interlanguage extension

2009-03-16 Thread Nicolas Dumazet
>> If we're going to create a special tool, there's no point in stopping so
>> short that we still have to rely on somebody to run client-side bots to make
>> it work properly.

I think that it is a request to start looking into ways to handle
interwiki links from the inside, rather than still relying on bots to
update the pages.

The interlanguage extension is a nice first step towards the inclusion
in core of all the work that we are currently doing with interwiki
bots. But why would we rely on, again, client-side bots to update
anything? The bot work seems to be smaller with this system, agreed.
But perhaps these new type of bots updates could simply be transformed
into Jobs and put in the Job queue.

At least the idea is to try to find a workflow that could be included
in the server as much as possible instead of saying that client-side
bots will handle the update tasks.

2009/3/17 Nikola Smolenski :
> Дана Saturday 14 March 2009 23:20:01 Amir E. Aharoni написа:
>> 2009/3/15 Andrew Garrett :
>> > On Sun, Mar 15, 2009 at 4:10 AM, Amir E. Aharoni 
> wrote:
>> >> Sorry about bugging the list about it, but can anyone please explain
>> >> the reason for not enabling the Interlanguage extension?
>> >>
>> >> See bug 15607 -
>> >> https://bugzilla.wikimedia.org/show_bug.cgi?id=15607
>> >
>> > In general, extensions with this status haven't been implemented
>> > because they haven't been reviewed by a highly experienced developer.
>>
>> Brion wrote a few comments there, but i didn't understand what exactly
>> is the problem.
>
> Seconded. Brion's comments are:
>
>> I don't quite understand what this extension is meant to accomplish, or how
>> the workflow is envisioned. What are the user interface and performance
>> considerations?
>
>> The lack of automatic updates seems less than ideal, as does the multiple
>> fetching of link data over the HTTP API on every page render. Management UI
>> by manual editing of offsite pages looks pretty ugly; what could be done to
>> improve this?
>

>
> I don't fully understand them.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
Nicolas Dumazet — NicDumZ [ nɪk.d̪ymz ]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] implementing the Interlanguage extension

2009-03-16 Thread Nikola Smolenski
Дана Saturday 14 March 2009 23:20:01 Amir E. Aharoni написа:
> 2009/3/15 Andrew Garrett :
> > On Sun, Mar 15, 2009 at 4:10 AM, Amir E. Aharoni  
wrote:
> >> Sorry about bugging the list about it, but can anyone please explain
> >> the reason for not enabling the Interlanguage extension?
> >>
> >> See bug 15607 -
> >> https://bugzilla.wikimedia.org/show_bug.cgi?id=15607
> >
> > In general, extensions with this status haven't been implemented
> > because they haven't been reviewed by a highly experienced developer.
>
> Brion wrote a few comments there, but i didn't understand what exactly
> is the problem.

Seconded. Brion's comments are:

> I don't quite understand what this extension is meant to accomplish, or how
> the workflow is envisioned. What are the user interface and performance
> considerations?  

> The lack of automatic updates seems less than ideal, as does the multiple
> fetching of link data over the HTTP API on every page render. Management UI
> by manual editing of offsite pages looks pretty ugly; what could be done to
> improve this?

> If we're going to create a special tool, there's no point in stopping so
> short that we still have to rely on somebody to run client-side bots to make
> it work properly. We've got a site with shared databases and the ability to
> queue updates automatically on the server side... take advantage of it!   

I don't fully understand them.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problems with the recent version of Cite Extension

2009-03-16 Thread O. O.
Gerard Meijssen wrote:
> Hoi,
> What revision number does the working version for REL1_14_0 of Cite have ?
> Thanks,
>   GerardM
> 

I don’t know what is “REL1_14_0 of Cite”??
Sorry I am new to this
O. O.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problems with the recent version of Cite Extension

2009-03-16 Thread Gerard Meijssen
Hoi,
What revision number does the working version for REL1_14_0 of Cite have ?
Thanks,
  GerardM

2009/3/17 O. O. 

> I have installed Mediawiki 1.14.0 http://www.mediawiki.org/wiki/Download
> and am trying to get the  Cite Extension
> http://www.mediawiki.org/wiki/Extension:Cite version 1.14.0 to work.
>
> When accessing the Main_Page I get the error:
>
> Fatal error: Call to undefined method
> ParserOptions::getIsSectionPreview() in
> /var/www/wiki2/extensions/Cite/Cite_body.php on line 699
>
> I can however get the 1.13.0 version of the Cite Extension to work.
>
> Is this the correct place to be asking for help on the Cite Extension –
> or is there some other mailing list or newsgroup for this.
>
> Thanks,
> O. O.
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Problems with the recent version of Cite Extension

2009-03-16 Thread O. O.
I have installed Mediawiki 1.14.0 http://www.mediawiki.org/wiki/Download 
and am trying to get the  Cite Extension 
http://www.mediawiki.org/wiki/Extension:Cite version 1.14.0 to work.

When accessing the Main_Page I get the error:

Fatal error: Call to undefined method 
ParserOptions::getIsSectionPreview() in 
/var/www/wiki2/extensions/Cite/Cite_body.php on line 699

I can however get the 1.13.0 version of the Cite Extension to work.

Is this the correct place to be asking for help on the Cite Extension – 
or is there some other mailing list or newsgroup for this.

Thanks,
O. O.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] statistics of relevant functions/lines

2009-03-16 Thread Ilmari Karonen
Maggi Federico wrote:
> Hello,
>   thanks for your quick contribution. By the way, I was browsing around  
> the repository and, beside its hugeness, I couldn't really find the  
> classic /branches /trunk /tags structure.

Have you read http://www.mediawiki.org/wiki/Manual:Download_from_SVN ?

Also, you can browse the source code at 
http://svn.wikimedia.org/viewvc/mediawiki/ .  Note that the core code is 
found under the somewhat confusingly named trunk/phase3 directory (or 
tags/REL_x_yy/phase3 for tagged releases).  Direct link: 
http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/

-- 
Ilmari Karonen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] statistics of relevant functions/lines

2009-03-16 Thread Ilmari Karonen
Maggi Federico wrote:
> 
> The functions/lines I am interested in are those somehow involved in  
> handling HTTP request/session/response parameters. Of course, I allow  
> a certain roughness in the analysis.
> 
> I am trying to build a list of such functions/lines. What would you  
> suggest to include? What would you grep for if you want to estimate  
> the functions that are, even indirectly, related to processing of HTTP  
> requests/sessions/responses?

Does "HTTP responses" include content generation?  I'm going to assume 
not, since otherwise something most of the codebase will qualify.

The WebRequest instance describing the current request is kept in the 
global variable $wgRequest -- you may want to grep for that, or for 
methods specific to the WebRequest class.  You'll find that a lot of 
code uses that class to access things like URL parameters, though.

There's also a WebResponse class (available via $wgRequest->response()), 
but it doesn't really do much (just duplicates the PHP header() function 
and provides a wrapper around setcookie() which seems to be called from 
exactly one place in the code).  Anyway, you should catch any uses of it 
by grepping for $wgRequest, but you should also grep for the standard 
PHP header() and setcookie() functions.

The WebRequest class also provides functions for accessing session data, 
but they don't actually seem to be used -- everything just accesses the 
PHP superglobal $_SESSION array directly.

Also, note that all this is true for the main UI code.  The API code 
(found in api.php and under includes/api) and the AJAX code 
(includes/Ajax*.php) may do things somewhat differently.

-- 
Ilmari Karonen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Understanding the meaning of “Lis t of page titles”

2009-03-16 Thread O. O.
Aryeh Gregor wrote:
> 
> Page titles cannot begin with these prefixes, so I deliberately
> omitted them.  What I said is correct.

Thanks guys for clarifying this. I sure learnt a lot.
O.  O.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] statistics of relevant functions/lines

2009-03-16 Thread Maggi Federico
Hello,
thanks for your quick contribution. By the way, I was browsing around  
the repository and, beside its hugeness, I couldn't really find the  
classic /branches /trunk /tags structure.

Am I pointing my client to the wrong repo?


On Mar 16, 2009, at 3:53 PM, Platonides wrote:

> However, MediaWiki encapsulates them, mainly with WebRequest.php

I was looking for exactly this information. Any hint along the line of  
custom/recurrent/included functions used for HTTP data manipulation is  
greatly appreciated.

Thanks in advance. Best regards,

-- Federico


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] FlaggedRevs tied to Protection?

2009-03-16 Thread Platonides
I think that's a good idea. But does flagged revs support unprotecting?


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] statistics of relevant functions/lines

2009-03-16 Thread Platonides
Maggi Federico wrote:
> Hello List,
>   I know MW a little bit but not so well to avoid asking for hints here.
> 
> For research purposes, I am surveying code repositories of the top 10  
> web applications using a modified version of StatSVN.
> 
> Basically, I need to track source changes, over releases, in terms of  
> "Lines of Code (LOC) matching a certain regexp".
> 
> The functions/lines I am interested in are those somehow involved in  
> handling HTTP request/session/response parameters. Of course, I allow  
> a certain roughness in the analysis.
> 
> I am trying to build a list of such functions/lines. What would you  
> suggest to include? What would you grep for if you want to estimate  
> the functions that are, even indirectly, related to processing of HTTP  
> requests/sessions/responses?
> 
> Any input is greatly appreciated. Thanks in advance. Cheers,
> 
> -- Federico

$_POST/$_GET/$_SESSION/$_REQUEST ?
However, MediaWiki encapsulates them, mainly with WebRequest.php


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HTML not Rendered correctly after Import of Wikipedia

2009-03-16 Thread O. O.
Roan Kattouw wrote:
> O. O. schreef:
>> I did not understand where I have to edit “imagemap_invalid_image” I am 
>> new here, so I am not sure which file I have to edit.
>>
> You have to edit [[MediaWiki:imagemap_invalid_image]], which is a wiki 
> page, not a file.
> 
> Roan Kattouw (Catrope)

Thanks Roan for this Tip.
For others who come this way – you need to edit this page only with a 
Wikiadmin Account i.e. WikiSysop by default.

I think I am going to make a new thread regarding the Cite Extension.

Thanks,
O. O.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Foundation-l] Proposed revised attribution language

2009-03-16 Thread Andrew Cates
Hmm. There is an issue which has been raised before by Duncan Harris
on the en list:

>>"The way I see it the Document referred to in the GFDL cannot be an 
>>individual Wikipedia article. It has to be the whole of Wikipedia. If the 
>>Document were an individual article then Wikipedia would be in breach of its 
>>own license. Every time people copy text between articles then they would 
>>create a Modified Version under the GFDL. They mostly do not comply with GFDL 
>>section 4 under these circumstances on a number of points. So the only 
>>sensible interpretations are the whole of English Wikipedia or the whole of 
>>Wikipedia as the GFDL Document. This has the following implications for GFDL 
>>compliance: - only need to give network location of Wikipedia, not individual 
>>articles - only need to give five principal authors of Wikipedia, not of 
>>individual articles - no real section Entitled "History", so no requirement 
>>to copy that"


I think this is right: article history in practice fails the license
terms. I had a look at a couple of articles which was itself a labour
of love. In particular you find immediately drafting is not generally
done in an article, except first time around for new stubs. For
existing articles being reworked, a lot of content is generated/worked
out/negotiated on various different talk pages, often not the main
article talk page, before moving onto the actual article page using
copy and paste. There is also a fair amount of copy and paste when
sections are spun out to their own article or articles merged into a
main article. In none of these cases does the article history
correctly attribute authorship. Main author is even worse as content
gets deleted by vandalism and restored so often finding original main
contributors is practically impossible.

I think Wikipedia is so far from compliant in the interpretation of
the license if we take "one article as a document" that we have to
accept that the whole thing is the document in license terms and no
history is available.

BozMo
=
On Sat, Mar 14, 2009 at 9:22 PM, Thomas Dalton  wrote:
> 2009/3/14 Magnus Manske :
>> IIRC one reason to use wiki/ and w/ instead of "direct" URLs
>> (en.wikipedia.org/Xenu) was to allow for non-article data at a later
>> time (the other reason was to set noindex/nofollow rules). Looks like
>> we will use that space after all :-)
>
> That may be one reason, but I think the main reason is to avoid
> problems with articles called things like "index.php". /wiki/ is a
> dummy directory, there's nothing actually there to conflict with, the
> root directory has real files in it that need to accessible.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] statistics of relevant functions/lines

2009-03-16 Thread Maggi Federico
Hello List,
I know MW a little bit but not so well to avoid asking for hints here.

For research purposes, I am surveying code repositories of the top 10  
web applications using a modified version of StatSVN.

Basically, I need to track source changes, over releases, in terms of  
"Lines of Code (LOC) matching a certain regexp".

The functions/lines I am interested in are those somehow involved in  
handling HTTP request/session/response parameters. Of course, I allow  
a certain roughness in the analysis.

I am trying to build a list of such functions/lines. What would you  
suggest to include? What would you grep for if you want to estimate  
the functions that are, even indirectly, related to processing of HTTP  
requests/sessions/responses?

Any input is greatly appreciated. Thanks in advance. Cheers,

-- Federico

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HTML not Rendered correctly after Import of Wikipedia

2009-03-16 Thread Roan Kattouw
O. O. schreef:
> I did not understand where I have to edit “imagemap_invalid_image” I am 
> new here, so I am not sure which file I have to edit.
> 
You have to edit [[MediaWiki:imagemap_invalid_image]], which is a wiki 
page, not a file.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML not Rendered correctly after Import of Wikipedia

2009-03-16 Thread O. O.
Thanks Robert.

Robert Ullmann wrote:
> Hi,
> 
> This is immediately preceded by  tags, implemented by the
> "Cite.php" extension. Did you install that? May be your problem. Check
> that you have all the extensions that the 'pedia routinely uses.

Yes this was a good guess. I actually could not get the Latest version 
of the Cite Extension http://www.mediawiki.org/wiki/Extension:Cite  to 
work with the Latest version of MediaWiki i.e. version 1.14.0. The 
version 1.14.0 version gives me an error on the Main_Page

Fatal error: Call to undefined method 
ParserOptions::getIsSectionPreview() in 
/var/www/wiki2/extensions/Cite/Cite_body.php on line 699

So instead I am using the 1.13.0 version of Cite Extension.

> 
> A simple trick: this is a mediawiki message that can be modified. Edit
> Mediawiki:imagemap_invalid_image and change it to   (not to
> blank, as that just gets you the default message). (Why this message
> starts with "imagemap" I don't know, but it doesn't seem to appear
> anywhere else? Try it at least.) You'll still get a blank line etc in
> some cases.

I did not understand where I have to edit “imagemap_invalid_image” I am 
new here, so I am not sure which file I have to edit.

Thanks again,
O. O.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] not all tables need to be backed up

2009-03-16 Thread Roan Kattouw
Aryeh Gregor schreef:
> On Sun, Mar 15, 2009 at 3:02 PM, Gerard Meijssen
>  wrote:
>> The one thing missing in this discussion is a risk assessment and the
>> importance given to maintaing our infrastructure availability.
> 
> What's "our"?  Jidanni is talking about backing up his own wiki, not
> any Wikimedia wiki.
> 
On top of that, restoring from dumps with these tables empty doesn't 
break anything. While the rebuild is in progress, features relying on 
them (search, whatlinkshere, category views) will show incomplete 
results, but if that means restoring is faster, I guess that's acceptable.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Running deleteDefaultMessages.php on WMF wikis

2009-03-16 Thread Andrew Garrett
On Mon, Mar 16, 2009 at 11:52 PM, John Doe  wrote:
> This would break soo much, a lot of wikis have purposefully changed the
> default messages for tracking purposes and other reasons. deleting these
> messages would cause a lot of problems
>

I'm not sure you get it. deleteDefaultMessages.php only deletes
messages which have not been customised.

-- 
Andrew Garrett

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Running deleteDefaultMessages.php on WMF wikis

2009-03-16 Thread John Doe
This would break soo much, a lot of wikis have purposefully changed the
default messages for tracking purposes and other reasons. deleting these
messages would cause a lot of problems

On Sat, Mar 14, 2009 at 11:22 AM, Eugene Zelenko
wrote:

> Hi!
>
> We need to clean up old MediaWiki messages translations on be-x-old
> Wikipedia. deleteDefaultMessages.php looks as perfect candidate for
> this job. However last changes from MediaWiki_default were made in
> 2007.
>
> Is this code still functional? If so, I think will be good idea to run
> it on all WMF wikis, especially those which had long history of
> translation of MediaWiki messages (before translatewiki.net).
>
> Eugene.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l