[Wikitech-l] second-class wikis

2009-02-02 Thread Marcus Buck
According to SiteMatrix we have 739 projects at the moment. There are 
three master partitions for the servers: s1 for enwiki only, s2 for 19 
other projects and s3 for all the rest (that's 719 projects).

My homewiki is one of those 719 projects. And I feel a bit neglected. 
Replication is halted since 34 days. LuceneSearch 2.1 is active on 
enwiki since October and on dewiki and some other big wikis since 
December. Most other wikis have still no access to the new features. 
Even the +incategory: feature which is active on enwiki since April 
2008 is not active on most wikis as of February 2009.

It seems, we are very low at the priority list.

Marcus Buck

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Double redirects

2009-02-02 Thread Stephen Bain
On Tue, Feb 3, 2009 at 1:09 AM, Chad innocentkil...@gmail.com wrote:
 On Mon, Feb 2, 2009 at 8:43 AM, Russell Blau russb...@hotmail.com wrote:

 2)  If not, should there be a new Special: page to list such excessive
 redirect chains?

 Probably not. I'd say tweak the current one (a better name for it?)

ReReRedirects?

-- 
Stephen Bain
stephen.b...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Please make a optional way to set editsection links after section's title

2009-02-02 Thread Steve Summit
mizusumashi wrote:
 Please see [[w:en:User:Mizusumashi/workspace]] with Firefox.
 Don't you see moved [edit] links near at the second image?

I see two sections and two edit links, both of them moved down to
roughly the bottom edge of the first section's image.  I see this
all the time on the real wiki as well; as far as I know it's a
known, longstanding bug.  The solution on the real wiki is often
to put a {{-}} at the end of a section which has an image which
is vertically longer than the sections's text tends to be.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] second-class wikis

2009-02-02 Thread Gerard Meijssen
Hoi,
Can someone please explain why this is ?
Thanks,
 GerardM

2009/2/1 Marcus Buck w...@marcusbuck.org

 According to SiteMatrix we have 739 projects at the moment. There are
 three master partitions for the servers: s1 for enwiki only, s2 for 19
 other projects and s3 for all the rest (that's 719 projects).

 My homewiki is one of those 719 projects. And I feel a bit neglected.
 Replication is halted since 34 days. LuceneSearch 2.1 is active on
 enwiki since October and on dewiki and some other big wikis since
 December. Most other wikis have still no access to the new features.
 Even the +incategory: feature which is active on enwiki since April
 2008 is not active on most wikis as of February 2009.

 It seems, we are very low at the priority list.

 Marcus Buck

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Please make a optional way to set editsection links after section's title

2009-02-02 Thread Sylvain Brunerie
I see... Thanks.

— Sylvain Brunerie
[[w:fr:User:Delhovlyn]]


2009/2/2 mizusumashi mizusuma...@coda.ocn.ne.jp

 Hi, Sylvain.

  I don't see the problem with Firefox. Could you precise ?

 Please see [[w:en:User:Mizusumashi/workspace]] with Firefox.
 Don't you see moved [edit] links near at the second image?

 
   mizusumashi

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] second-class wikis

2009-02-02 Thread Aryeh Gregor
On Mon, Feb 2, 2009 at 1:19 PM, Bence Damokos bdamo...@gmail.com wrote:
 If I read the original e-mail correctly there were two issues mentioned:
 long replication for s3 --  explained by the deleted logs -- and long delays
 before any new feature (search upgrades in particular) are enabled on what
 are perceived to be smaller wikis.
 I think it would be beneficial for everyone if a solution, explanation for
 the second issue was found, and we left the first issue be until the new
 servers arrive (or if they do not come for a long time it can be taken up
 again at a later time).

For search features, I found this thread:

http://lists.wikimedia.org/pipermail/wikitech-l/2008-October/039861.html

I'm not clear on the exact reasons.  My impression there is that the
feature could not easily be enabled for all wikis at once (due to rack
space issues?), so it was enabled one by one.  Given that, it seems
logical enough to start with the largest wikis to get the most
feedback earliest in the process.  This post refers to hardware
issues:

http://lists.wikimedia.org/pipermail/wikitech-l/2008-October/040022.html

This one says that some more RAM is being ordered:

http://lists.wikimedia.org/pipermail/wikitech-l/2009-January/040977.html

If this still isn't finished yet, obviously that's a problem, and I
don't know what the answer is.


It's unrealistic to expect identical treatment for enwiki and bxrwiki,
of course, but I don't see gross inattention toward the smaller wikis.
 I do use at least one pretty regularly, too, mediawiki.org (although
of course that's in English).  Mostly features get rolled out for all
wikis at the exact same time.  Of course, which features get developed
to begin with is a separate story -- there's clearly a bias toward
Western languages there, and especially English (look at category
sorting for an example of that).

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] – Fixing {val}

2009-02-02 Thread greg_l_at_wikipedia
I have a sandbox with hundreds of test numbers to exercise the  
{delimitnum} template. This template uses the same math-based  
functions as {val}. For anyone who has produced improved math  
functions, you can examine values on this sandbox that currently  
produce rounding errors and test your new math functions. The sandbox  
is at the following:

http://en.wikipedia.org/wiki/User:Greg_L/Delimitnum_sandbox

At the very bottom of the page (which loads slowly), is a concise list  
of values that all have errors.

It would still be much better if a character-counting parser function  
can be made. Then, numbers like {{val|1.4500}} won’t have its two  
trailing zeros truncated.

Greg L


On Jan 31, 2009, at 8:02 PM, Robert Rohde wrote:

On Sat, Jan 31, 2009 at 5:43 PM, Aryeh Gregor
simetrical+wikil...@gmail.com wrote:
 On Sat, Jan 31, 2009 at 8:33 PM, Robert Rohde raro...@gmail.com  
 wrote:
 This discussion is getting side tracked.

 The real complaint here is that

 {{#expr:(0.7 * 1000 * 1000) mod 1000}} is giving 69 when it  
 should give 70.

 This is NOT a formatting issue, but rather it is bug in the #expr
 parser function, presumably caused by some kind of round-off error.

 $ php -r 'echo (0.7 * 1000 * 1000) % 1000 . \n;'
 69
 $ php -r 'echo (int)(0.7 * 1000) . \n;'
 699

 The issue is bog-standard floating-point error.  If PHP has a decent
 library for exact-precision arithmetic, we could probably use that.
 Otherwise, template programmers will have to learn how floating-point
 numbers work just like all other programmers in the universe.

In r46671 I have added an explicit test for floating point numbers
that are within 1 part in 10^10 of integers before performing
round-off sensitive conversions and comparisons.

This should eliminate these errors in many cases.

-Robert Rohde

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Double redirects

2009-02-02 Thread Chad
ExcessiveRedirects?

-Chad

On Feb 2, 2009 10:31 AM, Stephen Bain stephen.b...@gmail.com wrote:

On Tue, Feb 3, 2009 at 1:09 AM, Chad innocentkil...@gmail.com wrote:  On
Mon, Feb 2, 2009 at 8:43...

 2) If not, should there be a new Special: page to list such excessive
 redirect chains?   ...
ReReRedirects?

--
Stephen Bain
stephen.b...@gmail.com

___ Wikitech-l mailing list
wikitec...@lists.wikimedia
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] war on Cite/{{cite}}

2009-02-02 Thread Domas Mituzas

in few minutes Cite cache dropped Cite out of our profiling top50  
page, and even though it accounted for 10% of cluster load today, it  
is under 3% already.
I claim 90% of honor for this improvement, the rest goes to Andrew,  
who implemented it all ;-D

Do note, the war is not over yet, we will have to think how to resolve  
metatemplates properly :)

BR,
-- 
Domas Mituzas -- http://dammit.lt/ -- [[user:midom]]



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] war on Cite/{{cite}}

2009-02-02 Thread Chad
We could always go back to no templates at all ;-)

-Chad

On Feb 2, 2009 2:58 PM, Domas Mituzas midom.li...@gmail.com wrote:


in few minutes Cite cache dropped Cite out of our profiling top50
page, and even though it accounted for 10% of cluster load today, it
is under 3% already.
I claim 90% of honor for this improvement, the rest goes to Andrew,
who implemented it all ;-D

Do note, the war is not over yet, we will have to think how to resolve
metatemplates properly :)

BR,

-- Domas Mituzas -- http://dammit.lt/ -- [[user:midom]]
...

Wikitech-l mailing list Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/...
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] – Fixing {val}

2009-02-02 Thread greg_l_at_wikipedia
Indeed. When Wikiipedia’s servers are cooking along, the first-time  
load on the sandbox is entirely tolerable. Once you’ve got it cached  
after that very first time, then it’s much better yet. Once you get  
there, you will find many, many number-sequence strategies. Some of  
you mathematically brighter types might even see a pattern to all  
those errors.

On Feb 2, 2009, at 11:31 AM, Robert Rohde wrote:

On Mon, Feb 2, 2009 at 11:17 AM, Aryeh Gregor
simetrical+wikil...@gmail.com wrote:
 On Mon, Feb 2, 2009 at 1:55 PM, greg_l_at_wikipedia
 greg_l_at_wikipe...@comcast.net wrote:
 I have a sandbox with hundreds of test numbers to exercise the
 {delimitnum} template. This template uses the same math-based
 functions as {val}. For anyone who has produced improved math
 functions, you can examine values on this sandbox that currently
 produce rounding errors and test your new math functions. The sandbox
 is at the following:

 http://en.wikipedia.org/wiki/User:Greg_L/Delimitnum_sandbox

 At the very bottom of the page (which loads slowly), is a concise  
 list
 of values that all have errors.

 The page is so slow that even attempting to view it times out for me.


It is large and slow (loaded for me after about 20 seconds), but we've
also had some recent sporadic reports of routine things like
watchlists being sluggish or failing to load.  So, in addition to his
page being huge, there may be other factors at work as well at the
moment.

-Robert Rohde

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Double redirects

2009-02-02 Thread Chad
NPOV doesn't apply to MediaWiki. :)

-Chad

On Mon, Feb 2, 2009 at 3:22 PM, Nikola Smolenski smole...@eunet.yu wrote:

 On Monday 02 February 2009 19:56:16 Chad wrote:
  ExcessiveRedirects?

 That's POV. MultipleRedirects?

  On Feb 2, 2009 10:31 AM, Stephen Bain stephen.b...@gmail.com wrote:
  On Tue, Feb 3, 2009 at 1:09 AM, Chad innocentkil...@gmail.com wrote: 
 On
  Mon, Feb 2, 2009 at 8:43...
   2) If not, should there be a new Special: page to list such
 excessive
   redirect chains?   ...
 
  ReReRedirects?

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] war on Cite/{{cite}}

2009-02-02 Thread David Gerard
2009/2/2 Andrew Garrett and...@epstone.net:

 Domas and I worked on this on IRC for a bit just now, and the change
 has been synced to Wikimedia wikis. It generates the cache key from an
 md5 of the input to the parser and the page-id. Render hash could be
 included in this if it causes problems, but I'm not sure what stuff in
 references will depend on the render hash and it may be safe to keep
 it out of the cache key.


Heh. How much of an evil hack is this?


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] war on Cite/{{cite}}

2009-02-02 Thread Robert Rohde
You guys appear to have broken something.

Look at ref #13 at the bottom of [1]

Instead of saying [[Discover]] it magically says [[encephalitis]] for
the magazine name.

-Robert Rohde

[1] 
http://en.wikipedia.org/w/index.php?title=Visna_virusdiff=268100404oldid=268099718


On Mon, Feb 2, 2009 at 11:56 AM, Domas Mituzas midom.li...@gmail.com wrote:

 in few minutes Cite cache dropped Cite out of our profiling top50
 page, and even though it accounted for 10% of cluster load today, it
 is under 3% already.
 I claim 90% of honor for this improvement, the rest goes to Andrew,
 who implemented it all ;-D

 Do note, the war is not over yet, we will have to think how to resolve
 metatemplates properly :)

 BR,
 --
 Domas Mituzas -- http://dammit.lt/ -- [[user:midom]]



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] – Fixing {val}

2009-02-02 Thread Greg L
Robert: With regard to your Q: “Greg, how often do you worry about  
more than 10 digits?”, the answer is, ‘not too often.’ The highest- 
precision set of number sequences on my sandbox is nine significant  
digits; e.g. 0.298728008

Getting floating-point math to no longer produce rounding errors (like  
it is a cheap hand-held calculator where 5/3 x 3 = 4.998) will  
likely take away most of the need to have a caveat on WP:MOSNUM about  
using {val}. Still, it couldn’t properly ever parse a number like  
1.2300 and leave the last two zeros in the result.

As for your comment, Robert: “ Though, we may still want to consider  
building number formatting options into Mediawiki directly since I'm  
not sure putting all of that formatting effort into template space is  
really a good idea”, I agree; Mediawiki would be the better place for  
this. There were a number of editors on WT:MOSNUM who were very  
enthusiastic over the notion of a character-counting parser function  
that could treat *anything* as a string of mindless characters and  
could respond to requests like “gimme three” – “gimme three more” –  
“and gimme the last four”.

If such a character-dolling parser function has a little bit of extra  
sophistication (perhaps if combined with other existing functions),  
then template authors could ask a couple of questions like this about  
the nature of the string:

“How many characters are to the left of a decimal point?”
“How many characters are to the right?”

  …and then request the following from a character-dolling function:

“Give me the first three to the right of the decimal point”

As I understand it, adding character-dolling functionality to  
Mediawiki would be very well received indeed.

Greg


On Feb 2, 2009, at 12:33 PM, Robert Rohde wrote:

On Mon, Feb 2, 2009 at 10:55 AM, greg_l_at_wikipedia
greg_l_at_wikipe...@comcast.net wrote:
 I have a sandbox with hundreds of test numbers to exercise the
 {delimitnum} template. This template uses the same math-based
 functions as {val}. For anyone who has produced improved math
 functions, you can examine values on this sandbox that currently
 produce rounding errors and test your new math functions. The sandbox
 is at the following:

 http://en.wikipedia.org/wiki/User:Greg_L/Delimitnum_sandbox

 At the very bottom of the page (which loads slowly), is a concise list
 of values that all have errors.

 It would still be much better if a character-counting parser function
 can be made. Then, numbers like {{val|1.4500}} won't have its two
 trailing zeros truncated.

Based on feedback, principally from Aryeh, I modified a previous
effort to add an explicit fractional tolerance test of 1e-10 to #expr
when performing comparisons and integer conversion (e.g. floor, ceil,
mod, etc.)

That would have the effect that the comparison functions and integer
conversion would generally operate correctly up to 10 significant
figures (e.g. A == B  if  A/B - 1  1e-10).  This would eliminate many
of the problems associated with comparison functions and modular
arithmetic, but with the tradeoff that those same operations may give
unexpected results if the significand is intended to have more than 10
base-10 digits.

This doesn't apply to basic math operations (+, -, *, ^, etc.) except
where they are coupled to either comparisons or integer conversions.
So if someone simply wanted to do computations, such as 47.3 *
10.23^7.8, then they will still get all ~15 digits that PHP is capable
of displaying.

I see this principally as a step to make the basic math functions more
reliable and consistent with user expectation by reducing the impact
of floating point rounding effects.  An indirect effect is that I
believe those number formatting templates would generally work as
intended for significands with no more than 10 digits.  (Greg, how
often do you worry about more than 10 digits?)  Though, we may still
want to consider building number formatting options into Mediawiki
directly since I'm not sure putting all of that formatting effort into
template space is really a good idea.

Aryeh and I seem to have a difference of opinion about whether
sacrificing precision for some operations involving more than 10
significant digits is an acceptable trade-off for quashing round-off
problems most of the rest of the time.  The code is in r46683, and I
would appreciate some additional viewpoints on this.

Thanks.

-Robert Rohde

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Please make a optional way to set editsection links after section's title

2009-02-02 Thread Sylvain Brunerie
I don't see the problem with Firefox. Could you precise ?

— Sylvain Brunerie
[[w:fr:User:Delhovlyn]]


2009/2/2 mizusumashi mizusuma...@coda.ocn.ne.jp

 Floating editsection links work well with IE7.
 But floating editsection links move to bad position with Firefox,
 Safari, and GoogleChrome.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Double redirects

2009-02-02 Thread Chad
On Mon, Feb 2, 2009 at 8:43 AM, Russell Blau russb...@hotmail.com wrote:

 It appears that since r45973, there is an enhancement to MediaWiki that
 allows the software to follow double redirects, and conceivably triple- and
 higher-level redirects, up to a configurable limit ($wgMaxRedirects).  This
 raises a few questions:

 1)  Should [[Special:DoubleRedirects]] be changed to show only those
 redirect chains that exceed $wgMaxRedirects in length, since those are the
 only ones that really need fixing under the current software?


Probably. Sounds like a pretty good idea actually. If a wiki allows chains
of 3 redirects, there's no point to tell them chains of 2 or 3 are bad, only
4+.



 2)  If not, should there be a new Special: page to list such excessive
 redirect chains?


Probably not. I'd say tweak the current one (a better name for it?)


 3)  How can the users (not sysadmins) of a given wiki determine what the
 value of $wgMaxRedirects is for their wiki?  In particular, what value is
 being used currently on enwiki and other WMF projects?


The defaults for every setting are available for viewing on Mediawiki.org
For this setting, the default appears to be 1, meaning that we keep long-
standing behavior for the default. As for seeing your current settings, you
can check out the configuration files at [1].


 4)  Shouldn't this configuration change have been announced to the user
 community of each project so that they could consider how it affects their
 internal policies on what types of redirects to allow?

 Russ


New configuration options are introduced all the time. If the default
settings
don't change current behavior, they generally aren't announced widely.
Keeping
track of the various changes (on MW.org, subversion, or WP:POST on enwiki)
is your best for knowing when something new is available to you.

-Chad

[1] - http://noc.wikimedia.org/conf/ - CommonSettings and InitialiseSettings
are the two places to look.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] – Fixing {val}

2009-02-02 Thread Gregory Maxwell
On Sat, Jan 31, 2009 at 8:33 PM, Robert Rohde raro...@gmail.com wrote:
 This discussion is getting side tracked.

 The real complaint here is that

 {{#expr:(0.7 * 1000 * 1000) mod 1000}} is giving 69 when it should give 
 70.

 This is NOT a formatting issue, but rather it is bug in the #expr
 parser function, presumably caused by some kind of round-off error.

It's a bug in the user's understanding of floating point on computers,
combined with % being (quite naturally) an operator on integers.

0.7… does not exist in your finite precision base-2 based computer.

I don't think it's reasonable for Mediawiki to include a full radix-n
multi-precision floating point library in order to capture the
expected your behavior for these cases, any more than it would be
reasonable to expect it to contain a full computer algebra system so
it could handle manipulations of irrationals precisely.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] – Fixing {val}: floating po int

2009-02-02 Thread Greg L
Let me try that again since I mangled the last response:

Sure. And my expectation that 5 divided by 3 times 3 ought to equal 5  
and not 4.998 is due to my relative understanding of math and  
relative lack of understanding regarding the inner workings of common  
calculators. Is there some downside to fixing rounding errors in math  
functions? If so, then we might as well throw up our hands that trying  
to delimit numbers via math functions is a lost cause.

On Feb 2, 2009, at 2:32 PM, Gregory Maxwell wrote:

On Sat, Jan 31, 2009 at 8:33 PM, Robert Rohde raro...@gmail.com wrote:
 This discussion is getting side tracked.

 The real complaint here is that

 {{#expr:(0.7 * 1000 * 1000) mod 1000}} is giving 69 when it  
 should give 70.

 This is NOT a formatting issue, but rather it is bug in the #expr
 parser function, presumably caused by some kind of round-off error.

It's a bug in the user's understanding of floating point on computers,
combined with % being (quite naturally) an operator on integers.

0.7… does not exist in your finite precision base-2 based  
computer.

I don't think it's reasonable for Mediawiki to include a full radix-n
multi-precision floating point library in order to capture the
expected your behavior for these cases, any more than it would be
reasonable to expect it to contain a full computer algebra system so
it could handle manipulations of irrationals precisely.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Double redirects

2009-02-02 Thread Chad
On Mon, Feb 2, 2009 at 9:22 AM, Russell Blau russb...@hotmail.com wrote:

 Chad innocentkil...@gmail.com wrote in message
 news:5924f50a0902020609q542047cfme84b9237eec38...@mail.gmail.com...
  3)  How can the users (not sysadmins) of a given wiki determine what the
  value of $wgMaxRedirects is for their wiki?  In particular, what value
 is
  being used currently on enwiki and other WMF projects?
 
 
  The defaults for every setting are available for viewing on Mediawiki.org
  For this setting, the default appears to be 1, meaning that we keep long-
  standing behavior for the default. As for seeing your current settings,
  you
  can check out the configuration files at [1].
 
 Hmm.  Enwiki currently seems to be following double-redirects
 notwithstanding this setting.  I am aware that the initial implementation
 of
 this fix contained some bugs that (we hope) will be fixed in the next
 software update, so probably best to wait until after that update is done
 before worrying further.

 Russ


There were some regressions, but I believe they've been fixed.

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] – Fixing {val}

2009-02-02 Thread Gregory Maxwell
On Mon, Feb 2, 2009 at 5:56 PM, Robert Rohde raro...@gmail.com wrote:
 I've already written code that converts 4.98 to 5 immediately
 before performing operations that explicitly expect integers by
 applying an explicit 1 part in 10^10 tolerance.  That covers a wide
 range of cases that might be affected by round-off errors while adding
 little overhead.

I can't wait to see the complaints when some compiler upgrade results
in the code using the 32bit SSE registers rather than the 80bit FPU
and producing inexplicably different results.

The whole reason that the discussion is even being had here is because
Wikipedia is full of pedants.  Adding a number fudging kludge to
produce a less accurate result to order to fix some other kludge, just
to satisfy one set of pedants is only going to irritate another set of
pedants later.

Doing math in the parser is the wrong solution for this. Give the
users string manipulation functions and save the math for integer only
calculations and sausage making like table layout.  Finite precision
math simply can't be guaranteed to produce reasonable results for
non-integer content math purpose.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] war on Cite/{{cite}}

2009-02-02 Thread Andrew Garrett
On Sat, Jan 31, 2009 at 5:03 AM, Domas Mituzas midom.li...@gmail.com wrote:
 [  ] - Separate cache for Cite, to avoid reparsing on minor edits,
 that don't involve citations. I have no idea how much this would win,

Domas and I worked on this on IRC for a bit just now, and the change
has been synced to Wikimedia wikis. It generates the cache key from an
md5 of the input to the parser and the page-id. Render hash could be
included in this if it causes problems, but I'm not sure what stuff in
references will depend on the render hash and it may be safe to keep
it out of the cache key.

The change has been pretty handy, from what I can see. Reports I've
received indicate that render time for [[en:Rod Blagojevich corruption
charges]] dropped from 10.684 s to  2.700 s.

-- 
Andrew Garrett

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Please make a optional way to set editsection links after section's title

2009-02-02 Thread Chad
On Mon, Feb 2, 2009 at 12:13 PM, Steve Summit s...@eskimo.com wrote:

 mizusumashi wrote:
  Please see [[w:en:User:Mizusumashi/workspace]] with Firefox.
  Don't you see moved [edit] links near at the second image?

 I see two sections and two edit links, both of them moved down to
 roughly the bottom edge of the first section's image.  I see this
 all the time on the real wiki as well; as far as I know it's a
 known, longstanding bug.  The solution on the real wiki is often
 to put a {{-}} at the end of a section which has an image which
 is vertically longer than the sections's text tends to be.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


There's a bug open about it (1629). Bug 11270 has a proposed
solution that would fix both 1629 and 11555.

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Double redirects

2009-02-02 Thread Russell Blau
It appears that since r45973, there is an enhancement to MediaWiki that 
allows the software to follow double redirects, and conceivably triple- and 
higher-level redirects, up to a configurable limit ($wgMaxRedirects).  This 
raises a few questions:

1)  Should [[Special:DoubleRedirects]] be changed to show only those 
redirect chains that exceed $wgMaxRedirects in length, since those are the 
only ones that really need fixing under the current software?

2)  If not, should there be a new Special: page to list such excessive 
redirect chains?

3)  How can the users (not sysadmins) of a given wiki determine what the 
value of $wgMaxRedirects is for their wiki?  In particular, what value is 
being used currently on enwiki and other WMF projects?

4)  Shouldn't this configuration change have been announced to the user 
community of each project so that they could consider how it affects their 
internal policies on what types of redirects to allow?

Russ




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Double redirects

2009-02-02 Thread Thomas Dalton
2009/2/2 Nikola Smolenski smole...@eunet.yu:
 On Monday 02 February 2009 19:56:16 Chad wrote:
 ExcessiveRedirects?

 That's POV. MultipleRedirects?

Well, it's the POV of whoever set the maximum, is that a problem?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] – Fixing {val}

2009-02-02 Thread Robert Rohde
On Mon, Feb 2, 2009 at 3:25 PM, Gregory Maxwell gmaxw...@gmail.com wrote:
snip
 Doing math in the parser is the wrong solution for this. Give the
 users string manipulation functions and save the math for integer only
 calculations and sausage making like table layout.  Finite precision
 math simply can't be guaranteed to produce reasonable results for
 non-integer content math purpose.

I am happy to agree with you that there would be many uses for string
functions, including places related to number manipulation.

I don't think that obviates the need for math functions to behave in
intuitive ways under as many widely used cases as possible, though.

-Robert Rohde

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] – Fixing {val}

2009-02-02 Thread Robert Rohde
On Mon, Feb 2, 2009 at 11:17 AM, Aryeh Gregor
simetrical+wikil...@gmail.com wrote:
 On Mon, Feb 2, 2009 at 1:55 PM, greg_l_at_wikipedia
 greg_l_at_wikipe...@comcast.net wrote:
 I have a sandbox with hundreds of test numbers to exercise the
 {delimitnum} template. This template uses the same math-based
 functions as {val}. For anyone who has produced improved math
 functions, you can examine values on this sandbox that currently
 produce rounding errors and test your new math functions. The sandbox
 is at the following:

 http://en.wikipedia.org/wiki/User:Greg_L/Delimitnum_sandbox

 At the very bottom of the page (which loads slowly), is a concise list
 of values that all have errors.

 The page is so slow that even attempting to view it times out for me.


It is large and slow (loaded for me after about 20 seconds), but we've
also had some recent sporadic reports of routine things like
watchlists being sluggish or failing to load.  So, in addition to his
page being huge, there may be other factors at work as well at the
moment.

-Robert Rohde

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] war on Cite/{{cite}}

2009-02-02 Thread Tim Starling
Andrew Garrett wrote:
 On Sat, Jan 31, 2009 at 5:03 AM, Domas Mituzas midom.li...@gmail.com wrote:
 [  ] - Separate cache for Cite, to avoid reparsing on minor edits,
 that don't involve citations. I have no idea how much this would win,
 
 Domas and I worked on this on IRC for a bit just now, and the change
 has been synced to Wikimedia wikis. It generates the cache key from an
 md5 of the input to the parser and the page-id. Render hash could be
 included in this if it causes problems, but I'm not sure what stuff in
 references will depend on the render hash and it may be safe to keep
 it out of the cache key.
 
 The change has been pretty handy, from what I can see. Reports I've
 received indicate that render time for [[en:Rod Blagojevich corruption
 charges]] dropped from 10.684 s to  2.700 s.

What would ref{{ {{PAGENAME}}/Citations }}/ref return?

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Interwiki recommendations

2009-02-02 Thread Marcus Buck
Lars Aronsson hett schreven:
 What is the best way to organize infobox templates for geographic 
 places, the one used on the French, the Polish, or the Turkish 
 Wikipedia?  What are the most important features in use on other 
 languages of Wikipedia, that my language is still missing?

 Are these questions of a kind that you sometimes ask yourself?  
 If so, where do you go to find the answers?  Are we all just 
 copying ideas from the English Wikipedia?  Or inventing our own 
 wheels? Has anybody collected stories of how one project learned 
 something useful from another one?
As you are speaking of infoboxes and crosswiki, I want to chip in 
another thought: why do we actually place infobox templates on every 
single wiki? In 2007 I created some semiautomatic bot articles about 
municipalities on my home wiki. In 2008 they had elections and elected 
new mayors. So my articles mentioning the mayors were outdated. The 
articles in the main language of that country were updated relatively 
quickly, Mine are not yet. I plan to do, but who does that for all 
articles in all language editions?

An example: Bavaria held communal elections in March 2008. Enough time 
to update infoboxes. The municipality Adelzhausen got Lorenz Braun as 
new mayor, replacing Thomas Goldstein. I checked all interwikis of the 
German article. Two had it right. Both were created after the elections. 
Four don't mention the mayor at all, and six still mentioned the old 
mayor. No wiki had bothered to update the information.

It would be much easier, if we had a central repository for the data. We 
would place infoboxes in the central wiki. Each wiki then could fetch 
the data from the central wiki just as images are fetched from Commons 
and render the data into a localised infobox. That would be much more 
accurate than maintaining redundant info on potentially hundreds of wikis.

Marcus Buck

PS: And that would be interesting in regard to botopedias too. Volapük 
Wikipedia was massively critized for creating masses of bot content. 
With a central wiki for data creating articles for example for all the 
~37,000 municipalities of France would essentially be reduced to 
creating a template that renders the central content into an article. 
Little Wikipedias could greatly benefit, if they just had to create some 
templates to make available info on hundreds of thousands of topics to 
the speakers of their language. It would be very basic, infobox-like 
information, but it would be information.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Interwiki recommendations

2009-02-02 Thread Chad
Step 1 would be making interwiki transclusion not suck.
Its been a long-standing back-burner project of mine.

-Chad

On Feb 2, 2009 8:05 PM, Marcus Buck w...@marcusbuck.org wrote:

Lars Aronsson hett schreven:

 What is the best way to organize infobox templates for geographic 
places, the one used on the F...
As you are speaking of infoboxes and crosswiki, I want to chip in
another thought: why do we actually place infobox templates on every
single wiki? In 2007 I created some semiautomatic bot articles about
municipalities on my home wiki. In 2008 they had elections and elected
new mayors. So my articles mentioning the mayors were outdated. The
articles in the main language of that country were updated relatively
quickly, Mine are not yet. I plan to do, but who does that for all
articles in all language editions?

An example: Bavaria held communal elections in March 2008. Enough time
to update infoboxes. The municipality Adelzhausen got Lorenz Braun as
new mayor, replacing Thomas Goldstein. I checked all interwikis of the
German article. Two had it right. Both were created after the elections.
Four don't mention the mayor at all, and six still mentioned the old
mayor. No wiki had bothered to update the information.

It would be much easier, if we had a central repository for the data. We
would place infoboxes in the central wiki. Each wiki then could fetch
the data from the central wiki just as images are fetched from Commons
and render the data into a localised infobox. That would be much more
accurate than maintaining redundant info on potentially hundreds of wikis.

Marcus Buck

PS: And that would be interesting in regard to botopedias too. Volapük
Wikipedia was massively critized for creating masses of bot content.
With a central wiki for data creating articles for example for all the
~37,000 municipalities of France would essentially be reduced to
creating a template that renders the central content into an article.
Little Wikipedias could greatly benefit, if they just had to create some
templates to make available info on hundreds of thousands of topics to
the speakers of their language. It would be very basic, infobox-like
information, but it would be information.

___ Wikitech-l mailing list
wikitec...@lists.wikimedia
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Double redirects

2009-02-02 Thread Platonides
Nikola Smolenski wrote:
 On Monday 02 February 2009 21:33:26 Chad wrote:
 NPOV doesn't apply to MediaWiki. :)
 
 But, but, then we won't be able to use MediaWiki on Wikipedia! :)

WHAT?? You mere mortal are threatening to stop using the great software
made by the developers?? How do you dare?

ca...@larousse$ /usr/local/bin/suicidenote -u Nikola -t 1233620512

http://en.wikipedia.org/wiki/Wikipedia:Do_NOT_bite_the_developers


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l