Re: [Wikitech-l] Showing bytes added/removed in each edit in View history and User contributions

2010-08-03 Thread ChrisiPK
This is a policy requirement, not a technical requirement, and can surely be
adjusted.

Am 03.08.2010 07:14, schrieb Liangent:
 On 8/3/10, Aryeh Gregor simetrical+wikil...@gmail.com wrote:
 No, we'd just have to repurpose rev_len to mean characters instead
 of bytes, and update all the old rows.  We don't actually need the
 byte count for anything, do we?
 
 Byte count is used. For example in Chinese Wikipedia, one of the
 criteria of Did you know articles is = 3000 bytes.
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



signature.asc
Description: OpenPGP digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] chanfing main page articles from drop down. help required

2010-08-03 Thread Dmitriy Sintsov
* Noman nom...@gmail.com [Tue, 3 Aug 2010 10:04:47 +0500]:
 Hi,
 i've installed mediawiki for a wiki project.
 now we have 4 sections on main page . like there are on wikipedia main
 page.

 Now as its done in wikipedia these 4 boxes are tables and update on 
date
 criteria.

 Now i want to do is to give some kind a navigation bar like drop down 
or
 paging.
 so when user selects page from drop down the all 4 pages are updated
 with
 relevant articles.
 to clear more. how the for parts of table can be updated . and how to
 put
 combo box ( which is already filled with page numbers / article 
topics)
 and
 which user selects any page from drop down all the 4 table rows are
 updated.


 any body help me .
 im new to media wiki searched alot. but didnot get any thing. or
 material or
 example.
 if i ve to made my extension then how ill write . means what is will 
the
 flow. how it will be pluged in the mediawiki  and runs smoothly

 any article example. e-book from where i can read step by step.

The simpliest (although not the prettiest) way is to use a set of 
iframes and to load their src from select option value onchange event 
handler:
http://lab.artlung.com/dropdown/
scroll down to Dropdown Navigation to iFrame example. One might choose 
to pass printable=yes to not recusrively include menus and user 
interface into iframes.
Although I am not sure that iframe is not stripped by parser.

The harder (yet more proper) way would be to develop an extension which 
will load the content of selected pages into div's via API AJAX calls. I 
don't know whether such extension already exists.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikitech-l Digest, Vol 85, Issue 9

2010-08-03 Thread Noman
Thanks Dmitriy,
i'm looking for div solution. as iframe will give scrolling if content are
large. which is not required.
now i was unable to find step by step approach to develop extension.

if you have ne thing / example. i'll be waiting.

Noman

On Tue, Aug 3, 2010 at 11:58 AM, wikitech-l-requ...@lists.wikimedia.orgwrote:

 Dmitriy



 --

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


 End of Wikitech-l Digest, Vol 85, Issue 9
 *

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Showing bytes added/removed in each edit in View history and User contributions

2010-08-03 Thread Liangent
On 8/3/10, ChrisiPK chris...@gmail.com wrote:
 This is a policy requirement, not a technical requirement, and can surely be
 adjusted.

It seems 1 zh char = 3 bytes gives a kind of proper weight among
characters. Obviously, zh chars look more important (when counting the
amount of content) than en chars, which are usually wikisyntax, in
zh.wp...

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



Re: [Wikitech-l] Showing bytes added/removed in each edit in View history and User contributions

2010-08-03 Thread Robert Ullmann
Ahem.

The revision size (and page size, meaning that of last revision) in
bytes, is available in the API. If you change the definition there is
no telling what you will break. Essentially you can't.

A character count would have to be another field.

best,
Robert

On Tue, Aug 3, 2010 at 9:53 AM, ChrisiPK chris...@gmail.com wrote:
 This is a policy requirement, not a technical requirement, and can surely be
 adjusted.

 Am 03.08.2010 07:14, schrieb Liangent:
 On 8/3/10, Aryeh Gregor simetrical+wikil...@gmail.com wrote:
 No, we'd just have to repurpose rev_len to mean characters instead
 of bytes, and update all the old rows.  We don't actually need the
 byte count for anything, do we?

 Byte count is used. For example in Chinese Wikipedia, one of the
 criteria of Did you know articles is = 3000 bytes.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Developing true WISIWYG editor for media wiki

2010-08-03 Thread Jacopo Corbetta
On Tue, Aug 3, 2010 at 00:49, Platonides platoni...@gmail.com wrote:
 The problem that makes this really hard is that MediaWiki syntax is not
 nice. So I'm a bit skeptical about that fast quality editor. You can
 find in the list archives many discussions about it, and also in wikitext-l.
 Things like providing a ribbon is a completely esthetical choice, it
 can't really help on the result of its editing. Maybe your backend is
 powerful enough to handle this without problems. Please, show me wrong :)

I agree, wikitext is notoriously developer-unfriendly. A survey of
currently existing ideas and extensions is on
http://www.mediawiki.org/wiki/WYSIWYG_editor

As a shameless self-promotion, I encourage you to look at
http://www.mediawiki.org/wiki/Extension:MeanEditor for the approach we
took:
1. supporting only a limited subset of wikitext
2. supporting that subset well, leaving a clean history

The rationale here is that supporting all quirks of wikitext adds
little value both for new users (they should not be editing complex
stuff anyways!) and for advanced users (who probably already know
wikitext). I hope this idea can be useful to you.

However, the editing mode provided by browsers is a nightmare of
incompatibilities. Basically, each browser produces a different output
given identical commands, so currently MeanEditor is not completely up
to the task. An external application might be an interesting solution.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Developing true WISIWYG editor for media wiki

2010-08-03 Thread Marco Schuster
On Tue, Aug 3, 2010 at 10:53 AM, Jacopo Corbetta
jacopo.corbe...@gmail.com wrote:
 However, the editing mode provided by browsers is a nightmare of
 incompatibilities. Basically, each browser produces a different output
 given identical commands, so currently MeanEditor is not completely up
 to the task. An external application might be an interesting solution.

I don't have the link ready, but Google solved this in Google Docs by
re-implementing this in Javascript... they intercept mouse
movements/clicks and keyboard events and then javascript-render the
page.
Given the complexity of wikitext, I fear rewriting the parser in
Javascript is the only way to get a 100% compatible wikitext editor...

Marco

-- 
VMSoft GbR
Nabburger Str. 15
81737 München
Geschäftsführer: Marco Schuster, Volker Hemmert
http://vmsoft-gbr.de

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Developing true WISIWYG editor for media wiki

2010-08-03 Thread Павел Петроченко
Hi,

Yes, of course we are interested on it.
Specifically, the ideal WISIWYG MediaWiki editor would allow easy
WISIWYG editing to newbies, while still allowing to use the full
wikisyntax to power users, without inserting crappy markup when using
it, or reordering everything to its liking when WISIWYG was used to do a
little change.
Thanks for the note, it may be an important issue.

From the screencast, it seems your technology is based in a local
application instead of web. That's is a little inconvenience for the
users, but an acceptable one IMHO. You could plug your app as an
external editor, see: http://www.mediawiki.org/wiki/Manual:External_editors

Yep according to my understanding this is major problem, but unfortunately
we are rich client developers, so going web is only in our future plans.
(Actually we are thinking about moving to it, but waiting for a first
customer to help with transition)

On other side being a rich client app may add some benefits for advanced
users, which are still hard
to do in web apps (according to my poor desktop developer understanding).

custom groupings, personal inbox, local for work flow/validation rules and
review. (just as initial examples)

The problem that makes this really hard is that MediaWiki syntax is not
nice. So I'm a bit skeptical about that fast quality editor. You can
find in the list archives many discussions about it, and also in
wikitext-l.
Things like providing a ribbon is a completely esthetical choice, it
can't really help on the result of its editing. Maybe your backend is
powerful enough to handle this without problems. Please, show me wrong :)

Yep - already meet some crap in dealing with it(much more complex than, Trac
wiki one).
But still hope to over helm most of problems, in a couple of month

 I don't have an issue with there being a closed source Windows app that
 edits wikitext well, but then there is going to be a bit of a difficult
 transition from reading to editing and back again.
Yes, this is one of pote

 And just FYI, generally our community is more interested in free and
 cross-platform software than proprietary, single platform software.
Actually we are going to be open source and cross platform (we are Eclipse
RCP based)

 That was very interesting. Any chance the rest of us can try it for
 ourselves?

Our media wiki support is at very early stage now. Actually we are still not
sure how much we are going to be committed into it,
If there will be enough interest (at least couple of volunteer beta
testers), we will start publishing builds somewhere.

Regards,
Pavel
OnPositive Technologies.

2010/8/3 Neil Kandalgaonkar ne...@wikimedia.org

 On 8/2/10 9:29 AM, Павел Петроченко wrote:

 Hi guys,

 At the moment we are discussing an opportunity to create full scale
 true WYSIWYG client for media wiki. To the moment we have a technology
 which should allow us to implement with a good quality and quite fast.
 Unfortunately we are not sure
 if there is a real need/interest for having such kind of client at the
 media wiki world, as well as what are actual needs of media wiki
 users.


 Definitely interested.

 As for what the needs of MediaWiki users are, you can check out everything
 on http://usability.wikimedia.org/ . We are just beginning to address
 usability concerns. This study might be interesting to you:

 http://usability.wikimedia.org/wiki/Usability_and_Experience_Study



  P.S. Screen cast demonstrating our experimental client for Trac wiki
 http://www.screencast.com/t/MDkzYzM4


 That was very interesting. Any chance the rest of us can try it for
 ourselves?

 I personally like the idea of a ribbon. I think we can assume that most
 wiki editors are always going to be novice editors, so taking up tremendous
 amounts of space by default to explain things is warranted. Experts should
 be able to drop into raw wikitext, or otherwise minimize the interface.

 I don't have an issue with there being a closed source Windows app that
 edits wikitext well, but then there is going to be a bit of a difficult
 transition from reading to editing and back again.

 And just FYI, generally our community is more interested in free and
 cross-platform software than proprietary, single platform software.

 Still it looks interesting. Please let us know more.

 --
 Neil Kandalgaonkar (|  ne...@wikimedia.org

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-08-03 Thread Domas Mituzas
Hi!

 Couldn't you just tag every internal link with
 a separate class for the length of the target article,

Great idea, how come noone ever came up with this, I even have a stylesheet 
ready, here it is (do note, even it looks big in text, gzip gets it down to 10% 
so we can support this kind of granularity even up to a megabyte :)

Domas

a { color: blue }
a.1_byte_article { color: red; }
a.2_byte_article { color: red; }
a.3_byte_article { color: red; }
a.4_byte_article { color: red; }
a.5_byte_article { color: red; }
a.6_byte_article { color: red; }
a.7_byte_article { color: red; }
a.8_byte_article { color: red; }
a.9_byte_article { color: red; }
a.10_byte_article { color: red; }
a.11_byte_article { color: red; }
a.12_byte_article { color: red; }
a.13_byte_article { color: red; }
a.14_byte_article { color: red; }
a.15_byte_article { color: red; }
a.16_byte_article { color: red; }
a.17_byte_article { color: red; }
a.18_byte_article { color: red; }
a.19_byte_article { color: red; }
a.20_byte_article { color: red; }
a.21_byte_article { color: red; }
a.22_byte_article { color: red; }
a.23_byte_article { color: red; }
a.24_byte_article { color: red; }
a.25_byte_article { color: red; }
a.26_byte_article { color: red; }
a.27_byte_article { color: red; }
a.28_byte_article { color: red; }
a.29_byte_article { color: red; }
a.30_byte_article { color: red; }
a.31_byte_article { color: red; }
a.32_byte_article { color: red; }
a.33_byte_article { color: red; }
a.34_byte_article { color: red; }
a.35_byte_article { color: red; }
a.36_byte_article { color: red; }
a.37_byte_article { color: red; }
a.38_byte_article { color: red; }
a.39_byte_article { color: red; }
a.40_byte_article { color: red; }
a.41_byte_article { color: red; }
a.42_byte_article { color: red; }
a.43_byte_article { color: red; }
a.44_byte_article { color: red; }
a.45_byte_article { color: red; }
a.46_byte_article { color: red; }
a.47_byte_article { color: red; }
a.48_byte_article { color: red; }
a.49_byte_article { color: red; }
a.50_byte_article { color: red; }
a.51_byte_article { color: red; }
a.52_byte_article { color: red; }
a.53_byte_article { color: red; }
a.54_byte_article { color: red; }
a.55_byte_article { color: red; }
a.56_byte_article { color: red; }
a.57_byte_article { color: red; }
a.58_byte_article { color: red; }
a.59_byte_article { color: red; }
a.60_byte_article { color: red; }
a.61_byte_article { color: red; }
a.62_byte_article { color: red; }
a.63_byte_article { color: red; }
a.64_byte_article { color: red; }
a.65_byte_article { color: red; }
a.66_byte_article { color: red; }
a.67_byte_article { color: red; }
a.68_byte_article { color: red; }
a.69_byte_article { color: red; }
a.70_byte_article { color: red; }
a.71_byte_article { color: red; }
a.72_byte_article { color: red; }
a.73_byte_article { color: red; }
a.74_byte_article { color: red; }
a.75_byte_article { color: red; }
a.76_byte_article { color: red; }
a.77_byte_article { color: red; }
a.78_byte_article { color: red; }
a.79_byte_article { color: red; }
a.80_byte_article { color: red; }
a.81_byte_article { color: red; }
a.82_byte_article { color: red; }
a.83_byte_article { color: red; }
a.84_byte_article { color: red; }
a.85_byte_article { color: red; }
a.86_byte_article { color: red; }
a.87_byte_article { color: red; }
a.88_byte_article { color: red; }
a.89_byte_article { color: red; }
a.90_byte_article { color: red; }
a.91_byte_article { color: red; }
a.92_byte_article { color: red; }
a.93_byte_article { color: red; }
a.94_byte_article { color: red; }
a.95_byte_article { color: red; }
a.96_byte_article { color: red; }
a.97_byte_article { color: red; }
a.98_byte_article { color: red; }
a.99_byte_article { color: red; }
a.100_byte_article { color: red; }
a.101_byte_article { color: red; }
a.102_byte_article { color: red; }
a.103_byte_article { color: red; }
a.104_byte_article { color: red; }
a.105_byte_article { color: red; }
a.106_byte_article { color: red; }
a.107_byte_article { color: red; }
a.108_byte_article { color: red; }
a.109_byte_article { color: red; }
a.110_byte_article { color: red; }
a.111_byte_article { color: red; }
a.112_byte_article { color: red; }
a.113_byte_article { color: red; }
a.114_byte_article { color: red; }
a.115_byte_article { color: red; }
a.116_byte_article { color: red; }
a.117_byte_article { color: red; }
a.118_byte_article { color: red; }
a.119_byte_article { color: red; }
a.120_byte_article { color: red; }
a.121_byte_article { color: red; }
a.122_byte_article { color: red; }
a.123_byte_article { color: red; }
a.124_byte_article { color: red; }
a.125_byte_article { color: red; }
a.126_byte_article { color: red; }
a.127_byte_article { color: red; }
a.128_byte_article { color: red; }
a.129_byte_article { color: red; }
a.130_byte_article { color: red; }
a.131_byte_article { color: red; }
a.132_byte_article { color: red; }
a.133_byte_article { color: red; }
a.134_byte_article { color: red; }
a.135_byte_article { color: red; }
a.136_byte_article 

Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-08-03 Thread Liangent
On 8/3/10, Lars Aronsson l...@aronsson.se wrote:
 Couldn't you just tag every internal link with
 a separate class for the length of the target article,
 and then use different personal CSS to set the
 threshold? The generated page would be the same
 for all users:

So if a page is changed, all pages linking to it need to be parsed
again. Will this cost even more?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-08-03 Thread K. Peachey
Would something like what is shown below get it even further down?

a { color: blue }
a.1_byte_article, a.2_byte_article, a.3_byte_article,
a.4_byte_article, a.5_byte_article, a.6_byte_article,
a.7_byte_article, a.8_byte_article, a.9_byte_article,
a.10_byte_article,a.11_byte_article, a.12_byte_article,
a.13_byte_article, a.14_byte_article, a.15_byte_article,
a.16_byte_article, a.17_byte_article, a.18_byte_article,
a.19_byte_article, a.20_byte_article, a.21_byte_article,
a.22_byte_article, a.23_byte_article, a.24_byte_article,
a.25_byte_article, a.26_byte_article, a.27_byte_article,
a.28_byte_article, a.29_byte_article, a.30_byte_article,
a.31_byte_article, a.32_byte_article, a.33_byte_article,
a.34_byte_article, a.35_byte_article, a.36_byte_article,
a.37_byte_article, a.38_byte_article, a.39_byte_article,
a.40_byte_article, a.41_byte_article, a.42_byte_article,
a.43_byte_article, a.44_byte_article, a.45_byte_article,
a.46_byte_article, a.47_byte_article, a.48_byte_article,
a.49_byte_article, a.50_byte_article, a.51_byte_article,
a.52_byte_article, a.53_byte_article, a.54_byte_article,
a.55_byte_article, a.56_byte_article, a.57_byte_article,
a.58_byte_article, a.59_byte_article, a.60_byte_article,
a.61_byte_article, a.62_byte_article, a.63_byte_article,
a.64_byte_article, a.65_byte_article, a.66_byte_article,
a.67_byte_article, a.68_byte_article, a.69_byte_article,
a.70_byte_article, a.71_byte_article, a.72_byte_article,
a.73_byte_article, a.74_byte_article, a.75_byte_article,
a.76_byte_article, a.77_byte_article, a.78_byte_article,
a.79_byte_article, a.80_byte_article, a.81_byte_article,
a.82_byte_article, a.83_byte_article, a.84_byte_article,
a.85_byte_article, a.86_byte_article, a.87_byte_article,
a.88_byte_article, a.89_byte_article, a.90_byte_article,
a.91_byte_article, a.92_byte_article, a.93_byte_article,
a.94_byte_article, a.95_byte_article { color: red }

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] chanfing main page articles from drop down. help required

2010-08-03 Thread Dmitriy Sintsov
* Noman nom...@gmail.com [Tue, 3 Aug 2010 12:04:31 +0500]:
 Thanks Dmitriy,
 i'm looking for div solution. as iframe will give scrolling if content
 are
 large. which is not required.
 now i was unable to find step by step approach to develop extension.

 if you have ne thing / example. i'll be waiting.

Maybe Extension:HTMLets will suits your needs. Otherwise, you have to 
study MediaWiki developers site:

1. Perhaps one would setup a parser xml tag hook to generate proper 
form/select/option and four corresponding div's code:
http://www.mediawiki.org/wiki/Manual:Tag_extensions

from xml tag attributes one would generate full html which is required 
to select titles and to place content of these into div's.

2. Perhaps one would use API to retrieve the pages whose title are taken 
from option.value via javascript, then place these into div.innerHTML, 
again via the Javascript:
http://www.mediawiki.org/wiki/API:Expanding_templates_and_rendering
Another possibility is to use Title and Article classes and do your own 
AJAX handler:
http://www.mediawiki.org/wiki/Manual:Ajax
http://www.mediawiki.org/wiki/Manual:Title.php
http://www.mediawiki.org/wiki/Manual:Article.php

However, that's probably a reinventing of wheel.

Sorry for not being able to provide full example - I am not a rich guy 
and busy with projects to feed my family. Also, I am not the fastest 
coder out there.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-08-03 Thread Platonides
Lars Aronsson wrote:
 On 08/01/2010 10:55 PM, Aryeh Gregor wrote:
 One easy hack to reduce this problem is just to only provide a few
 options for stub threshold, as we do with thumbnail size.  Although
 this is only useful if we cache pages with nonzero stub threshold . .
 . why don't we do that?  Too much fragmentation due to the excessive
 range of options?
 
 Couldn't you just tag every internal link with
 a separate class for the length of the target article,
 and then use different personal CSS to set the
 threshold? The generated page would be the same
 for all users:
 
 a href=My_Article class=134_byte_articleMy Article/a

That would be workable, eg. one class for articles smaller than 50
bytes, other for 100, 200, 250, 300, 400, 500, 600, 700, 800, 1000,
2000, 2500, 5000, 1 if it weren't for having to update all those
classes whenever the page changes.

It would work to add it as a separate stylesheet for stubs, though.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] chanfing main page articles from drop down. help required

2010-08-03 Thread Platonides
Noman wrote:
 Hi,
 i've installed mediawiki for a wiki project.
 now we have 4 sections on main page . like there are on wikipedia main page.
 
 Now as its done in wikipedia these 4 boxes are tables and update on date
 criteria.
 
 Now i want to do is to give some kind a navigation bar like drop down or
 paging.
 so when user selects page from drop down the all 4 pages are updated with
 relevant articles.
 to clear more. how the for parts of table can be updated . and how to put
 combo box ( which is already filled with page numbers / article topics) and
 which user selects any page from drop down all the 4 table rows are updated.

One way would be to have different pages. You seem to already have lists
of what four things should appear when selectiog X, so that would fit.

Another approach that may fit you is the one of:
http://ca.wikipedia.org/wiki/Portada


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread David Gerard
On 3 August 2010 00:17, Edward Z. Yang ezy...@mit.edu wrote:


    2. Distributors roll patches without telling upstream developers who
       would happily accept them into the mainline.


Has anyone reported the following as Debian bugs?

* Package maintainer not sending patches back upstream
* Package maintainer not visible and active in MediaWiki development
* Package maintainer not visible and active in MediaWiki community
support, leaving supporting his packages to the upstream


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread David Gerard
On 3 August 2010 16:14, Aryeh Gregor simetrical+wikil...@gmail.com wrote:

 I'm thankful that the Debian MediaWiki package at least *works*.  Not
 that the same can be said of all their packages either (OpenSSL,
 anyone?).  Maybe if we provided .debs and RPMs, people would be less
 prone to use the distro packages.


_

IT'S A TARBALL!

A TARBALL OF SOURCE CODE!

THAT YOU INTERPRET!

_


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [Testing] Selenium

2010-08-03 Thread Benedikt Kaempgen
Hello,

In order to test SMW, I would like to try out your Selenium testing framework, 
as described here [1]. 

Two things are not that clear to me:

- As of now, you have to manually add the test file to 
maintenance/tests/RunSeleniumTests.php. This will be replaced by a command line 
argument in the future. What exactly is one supposed to do here?

- Also, in section Architecture some files are mentioned, that I cannot find 
in /trunk/phase3, e.g., selenium/SimpleSeleniumTest oder 
selenium/LocalSeleniumSettings.php.sample. Why is this not the case?

Regards,

Benedikt

[1] http://www.mediawiki.org/wiki/SeleniumFramework


--
Karlsruhe Institute of Technology (KIT)
Institute of Applied Informatics and Formal Description Methods (AIFB)

Benedikt Kämpgen
Research Associate

Kaiserstraße 12
Building 11.40
76131 Karlsruhe, Germany

Phone: +49 721 608-7946
Fax: +49 721 608-6580
Email: benedikt.kaemp...@kit.edu
Web: http://www.kit.edu/

KIT - University of the State of Baden-Wuerttemberg and
National Research Center of the Helmholtz Association


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] A Shorturl Service for Wikipedia Projects by Node.js

2010-08-03 Thread Mingli Yuan
Hi, folks,

For many languages which dose not use Latin characters, the URL of an
articles in Wikipedia might be very long.
This is the case for Chinese Wikipedia.

So in order to help people on this problem, I create a small project to
solve it, and it runs successfully on my local machine.
Although I dose not deploy it to a public server yet, I decided to make the
code public first.
You can get the code at http://github.com/mountain/shortify

It uses API call to get pageId by the title, and then convert pageId by base
36 to the short url. It is quite simple.
To reduce the frequency of API call, a simple cache was used.  So far, only
Chinese and English were supported.

If you think your language need such kind a tool, please help me localize
the i18n config file at
http://github.com/mountain/shortify/blob/master/config/i18n.js

Comments are welcomed. If you can help setup a server, that would be nice
and please contact me separately.

Regards,

Mingli (User:Mountain)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A Shorturl Service for Wikipedia Projects by Node.js

2010-08-03 Thread Roan Kattouw
2010/8/3 Mingli Yuan mingli.y...@gmail.com:
 It uses API call to get pageId by the title, and then convert pageId by base
 36 to the short url. It is quite simple.
 To reduce the frequency of API call, a simple cache was used.  So far, only
 Chinese and English were supported.

Why would you reduce the page ID in length with base 36? They're,
what, 10 digits?

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A Shorturl Service for Wikipedia Projects by Node.js

2010-08-03 Thread j
On 08/03/2010 05:59 PM, Mingli Yuan wrote:
 It uses API call to get pageId by the title, and then convert pageId by base
 36 to the short url. It is quite simple.
 To reduce the frequency of API call, a simple cache was used.  So far, only
 Chinese and English were supported.

you should consider base32
http://www.crockford.com/wrmg/base32.html

j

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A Shorturl Service for Wikipedia Projects by Node.js

2010-08-03 Thread സാദിക്ക് ഖാലിദ് Sadik Khali d
 If you think your language need such kind a tool, please help me localize
 the i18n config file at
 http://github.com/mountain/shortify/blob/master/config/i18n.js


URL has Read-Only access


On Tue, Aug 3, 2010 at 7:09 PM, j...@v2v.cc wrote:

 On 08/03/2010 05:59 PM, Mingli Yuan wrote:
  It uses API call to get pageId by the title, and then convert pageId by
 base
  36 to the short url. It is quite simple.
  To reduce the frequency of API call, a simple cache was used.  So far,
 only
  Chinese and English were supported.

 you should consider base32
 http://www.crockford.com/wrmg/base32.html

 j

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
സ്‌നേഹാന്വേഷണങ്ങളോടെ,
സാദിക്ക് ഖാലിദ്
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread Niklas Laxström
On 3 August 2010 18:14, Aryeh Gregor simetrical+wikil...@gmail.com wrote:
 I'm thankful that the Debian MediaWiki package at least *works*.  Not
 that the same can be said of all their packages either (OpenSSL,
 anyone?).  Maybe if we provided .debs and RPMs, people would be less
 prone to use the distro packages.

That just creates more problems:
* bad quality distro packages
* bad quality our own packages (while we know MediaWiki, we are not
experts in packaging)
* lots of confusion

I don't see any other way out but to reach to the packagers and get
their packages fixed. What we can do is to communicate this to our
users and try to  communicate more with the packagers. We already do
the first in our IRC channel (telling users we can't support distro
packages, and that they should just download the tarball), but there
are lots of place where we don't do that yet.

In short: education and communication, not trying to do their job.

 -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread Aryeh Gregor
On Tue, Aug 3, 2010 at 12:45 PM, Niklas Laxström
niklas.laxst...@gmail.com wrote:
 I don't see any other way out but to reach to the packagers and get
 their packages fixed. What we can do is to communicate this to our
 users and try to  communicate more with the packagers.

I tried that with Fedora.  You can read about it here:

https://bugzilla.redhat.com/show_bug.cgi?id=484855
https://fedorahosted.org/fesco/ticket/225

Result: nothing.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Developing true WISIWYG editor for media wiki

2010-08-03 Thread Neil Kandalgaonkar
On 8/3/10 2:18 AM, Marco Schuster wrote:

 I don't have the link ready, but Google solved this in Google Docs by
 re-implementing this in Javascript... they intercept mouse
 movements/clicks and keyboard events and then javascript-render the
 page.

http://googledocs.blogspot.com/2010/05/whats-different-about-new-google-docs.html


-- 
Neil Kandalgaonkar (   ne...@wikimedia.org

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-08-03 Thread Happy-melon

Oldak Quill oldakqu...@gmail.com wrote in message 
news:aanlktik8sqmaetwvg8eta+ca49i08rfbrmvicsms+...@mail.gmail.com...
 On 2 August 2010 12:13, Oldak Quill oldakqu...@gmail.com wrote:
 On 28 July 2010 20:13,  jida...@jidanni.org wrote:
 Seems to me playing the role of the average dumb user, that
 en.wikipedia.org is one of the rather slow websites of the many websites
 I browse.

 No matter what browser, it takes more seconds from the time I click on a
 link to the time when the first bytes of the HTTP response start flowing
 back to me.

 Seems facebook is more zippy.

 Maybe Mediawiki is not optimized.


 For what it's worth, Alexa.com lists the average load time of the
 websites they catalogue. I'm not sure what the metrics they use are,
 and I would guess they hit the squid cache and are in the United
 States.

 Alexa.com list the following average load times as of now:

 wikipedia.org: Fast (1.016 Seconds), 74% of sites are slower.
 facebook.com: Average (1.663 Seconds), 50% of sites are slower.


 An addendum to the above message:

 According to the Alexa.com help page Average Load Times: Speed
 Statistics (http://www.alexa.com/help/viewtopic.php?f=6t=1042):
 The Average Load Time ... [is] based on load times experienced by
 Alexa users, and measured by the Alexa Toolbar, during their regular
 web browsing.

 So although US browsers might be overrepresented in this sample (I'm
 just guessing, I have no figures to support this statement), the Alexa
 sample should include many non-US browsers, assuming that the figure
 reported by Alexa.com is reflective of its userbase.

And the average Alexa toolbar user is logged in to facebook and using it to 
see what their friends were up to last night, with masses of personalised 
content; while the average Alexa toolbar user is a reader seeing the same 
page as everyone else.  We definitely have the theoretical advantage.

--HM
 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread Lane, Ryan
 On 3 August 2010 18:14, Aryeh Gregor 
 simetrical+wikil...@gmail.com wrote:
  I'm thankful that the Debian MediaWiki package at least 
 *works*.  Not
  that the same can be said of all their packages either (OpenSSL,
  anyone?).  Maybe if we provided .debs and RPMs, people would be less
  prone to use the distro packages.
 
 That just creates more problems:
 * bad quality distro packages
 * bad quality our own packages (while we know MediaWiki, we are not
 experts in packaging)
 * lots of confusion
 

I've packaged hundreds of RPMs. It isn't difficult, and you don't need to be
an expert. It is easy enough to package the MediaWiki software. The real
problem comes with upgrades. How does the package handle this? Do we ignore
the actual maintanence/update.php portion? Do we run it? How do we handle
extensions? Package them too? Do we make a repo for all of this? How are the
extensions handled on upgrade?

Having MediaWiki in a package really doesn't make much sense, unless we put
a lot of effort into making it work this way.

 I don't see any other way out but to reach to the packagers and get
 their packages fixed. What we can do is to communicate this to our
 users and try to  communicate more with the packagers. We already do
 the first in our IRC channel (telling users we can't support distro
 packages, and that they should just download the tarball), but there
 are lots of place where we don't do that yet.
 
 In short: education and communication, not trying to do their job.
 

I think we should be doing education, but not for the package maintainers.
We should try harder to inform our users that they shouldn't used distro
maintained packages, and we should explain why.

Respectfully,

Ryan Lane
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New table: globaltemplatelinks

2010-08-03 Thread Domas Mituzas
Hi!

 Can you please read it and give your opinion?

Great job on indexing, man, I see you cover pretty much every use case!

Domas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Showing bytes added/removed in each edit in View history and User contributions

2010-08-03 Thread Aryeh Gregor
On Tue, Aug 3, 2010 at 5:09 PM, Daniel Friesen
li...@nadir-seen-fire.com wrote:
 Yup, though we might as well remember that not everyone has mb_
 functions installed.

if ( !function_exists( 'mb_strlen' ) ) {
/**
 * Fallback implementation of mb_strlen, hardcoded to UTF-8.
 * @param string $str
 * @param string $enc optional encoding; ignored
 * @return int
 */
function mb_strlen( $str, $enc= ) {
$counts = count_chars( $str );
$total = 0;

// Count ASCII bytes
for( $i = 0; $i  0x80; $i++ ) {
$total += $counts[$i];
}

// Count multibyte sequence heads
for( $i = 0xc0; $i  0xff; $i++ ) {
$total += $counts[$i];
}
return $total;
}
}

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New table: globaltemplatelinks

2010-08-03 Thread Roan Kattouw
2010/8/3 Domas Mituzas midom.li...@gmail.com:
 Hi!

 Can you please read it and give your opinion?

 Great job on indexing, man, I see you cover pretty much every use case!

I at one point meant to tell Peter to add indexes, but that slipped
through, I guess. I'll put in some indexes tomorrow.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Posting with Gmane

2010-08-03 Thread MZMcBride
It sounds so perfect: you can post your snarky comments to the mailing list
without having to really be subscribed. Just use this handy form!

It turns out that Gmane has two rigid limits in its posting form: (1) lines
can't be longer than 80 characters and (2) you can't have too much quoted
material in your reply. These aren't warnings, mind you, they're imposed
limits that prevent the mail from being submitted.

And it now appears that it also mangles subject lines containing quotation
marks.

So much for that noise. I've properly subscribed now. Apologies for the
foul-up.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Showing bytes added/removed in each edit in quot; View historyquot; and quot; User contributionsquot;

2010-08-03 Thread Ariel T. Glenn
Στις 04-08-2010, ημέρα Τετ, και ώρα 04:17 +, ο/η MZMcBride έγραψε:
 Aryeh Gregor wrote:
  The same could be said of practically any user-visible change.  I
  mean, maybe if we add a new special page we'll break some script that
  was screen-scraping Special:SpecialPages.  We can either freeze
  MediaWiki and never change anything for fear that we'll break
  something, or we can evaluate each potential change on the basis of
  how likely it is to break anything.  I can't see anything breaking too
  badly if rev_len is reported in characters instead of bytes -- the
  only place it's likely to be useful is in heuristics, and by their
  nature, those won't break too badly if the numbers they're based on
  change somewhat.
 
 This is problematic logic for a few reasons. I see a change to the rev_len 
 logic
 as being similar to a change in article count logic. The same arguments work 
 in
 both places, specifically the step problem that will cause nasty jumps in
 graphs.[1]
 
 In some cases, as you've noted, we're talking about a change by a factor of
 three. Plenty of scripts rely on hard-coded values to determine size 
 thresholds
 for certain behaviors. While these scripts may not have the best
 implementations, I don't think it's fair to say that they're worth breaking.
 
 The comparison to screen-scraping seems pretty spurious as well. The reason 
 it's
 acceptable to break screen-scraping scripts is that there's a functioning API
 alternative that is designed for bots and scripts. One of the design 
 principles
 is consistency. Altering a metric by up to a factor of three (and even worse,
 doing so in an unpredictable manner) breaks this consistency needlessly.
 
 Is it worth the cost to add 300 million+ rows to easily have character count? 
 I
 don't know. Personally, I don't mind rev_len being in bytes; it makes more 
 sense
 from a database and technical perspective to me. Admittedly, though, I deal
 mostly with English sites.
 

Im all for the change, but it would have to be announced well in
advance of rollout and coordinated with other folks.  For example, I
have a check against rev_len (in bytes) when writing out XML dumps, in
order to avoid rev id and rev content out of sync errors that we have
run into multiple times in the past.  That code would need to be changed
to count characters of the text being used for prefetch instead of
bytes.

Ariel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Showing bytes added/removed in each edit in quot; View historyquot; and quot; User contributionsquot;

2010-08-03 Thread MZMcBride
Ariel T. Glenn wrote:
 Im all for the change, but it would have to be announced well
 in
advance of rollout and coordinated with other folks.  For example,
 I
have a check against rev_len (in bytes) when writing out XML dumps,
 in
order to avoid rev id and rev content out of sync errors that we
 have
run into multiple times in the past.  That code would need to be
 changed
to count characters of the text being used for prefetch
 instead of
bytes.

Are character counts between programming languages generally consistent? And
is there a performance concern with counting characters vs. counting bytes?
Another post in this thread suggested that it might be up to five times
slower when counting characters. I've no idea if this is accurate, but even
a small increase could have a nasty impact on dump-processing scripts (as
opposed to the negligible impact on revision table inserts).

MZMcBride




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l