Re: [Wikitech-l] Showing bytes added/removed in each edit in " View history" and " User contributions"

2010-08-03 Thread MZMcBride
Ariel T. Glenn wrote:
> I"m all for the change, but it would have to be announced well
> in
advance of rollout and coordinated with other folks.  For example,
> I
have a check against rev_len (in bytes) when writing out XML dumps,
> in
order to avoid rev id and rev content out of sync errors that we
> have
run into multiple times in the past.  That code would need to be
> changed
to count characters of the text being used for prefetch
> instead of
bytes.

Are character counts between programming languages generally consistent? And
is there a performance concern with counting characters vs. counting bytes?
Another post in this thread suggested that it might be up to five times
slower when counting characters. I've no idea if this is accurate, but even
a small increase could have a nasty impact on dump-processing scripts (as
opposed to the negligible impact on revision table inserts).

MZMcBride




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Showing bytes added/removed in each edit in " View history" and " User contributions"

2010-08-03 Thread Ariel T. Glenn
Στις 04-08-2010, ημέρα Τετ, και ώρα 04:17 +, ο/η MZMcBride έγραψε:
> Aryeh Gregor wrote:
> > The same could be said of practically any user-visible change.  I
> > mean, maybe if we add a new special page we'll break some script that
> > was screen-scraping Special:SpecialPages.  We can either freeze
> > MediaWiki and never change anything for fear that we'll break
> > something, or we can evaluate each potential change on the basis of
> > how likely it is to break anything.  I can't see anything breaking too
> > badly if rev_len is reported in characters instead of bytes -- the
> > only place it's likely to be useful is in heuristics, and by their
> > nature, those won't break too badly if the numbers they're based on
> > change somewhat.
> 
> This is problematic logic for a few reasons. I see a change to the rev_len 
> logic
> as being similar to a change in article count logic. The same arguments work 
> in
> both places, specifically the "step problem" that will cause nasty jumps in
> graphs.[1]
> 
> In some cases, as you've noted, we're talking about a change by a factor of
> three. Plenty of scripts rely on hard-coded values to determine size 
> thresholds
> for certain behaviors. While these scripts may not have the best
> implementations, I don't think it's fair to say that they're worth breaking.
> 
> The comparison to screen-scraping seems pretty spurious as well. The reason 
> it's
> acceptable to break screen-scraping scripts is that there's a functioning API
> alternative that is designed for bots and scripts. One of the design 
> principles
> is consistency. Altering a metric by up to a factor of three (and even worse,
> doing so in an unpredictable manner) breaks this consistency needlessly.
> 
> Is it worth the cost to add 300 million+ rows to easily have character count? 
> I
> don't know. Personally, I don't mind rev_len being in bytes; it makes more 
> sense
> from a database and technical perspective to me. Admittedly, though, I deal
> mostly with English sites.
> 

I"m all for the change, but it would have to be announced well in
advance of rollout and coordinated with other folks.  For example, I
have a check against rev_len (in bytes) when writing out XML dumps, in
order to avoid rev id and rev content out of sync errors that we have
run into multiple times in the past.  That code would need to be changed
to count characters of the text being used for prefetch instead of
bytes.

Ariel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Posting with Gmane

2010-08-03 Thread MZMcBride
It sounds so perfect: you can post your snarky comments to the mailing list
without having to really be subscribed. Just use this handy form!

It turns out that Gmane has two rigid limits in its posting form: (1) lines
can't be longer than 80 characters and (2) you can't have "too much" quoted
material in your reply. These aren't warnings, mind you, they're imposed
limits that prevent the mail from being submitted.

And it now appears that it also mangles subject lines containing quotation
marks.

So much for that noise. I've properly subscribed now. Apologies for the
foul-up.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Showing bytes added/removed in each edit i n "View history" and "User contributions"

2010-08-03 Thread MZMcBride
Aryeh Gregor wrote:
> The same could be said of practically any user-visible change.  I
> mean, maybe if we add a new special page we'll break some script that
> was screen-scraping Special:SpecialPages.  We can either freeze
> MediaWiki and never change anything for fear that we'll break
> something, or we can evaluate each potential change on the basis of
> how likely it is to break anything.  I can't see anything breaking too
> badly if rev_len is reported in characters instead of bytes -- the
> only place it's likely to be useful is in heuristics, and by their
> nature, those won't break too badly if the numbers they're based on
> change somewhat.

This is problematic logic for a few reasons. I see a change to the rev_len logic
as being similar to a change in article count logic. The same arguments work in
both places, specifically the "step problem" that will cause nasty jumps in
graphs.[1]

In some cases, as you've noted, we're talking about a change by a factor of
three. Plenty of scripts rely on hard-coded values to determine size thresholds
for certain behaviors. While these scripts may not have the best
implementations, I don't think it's fair to say that they're worth breaking.

The comparison to screen-scraping seems pretty spurious as well. The reason it's
acceptable to break screen-scraping scripts is that there's a functioning API
alternative that is designed for bots and scripts. One of the design principles
is consistency. Altering a metric by up to a factor of three (and even worse,
doing so in an unpredictable manner) breaks this consistency needlessly.

Is it worth the cost to add 300 million+ rows to easily have character count? I
don't know. Personally, I don't mind rev_len being in bytes; it makes more sense
from a database and technical perspective to me. Admittedly, though, I deal
mostly with English sites.

MZMcBride

[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=11868#c8


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A Shorturl Service for Wikipedia Projects by Node.js

2010-08-03 Thread MZMcBride
Mingli Yuan wrote:
> Thanks for you response. How dose admin actions mess up pageid?
> Could you give me some example, so I would try to find a way to avoid it.

If an admin deletes and then undeletes a page, the old page ID is not retained.

MZMcBride


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A Shorturl Service for Wikipedia Projects by Node.js

2010-08-03 Thread Liangent
On 8/4/10, Mingli Yuan  wrote:
> Thanks for you response. How dose admin actions mess up pageid?
> Could you give me some example, so I would try to find a way to avoid it.

Deletions and restorations. They're often used to deal with mixed
history (eg. results of c&p move).

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread Jax
2005 releases were the best, I wouldn't use newer only if I have a
good reason to do so.



On Mon, Aug 2, 2010 at 4:16 PM, Lane, Ryan
 wrote:
>> I haven't read all the documents, but have these researchers taken
>> into account backported fixes?
>>
>> My gut feeling is that the "preference" for 1.12 is simply due to its
>> inclusion in Debian stable [1]. The maintainer seems to be actively
>> backporting security fixes [2], so while I agree that these versions
>> may enjoy less community support, they should not be considered broken
>> on the basis of the version number alone.
>>
>> This, of course, unless it is certain that some vulnerabilities are
>> still present in the Debian version. If you are aware of the existence
>> of such a problem, I would recommend you contact
>> . Otherwise, the situation might not be as
>> dangerous as it seems.
>>
>> On the topic of facilitating upgrades: perhaps we should emphasize the
>> option to install and upgrade using SVN, which is probably very
>> convenient for users that are comfortable with the command line.
>> Moodle has this in the official documentation and I find it very
>> useful [3]. SVN could also be handy as the backend for a user-friendly
>> upgrade procedure, as it already deals with local modifications and
>> such.
>>
>
> As someone who has had their code patched by the debian team, I'd like to
> take the time to bitch about this.
>
> Firstly, their patches are often incorrect. Secondly, though they've patched
> my LDAP extension a number of times, I have *never* received a bug report or
> a patch from them for something they've fixed. It is extremely annoying to
> see a fix has been around that I could have used months before someone
> reports a problem to me. Beyond anything else this bothers me the most. They
> really need to be better community members in regards to this. Lastly,
> packaging and maintaining such an old version of MediaWiki does a disservice
> to us, and their users. We don't support versions of MediaWiki that old. I
> understand that Debian backports security fixes for MediaWiki, but they
> don't backport new features, and don't backport all bug fixes. Additionally,
> Debian doesn't backport security fixes for all extensions. Not all extension
> developers bother maintaining backwards compatibility, and the only possible
> way to get security fixes is to upgrade MediaWiki and the extension.
>
> Please Debian, keep your version of MediaWiki up to date at least to the
> oldest stable release, and please send your fixes upstream when you find
> unfixed bugs.
>
> Respectfully,
>
> Ryan Lane
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread K. Peachey
On Wed, Aug 4, 2010 at 11:03 AM, Rob Lanphier  wrote:
> On Tue, Aug 3, 2010 at 1:38 PM, Lane, Ryan
>  wrote:
>> I think we should be doing education, but not for the package maintainers.
>> We should try harder to inform our users that they shouldn't used distro
>> maintained packages, and we should explain why.
>
> I'm not sure I buy this.  Why is MediaWiki so special that it can't
> exist inside of a package?  Is MediaWiki such a special piece of
> software that it's impossible to build a good package?
It Can, we just want a working package, And until the providers
provide this, there will be recommendations against using the
packages. Someone, That for example, installs a package that has
broken skins right from the get go are going to have bad impressions
on MediaWiki, it won't be until they go digging to find out that they
need to manually set a alias for their webserver to find it, And it
won't be until someone mentions otherwise that its actually the
packages fault its broken.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread Rob Lanphier
On Tue, Aug 3, 2010 at 1:38 PM, Lane, Ryan
 wrote:
> I think we should be doing education, but not for the package maintainers.
> We should try harder to inform our users that they shouldn't used distro
> maintained packages, and we should explain why.

I'm not sure I buy this.  Why is MediaWiki so special that it can't
exist inside of a package?  Is MediaWiki such a special piece of
software that it's impossible to build a good package?

I think user education is going to be even more futile than package
maintainer education.  The allure of running a system like Debian or
Fedora is the ability to have pre-vetted software running in a
configuration designed to work as part of a system.  I'm not here to
start a debate about whether they are successful in achieving that,
but it's clearly a popular enough notion that an education effort to
counter that probably won't have much of an impact with anyone beyond
the Slackware community.

+1 for package maintainer education (as frustrating and unproductive
as it might be thusfar)

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A Shorturl Service for Wikipedia Projects by Node.js

2010-08-03 Thread Mingli Yuan
> Why would you reduce the page ID in length with base 36? They're, what, 10
digits?
> Roan Kattouw (Catrope)

Hi, Roan, base 36 encoding/decoding is very cheap in Javascript, just call

* parseInt(string, radix)
* number.toString(radix)

Where radix is range from 2 to 36.
You know, combining 0~9 and a ~z, we get 36 letters.

> you should consider base32
> j

base 36 is more compact than base 32.

> URL has Read-Only access
> Sadik Khalid

Hi, Sadik,

I use github to manage the versions of my code.
You can send me the code directly via mail, and I will merge it to the code.
Thanks.

> Some admin actions mess up pageid (and I did some from time to time),
> so your idea that a page id identifies a title is buggy.
> Liangent

Hi, Liangent,

Thanks for you response. How dose admin actions mess up pageid?
Could you give me some example, so I would try to find a way to avoid it.


On Tue, Aug 3, 2010 at 11:59 PM, Mingli Yuan  wrote:

> Hi, folks,
>
> For many languages which dose not use Latin characters, the URL of an
> articles in Wikipedia might be very long.
> This is the case for Chinese Wikipedia.
>
> So in order to help people on this problem, I create a small project to
> solve it, and it runs successfully on my local machine.
> Although I dose not deploy it to a public server yet, I decided to make the
> code public first.
> You can get the code at http://github.com/mountain/shortify
>
> It uses API call to get pageId by the title, and then convert pageId by
> base 36 to the short url. It is quite simple.
> To reduce the frequency of API call, a simple cache was used.  So far, only
> Chinese and English were supported.
>
> If you think your language need such kind a tool, please help me localize
> the i18n config file at
> http://github.com/mountain/shortify/blob/master/config/i18n.js
>
> Comments are welcomed. If you can help setup a server, that would be nice
> and please contact me separately.
>
> Regards,
>
> Mingli (User:Mountain)
>
>
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Showing bytes added/removed in each edit in "View history" and "User contributions"

2010-08-03 Thread soxred93
(just remember that it's 1.5 to 5 times slower, like I said earlier.  
Whether or not that's an issue will have to be decided by higher powers)

On Aug 3, 2010, at 5:54 PM, Aryeh Gregor wrote:

> On Tue, Aug 3, 2010 at 5:09 PM, Daniel Friesen
>  wrote:
>> Yup, though we might as well remember that not everyone has mb_
>> functions installed.
>
> if ( !function_exists( 'mb_strlen' ) ) {
> /**
>  * Fallback implementation of mb_strlen, hardcoded to UTF-8.
>  * @param string $str
>  * @param string $enc optional encoding; ignored
>  * @return int
>  */
> function mb_strlen( $str, $enc="" ) {
> $counts = count_chars( $str );
> $total = 0;
>
> // Count ASCII bytes
> for( $i = 0; $i < 0x80; $i++ ) {
> $total += $counts[$i];
> }
>
> // Count multibyte sequence heads
> for( $i = 0xc0; $i < 0xff; $i++ ) {
> $total += $counts[$i];
> }
> return $total;
> }
> }
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New table: globaltemplatelinks

2010-08-03 Thread Roan Kattouw
2010/8/4 Platonides :
> You seem to assume that when a template changes it is enough to update
> page_touched.
> You also need to purge the squids
Yes, Squid purges need to be done as well. Good catch. Fortunately,
the interwiki table can tell us exactly what each page's URL is.

> and create the needed jobs to rerender
> those pages
This is not needed: pages are automatically rerendered when first
viewed after page_touched is updated.

> (and the pages that include those, you will probably store
> the final inclusions on all superpages, so there shouldn't be issues there).
>
Because globaltemplatelinks would be transitive, like templatelinks,
that would not be an issue, correct.

> We are having more and more global uses. We should have a global
> namespace mapping so that each global table doesn't need to copy the
> namespace names just for display.
>
Maybe, but that's not easy. Storing the namespace name sounds like an
acceptable intermediate solution.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New table: globaltemplatelinks

2010-08-03 Thread Platonides
Pierre-Yves Guerder wrote:
> Hi everybody!
> 
> I have been working on interwiki transclusion for several weeks on my
> branch and got some very interesting results (most of the stuff is
> working, now!).
> 
> What I currently need is to create a globaltemplatelinks that will
> allow a wiki to invalidate the cache of the pages which transclude a
> distant template when this template is modified.
> 
> I have written my proposal for the structure and behavior of this
> system here: [1].
> 
> Can you please read it and give your opinion?
> 
> Thanks in advance
> 
> Best regards

You seem to assume that when a template changes it is enough to update
page_touched.
You also need to purge the squids and create the needed jobs to rerender
those pages (and the pages that include those, you will probably store
the final inclusions on all superpages, so there shouldn't be issues there).

We are having more and more global uses. We should have a global
namespace mapping so that each global table doesn't need to copy the
namespace names just for display.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread Platonides
Lane, Ryan wrote:
>> On 3 August 2010 18:14, Aryeh Gregor 
>>  wrote:
>>> I'm thankful that the Debian MediaWiki package at least 
>> *works*.  Not
>>> that the same can be said of all their packages either (OpenSSL,
>>> anyone?).  Maybe if we provided .debs and RPMs, people would be less
>>> prone to use the distro packages.
>>
>> That just creates more problems:
>> * bad quality distro packages
>> * bad quality our own packages (while we know MediaWiki, we are not
>> experts in packaging)
>> * lots of confusion
>>
> 
> I've packaged hundreds of RPMs. It isn't difficult, and you don't need to be
> an expert. It is easy enough to package the MediaWiki software. The real
> problem comes with upgrades. How does the package handle this? Do we ignore
> the actual maintanence/update.php portion? Do we run it? How do we handle
> extensions? Package them too? Do we make a repo for all of this? How are the
> extensions handled on upgrade?

I had some plans for adding the needed hooks to the installer so that it
could be run by a package manager in a more FHS way, store LocalSettings
inside /etc, automatically run upgrade.php, etc.

Then since the new installer changed how LocalSettings is provided to
the user, I didn't thought further on how to adapt it.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New table: globaltemplatelinks

2010-08-03 Thread Roan Kattouw
2010/8/3 Domas Mituzas :
> Hi!
>
>> Can you please read it and give your opinion?
>
> Great job on indexing, man, I see you cover pretty much every use case!
>
I at one point meant to tell Peter to add indexes, but that slipped
through, I guess. I'll put in some indexes tomorrow.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Showing bytes added/removed in each edit in "View history" and "User contributions"

2010-08-03 Thread Aryeh Gregor
On Tue, Aug 3, 2010 at 5:09 PM, Daniel Friesen
 wrote:
> Yup, though we might as well remember that not everyone has mb_
> functions installed.

if ( !function_exists( 'mb_strlen' ) ) {
/**
 * Fallback implementation of mb_strlen, hardcoded to UTF-8.
 * @param string $str
 * @param string $enc optional encoding; ignored
 * @return int
 */
function mb_strlen( $str, $enc="" ) {
$counts = count_chars( $str );
$total = 0;

// Count ASCII bytes
for( $i = 0; $i < 0x80; $i++ ) {
$total += $counts[$i];
}

// Count multibyte sequence heads
for( $i = 0xc0; $i < 0xff; $i++ ) {
$total += $counts[$i];
}
return $total;
}
}

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New table: globaltemplatelinks

2010-08-03 Thread Domas Mituzas
Hi!

> Can you please read it and give your opinion?

Great job on indexing, man, I see you cover pretty much every use case!

Domas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Showing bytes added/removed in each edit in "View history" and "User contributions"

2010-08-03 Thread Daniel Friesen
Aryeh Gregor wrote:
> On Tue, Aug 3, 2010 at 10:59 AM, soxred93  wrote:
>   
>> Just butting in here, if I recall correctly, both the PHP-native
>> mb_strlen() and the MediaWiki fallback mb_strlen() functions are
>> considerably slower (1.5 to 5 times as slow).
>> 
>
> They only have to be run once, when the revision is saved.  It's not
> likely to be a noticeable cost.
>   
Yup, though we might as well remember that not everyone has mb_ 
functions installed.
MediaWiki is intended to be functional both with, and without mb_ 
functions. That's another point towards storing both and falling back to 
bytes when the char field isn't populated.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


-- 
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] New table: globaltemplatelinks

2010-08-03 Thread Pierre-Yves Guerder
Hi everybody!

I have been working on interwiki transclusion for several weeks on my
branch and got some very interesting results (most of the stuff is
working, now!).

What I currently need is to create a globaltemplatelinks that will
allow a wiki to invalidate the cache of the pages which transclude a
distant template when this template is modified.

I have written my proposal for the structure and behavior of this
system here: [1].

Can you please read it and give your opinion?

Thanks in advance

Best regards

--
Peter Potrowl
http://www.mediawiki.org/wiki/User:Peter17

[1] 
http://www.mediawiki.org/wiki/User:Peter17/Reasonably_efficient_interwiki_transclusion#Create_a_globaltemplatelinks_table

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Testing] Selenium

2010-08-03 Thread Markus Glaser
Hi Benedikt,

the framework was reworked several times the last few weeks, so I am afraid the 
documentation is slightly out of date. I will update it the next few days. As 
of now, you have to add your test classes to the autoloader and then adapt 
these settings and put it in your LocalSettings.php:

$wgEnableSelenium = true;
$wgGroupPermissions['sysop']['selenium'] = true;
$wgSeleniumTestSuites = array(
'SimpleSeleniumTestSuite',
);
// use no protocol here
$wgSeleniumTestsSeleniumHost = 'localhost';
// use of protocol is mandatory! also, selenium requests a trailing slash
$wgSeleniumTestsWikiUrl = 'http://localhost/phase3/';
$wgSeleniumServerPort = ;
$wgSeleniumTestsWikiUser  = 'WikiSysop';
$wgSeleniumTestsWikiPassword  = 'password';
$wgSeleniumTestsBrowsers = array(
'firefox' => '*chrome d:\\Firefox35\\firefox.exe',
'iexplorer' => '*iexploreproxy',
'opera' => '*chrome /usr/bin/opera',
);
$wgSeleniumTestsUseBrowser = 'firefox';

You can find a sample test in the maintenance/tests/selenium folder, which 
consists of a test case and a test suite. It's the test suite you have to add 
to the autoloader. For the sample test, this has already been done in the trunk.

Cheers,
Markus


-Ursprüngliche Nachricht-
Von: wikitech-l-boun...@lists.wikimedia.org 
[mailto:wikitech-l-boun...@lists.wikimedia.org] Im Auftrag von Benedikt Kaempgen
Gesendet: Dienstag, 3. August 2010 17:38
An: Wikimedia developers
Betreff: [Wikitech-l] [Testing] Selenium

Hello,

In order to test SMW, I would like to try out your Selenium testing framework, 
as described here [1]. 

Two things are not that clear to me:

- "As of now, you have to manually add the test file to 
maintenance/tests/RunSeleniumTests.php. This will be replaced by a command line 
argument in the future." What exactly is one supposed to do here?

- Also, in section "Architecture" some files are mentioned, that I cannot find 
in /trunk/phase3, e.g., selenium/SimpleSeleniumTest oder 
selenium/LocalSeleniumSettings.php.sample. Why is this not the case?

Regards,

Benedikt

[1] http://www.mediawiki.org/wiki/SeleniumFramework


--
Karlsruhe Institute of Technology (KIT)
Institute of Applied Informatics and Formal Description Methods (AIFB)

Benedikt Kämpgen
Research Associate

Kaiserstraße 12
Building 11.40
76131 Karlsruhe, Germany

Phone: +49 721 608-7946
Fax: +49 721 608-6580
Email: benedikt.kaemp...@kit.edu
Web: http://www.kit.edu/

KIT - University of the State of Baden-Wuerttemberg and National Research 
Center of the Helmholtz Association


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread Lane, Ryan
> On 3 August 2010 18:14, Aryeh Gregor 
>  wrote:
> > I'm thankful that the Debian MediaWiki package at least 
> *works*.  Not
> > that the same can be said of all their packages either (OpenSSL,
> > anyone?).  Maybe if we provided .debs and RPMs, people would be less
> > prone to use the distro packages.
> 
> That just creates more problems:
> * bad quality distro packages
> * bad quality our own packages (while we know MediaWiki, we are not
> experts in packaging)
> * lots of confusion
> 

I've packaged hundreds of RPMs. It isn't difficult, and you don't need to be
an expert. It is easy enough to package the MediaWiki software. The real
problem comes with upgrades. How does the package handle this? Do we ignore
the actual maintanence/update.php portion? Do we run it? How do we handle
extensions? Package them too? Do we make a repo for all of this? How are the
extensions handled on upgrade?

Having MediaWiki in a package really doesn't make much sense, unless we put
a lot of effort into making it work this way.

> I don't see any other way out but to reach to the packagers and get
> their packages fixed. What we can do is to communicate this to our
> users and try to  communicate more with the packagers. We already do
> the first in our IRC channel (telling users we can't support distro
> packages, and that they should just download the tarball), but there
> are lots of place where we don't do that yet.
> 
> In short: education and communication, not trying to do their job.
> 

I think we should be doing education, but not for the package maintainers.
We should try harder to inform our users that they shouldn't used distro
maintained packages, and we should explain why.

Respectfully,

Ryan Lane
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] InputBox and getElementsByName

2010-08-03 Thread Strainu
Hi,
I've noticed that Extension:InputBox generates HTML with only the
"name" attribute, but no id. Is this a feature or a bug?

I'm asking because I'm trying to get a "onsubmit" hook for a form
generated by this extension. I tried doing something like:

if (document.getElementsByName('createbox').length)
  {
var i=0;
for(i=0; i < document.getElementsByName('createbox').length; i++){
  document.getElementsByName('createbox')[i].onsubmit=function() {
var 
ta=document.getElementsByName('createbox')[i].getElementsByName('name')[0];
if(!ta){
  return true;
}
//do stuff
  }
}
  }

(actually the code is a little cleaner, but this is a more concise form)

I'm getting a "document.getElementsByName('createbox')[i] is undefined
error", even if firebug reports the getElementsByName function returns
an array with 1 element. Could some JS-wizard help me out on that?

Thanks,
  Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-08-03 Thread Happy-melon

"Oldak Quill"  wrote in message 
news:aanlktik8sqmaetwvg8eta+ca49i08rfbrmvicsms+...@mail.gmail.com...
> On 2 August 2010 12:13, Oldak Quill  wrote:
>> On 28 July 2010 20:13,   wrote:
>>> Seems to me playing the role of the average dumb user, that
>>> en.wikipedia.org is one of the rather slow websites of the many websites
>>> I browse.
>>>
>>> No matter what browser, it takes more seconds from the time I click on a
>>> link to the time when the first bytes of the HTTP response start flowing
>>> back to me.
>>>
>>> Seems facebook is more zippy.
>>>
>>> Maybe Mediawiki is not "optimized".
>>
>>
>> For what it's worth, Alexa.com lists the average load time of the
>> websites they catalogue. I'm not sure what the metrics they use are,
>> and I would guess they hit the squid cache and are in the United
>> States.
>>
>> Alexa.com list the following average load times as of now:
>>
>> wikipedia.org: Fast (1.016 Seconds), 74% of sites are slower.
>> facebook.com: Average (1.663 Seconds), 50% of sites are slower.
>
>
> An addendum to the above message:
>
> According to the Alexa.com help page "Average Load Times: Speed
> Statistics" (http://www.alexa.com/help/viewtopic.php?f=6&t=1042):
> "The Average Load Time ... [is] based on load times experienced by
> Alexa users, and measured by the Alexa Toolbar, during their regular
> web browsing."
>
> So although US browsers might be overrepresented in this sample (I'm
> just guessing, I have no figures to support this statement), the Alexa
> sample should include many non-US browsers, assuming that the figure
> reported by Alexa.com is reflective of its userbase.
>
And the average Alexa toolbar user is logged in to facebook and using it to 
see what their friends were up to last night, with masses of personalised 
content; while the average Alexa toolbar user is a reader seeing the same 
page as everyone else.  We definitely have the theoretical advantage.

--HM
 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikitech-l Digest, Vol 85, Issue 11

2010-08-03 Thread Noman
Thanks alot Dmitriy !
I thinks its enough to get me started. will email if stuck some where .

Noman

On Tue, Aug 3, 2010 at 7:59 PM, wrote:

> Send Wikitech-l mailing list submissions to
>wikitech-l@lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
>https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> or, via email, send a message with subject or body 'help' to
>wikitech-l-requ...@lists.wikimedia.org
>
> You can reach the person managing the list at
>wikitech-l-ow...@lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Wikitech-l digest..."
>
>
> Today's Topics:
>
>   1. Re: wikipedia is one of the slower sites on the web (Liangent)
>   2. Re: wikipedia is one of the slower sites on the web (K. Peachey)
>   3. Re: chanfing main page articles from drop down. help  required
>  (Dmitriy Sintsov)
>   4. Re: wikipedia is one of the slower sites on the web
>  (John Vandenberg)
>   5. Re: wikipedia is one of the slower sites on the web (Platonides)
>   6. Re: chanfing main page articles from drop down. help  required
>  (Platonides)
>   7. Re: Debian packages (was MediaWiki version statistics)
>  (David Gerard)
>   8. Re: Showing bytes added/removed in each edit in "View
>  history" and "User contributions" (Aryeh Gregor)
>   9. Re: Showing bytes added/removed in each edit in "View
>  history" and "User contributions" (soxred93)
>
>
> --
>
> Message: 1
> Date: Tue, 3 Aug 2010 18:54:38 +0800
> From: Liangent 
> Subject: Re: [Wikitech-l] wikipedia is one of the slower sites on the
>web
> To: Wikimedia developers 
> Message-ID:
>
> 
> >
> Content-Type: text/plain; charset=UTF-8
>
> On 8/3/10, Lars Aronsson  wrote:
> > Couldn't you just tag every internal link with
> > a separate class for the length of the target article,
> > and then use different personal CSS to set the
> > threshold? The generated page would be the same
> > for all users:
>
> So if a page is changed, all pages linking to it need to be parsed
> again. Will this cost even more?
>
>
>
> --
>
> Message: 2
> Date: Tue, 3 Aug 2010 20:55:23 +1000
> From: "K. Peachey" 
> Subject: Re: [Wikitech-l] wikipedia is one of the slower sites on the
>web
> To: Wikimedia developers 
> Message-ID:
>
> Content-Type: text/plain; charset=UTF-8
>
> Would something like what is shown below get it even further down?
>
> a { color: blue }
> a.1_byte_article, a.2_byte_article, a.3_byte_article,
> a.4_byte_article, a.5_byte_article, a.6_byte_article,
> a.7_byte_article, a.8_byte_article, a.9_byte_article,
> a.10_byte_article,a.11_byte_article, a.12_byte_article,
> a.13_byte_article, a.14_byte_article, a.15_byte_article,
> a.16_byte_article, a.17_byte_article, a.18_byte_article,
> a.19_byte_article, a.20_byte_article, a.21_byte_article,
> a.22_byte_article, a.23_byte_article, a.24_byte_article,
> a.25_byte_article, a.26_byte_article, a.27_byte_article,
> a.28_byte_article, a.29_byte_article, a.30_byte_article,
> a.31_byte_article, a.32_byte_article, a.33_byte_article,
> a.34_byte_article, a.35_byte_article, a.36_byte_article,
> a.37_byte_article, a.38_byte_article, a.39_byte_article,
> a.40_byte_article, a.41_byte_article, a.42_byte_article,
> a.43_byte_article, a.44_byte_article, a.45_byte_article,
> a.46_byte_article, a.47_byte_article, a.48_byte_article,
> a.49_byte_article, a.50_byte_article, a.51_byte_article,
> a.52_byte_article, a.53_byte_article, a.54_byte_article,
> a.55_byte_article, a.56_byte_article, a.57_byte_article,
> a.58_byte_article, a.59_byte_article, a.60_byte_article,
> a.61_byte_article, a.62_byte_article, a.63_byte_article,
> a.64_byte_article, a.65_byte_article, a.66_byte_article,
> a.67_byte_article, a.68_byte_article, a.69_byte_article,
> a.70_byte_article, a.71_byte_article, a.72_byte_article,
> a.73_byte_article, a.74_byte_article, a.75_byte_article,
> a.76_byte_article, a.77_byte_article, a.78_byte_article,
> a.79_byte_article, a.80_byte_article, a.81_byte_article,
> a.82_byte_article, a.83_byte_article, a.84_byte_article,
> a.85_byte_article, a.86_byte_article, a.87_byte_article,
> a.88_byte_article, a.89_byte_article, a.90_byte_article,
> a.91_byte_article, a.92_byte_article, a.93_byte_article,
> a.94_byte_article, a.95_byte_article { color: red }
>
>
>
> --
>
> Message: 3
> Date: Tue, 03 Aug 2010 15:09:24 +0400
> From: Dmitriy Sintsov 
> Subject: Re: [Wikitech-l] chanfing main page articles from drop down.
>helprequired
> To: Wikimedia developers 
> Message-ID:
><353001217.1280833764.142604888.40...@mcgi-wr-7.rambler.ru>
> Content-Type: text/plain; charset="us-ascii"; format="flowed"
>
> * Noman  [Tue, 3 Aug 2010 12:04:31 +0500]:
> > Thanks Dmitriy,
> > i'm looking for div solution. as iframe will give scrolli

Re: [Wikitech-l] Developing true WISIWYG editor for media wiki

2010-08-03 Thread Neil Kandalgaonkar
On 8/3/10 2:18 AM, Marco Schuster wrote:

> I don't have the link ready, but Google solved this in Google Docs by
> re-implementing this in Javascript... they intercept mouse
> movements/clicks and keyboard events and then javascript-render the
> page.

http://googledocs.blogspot.com/2010/05/whats-different-about-new-google-docs.html


-- 
Neil Kandalgaonkar (   

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikitech-l Digest, Vol 85, Issue 11

2010-08-03 Thread Noman
Thanks alot Dmitriy !
I thinks its enough to get me started. will email if stuck some where .

Noman

On Tue, Aug 3, 2010 at 7:59 PM, wrote:

> Send Wikitech-l mailing list submissions to
>wikitech-l@lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
>https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> or, via email, send a message with subject or body 'help' to
>wikitech-l-requ...@lists.wikimedia.org
>
> You can reach the person managing the list at
>wikitech-l-ow...@lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Wikitech-l digest..."
>
>
> Today's Topics:
>
>   1. Re: wikipedia is one of the slower sites on the web (Liangent)
>   2. Re: wikipedia is one of the slower sites on the web (K. Peachey)
>   3. Re: chanfing main page articles from drop down. help  required
>  (Dmitriy Sintsov)
>   4. Re: wikipedia is one of the slower sites on the web
>  (John Vandenberg)
>   5. Re: wikipedia is one of the slower sites on the web (Platonides)
>   6. Re: chanfing main page articles from drop down. help  required
>  (Platonides)
>   7. Re: Debian packages (was MediaWiki version statistics)
>  (David Gerard)
>   8. Re: Showing bytes added/removed in each edit in "View
>  history" and "User contributions" (Aryeh Gregor)
>   9. Re: Showing bytes added/removed in each edit in "View
>  history" and "User contributions" (soxred93)
>
>
> --
>
> Message: 1
> Date: Tue, 3 Aug 2010 18:54:38 +0800
> From: Liangent 
> Subject: Re: [Wikitech-l] wikipedia is one of the slower sites on the
>web
> To: Wikimedia developers 
> Message-ID:
>
> 
> >
> Content-Type: text/plain; charset=UTF-8
>
> On 8/3/10, Lars Aronsson  wrote:
> > Couldn't you just tag every internal link with
> > a separate class for the length of the target article,
> > and then use different personal CSS to set the
> > threshold? The generated page would be the same
> > for all users:
>
> So if a page is changed, all pages linking to it need to be parsed
> again. Will this cost even more?
>
>
>
> --
>
> Message: 2
> Date: Tue, 3 Aug 2010 20:55:23 +1000
> From: "K. Peachey" 
> Subject: Re: [Wikitech-l] wikipedia is one of the slower sites on the
>web
> To: Wikimedia developers 
> Message-ID:
>
> Content-Type: text/plain; charset=UTF-8
>
> Would something like what is shown below get it even further down?
>
> a { color: blue }
> a.1_byte_article, a.2_byte_article, a.3_byte_article,
> a.4_byte_article, a.5_byte_article, a.6_byte_article,
> a.7_byte_article, a.8_byte_article, a.9_byte_article,
> a.10_byte_article,a.11_byte_article, a.12_byte_article,
> a.13_byte_article, a.14_byte_article, a.15_byte_article,
> a.16_byte_article, a.17_byte_article, a.18_byte_article,
> a.19_byte_article, a.20_byte_article, a.21_byte_article,
> a.22_byte_article, a.23_byte_article, a.24_byte_article,
> a.25_byte_article, a.26_byte_article, a.27_byte_article,
> a.28_byte_article, a.29_byte_article, a.30_byte_article,
> a.31_byte_article, a.32_byte_article, a.33_byte_article,
> a.34_byte_article, a.35_byte_article, a.36_byte_article,
> a.37_byte_article, a.38_byte_article, a.39_byte_article,
> a.40_byte_article, a.41_byte_article, a.42_byte_article,
> a.43_byte_article, a.44_byte_article, a.45_byte_article,
> a.46_byte_article, a.47_byte_article, a.48_byte_article,
> a.49_byte_article, a.50_byte_article, a.51_byte_article,
> a.52_byte_article, a.53_byte_article, a.54_byte_article,
> a.55_byte_article, a.56_byte_article, a.57_byte_article,
> a.58_byte_article, a.59_byte_article, a.60_byte_article,
> a.61_byte_article, a.62_byte_article, a.63_byte_article,
> a.64_byte_article, a.65_byte_article, a.66_byte_article,
> a.67_byte_article, a.68_byte_article, a.69_byte_article,
> a.70_byte_article, a.71_byte_article, a.72_byte_article,
> a.73_byte_article, a.74_byte_article, a.75_byte_article,
> a.76_byte_article, a.77_byte_article, a.78_byte_article,
> a.79_byte_article, a.80_byte_article, a.81_byte_article,
> a.82_byte_article, a.83_byte_article, a.84_byte_article,
> a.85_byte_article, a.86_byte_article, a.87_byte_article,
> a.88_byte_article, a.89_byte_article, a.90_byte_article,
> a.91_byte_article, a.92_byte_article, a.93_byte_article,
> a.94_byte_article, a.95_byte_article { color: red }
>
>
>
> --
>
> Message: 3
> Date: Tue, 03 Aug 2010 15:09:24 +0400
> From: Dmitriy Sintsov 
> Subject: Re: [Wikitech-l] chanfing main page articles from drop down.
>helprequired
> To: Wikimedia developers 
> Message-ID:
><353001217.1280833764.142604888.40...@mcgi-wr-7.rambler.ru>
> Content-Type: text/plain; charset="us-ascii"; format="flowed"
>
> * Noman  [Tue, 3 Aug 2010 12:04:31 +0500]:
> > Thanks Dmitriy,
> > i'm looking for div solution. as iframe will give scrolli

Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread Aryeh Gregor
On Tue, Aug 3, 2010 at 12:45 PM, Niklas Laxström
 wrote:
> I don't see any other way out but to reach to the packagers and get
> their packages fixed. What we can do is to communicate this to our
> users and try to  communicate more with the packagers.

I tried that with Fedora.  You can read about it here:

https://bugzilla.redhat.com/show_bug.cgi?id=484855
https://fedorahosted.org/fesco/ticket/225

Result: nothing.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread Niklas Laxström
On 3 August 2010 18:14, Aryeh Gregor  wrote:
> I'm thankful that the Debian MediaWiki package at least *works*.  Not
> that the same can be said of all their packages either (OpenSSL,
> anyone?).  Maybe if we provided .debs and RPMs, people would be less
> prone to use the distro packages.

That just creates more problems:
* bad quality distro packages
* bad quality our own packages (while we know MediaWiki, we are not
experts in packaging)
* lots of confusion

I don't see any other way out but to reach to the packagers and get
their packages fixed. What we can do is to communicate this to our
users and try to  communicate more with the packagers. We already do
the first in our IRC channel (telling users we can't support distro
packages, and that they should just download the tarball), but there
are lots of place where we don't do that yet.

In short: education and communication, not trying to do their job.

 -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A Shorturl Service for Wikipedia Projects by Node.js

2010-08-03 Thread Liangent
Some admin actions mess up pageid (and I did some from time to time),
so your idea that a page id identifies a title is buggy.

On 8/4/10, സാദിക്ക് ഖാലിദ് Sadik Khalid  wrote:
>> If you think your language need such kind a tool, please help me localize
>> the i18n config file at
>> http://github.com/mountain/shortify/blob/master/config/i18n.js
>>
>
> URL has Read-Only access
>
>
> On Tue, Aug 3, 2010 at 7:09 PM,  wrote:
>
>> On 08/03/2010 05:59 PM, Mingli Yuan wrote:
>> > It uses API call to get pageId by the title, and then convert pageId by
>> base
>> > 36 to the short url. It is quite simple.
>> > To reduce the frequency of API call, a simple cache was used.  So far,
>> only
>> > Chinese and English were supported.
>>
>> you should consider base32
>> http://www.crockford.com/wrmg/base32.html
>>
>> j
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
>
>
> --
> സ്‌നേഹാന്വേഷണങ്ങളോടെ,
> സാദിക്ക് ഖാലിദ്
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] A Shorturl Service for Wikipedia Projects by Node.js

2010-08-03 Thread സാദിക്ക് ഖാലിദ് Sadik Khali d
> If you think your language need such kind a tool, please help me localize
> the i18n config file at
> http://github.com/mountain/shortify/blob/master/config/i18n.js
>

URL has Read-Only access


On Tue, Aug 3, 2010 at 7:09 PM,  wrote:

> On 08/03/2010 05:59 PM, Mingli Yuan wrote:
> > It uses API call to get pageId by the title, and then convert pageId by
> base
> > 36 to the short url. It is quite simple.
> > To reduce the frequency of API call, a simple cache was used.  So far,
> only
> > Chinese and English were supported.
>
> you should consider base32
> http://www.crockford.com/wrmg/base32.html
>
> j
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
സ്‌നേഹാന്വേഷണങ്ങളോടെ,
സാദിക്ക് ഖാലിദ്
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] A Shorturl Service for Wikipedia Projects by Node.js

2010-08-03 Thread j
On 08/03/2010 05:59 PM, Mingli Yuan wrote:
> It uses API call to get pageId by the title, and then convert pageId by base
> 36 to the short url. It is quite simple.
> To reduce the frequency of API call, a simple cache was used.  So far, only
> Chinese and English were supported.

you should consider base32
http://www.crockford.com/wrmg/base32.html

j

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A Shorturl Service for Wikipedia Projects by Node.js

2010-08-03 Thread Roan Kattouw
2010/8/3 Mingli Yuan :
> It uses API call to get pageId by the title, and then convert pageId by base
> 36 to the short url. It is quite simple.
> To reduce the frequency of API call, a simple cache was used.  So far, only
> Chinese and English were supported.
>
Why would you reduce the page ID in length with base 36? They're,
what, 10 digits?

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] A Shorturl Service for Wikipedia Projects by Node.js

2010-08-03 Thread Mingli Yuan
Hi, folks,

For many languages which dose not use Latin characters, the URL of an
articles in Wikipedia might be very long.
This is the case for Chinese Wikipedia.

So in order to help people on this problem, I create a small project to
solve it, and it runs successfully on my local machine.
Although I dose not deploy it to a public server yet, I decided to make the
code public first.
You can get the code at http://github.com/mountain/shortify

It uses API call to get pageId by the title, and then convert pageId by base
36 to the short url. It is quite simple.
To reduce the frequency of API call, a simple cache was used.  So far, only
Chinese and English were supported.

If you think your language need such kind a tool, please help me localize
the i18n config file at
http://github.com/mountain/shortify/blob/master/config/i18n.js

Comments are welcomed. If you can help setup a server, that would be nice
and please contact me separately.

Regards,

Mingli (User:Mountain)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] [Testing] Selenium

2010-08-03 Thread Benedikt Kaempgen
Hello,

In order to test SMW, I would like to try out your Selenium testing framework, 
as described here [1]. 

Two things are not that clear to me:

- "As of now, you have to manually add the test file to 
maintenance/tests/RunSeleniumTests.php. This will be replaced by a command line 
argument in the future." What exactly is one supposed to do here?

- Also, in section "Architecture" some files are mentioned, that I cannot find 
in /trunk/phase3, e.g., selenium/SimpleSeleniumTest oder 
selenium/LocalSeleniumSettings.php.sample. Why is this not the case?

Regards,

Benedikt

[1] http://www.mediawiki.org/wiki/SeleniumFramework


--
Karlsruhe Institute of Technology (KIT)
Institute of Applied Informatics and Formal Description Methods (AIFB)

Benedikt Kämpgen
Research Associate

Kaiserstraße 12
Building 11.40
76131 Karlsruhe, Germany

Phone: +49 721 608-7946
Fax: +49 721 608-6580
Email: benedikt.kaemp...@kit.edu
Web: http://www.kit.edu/

KIT - University of the State of Baden-Wuerttemberg and
National Research Center of the Helmholtz Association


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread David Gerard
On 3 August 2010 16:14, Aryeh Gregor  wrote:

> I'm thankful that the Debian MediaWiki package at least *works*.  Not
> that the same can be said of all their packages either (OpenSSL,
> anyone?).  Maybe if we provided .debs and RPMs, people would be less
> prone to use the distro packages.


>_<

IT'S A TARBALL!

A TARBALL OF SOURCE CODE!

THAT YOU INTERPRET!

>_<


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Showing bytes added/removed in each edit in "View history" and "User contributions"

2010-08-03 Thread Aryeh Gregor
On Tue, Aug 3, 2010 at 10:59 AM, soxred93  wrote:
> Just butting in here, if I recall correctly, both the PHP-native
> mb_strlen() and the MediaWiki fallback mb_strlen() functions are
> considerably slower (1.5 to 5 times as slow).

They only have to be run once, when the revision is saved.  It's not
likely to be a noticeable cost.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread Aryeh Gregor
On Mon, Aug 2, 2010 at 7:06 PM, Carl (CBM)  wrote:
> I am not a Debian developer, and I agree that sending fixes upstream
> is good. But surely you're aware that the whole point of "Debian
> stable" is that it does ***not*** change to newer versions of programs
> after release, apart from security fixes?

Which means it doesn't get all security fixes either, because nobody
announces vulnerabilities or publishes patches for unsupported
MediaWiki versions.  If a bug occurred only in an old version, it
won't be announced.  Distributions that try to pretend they can
support software for years past the time the vendor stopped supporting
it are probably crazy, but then, they're no more crazy than the users
who ask for that behavior, and I don't think we're likely to change
them.

>From #wikimedia-tech a couple years ago:

080511 15:35:42  mark, why Ubuntu?
080511 15:37:03  becuase that's what we use for all new servers? :)
080511 15:39:18  mark, well, yes.  What made you decide on Ubuntu?
080511 15:39:28  it's debian but with predictable release cycles

As for not upstreaming patches, probably the best bet there is for us
to give up and just watch the major distro bug trackers ourselves,
because I doubt we're going to get the distributors ever reporting
anything to us consistently.

On Mon, Aug 2, 2010 at 7:17 PM, Edward Z. Yang  wrote:
> However, upstream developers are often guilty of ignoring a distribution's
> needs, so it goes both ways.

I spoke with the Fedora maintainer of MediaWiki some time ago pretty
extensively about his hacks to MediaWiki, particularly the way he
moved all files around without understanding what he was doing and
completely broke the software.  (Reportedly to the point that styles
and scripts didn't work because he moved them out of the web root.
Really.  The Fedora wiki didn't use the Fedora MediaWiki package
because it was so broken.)  I suggested in some detail a better way to
fix things, and offered to review any patches he wanted to submit
upstream.  He never submitted any.  Oh well.

I'm thankful that the Debian MediaWiki package at least *works*.  Not
that the same can be said of all their packages either (OpenSSL,
anyone?).  Maybe if we provided .debs and RPMs, people would be less
prone to use the distro packages.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-08-03 Thread Aryeh Gregor
On Mon, Aug 2, 2010 at 8:32 PM, Lars Aronsson  wrote:
> Couldn't you just tag every internal link with
> a separate class for the length of the target article,
> and then use different personal CSS to set the
> threshold? The generated page would be the same
> for all users:
>
> My Article

Until the page changes length.  That would force all articles that
link to it to be reparsed, unless we use some way to insert the
correct page lengths into a parsed page before serving it to the user.
 In which case we don't really need to do this, we can just insert the
stub class on the correct pages using the same mechanism.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Showing bytes added/removed in each edit in "View history" and "User contributions"

2010-08-03 Thread soxred93
Just butting in here, if I recall correctly, both the PHP-native  
mb_strlen() and the MediaWiki fallback mb_strlen() functions are  
considerably slower (1.5 to 5 times as slow). Unless there's another  
way to count characters for multibyte UTF strings, this would not be  
a feasible idea.

-X!

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Showing bytes added/removed in each edit in "View history" and "User contributions"

2010-08-03 Thread Aryeh Gregor
On Tue, Aug 3, 2010 at 1:14 AM, Liangent  wrote:
> Byte count is used. For example in Chinese Wikipedia, one of the
> criteria of "Did you know" articles is ">= 3000 bytes".

I mean, is byte count used for anything where character count couldn't
be used just about as well?  Like is there some code that uses rev_len
to figure out whether an article can fit into a field limited to X
bytes, or whatever?  (That's probably unsafe anyway.)

On Tue, Aug 3, 2010 at 3:48 AM, Robert Ullmann  wrote:
> The revision size (and page size, meaning that of last revision) in
> bytes, is available in the API. If you change the definition there is
> no telling what you will break.

The same could be said of practically any user-visible change.  I
mean, maybe if we add a new special page we'll break some script that
was screen-scraping Special:SpecialPages.  We can either freeze
MediaWiki and never change anything for fear that we'll break
something, or we can evaluate each potential change on the basis of
how likely it is to break anything.  I can't see anything breaking too
badly if rev_len is reported in characters instead of bytes -- the
only place it's likely to be useful is in heuristics, and by their
nature, those won't break too badly if the numbers they're based on
change somewhat.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Debian packages (was MediaWiki version statistics)

2010-08-03 Thread David Gerard
On 3 August 2010 00:17, Edward Z. Yang  wrote:


>    2. Distributors roll patches without telling upstream developers who
>       would happily accept them into the mainline.


Has anyone reported the following as Debian bugs?

* Package maintainer not sending patches back upstream
* Package maintainer not visible and active in MediaWiki development
* Package maintainer not visible and active in MediaWiki community
support, leaving supporting his packages to the upstream


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] chanfing main page articles from drop down. help required

2010-08-03 Thread Platonides
Noman wrote:
> Hi,
> i've installed mediawiki for a wiki project.
> now we have 4 sections on main page . like there are on wikipedia main page.
> 
> Now as its done in wikipedia these 4 boxes are tables and update on date
> criteria.
> 
> Now i want to do is to give some kind a navigation bar like drop down or
> paging.
> so when user selects page from drop down the all 4 pages are updated with
> relevant articles.
> to clear more. how the for parts of table can be updated . and how to put
> combo box ( which is already filled with page numbers / article topics) and
> which user selects any page from drop down all the 4 table rows are updated.

One way would be to have different pages. You seem to already have lists
of what four things should appear when selectiog X, so that would fit.

Another approach that may fit you is the one of:
http://ca.wikipedia.org/wiki/Portada


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-08-03 Thread Platonides
Lars Aronsson wrote:
> On 08/01/2010 10:55 PM, Aryeh Gregor wrote:
>> One easy hack to reduce this problem is just to only provide a few
>> options for stub threshold, as we do with thumbnail size.  Although
>> this is only useful if we cache pages with nonzero stub threshold . .
>> . why don't we do that?  Too much fragmentation due to the excessive
>> range of options?
> 
> Couldn't you just tag every internal link with
> a separate class for the length of the target article,
> and then use different personal CSS to set the
> threshold? The generated page would be the same
> for all users:
> 
> My Article

That would be workable, eg. one class for articles smaller than 50
bytes, other for 100, 200, 250, 300, 400, 500, 600, 700, 800, 1000,
2000, 2500, 5000, 1 if it weren't for having to update all those
classes whenever the page changes.

It would work to add it as a separate stylesheet for stubs, though.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-08-03 Thread John Vandenberg
On Tue, Aug 3, 2010 at 8:55 PM, K. Peachey  wrote:
> Would something like what is shown below get it even further down?
>
> a { color: blue }
> a.1_byte_article, a.2_byte_article, a.3_byte_article,
> ...

using an abbreviation like ba would also help.

Limiting the user pref to intervals of 10 bytes would also help.

Also, as this piece of CSS is being dynamically generated,it only
needs to include the variations that occur in the body of the
article's HTML.

Or the CSS can be generated by JS on the client side, which is what
Aryeh has been suggesting all along (I think).

btw, I thought Domas was kidding.  I got a chuckle out of it, at least.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] chanfing main page articles from drop down. help required

2010-08-03 Thread Dmitriy Sintsov
* Noman  [Tue, 3 Aug 2010 12:04:31 +0500]:
> Thanks Dmitriy,
> i'm looking for div solution. as iframe will give scrolling if content
> are
> large. which is not required.
> now i was unable to find step by step approach to develop extension.
>
> if you have ne thing / example. i'll be waiting.
>
Maybe Extension:HTMLets will suits your needs. Otherwise, you have to 
study MediaWiki developers site:

1. Perhaps one would setup a parser xml tag hook to generate proper 
form/select/option and four corresponding div's code:
http://www.mediawiki.org/wiki/Manual:Tag_extensions

from xml tag attributes one would generate full html which is required 
to select titles and to place content of these into div's.

2. Perhaps one would use API to retrieve the pages whose title are taken 
from option.value via javascript, then place these into div.innerHTML, 
again via the Javascript:
http://www.mediawiki.org/wiki/API:Expanding_templates_and_rendering
Another possibility is to use Title and Article classes and do your own 
AJAX handler:
http://www.mediawiki.org/wiki/Manual:Ajax
http://www.mediawiki.org/wiki/Manual:Title.php
http://www.mediawiki.org/wiki/Manual:Article.php

However, that's probably a "reinventing of wheel".

Sorry for not being able to provide full example - I am not a rich guy 
and busy with projects to feed my family. Also, I am not the fastest 
coder out there.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-08-03 Thread K. Peachey
Would something like what is shown below get it even further down?

a { color: blue }
a.1_byte_article, a.2_byte_article, a.3_byte_article,
a.4_byte_article, a.5_byte_article, a.6_byte_article,
a.7_byte_article, a.8_byte_article, a.9_byte_article,
a.10_byte_article,a.11_byte_article, a.12_byte_article,
a.13_byte_article, a.14_byte_article, a.15_byte_article,
a.16_byte_article, a.17_byte_article, a.18_byte_article,
a.19_byte_article, a.20_byte_article, a.21_byte_article,
a.22_byte_article, a.23_byte_article, a.24_byte_article,
a.25_byte_article, a.26_byte_article, a.27_byte_article,
a.28_byte_article, a.29_byte_article, a.30_byte_article,
a.31_byte_article, a.32_byte_article, a.33_byte_article,
a.34_byte_article, a.35_byte_article, a.36_byte_article,
a.37_byte_article, a.38_byte_article, a.39_byte_article,
a.40_byte_article, a.41_byte_article, a.42_byte_article,
a.43_byte_article, a.44_byte_article, a.45_byte_article,
a.46_byte_article, a.47_byte_article, a.48_byte_article,
a.49_byte_article, a.50_byte_article, a.51_byte_article,
a.52_byte_article, a.53_byte_article, a.54_byte_article,
a.55_byte_article, a.56_byte_article, a.57_byte_article,
a.58_byte_article, a.59_byte_article, a.60_byte_article,
a.61_byte_article, a.62_byte_article, a.63_byte_article,
a.64_byte_article, a.65_byte_article, a.66_byte_article,
a.67_byte_article, a.68_byte_article, a.69_byte_article,
a.70_byte_article, a.71_byte_article, a.72_byte_article,
a.73_byte_article, a.74_byte_article, a.75_byte_article,
a.76_byte_article, a.77_byte_article, a.78_byte_article,
a.79_byte_article, a.80_byte_article, a.81_byte_article,
a.82_byte_article, a.83_byte_article, a.84_byte_article,
a.85_byte_article, a.86_byte_article, a.87_byte_article,
a.88_byte_article, a.89_byte_article, a.90_byte_article,
a.91_byte_article, a.92_byte_article, a.93_byte_article,
a.94_byte_article, a.95_byte_article { color: red }

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-08-03 Thread Liangent
On 8/3/10, Lars Aronsson  wrote:
> Couldn't you just tag every internal link with
> a separate class for the length of the target article,
> and then use different personal CSS to set the
> threshold? The generated page would be the same
> for all users:

So if a page is changed, all pages linking to it need to be parsed
again. Will this cost even more?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-08-03 Thread Domas Mituzas
Hi!

> Couldn't you just tag every internal link with
> a separate class for the length of the target article,

Great idea, how come noone ever came up with this, I even have a stylesheet 
ready, here it is (do note, even it looks big in text, gzip gets it down to 10% 
so we can support this kind of granularity even up to a megabyte :)

Domas

a { color: blue }
a.1_byte_article { color: red; }
a.2_byte_article { color: red; }
a.3_byte_article { color: red; }
a.4_byte_article { color: red; }
a.5_byte_article { color: red; }
a.6_byte_article { color: red; }
a.7_byte_article { color: red; }
a.8_byte_article { color: red; }
a.9_byte_article { color: red; }
a.10_byte_article { color: red; }
a.11_byte_article { color: red; }
a.12_byte_article { color: red; }
a.13_byte_article { color: red; }
a.14_byte_article { color: red; }
a.15_byte_article { color: red; }
a.16_byte_article { color: red; }
a.17_byte_article { color: red; }
a.18_byte_article { color: red; }
a.19_byte_article { color: red; }
a.20_byte_article { color: red; }
a.21_byte_article { color: red; }
a.22_byte_article { color: red; }
a.23_byte_article { color: red; }
a.24_byte_article { color: red; }
a.25_byte_article { color: red; }
a.26_byte_article { color: red; }
a.27_byte_article { color: red; }
a.28_byte_article { color: red; }
a.29_byte_article { color: red; }
a.30_byte_article { color: red; }
a.31_byte_article { color: red; }
a.32_byte_article { color: red; }
a.33_byte_article { color: red; }
a.34_byte_article { color: red; }
a.35_byte_article { color: red; }
a.36_byte_article { color: red; }
a.37_byte_article { color: red; }
a.38_byte_article { color: red; }
a.39_byte_article { color: red; }
a.40_byte_article { color: red; }
a.41_byte_article { color: red; }
a.42_byte_article { color: red; }
a.43_byte_article { color: red; }
a.44_byte_article { color: red; }
a.45_byte_article { color: red; }
a.46_byte_article { color: red; }
a.47_byte_article { color: red; }
a.48_byte_article { color: red; }
a.49_byte_article { color: red; }
a.50_byte_article { color: red; }
a.51_byte_article { color: red; }
a.52_byte_article { color: red; }
a.53_byte_article { color: red; }
a.54_byte_article { color: red; }
a.55_byte_article { color: red; }
a.56_byte_article { color: red; }
a.57_byte_article { color: red; }
a.58_byte_article { color: red; }
a.59_byte_article { color: red; }
a.60_byte_article { color: red; }
a.61_byte_article { color: red; }
a.62_byte_article { color: red; }
a.63_byte_article { color: red; }
a.64_byte_article { color: red; }
a.65_byte_article { color: red; }
a.66_byte_article { color: red; }
a.67_byte_article { color: red; }
a.68_byte_article { color: red; }
a.69_byte_article { color: red; }
a.70_byte_article { color: red; }
a.71_byte_article { color: red; }
a.72_byte_article { color: red; }
a.73_byte_article { color: red; }
a.74_byte_article { color: red; }
a.75_byte_article { color: red; }
a.76_byte_article { color: red; }
a.77_byte_article { color: red; }
a.78_byte_article { color: red; }
a.79_byte_article { color: red; }
a.80_byte_article { color: red; }
a.81_byte_article { color: red; }
a.82_byte_article { color: red; }
a.83_byte_article { color: red; }
a.84_byte_article { color: red; }
a.85_byte_article { color: red; }
a.86_byte_article { color: red; }
a.87_byte_article { color: red; }
a.88_byte_article { color: red; }
a.89_byte_article { color: red; }
a.90_byte_article { color: red; }
a.91_byte_article { color: red; }
a.92_byte_article { color: red; }
a.93_byte_article { color: red; }
a.94_byte_article { color: red; }
a.95_byte_article { color: red; }
a.96_byte_article { color: red; }
a.97_byte_article { color: red; }
a.98_byte_article { color: red; }
a.99_byte_article { color: red; }
a.100_byte_article { color: red; }
a.101_byte_article { color: red; }
a.102_byte_article { color: red; }
a.103_byte_article { color: red; }
a.104_byte_article { color: red; }
a.105_byte_article { color: red; }
a.106_byte_article { color: red; }
a.107_byte_article { color: red; }
a.108_byte_article { color: red; }
a.109_byte_article { color: red; }
a.110_byte_article { color: red; }
a.111_byte_article { color: red; }
a.112_byte_article { color: red; }
a.113_byte_article { color: red; }
a.114_byte_article { color: red; }
a.115_byte_article { color: red; }
a.116_byte_article { color: red; }
a.117_byte_article { color: red; }
a.118_byte_article { color: red; }
a.119_byte_article { color: red; }
a.120_byte_article { color: red; }
a.121_byte_article { color: red; }
a.122_byte_article { color: red; }
a.123_byte_article { color: red; }
a.124_byte_article { color: red; }
a.125_byte_article { color: red; }
a.126_byte_article { color: red; }
a.127_byte_article { color: red; }
a.128_byte_article { color: red; }
a.129_byte_article { color: red; }
a.130_byte_article { color: red; }
a.131_byte_article { color: red; }
a.132_byte_article { color: red; }
a.133_byte_article { color: red; }
a.134_byte_article { color: red; }
a.135_byte_article { color: red; }
a.136_byte_article

Re: [Wikitech-l] Developing true WISIWYG editor for media wiki

2010-08-03 Thread Павел Петроченко
Hi,

>Yes, of course we are interested on it.
>Specifically, the ideal WISIWYG MediaWiki editor would allow easy
>WISIWYG editing to newbies, while still allowing to use the full
>wikisyntax to power users, without inserting crappy markup when using
>it, or reordering everything to its liking when WISIWYG was used to do a
>little change.
Thanks for the note, it may be an important issue.

>From the screencast, it seems your technology is based in a local
>application instead of web. That's is a little inconvenience for the
>users, but an acceptable one IMHO. You could plug your app as an
>external editor, see: http://www.mediawiki.org/wiki/Manual:External_editors

Yep according to my understanding this is major problem, but unfortunately
we are rich client developers, so going web is only in our future plans.
(Actually we are thinking about moving to it, but waiting for a first
customer to help with transition)

On other side being a rich client app may add some benefits for advanced
users, which are still hard
to do in web apps (according to my poor desktop developer understanding).

custom groupings, personal inbox, local for work flow/validation rules and
review. (just as initial examples)

>The problem that makes this really hard is that MediaWiki syntax is not
>nice. So I'm a bit skeptical about that fast quality editor. You can
>find in the list archives many discussions about it, and also in
wikitext-l.
>Things like providing a ribbon is a completely esthetical choice, it
>can't really help on the result of its editing. Maybe your backend is
>powerful enough to handle this without problems. Please, show me wrong :)

Yep - already meet some crap in dealing with it(much more complex than, Trac
wiki one).
But still hope to over helm most of problems, in a couple of month

> I don't have an issue with there being a closed source Windows app that
> edits wikitext well, but then there is going to be a bit of a difficult
> transition from reading to editing and back again.
Yes, this is one of pote

> And just FYI, generally our community is more interested in free and
> cross-platform software than proprietary, single platform software.
Actually we are going to be open source and cross platform (we are Eclipse
RCP based)

> That was very interesting. Any chance the rest of us can try it for
> ourselves?

Our media wiki support is at very early stage now. Actually we are still not
sure how much we are going to be committed into it,
If there will be enough interest (at least couple of volunteer beta
testers), we will start publishing builds somewhere.

Regards,
Pavel
OnPositive Technologies.

2010/8/3 Neil Kandalgaonkar 

> On 8/2/10 9:29 AM, Павел Петроченко wrote:
>
>> Hi guys,
>>
>> At the moment we are discussing an opportunity to create full scale
>> true WYSIWYG client for media wiki. To the moment we have a technology
>> which should allow us to implement with a good quality and quite fast.
>> Unfortunately we are not sure
>> if there is a real need/interest for having such kind of client at the
>> media wiki world, as well as what are actual needs of media wiki
>> users.
>>
>
> Definitely interested.
>
> As for what the needs of MediaWiki users are, you can check out everything
> on http://usability.wikimedia.org/ . We are just beginning to address
> usability concerns. This study might be interesting to you:
>
> http://usability.wikimedia.org/wiki/Usability_and_Experience_Study
>
>
>
>  P.S. Screen cast demonstrating our experimental client for Trac wiki
>> http://www.screencast.com/t/MDkzYzM4
>>
>
> That was very interesting. Any chance the rest of us can try it for
> ourselves?
>
> I personally like the idea of a ribbon. I think we can assume that most
> wiki editors are always going to be novice editors, so taking up tremendous
> amounts of space by default to explain things is warranted. Experts should
> be able to drop into raw wikitext, or otherwise minimize the interface.
>
> I don't have an issue with there being a closed source Windows app that
> edits wikitext well, but then there is going to be a bit of a difficult
> transition from reading to editing and back again.
>
> And just FYI, generally our community is more interested in free and
> cross-platform software than proprietary, single platform software.
>
> Still it looks interesting. Please let us know more.
>
> --
> Neil Kandalgaonkar (|  
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Developing true WISIWYG editor for media wiki

2010-08-03 Thread Marco Schuster
On Tue, Aug 3, 2010 at 10:53 AM, Jacopo Corbetta
 wrote:
> However, the "editing mode" provided by browsers is a nightmare of
> incompatibilities. Basically, each browser produces a different output
> given identical commands, so currently MeanEditor is not completely up
> to the task. An external application might be an interesting solution.

I don't have the link ready, but Google solved this in Google Docs by
re-implementing this in Javascript... they intercept mouse
movements/clicks and keyboard events and then javascript-render the
page.
Given the complexity of wikitext, I fear rewriting the parser in
Javascript is the only way to get a 100% compatible wikitext editor...

Marco

-- 
VMSoft GbR
Nabburger Str. 15
81737 München
Geschäftsführer: Marco Schuster, Volker Hemmert
http://vmsoft-gbr.de

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikitech-l Digest, Vol 85, Issue 9

2010-08-03 Thread Daniel Kinzler
Noman schrieb:
> Thanks Dmitriy,
> i'm looking for div solution. as iframe will give scrolling if content are
> large. which is not required.
> now i was unable to find step by step approach to develop extension.
> 
> if you have ne thing / example. i'll be waiting.

Perhaps have a look at 

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Developing true WISIWYG editor for media wiki

2010-08-03 Thread Jacopo Corbetta
On Tue, Aug 3, 2010 at 00:49, Platonides  wrote:
> The problem that makes this really hard is that MediaWiki syntax is not
> nice. So I'm a bit skeptical about that fast quality editor. You can
> find in the list archives many discussions about it, and also in wikitext-l.
> Things like providing a ribbon is a completely esthetical choice, it
> can't really help on the result of its editing. Maybe your backend is
> powerful enough to handle this without problems. Please, show me wrong :)

I agree, wikitext is notoriously developer-unfriendly. A survey of
currently existing ideas and extensions is on
http://www.mediawiki.org/wiki/WYSIWYG_editor

As a shameless self-promotion, I encourage you to look at
http://www.mediawiki.org/wiki/Extension:MeanEditor for the approach we
took:
1. supporting only a limited subset of wikitext
2. supporting that subset well, leaving a clean history

The rationale here is that supporting all quirks of wikitext adds
little value both for new users (they should not be editing complex
stuff anyways!) and for advanced users (who probably already know
wikitext). I hope this idea can be useful to you.

However, the "editing mode" provided by browsers is a nightmare of
incompatibilities. Basically, each browser produces a different output
given identical commands, so currently MeanEditor is not completely up
to the task. An external application might be an interesting solution.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Showing bytes added/removed in each edit in "View history" and "User contributions"

2010-08-03 Thread Robert Ullmann
Ahem.

The revision size (and page size, meaning that of last revision) in
bytes, is available in the API. If you change the definition there is
no telling what you will break. Essentially you can't.

A character count would have to be another field.

best,
Robert

On Tue, Aug 3, 2010 at 9:53 AM, ChrisiPK  wrote:
> This is a policy requirement, not a technical requirement, and can surely be
> adjusted.
>
> Am 03.08.2010 07:14, schrieb Liangent:
>> On 8/3/10, Aryeh Gregor  wrote:
>>> No, we'd just have to repurpose rev_len to mean "characters" instead
>>> of "bytes", and update all the old rows.  We don't actually need the
>>> byte count for anything, do we?
>>
>> Byte count is used. For example in Chinese Wikipedia, one of the
>> criteria of "Did you know" articles is ">= 3000 bytes".
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Showing bytes added/removed in each edit in "View history" and "User contributions"

2010-08-03 Thread Liangent
On 8/3/10, ChrisiPK  wrote:
> This is a policy requirement, not a technical requirement, and can surely be
> adjusted.

It seems 1 zh char = 3 bytes gives a kind of proper weight among
characters. Obviously, zh chars look more important (when counting the
amount of content) than en chars, which are usually wikisyntax, in
zh.wp...

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



Re: [Wikitech-l] Wikitech-l Digest, Vol 85, Issue 9

2010-08-03 Thread Noman
Thanks Dmitriy,
i'm looking for div solution. as iframe will give scrolling if content are
large. which is not required.
now i was unable to find step by step approach to develop extension.

if you have ne thing / example. i'll be waiting.

Noman

On Tue, Aug 3, 2010 at 11:58 AM, wrote:

> Dmitriy
>
>
>
> --
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> End of Wikitech-l Digest, Vol 85, Issue 9
> *
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l