Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread MZMcBride
Ashar Voultoiz wrote:
> On 06/12/10 05:56, MZMcBride wrote:
>> I filed bug 26259, "MediaWiki bloated with test suites":
>> https://bugzilla.wikimedia.org/show_bug.cgi?id=26259
>> 
>> RobLa asked me to send an e-mail to wikitech-l about this bug. It's my view
>> that checking out MediaWiki from SVN should not include files that most
>> users do not need or want. These test suites seem to fit perfectly within
>> this category.
> 
> I have at least two objections :
> 
>   - I want to be able to commit a bug fix + associated tests in one
> revision without having to fetch the whole phase3 tree.  It is much
> easier to track and phase3 has many stuff I do not care about.

I don't commit to Wikimedia's SVN repo, so I won't comment on this.

>   - Users fetching from SVN are most probably interested in development
> and will *require* the test suites.  Regular user use the tarballs which
> can be build without the tests suite though.

I don't think it's reasonable to say that most users downloading from SVN
are interested in development, though some certainly are. SVN is convenient
for installations and upgrades. That's certainly why I use it. Even if we
could say that most users downloading from SVN are interested in
development, I don't see how that extends to saying that test suites are
required for development. You've lost me.

The maintenance directory has maintenance tools in it. I don't think test
suites fall within the category of "maintenance." Like large artwork files,
these seem to be "development" tools that shouldn't be downloaded with every
phase3 checkout.

> Can't you just ignore the test suites ?

Sure, and I do. But I don't imagine that this "tests" directory will get any
smaller with time. I don't really want to see MediaWiki get more bloated
over time with files that very few people will want or need. If that applies
to other parts of phase3 as well, those ought to be examined and considered
in separate bugs and threads (the "math" directory is a prime target, for
example).

> I do not care about squid support or utf-normalisation. Should they get
> moved out of trunk as well?

Perhaps, but that's an issue for a different bug and a different thread. :-)

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Petr Kadlec
On 6 December 2010 08:47, Ashar Voultoiz  wrote:
>  - I want to be able to commit a bug fix + associated tests in one
> revision without having to fetch the whole phase3 tree.

You don’t have to. Subversion supports (since 1.5?) sparse checkouts
,
allowing you to have only selected parts of a directory checked out in
your working copy, which is a feature suitable exactly for scenarios
like this one.

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Niklas Laxström
On 6 December 2010 08:11, Q  wrote:
> I think better time would be spent decoupling all the languages.  Out
> the 57 megs for an svn export, 41 is the languages directory. Distribute
> the Big $foo, where $foo is some reasonable number of major languages,
> and offer the rest as a seperate dl.

This suggestion seems to come up from time to time. I feel it is
unrealistic. First of all we can't remove them from svn, since they
have to be there. We could remove them from the tarballs, but please,
last time I checked the tarball was hardly over 12 megs. Even with
very slow modem it should take an hour at most to download that. Using
better compression algorithm would likely shrink it as much as
removing few languages. The minor languages don't even take as much
space as the major languages, which usually have more complete
localisation.

Drawing the line is not easy and would likely cause continuous,
unnecessary contention, put some languages in a privileged position
and hurt MediaWiki's top notch i18n and l10n support. Each language is
special, but you don't see that if you just look at the number of
speakers. Do we really want hurt one of our greatest advantages?

Besides, it feels silly to talk about this, while we simultaneously
talk about including some of the most common extensions in the name of
providing feature complete MediaWiki straight from the box--which is a
goal I agree with.

 -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread MZMcBride
Niklas Laxström wrote:
> Besides, it feels silly to talk about this, while we simultaneously
> talk about including some of the most common extensions in the name of
> providing feature complete MediaWiki straight from the box--which is a
> goal I agree with.

I wouldn't say it's silly. If a very large percentage of MediaWiki
installations will end up installing ParserFunctions, but a very small
percentage of MediaWiki installations will end up using an obscure message
file, there's a reasonable debate there about which to include. A somewhat
similar metric is sometimes used when determining whether a feature should
go into core or into an extension.

Speaking of extensions, there's now a bug about bundling ParserFunctions
with MediaWiki: https://bugzilla.wikimedia.org/show_bug.cgi?id=26261  Tim
has asked developers who support (or oppose, I guess) this idea to comment
on that bug.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Niklas Laxström
On 6 December 2010 12:08, Petr Kadlec  wrote:
> On 6 December 2010 08:47, Ashar Voultoiz  wrote:
>>  - I want to be able to commit a bug fix + associated tests in one
>> revision without having to fetch the whole phase3 tree.
>
> You don’t have to. Subversion supports (since 1.5?) sparse checkouts
> ,
> allowing you to have only selected parts of a directory checked out in
> your working copy, which is a feature suitable exactly for scenarios
> like this one.

Except it is still too inconvenient to use for most cases (yeah I was
going to suggest it too):
"Subversion 1.5's implementation of shallow checkouts is good but does
not support a couple of interesting behaviors. First, you cannot
de-telescope a working copy item. Running svn update --set-depth empty
in an infinite-depth working copy will not have the effect of
discarding everything but the topmost directory—it will simply error
out. Second, there is no depth value to indicate that you wish an item
to be explicitly excluded. You have to do implicit exclusion of an
item by including everything else."

  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review and making it to 1.17

2010-12-06 Thread Roan Kattouw
2010/12/6 Tim Starling :
> Presumably you mean that we would create REL1_17 now, as a copy from
> trunk, then run make-wmf-branch in January, which would copy from
> REL1_17 to 1.17wmf1, to be deployed shortly after. Then we would tag
> for release of 1.17 beta 1 some time after that.
>
That's the correct approach indeed, thanks for correcting me or I
would've done it wrong.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Markus Glaser
Hi all,

concerning the selenium part of the tests, I suggest to treat functional tests 
and regression tests differently:
* The framework and test runner should remain in maintenance/tests, since it 
provides a testing infrastructure for MW
* We are working on a MediaWiki smoke test which makes sure the wiki is 
basically up and running. I think, that kind of tests makes perfect sense 
within phase3, since they might also be run by someone who sets up a wiki and 
wants to make sure everything in the installation went ok. Also, extension 
developers could use them to ensure their work didn't break any basic 
functionality.
* Regression tests for bugfixes go to some separate place, since there might be 
a large number of them. I can see Ashar's point, though. Maybe a solution could 
be to put them in a separate tests folder and ignore this one when bundling a 
tarball?
A similar system might also make sense for unit tests.

Cheers,
Markus (mglaser)

On 6 December 2010 12:08, Petr Kadlec  wrote:
> On 6 December 2010 08:47, Ashar Voultoiz  wrote:
>>  - I want to be able to commit a bug fix + associated tests in one 
>> revision without having to fetch the whole phase3 tree.
>
> You don't have to. Subversion supports (since 1.5?) sparse checkouts 
> ,
> allowing you to have only selected parts of a directory checked out in 
> your working copy, which is a feature suitable exactly for scenarios 
> like this one.

Except it is still too inconvenient to use for most cases (yeah I was going to 
suggest it too):
"Subversion 1.5's implementation of shallow checkouts is good but does not 
support a couple of interesting behaviors. First, you cannot de-telescope a 
working copy item. Running svn update --set-depth empty in an infinite-depth 
working copy will not have the effect of discarding everything but the topmost 
directory-it will simply error out. Second, there is no depth value to indicate 
that you wish an item to be explicitly excluded. You have to do implicit 
exclusion of an item by including everything else."


  -Niklas

--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Improving the skinning system

2010-12-06 Thread Daniel Friesen
I've been chipping away at our skins system lately, there's a lot we can 
improve to improve the skins system.
Right now there's a lot of it that doesn't work so nicely for an 
ecosystem promoting the creation of a wide variety of skins.

- footerlinks is duplicated across skins and is hardcoded (I fixed this 
in trunk, I did footericons too, however these could be improved with 
helper methods, we still have unwanted boilerplate)
- our directory structure is odd
-- SkinName.php with skinname/ beside it means skins are not self 
enclosed and can't simply be dropped into their own directory
-- having a first class skins directory we store skins in with proper 
autoloading, stylepath and localstylepath (1.17) variables, and having 
conventions for including skin resources like 
skins/{skinname}/path/to/file.ext but saying that any skin we don't 
include in core should instead go in extensions, use extensions assets 
path, not have localstylepath, and not use the 
skins/{skinname}/path/to/file.ext convention and be left without any 
autoloading at all, and include extra boilerplate/code that native skins 
don't need to, does not promote the creation of new third party skins
--- in fact, besides the 4 skins in extensions/skins/ that were in svn 
which don't appear to be really notable skins and haven't been touched 
in a half year, all the 3rd party skins out there don't even use the 
extensions features, they all just have you place them inside skins/ the 
same way the native skins are.
- we have page end boilerplate you have to copy into a skin (we can 
probably fix this somewhat similarly to the way headelement was added)
- toolbox is hardcoded into the skin and needs to be copied (this one 
definitely should be fixed)
- many of our toolboxes portlets nav links need very verbose loops and 
things hardcoded which would be better inside of a common helper method
-- it should also be possible to differentiate between the page/talk and 
other navlinks the way vector does, without having to hardcode it or add 
a pile of extra php
- building a sidebar should be minimal in the specialcases and php you 
have to hardcode (SEARCH, TOOLBOX, etc... special cases shouldn't be 
hardcoded into the skin)
- we need a method of building a search form that is flexible to styles, 
but minimizes what you have to hardcode and what boilerplate you need, 
most skins should be able to call something and just drop a search box 
or search bar in with a single method if they don't want to do any 
special customization.
- the large block of common boilerplate inside the content area also 
isn't that nice for building 3rd party skins

I've been working on improving the system for the past few days. And 
I've also come up with a new idea for a skin packaging and installation 
convention.
To show it off I ported WordPress' P2 theme into a wiki skin and 
committed it to 
http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/skins/p2wiki/
The result being http://trunk.new.wiki-tools.com/index.php?useskin=p2wiki

The convention should work with currently stable versions of MediaWiki 
as well. And it is compatible with adding new features to newer versions 
of MediaWiki to tear away boilerplate necessary to package up a skin for 
distribution. As well as with $wgStylePath conventions.
Currently the convention can be done by packaging up a skin like I did 
with p2wiki, using extension style techniques to add a new skin. It can 
be installed into skins/skinname/ and 
`require_once("$IP/skins/skinname/skinname.php");` can be added to 
LocalSettings.php.
That convention will work in already released versions of MediaWiki. For 
future versions we may want to consider adding autoloading code for this 
style of skins, and adding some convention based defaults.
As an ideal, it would be nice if some way down the road it's possible to 
build a skin without the use of any php, even if you have to enable an 
extension to add the actual template system or whatever.


As for places for putting skins and distributing them, Lcawte proposed 
adding a /trunk/skins/ for the new convention.
There was a bit of discussion in irc where some downsides, upsides, and 
counter issues were brought up.
- Adding /trunk/skins would require some (possibly small?) changes to 
the Translation extension.
- ExtensionDistributor would also need tweaks.
-- However it was brought up that ExtensionDistributor doesn't work with 
/trunk/extensions/skins/ anyways, at least not if you're only trying to 
get one skin out.
-- I think tweaking ExtensionDistributor to support /trunk/skins would 
be quite easy (at the simplest it's probably a case of making small 
tweaks to make a SkinDistributor that uses /trunk/skins instead of 
/trunk/extensions and whatnot)

It's not part of distribution, but the new installer's ability to point 
out extensions and allow you to install them from the installer was 
pointed out. However generally each skin doesn't have it's own set of 
configuration 

Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Soxred93

On Dec 6, 2010, at 5:12 AM, Niklas Laxström wrote:

> On 6 December 2010 08:11, Q  wrote:
>> I think better time would be spent decoupling all the languages.  Out
>> the 57 megs for an svn export, 41 is the languages directory. Distribute
>> the Big $foo, where $foo is some reasonable number of major languages,
>> and offer the rest as a seperate dl.

Perhaps an option would be to remove them from phase3, and moving them to a 
separate directory. Then, if someone switches the wiki language to some obscure 
language, or does &uselang=dfdjdkgj, or other activity that needs an obscure 
language, it would run a one-time download to the local filesystem. Might be 
too slow, but it's only a 1 time download.

Just an idea.

-X!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Huib Laurens
Hi!

MediaWiki is famous because its almost supported in all languages that's
something other wiki software don't have, lets not try to remove that. So I
think the main bundle should be mediawiki with *all* languages.

But a other option is to add a second download option where people can
download mediawiki with only EN and make tarballs for all languages
available so people can add the languages they want. Maybe something that
can be done on a mirror site.

Huib

2010/12/6 Soxred93 

>
> On Dec 6, 2010, at 5:12 AM, Niklas Laxström wrote:
>
> > On 6 December 2010 08:11, Q  wrote:
> >> I think better time would be spent decoupling all the languages.  Out
> >> the 57 megs for an svn export, 41 is the languages directory. Distribute
> >> the Big $foo, where $foo is some reasonable number of major languages,
> >> and offer the rest as a seperate dl.
>
> Perhaps an option would be to remove them from phase3, and moving them to a
> separate directory. Then, if someone switches the wiki language to some
> obscure language, or does &uselang=dfdjdkgj, or other activity that needs an
> obscure language, it would run a one-time download to the local filesystem.
> Might be too slow, but it's only a 1 time download.
>
> Just an idea.
>
> -X!
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
Regards,
Huib "Abigor" Laurens



Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Platonides
Niklas Laxström wrote:
> This suggestion seems to come up from time to time. I feel it is
> unrealistic. First of all we can't remove them from svn, since they
> have to be there. We could remove them from the tarballs, but please,
> last time I checked the tarball was hardly over 12 megs. Even with
> very slow modem it should take an hour at most to download that. Using
> better compression algorithm would likely shrink it as much as
> removing few languages. The minor languages don't even take as much
> space as the major languages, which usually have more complete
> localisation.
> 
> Drawing the line is not easy and would likely cause continuous,
> unnecessary contention, put some languages in a privileged position
> and hurt MediaWiki's top notch i18n and l10n support. Each language is
> special, but you don't see that if you just look at the number of
> speakers. Do we really want hurt one of our greatest advantages?
> 
> Besides, it feels silly to talk about this, while we simultaneously
> talk about including some of the most common extensions in the name of
> providing feature complete MediaWiki straight from the box--which is a
> goal I agree with.
> 
>  -Niklas

A few days ago the issue came up where I was talking with an end user
who was complaining about MediaWiki being too large (in the server, not
in the tarball) compared to other apps like wordpress.
I think there's a use case for providing a mediawiki download where the
end user can check which languages they want and provide a custom download.
And/or document how to strip some languages from mediawiki.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Soxred93
It probably would not be too hard to make an extension to do just that. Just 
modify ExtensionDistributor. 

-X!

On Dec 6, 2010, at 10:02 AM, Platonides  wrote:

> Niklas Laxström wrote:
>> This suggestion seems to come up from time to time. I feel it is
>> unrealistic. First of all we can't remove them from svn, since they
>> have to be there. We could remove them from the tarballs, but please,
>> last time I checked the tarball was hardly over 12 megs. Even with
>> very slow modem it should take an hour at most to download that. Using
>> better compression algorithm would likely shrink it as much as
>> removing few languages. The minor languages don't even take as much
>> space as the major languages, which usually have more complete
>> localisation.
>> 
>> Drawing the line is not easy and would likely cause continuous,
>> unnecessary contention, put some languages in a privileged position
>> and hurt MediaWiki's top notch i18n and l10n support. Each language is
>> special, but you don't see that if you just look at the number of
>> speakers. Do we really want hurt one of our greatest advantages?
>> 
>> Besides, it feels silly to talk about this, while we simultaneously
>> talk about including some of the most common extensions in the name of
>> providing feature complete MediaWiki straight from the box--which is a
>> goal I agree with.
>> 
>> -Niklas
> 
> A few days ago the issue came up where I was talking with an end user
> who was complaining about MediaWiki being too large (in the server, not
> in the tarball) compared to other apps like wordpress.
> I think there's a use case for providing a mediawiki download where the
> end user can check which languages they want and provide a custom download.
> And/or document how to strip some languages from mediawiki.
> 
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Chad
On Mon, Dec 6, 2010 at 10:02 AM, Platonides  wrote:
> Niklas Laxström wrote:
>> This suggestion seems to come up from time to time. I feel it is
>> unrealistic. First of all we can't remove them from svn, since they
>> have to be there. We could remove them from the tarballs, but please,
>> last time I checked the tarball was hardly over 12 megs. Even with
>> very slow modem it should take an hour at most to download that. Using
>> better compression algorithm would likely shrink it as much as
>> removing few languages. The minor languages don't even take as much
>> space as the major languages, which usually have more complete
>> localisation.
>>
>> Drawing the line is not easy and would likely cause continuous,
>> unnecessary contention, put some languages in a privileged position
>> and hurt MediaWiki's top notch i18n and l10n support. Each language is
>> special, but you don't see that if you just look at the number of
>> speakers. Do we really want hurt one of our greatest advantages?
>>
>> Besides, it feels silly to talk about this, while we simultaneously
>> talk about including some of the most common extensions in the name of
>> providing feature complete MediaWiki straight from the box--which is a
>> goal I agree with.
>>
>>  -Niklas
>
> A few days ago the issue came up where I was talking with an end user
> who was complaining about MediaWiki being too large (in the server, not
> in the tarball) compared to other apps like wordpress.
> I think there's a use case for providing a mediawiki download where the
> end user can check which languages they want and provide a custom download.
> And/or document how to strip some languages from mediawiki.
>

I agree that we could provide smaller tarbells for downloading a release.

I do not agree with moving them in SVN though.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Improving the skinning system

2010-12-06 Thread Trevor Parscal
On 12/6/10 3:18 AM, Daniel Friesen wrote:
> It's not part of distribution, but the new installer's ability to point
> out extensions and allow you to install them from the installer was
> pointed out. However generally each skin doesn't have it's own set of
> configuration (vector does, but generally as an ideal having a bunch of
> skins that are nothing but a separate theme for the site should not
> require special configuration of each one of them) so there isn't really
> much use for sharing configuration infrastructure. Additionally, if we
> do add an autoloader for the new style of skin there's not really any
> point to having the installer point out skins. If they're in a spot the
> installer can find them, they'll already be autoloaded anyways.
Or you can just add support for a visual interface for controlling skin 
settings. Wordpress does this kind of thing by having a script that just 
registers things, but we could do it in a variety of ways. The point 
here is, configuring skins isn't such a bad idea, it helps solve 
problems where people want the same skin with just a little tweaking, 
reducing forking, which helps concentrate efforts on a single distribution.

- Trevor

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making usability part of the development process

2010-12-06 Thread Trevor Parscal
Have you considered using something like say XSL? If a skin was a 
combination of a set of CSS/JS/image files plus an XSL file which took 
XML data (which would be the page content and various user interface 
elements) then you could avoid using a whole new template system. PHP 5 
has an XSL 1.0 library compiled by default, and since it's implemented 
in C, I would bet it's far more performant than any PHP implementation 
of a template system aside from using PHP itself as a template system.

Then again, there's a lot of people out there that dislike XSL for all 
sorts of reasons - which to me have always carried the ring of 
prejudice. I personally think XSL is awesome, and would defend it with 
vigor.

In any case, I'm excited about the work you are doing.

- Trevor

On 12/3/10 5:49 PM, Daniel Friesen wrote:
> Perhaps I should commit what I have so far even though it's not quite
> ready for use yet.
>
> It's enough to do this so far:
> http://testwiki.new.wiki-tools.com/index.php?title=Main_Page&useskin=testskin
> ((this runs off that ported monobook template I put in the pastie))
>
> Right now I'm using smarty's eval: which I don't want to use because it
> re-compiles the template each time you run it.
> I really wanted to make use of MediaWiki's caching system because MW
> already has such a good caching system it's built in, and having to
> require extra permissions just to make smarty happy felt messed up.
>
>
> I have a bad habit of starting projects for fun and experimenting them,
> then abandoning them later. The ones I finish are usually the ones with
> company backing, the really really simple ones, or ones with some other
> sort of backing or other people working on it to. So having someone else
> working on it too would probably be motivating.
> This came after experimenting with porting Wikia's Monaco skin (got 1/3
> the way through), and after it I even started experimenting with writing
> an entire language embedded in php that could be used in MW. And there's
> my long lived vm based wiki farm experiment.
>
>
> One possibility I suppose, would be to use the vector array as the
> standard, but hardcode monobook to flatten it and relocate the tagline.
>
> Some ideas for the templates besides fixing caching... We should
> consider a method of doing personal_urls and whatnot using some blocks
> or functions that can aide in the generation of the links and make them
> simpler to build with tooltips and whatnot. Perhaps also some sort of
> block for content to automatically place the dataAfterContent. The
> sidebar block could be made better potentially.
> Besides footer_links we could also consider moving most of the toolbox
> out of monobook and whatnot. Converting it mostly into something you can
> loop through.
>
>
> Btw, the new powered by icon, using a little more rounded and fancy
> background:
> http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/skins/common/images/poweredby_mediawiki_88x31.png?revision=76120&view=markup
> Anyone else think it feels slightly out of place in monobook?
> http://en.wikipedia.org/wiki/?useskin=monobook
> I debated tweaking the poweredby, etc... buttons to instead use
> something a little more component style, where the content of the button
> is an image, but the bg is provided by the skin. Giving you the same
> style as before in monobook, but that nice new fancy style in vector.
>
> ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
>
>
> On 10-12-03 11:10 AM, Trevor Parscal wrote:
>> This is a really awesome project, I'm sure we can figure out a way for
>> Vector and Monobook to be able to use the same arrays.
>>
>> Let me know if you want some help.
>>
>> - Trevor
>>
>> On 12/2/10 6:30 PM, Daniel Friesen wrote:
>>
>>> On 10-12-02 04:52 PM, Platonides wrote:
>>>
 Aryeh Gregor wrote:


> On Thu, Dec 2, 2010 at 5:17 PM, Paul Houle wrote:
>
>
>> Of all the code I've seen,  the Mediawiki code seems to be one of
>> the most difficult code bases to make simple changes in.  When I had to
>> change the template of a mediawiki once,  the easiest answer I found was
>> to put a proxy server in front of it,  drop out the original template
>> and spit the body text into a new template.  (That said,  this was a
>> system I already had on the shelf that worked wonders for all sorts of
>> commercial crapware)
>>
>>
> What do you mean by "change the template of a mediawiki"?  Do you mean
> templates in the MediaWiki sense, as in pages that can be transcluded
> into other pages?  Or do you mean the skin?  Skin HTML can usually be
> changed by just grepping a relevant class or id and editing some raw
> HTML, or a pretty simple wrapper layer.  It can't be changed without
> hacking the code, so it's certainly a lot harder than in most popular
> web apps, but I'm pretty sure you can do it more easily in almost all
> cases than by postprocessing the H

Re: [Wikitech-l] Improving the skinning system

2010-12-06 Thread Daniel Friesen
On 10-12-06 07:48 AM, Trevor Parscal wrote:
> On 12/6/10 3:18 AM, Daniel Friesen wrote:
>
>> It's not part of distribution, but the new installer's ability to point
>> out extensions and allow you to install them from the installer was
>> pointed out. However generally each skin doesn't have it's own set of
>> configuration (vector does, but generally as an ideal having a bunch of
>> skins that are nothing but a separate theme for the site should not
>> require special configuration of each one of them) so there isn't really
>> much use for sharing configuration infrastructure. Additionally, if we
>> do add an autoloader for the new style of skin there's not really any
>> point to having the installer point out skins. If they're in a spot the
>> installer can find them, they'll already be autoloaded anyways.
>>  
> Or you can just add support for a visual interface for controlling skin
> settings. Wordpress does this kind of thing by having a script that just
> registers things, but we could do it in a variety of ways. The point
> here is, configuring skins isn't such a bad idea, it helps solve
> problems where people want the same skin with just a little tweaking,
> reducing forking, which helps concentrate efforts on a single distribution.
>
> - Trevor
>
Sure, that's an even better point. If we DO have some sort of skin 
configuration we'll probably want a nice visual interface that people 
can understand, and if possible, we'll probably want to integrate some 
of it into the preferences system so that users can customize some of 
what they see to.

Anything we build for configuring an extension is going to be 
preference-less and geared towards configuring config vars and actual 
complex config settings used by extensions. Extensions and Skins have 
differing ideals for visual interfaces.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


-- 
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Mark A. Hershberger
MZMcBride  writes:

> Even if we could say that most users downloading from SVN are
> interested in development, I don't see how that extends to saying that
> test suites are required for development. You've lost me.

If a developer breaks a test, ze[1] should fix it.  As the tests cover
more and more of the code, the tests should be considered more integral
to development since they provide a nice sanity check on any changes a
developer is likely to make.

> The maintenance directory has maintenance tools in it. I don't think test
> suites fall within the category of "maintenance." Like large artwork files,
> these seem to be "development" tools that shouldn't be downloaded with every
> phase3 checkout.

Others have already pointed out that this argument could be applied to
other areas of the phase3 tree, but let me add this.

If you measure the size of everything under phase3 and normalize it
against the size of the maintenance/tests directory[2], you come up with
following:

1.000  maintenance/tests/
1.263  resources
1.302  skins
3.964  maintenance
14.949 includes
25.818 languages

That is, the tests directory is only 1/4 of the size of the entire
maintenance directory.  The includes directory is almost 15 times larger
than the tests directory and, yes, languages tops them all.

Since we have some developers actively working on the test suite, it
appears that you are more annoyed with the noise during “svn up” rather
than any actual “bloat”.

Mark.

[1] http://en.wikipedia.org/wiki/Gender-neutral_pronoun
[2] du -sk * maintenance/tests/ | perl -ne 'chomp;($c, $l) = \
m{(\S+)\s+(\S+)}; $w = $c/3652;print sprintf ("%0.3f $l\n", $w);' \
| sort -n

-- 
http://hexmode.com/

War begins by calling for the annihilation of the Other,
but ends ultimately in self-annihilation.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making usability part of the development process

2010-12-06 Thread Daniel Friesen
PHP -> XSL doesn't quite feel like much of an improvement in terms of 
cutting down on the verbose redundant code boilerplate required to 
insert something.
ie:  doesn't look much better than text("title") ?>, as opposed to {$title|escape:html}.
And I'm not sure how XSL handles html vs. text, we have stuff you want 
to htmlescape and stuff you want to output raw.
Using an XSL lib also doesn't play nicely with things like calling a 
function to generate tooltips or fetch something from the i18n system.

Unfortunately I think any designer we can convince to build a MediaWiki 
skin will probably dislike using XSL as a template language more than 
using PHP as a template language.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


On 10-12-06 07:57 AM, Trevor Parscal wrote:
> Have you considered using something like say XSL? If a skin was a
> combination of a set of CSS/JS/image files plus an XSL file which took
> XML data (which would be the page content and various user interface
> elements) then you could avoid using a whole new template system. PHP 5
> has an XSL 1.0 library compiled by default, and since it's implemented
> in C, I would bet it's far more performant than any PHP implementation
> of a template system aside from using PHP itself as a template system.
>
> Then again, there's a lot of people out there that dislike XSL for all
> sorts of reasons - which to me have always carried the ring of
> prejudice. I personally think XSL is awesome, and would defend it with
> vigor.
>
> In any case, I'm excited about the work you are doing.
>
> - Trevor
>
> On 12/3/10 5:49 PM, Daniel Friesen wrote:
>
>> Perhaps I should commit what I have so far even though it's not quite
>> ready for use yet.
>>
>> It's enough to do this so far:
>> http://testwiki.new.wiki-tools.com/index.php?title=Main_Page&useskin=testskin
>> ((this runs off that ported monobook template I put in the pastie))
>>
>> Right now I'm using smarty's eval: which I don't want to use because it
>> re-compiles the template each time you run it.
>> I really wanted to make use of MediaWiki's caching system because MW
>> already has such a good caching system it's built in, and having to
>> require extra permissions just to make smarty happy felt messed up.
>>
>>
>> I have a bad habit of starting projects for fun and experimenting them,
>> then abandoning them later. The ones I finish are usually the ones with
>> company backing, the really really simple ones, or ones with some other
>> sort of backing or other people working on it to. So having someone else
>> working on it too would probably be motivating.
>> This came after experimenting with porting Wikia's Monaco skin (got 1/3
>> the way through), and after it I even started experimenting with writing
>> an entire language embedded in php that could be used in MW. And there's
>> my long lived vm based wiki farm experiment.
>>
>>
>> One possibility I suppose, would be to use the vector array as the
>> standard, but hardcode monobook to flatten it and relocate the tagline.
>>
>> Some ideas for the templates besides fixing caching... We should
>> consider a method of doing personal_urls and whatnot using some blocks
>> or functions that can aide in the generation of the links and make them
>> simpler to build with tooltips and whatnot. Perhaps also some sort of
>> block for content to automatically place the dataAfterContent. The
>> sidebar block could be made better potentially.
>> Besides footer_links we could also consider moving most of the toolbox
>> out of monobook and whatnot. Converting it mostly into something you can
>> loop through.
>>
>>
>> Btw, the new powered by icon, using a little more rounded and fancy
>> background:
>> http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/skins/common/images/poweredby_mediawiki_88x31.png?revision=76120&view=markup
>> Anyone else think it feels slightly out of place in monobook?
>> http://en.wikipedia.org/wiki/?useskin=monobook
>> I debated tweaking the poweredby, etc... buttons to instead use
>> something a little more component style, where the content of the button
>> is an image, but the bg is provided by the skin. Giving you the same
>> style as before in monobook, but that nice new fancy style in vector.
>>
>> ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
>>
>>
>> On 10-12-03 11:10 AM, Trevor Parscal wrote:
>>  
>>> This is a really awesome project, I'm sure we can figure out a way for
>>> Vector and Monobook to be able to use the same arrays.
>>>
>>> Let me know if you want some help.
>>>
>>> - Trevor
>>>
>>> On 12/2/10 6:30 PM, Daniel Friesen wrote:
>>>
>>>
 On 10-12-02 04:52 PM, Platonides wrote:

  
> Aryeh Gregor wrote:
>
>
>
>> On Thu, Dec 2, 2010 at 5:17 PM, Paul Houle  
>> wrote:
>>
>>
>>  
>>>  Of all the code I've seen,  the Mediawiki code seems to be one 
>>> of

Re: [Wikitech-l] Making usability part of the development process

2010-12-06 Thread Trevor Parscal
Fair enough, but when I've pushed for using PHP as a template language, 
even with a clever wrapper that would make injected data escaped by 
default (so unescaping was obvious and could be more easily scrutinized 
for XSS) I have been met with about equal resistance.

The state of things is that unless you are generating HTML 100% 
procedurally (such as by using the Xml and Html classes) then you are 
seen as doing something wrong. Adding something like smarty to the 
system seems silly when we have two completely reasonable alternatives 
at our disposal already.

XSL allows data to be injected without escaping:



This is good because it's escaped by default, and any use of this 
attribute would be obvious and could thus be scrutinized for XSS.

As for hooking in an i18n system, I'm not so sure skins should really be 
arbitrarily injecting messages anyways. What's the use case here? 
Generating tooltips? I'm a bit lost there.

- Trevor

On 12/6/10 8:13 AM, Daniel Friesen wrote:
> PHP ->  XSL doesn't quite feel like much of an improvement in terms of
> cutting down on the verbose redundant code boilerplate required to
> insert something.
> ie:  doesn't look much better than $this->text("title") ?>, as opposed to {$title|escape:html}.
> And I'm not sure how XSL handles html vs. text, we have stuff you want
> to htmlescape and stuff you want to output raw.
> Using an XSL lib also doesn't play nicely with things like calling a
> function to generate tooltips or fetch something from the i18n system.
>
> Unfortunately I think any designer we can convince to build a MediaWiki
> skin will probably dislike using XSL as a template language more than
> using PHP as a template language.
>
> ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
>
>
> On 10-12-06 07:57 AM, Trevor Parscal wrote:
>> Have you considered using something like say XSL? If a skin was a
>> combination of a set of CSS/JS/image files plus an XSL file which took
>> XML data (which would be the page content and various user interface
>> elements) then you could avoid using a whole new template system. PHP 5
>> has an XSL 1.0 library compiled by default, and since it's implemented
>> in C, I would bet it's far more performant than any PHP implementation
>> of a template system aside from using PHP itself as a template system.
>>
>> Then again, there's a lot of people out there that dislike XSL for all
>> sorts of reasons - which to me have always carried the ring of
>> prejudice. I personally think XSL is awesome, and would defend it with
>> vigor.
>>
>> In any case, I'm excited about the work you are doing.
>>
>> - Trevor
>>
>> On 12/3/10 5:49 PM, Daniel Friesen wrote:
>>
>>> Perhaps I should commit what I have so far even though it's not quite
>>> ready for use yet.
>>>
>>> It's enough to do this so far:
>>> http://testwiki.new.wiki-tools.com/index.php?title=Main_Page&useskin=testskin
>>> ((this runs off that ported monobook template I put in the pastie))
>>>
>>> Right now I'm using smarty's eval: which I don't want to use because it
>>> re-compiles the template each time you run it.
>>> I really wanted to make use of MediaWiki's caching system because MW
>>> already has such a good caching system it's built in, and having to
>>> require extra permissions just to make smarty happy felt messed up.
>>>
>>>
>>> I have a bad habit of starting projects for fun and experimenting them,
>>> then abandoning them later. The ones I finish are usually the ones with
>>> company backing, the really really simple ones, or ones with some other
>>> sort of backing or other people working on it to. So having someone else
>>> working on it too would probably be motivating.
>>> This came after experimenting with porting Wikia's Monaco skin (got 1/3
>>> the way through), and after it I even started experimenting with writing
>>> an entire language embedded in php that could be used in MW. And there's
>>> my long lived vm based wiki farm experiment.
>>>
>>>
>>> One possibility I suppose, would be to use the vector array as the
>>> standard, but hardcode monobook to flatten it and relocate the tagline.
>>>
>>> Some ideas for the templates besides fixing caching... We should
>>> consider a method of doing personal_urls and whatnot using some blocks
>>> or functions that can aide in the generation of the links and make them
>>> simpler to build with tooltips and whatnot. Perhaps also some sort of
>>> block for content to automatically place the dataAfterContent. The
>>> sidebar block could be made better potentially.
>>> Besides footer_links we could also consider moving most of the toolbox
>>> out of monobook and whatnot. Converting it mostly into something you can
>>> loop through.
>>>
>>>
>>> Btw, the new powered by icon, using a little more rounded and fancy
>>> background:
>>> http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/skins/common/images/poweredby_mediawiki_88x31.png?revision=76120&view=markup
>>> Anyone else think it feels slight

Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Ashar Voultoiz
On 06/12/10 09:07, MZMcBride wrote:
> The maintenance directory has maintenance tools in it. I don't think test
> suites fall within the category of "maintenance." Like large artwork files,
> these seem to be "development" tools that shouldn't be downloaded with every
> phase3 checkout.

On the second part, I disagree with you since development tools must be 
part of MediaWiki core for development purposes.  I do agree though that 
the tests themselves should probably live in their own repository such 
as  "phase3/tests".  That is where Avar put them originally (or it was 
"phase3/t" which is "prove" default path).

This way you can easily ignore the "tests" directory but keep the actual 
test framework in "maintenance/tests/".

-- 
Ashar Voultoiz


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Niklas Laxström
On 6 December 2010 17:02, Platonides  wrote:
> Niklas Laxström wrote:
> A few days ago the issue came up where I was talking with an end user
> who was complaining about MediaWiki being too large (in the server, not
> in the tarball) compared to other apps like wordpress.
> I think there's a use case for providing a mediawiki download where the
> end user can check which languages they want and provide a custom download.
> And/or document how to strip some languages from mediawiki.


How hard can it be?
find languages/messages/ -name "Messages*.php" -and -not -name
"MessagesEn.php" -delete
Or use some fancy graphical ui to do the same.

What comes to the tarball..
8,0M  mediawiki-1.16.0.7z
13M   mediawiki-1.16.0.tar.gz

This is very simple to implement with no big downsides (we can
continue providing both formats).

And here's the same with all message files stripped out:
2,0M  mw-lite.7z
2,8M  mw-lite.tar.gz

It might look cool, and someone can provide those if he wants, but I
think the default release tarball should contain all languages.

 -Niklas


-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Platonides
Niklas Laxström wrote:
> On 6 December 2010 17:02, Platonides wrote:
>> Niklas Laxström wrote:
>> A few days ago the issue came up where I was talking with an end user
>> who was complaining about MediaWiki being too large (in the server, not
>> in the tarball) compared to other apps like wordpress.
>> I think there's a use case for providing a mediawiki download where the
>> end user can check which languages they want and provide a custom download.
>> And/or document how to strip some languages from mediawiki.
> 
> 
> How hard can it be?
> find languages/messages/ -name "Messages*.php" -and -not -name
> "MessagesEn.php" -delete
> Or use some fancy graphical ui to do the same.

I don't know. Aren't the message files referenced from somewhere?


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Improving the skinning system

2010-12-06 Thread Rob Lanphier
On Mon, Dec 6, 2010 at 3:18 AM, Daniel Friesen
 wrote:
> I've been chipping away at our skins system lately, there's a lot we can
> improve to improve the skins system.
> Right now there's a lot of it that doesn't work so nicely for an
> ecosystem promoting the creation of a wide variety of skins.

Glad to see this area getting some love!

> - Adding /trunk/skins would require some (possibly small?) changes to
> the Translation extension.
> - ExtensionDistributor would also need tweaks.
> -- However it was brought up that ExtensionDistributor doesn't work with
> /trunk/extensions/skins/ anyways, at least not if you're only trying to
> get one skin out.
> -- I think tweaking ExtensionDistributor to support /trunk/skins would
> be quite easy (at the simplest it's probably a case of making small
> tweaks to make a SkinDistributor that uses /trunk/skins instead of
> /trunk/extensions and whatnot)

One thought on that subject.  The main distinction between the
extensions directory and the skins directory is that the extensions
directory is a set of zero or more extensions that are presumably
complementary, none of which is required for proper operation of
MediaWiki (ParserFunctions debate notwithstanding)  The skins
directory is set of at least one active skin along with zero or more
mutually exclusive alternatives to the active skin.

One should implicitly be able to blow away the extensions directory,
modify the LocalSettings.php file accordingly, and have a working
MediaWiki install.  It's worth exploring if this should be an explicit
requirement or something we should drift from.

> Oh, and lastly... if anyone knows of any "really" good WordPress or
> other CMS themes or templates I might consider porting some as examples.

Drupal's Zen theme would be a great choice, I think:
http://drupal.org/project/zen

It's designed to be sort of a meta-theme that is very easy to
customize.  Here's the page with some of the contributed sub-themes:
http://drupal.org/node/340837

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread MZMcBride
Mark A. Hershberger wrote:
> MZMcBride  writes:
>> Even if we could say that most users downloading from SVN are
>> interested in development, I don't see how that extends to saying that
>> test suites are required for development. You've lost me.
> 
> If a developer breaks a test, ze[1] should fix it.  As the tests cover
> more and more of the code, the tests should be considered more integral
> to development since they provide a nice sanity check on any changes a
> developer is likely to make.

I think the idea that only people intending to do development work on
MediaWiki download from SVN is a bit insane. And as you note, these tests
are only going to grow in size over time.

>> The maintenance directory has maintenance tools in it. I don't think test
>> suites fall within the category of "maintenance." Like large artwork files,
>> these seem to be "development" tools that shouldn't be downloaded with every
>> phase3 checkout.
> 
> [...]
>  
> That is, the tests directory is only 1/4 of the size of the entire
> maintenance directory.  The includes directory is almost 15 times larger
> than the tests directory and, yes, languages tops them all.

You seem to be making an argument that other parts of phase3 need
examination and possible trimming. I agree.

> Since we have some developers actively working on the test suite, it
> appears that you are more annoyed with the noise during ³svn up² rather
> than any actual ³bloat².

I wouldn't say it's noise. But checkouts take longer and there are more
files to deal with in the directory. When I see them filling up space on my
servers (admittedly not a lot of space), I think it's reasonable to ask why
everyone is downloading these tests when relatively so few people will ever
run them or need them.

I've come to accept that downloading from SVN is just going to include these
tests. As I wrote on the bug[1] earlier today, there seems to be consensus
to keep the tests directory in phase3, though perhaps move it a directory
up.

The larger question now is whether to include the tests in the released
tarbells.

MZMcBride

[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=26259#c3



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Victor Vasiliev
I strongly disagree with removing tests from the /trunk/phase3
directory. They should be there, though they may be thrown out on the
release. Maybe introduce a "production" branch where the phase3 is
mirrored without development stuff?

--vvv

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Chad
On Mon, Dec 6, 2010 at 10:08 PM, Victor Vasiliev  wrote:
> I strongly disagree with removing tests from the /trunk/phase3
> directory. They should be there, though they may be thrown out on the
> release. Maybe introduce a "production" branch where the phase3 is
> mirrored without development stuff?
>

Too much work IMO.

I think the easiest solution is to just tweak the make-release
script to exclude the tests directory (wherever we end up
putting it).

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Dmitriy Sintsov
* Chad  [Mon, 6 Dec 2010 22:11:19 -0500]:
> On Mon, Dec 6, 2010 at 10:08 PM, Victor Vasiliev 
> wrote:
> > I strongly disagree with removing tests from the /trunk/phase3
> > directory. They should be there, though they may be thrown out on 
the
> > release. Maybe introduce a "production" branch where the phase3 is
> > mirrored without development stuff?
> >
>
> Too much work IMO.
>
> I think the easiest solution is to just tweak the make-release
> script to exclude the tests directory (wherever we end up
> putting it).
>
One can even imagine that some subset of tests might be performed after 
the completion of installation, to verify it.
As for rare languages, text messages have good compression ratio, the 
distribution may originally have rare language file compressed with gzip 
and unpacking it on demand, when there is first attempt to load this 
particular language.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making usability part of the development process

2010-12-06 Thread Dmitriy Sintsov
* Daniel Friesen  [Mon, 06 Dec 2010 08:13:04 
-0800]:
> PHP -> XSL doesn't quite feel like much of an improvement in terms of
> cutting down on the verbose redundant code boilerplate required to
> insert something.
> ie:  doesn't look much better than  $this->text("title") ?>, as opposed to {$title|escape:html}.
XQuery code looks much less bloated, much like your last example {} :
http://en.wikipedia.org/wiki/XQuery
however I am not sure it is equally as powerful and probably is not well 
supported in PHP :-(

> And I'm not sure how XSL handles html vs. text, we have stuff you want
> to htmlescape and stuff you want to output raw.
> Using an XSL lib also doesn't play nicely with things like calling a
> function to generate tooltips or fetch something from the i18n system.
>
Also XSL wants tags to be properly closed, while in HTML5 they have 
decided a "human-friendly" way of tag soup.

> Unfortunately I think any designer we can convince to build a 
MediaWiki
> skin will probably dislike using XSL as a template language more than
> using PHP as a template language.
>
I wonder whether XQuery is a good alternative but it seems it does not 
catch up :-( The same was for XSL (XSL exists for ages, yet it didn't 
gain a huge popularity - perhaps, because it's tiresome to type 
statements in tags).
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Dmitriy Sintsov
* Huib Laurens  [Mon, 6 Dec 2010 12:46:53 +0100]:
> Hi!
>
> MediaWiki is famous because its almost supported in all languages 
that's
> something other wiki software don't have, lets not try to remove that.
> So I
> think the main bundle should be mediawiki with *all* languages.
>
Preservation of minor languages is a good idea. Besides that, one might 
disagree which language is minor and which is not.

> But a other option is to add a second download option where people can
> download mediawiki with only EN and make tarballs for all languages
> available so people can add the languages they want. Maybe something
> that
> can be done on a mirror site.
>
That is widely used approach in desktop software, "localized versions" 
of software.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Improving the skinning system

2010-12-06 Thread Dmitriy Sintsov
* Daniel Friesen  [Mon, 06 Dec 2010 03:18:44 
-0800]:
> I've been chipping away at our skins system lately, there's a lot we 
can
> improve to improve the skins system.
> Right now there's a lot of it that doesn't work so nicely for an
> ecosystem promoting the creation of a wide variety of skins.
>
Nice work and good looking skin! By the way, I've always wondered why 
QuickTemplate::execute() itself could not be divided into separate 
smaller methods for different parts of layout. Such way one might 
inherit base Skin then alter only few lines, instead of patching or 
re-implementing the whole execute(). Currently, the base class 
QuickTemplate is too much limited.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Improving the skinning system

2010-12-06 Thread Daniel Friesen
On 10-12-06 10:46 PM, Dmitriy Sintsov wrote:
> * Daniel Friesen  [Mon, 06 Dec 2010 03:18:44
> -0800]:
>
>> I've been chipping away at our skins system lately, there's a lot we
>>
> can
>
>> improve to improve the skins system.
>> Right now there's a lot of it that doesn't work so nicely for an
>> ecosystem promoting the creation of a wide variety of skins.
>>
>>
> Nice work and good looking skin! By the way, I've always wondered why
> QuickTemplate::execute() itself could not be divided into separate
> smaller methods for different parts of layout. Such way one might
> inherit base Skin then alter only few lines, instead of patching or
> re-implementing the whole execute(). Currently, the base class
> QuickTemplate is too much limited.
> Dmitriy
>
Well, from what I see QuickTemplate was mostly built as an actual 
template, ie: it's purpose was the setting and use of variables of data 
stored in the template, some sort of holdover from the PHPtal days.

With my latest commit I did create a BaseTemplate extended from 
QuickTemplate, that will act as the new class to extend instead of 
QuickTemplate and will have a bunch of helpers to handle common ways we 
interact with the data that SkinTemplate sets on QuickTemplate based 
templates.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making usability part of the development process

2010-12-06 Thread Daniel Friesen

skin->tooltipAndAccesskey( 'search-go' ); ?> />


One thing our skin system "does" have is an extensive linker and system 
for building tooltips and accesskeys for things using our i18n system. 
And calls to the message system from skins are all over the place: 
tagline, jumpto, and basically every header that's part of a skin itself 
is all generated by calls to the i18n system, they are not hardcoded in 
SkinTemplate.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

On 10-12-06 08:20 AM, Trevor Parscal wrote:
> Fair enough, but when I've pushed for using PHP as a template language,
> even with a clever wrapper that would make injected data escaped by
> default (so unescaping was obvious and could be more easily scrutinized
> for XSS) I have been met with about equal resistance.
>
> The state of things is that unless you are generating HTML 100%
> procedurally (such as by using the Xml and Html classes) then you are
> seen as doing something wrong. Adding something like smarty to the
> system seems silly when we have two completely reasonable alternatives
> at our disposal already.
>
> XSL allows data to be injected without escaping:
>
>  
>
> This is good because it's escaped by default, and any use of this
> attribute would be obvious and could thus be scrutinized for XSS.
>
> As for hooking in an i18n system, I'm not so sure skins should really be
> arbitrarily injecting messages anyways. What's the use case here?
> Generating tooltips? I'm a bit lost there.
>
> - Trevor
>
> On 12/6/10 8:13 AM, Daniel Friesen wrote:
>
>> PHP ->   XSL doesn't quite feel like much of an improvement in terms of
>> cutting down on the verbose redundant code boilerplate required to
>> insert something.
>> ie:   doesn't look much better than> $this->text("title") ?>, as opposed to {$title|escape:html}.
>> And I'm not sure how XSL handles html vs. text, we have stuff you want
>> to htmlescape and stuff you want to output raw.
>> Using an XSL lib also doesn't play nicely with things like calling a
>> function to generate tooltips or fetch something from the i18n system.
>>
>> Unfortunately I think any designer we can convince to build a MediaWiki
>> skin will probably dislike using XSL as a template language more than
>> using PHP as a template language.
>>
>> ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
>>
>>
>> On 10-12-06 07:57 AM, Trevor Parscal wrote:
>>
>>> Have you considered using something like say XSL? If a skin was a
>>> combination of a set of CSS/JS/image files plus an XSL file which took
>>> XML data (which would be the page content and various user interface
>>> elements) then you could avoid using a whole new template system. PHP 5
>>> has an XSL 1.0 library compiled by default, and since it's implemented
>>> in C, I would bet it's far more performant than any PHP implementation
>>> of a template system aside from using PHP itself as a template system.
>>>
>>> Then again, there's a lot of people out there that dislike XSL for all
>>> sorts of reasons - which to me have always carried the ring of
>>> prejudice. I personally think XSL is awesome, and would defend it with
>>> vigor.
>>>
>>> In any case, I'm excited about the work you are doing.
>>>
>>> - Trevor
>>>
>>> On 12/3/10 5:49 PM, Daniel Friesen wrote:
>>>
>>>
 Perhaps I should commit what I have so far even though it's not quite
 ready for use yet.

 It's enough to do this so far:
 http://testwiki.new.wiki-tools.com/index.php?title=Main_Page&useskin=testskin
 ((this runs off that ported monobook template I put in the pastie))

 Right now I'm using smarty's eval: which I don't want to use because it
 re-compiles the template each time you run it.
 I really wanted to make use of MediaWiki's caching system because MW
 already has such a good caching system it's built in, and having to
 require extra permissions just to make smarty happy felt messed up.


 I have a bad habit of starting projects for fun and experimenting them,
 then abandoning them later. The ones I finish are usually the ones with
 company backing, the really really simple ones, or ones with some other
 sort of backing or other people working on it to. So having someone else
 working on it too would probably be motivating.
 This came after experimenting with porting Wikia's Monaco skin (got 1/3
 the way through), and after it I even started experimenting with writing
 an entire language embedded in php that could be used in MW. And there's
 my long lived vm based wiki farm experiment.


 One possibility I suppose, would be to use the vector array as the
 standard, but hardcode monobook to flatten it and relocate the tagline.

 Some ideas for the templates besides fixing caching... We should
 consider a method of doing personal_urls and whatnot using some blocks
 or functions that can aide in the generation of the