Re: [Wikitech-l] wfRunHooks deprecation

2015-01-21 Thread Dmitriy Sintsov
They probably could turn that global class into "facade" - compact form of
IoC container Laravel framework uses.
Dmitriy


On Thu, Jan 22, 2015 at 12:51 AM, Brian Wolff  wrote:

> On Jan 21, 2015 1:40 PM, "Jeroen De Dauw"  wrote:
> >
> > Hey,
> >
> > Does the new syntax offer any advantage over the old one?
> > > Assuming that we want to switch to non-static function calls eventually
> > > (which I hope is the case), wouldn't it be friendlier towards extension
> > > maintainers to only deprecate once we are there, instead of forcing
> them to
> > > update twice?
> > >
> >
> > Good points and questions. While this deprecation is not as problematic
> as
> > simply ditching the current hook system altogether, it does indeed seem a
> > bit of busy work.
> >
> > The Hooks class has this comment "Used to supersede $wgHooks, because
> > globals are EVIL.", which is quite amusing if you consider all fields and
> > methods are static. So it's a switch from a global var to a global field,
> > thus adding a second global to get rid of the first one. I have this
> > presentation on static code which has a screenshot of this comment and
> > class in it :)
> >
> > Cheers
> >
> > --
> > Jeroen De Dauw - http://www.bn2vs.com
> > Software craftsmanship advocate
> > Evil software architect at Wikimedia Germany
> > ~=[,,_,,]:3
>
> Ill be honest i dont understand the point of deprecating that. As you say
> the evil globalness is the same amount of evil regardless of the type of
> global symbol. And really i dont think global hooks causes too many
> problems.
>
> --bawolff
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki 2.0 (was: No more Architecture Committee?)

2015-01-16 Thread Dmitriy Sintsov
Why wikitext is so much disliked? It's more compact to type than HTML. It's
a templating language. HTML is not. Then something like Handlebars (which
is weaker than wikitext) should be used. Or, something like web components
and custom tags. But why removing nice thing (wikitext) which saves a lot
of keystrokes to break the whole codebase? Strange. Especially because
there's more progressive Parsoid. It looks like whole MW can be rewritten
in Node.js, not Python.

On Fri, Jan 16, 2015 at 8:04 PM, James Forrester 
wrote:

> [Moving threads for on-topic-ness.]
>
> On 16 January 2015 at 07:01, Brian Wolff  wrote:
>
> > Does anyone actually have
> > anything they want that is difficult to do currently and requires a mass
> > compat break?
>
>
> ​Sure.
>
> ​Three quick examples of things on the horizon (I'm not particularly saying
> we'd actually do these for Wikimedia's use, but if you're going to ask for
> straw man arguments… :-)):
>
>- ​Get rid of wikitext on the server-side.
>   - HTML storage only. Remove MWParser from the codebase. All
>   extensions that hook into wikitext (so, almost all of them?) will
> need to
>   be re-written.
>- Real-time collaborative editing.
>   - Huge semantic change to the concept of a 'revision'; we'd probably
>   need to re-structure the table from scratch. Breaking change for
> many tools
>   in core and elsewhere.
>- Replace local PHP hooks with proper services interfaces instead.​
>- Loads of opportunities for improvements here (anti-spam tools 'as a
>   service', Wordpress style; pre-flighting saves; ), but again, pretty
> much
>   everything will need re-writing; this would likely be "progressive",
>   happening one at a time to areas where it's
> useful/wanted/needed, but it's
>   still a huge breaking change for many extensions.
>
>
>
> > Proposing to rewrite mediawiki because we can without even a
> > notion of what we would want to do differently seems silly.
> >
>
> ​Oh, absolutely. I think RobLa's point was that it's unclear who feels
> empowered to make that decision (rather than the pitch). I don't. I don't
> think RobLa does. Clearly the Architecture Committee don't.​
>
> ​J.
> --
> James D. Forrester
> Product Manager, Editing
> Wikimedia Foundation, Inc.
>
> jforres...@wikimedia.org | @jdforrester
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Working around composer? (Fatal error: Class 'Cdb\Reader' not found)

2015-01-13 Thread Dmitriy Sintsov
Isn't web interface for maintenance imposes security risk? Also web scripts
usually are time-limited while shell scripts are not. To update text status
of web scripts one has to run batches via json, which is cumbersome.
Laravel artisan (shell script) runs composer itself, why
maintenance/update.php cannot? I work with Laravel after years of previous
working with MediaWiki.
Dmitriy


On Tue, Jan 13, 2015 at 9:29 PM, Ryan Kaldari 
wrote:

> This may be a dumb question, but has anyone worked on creating a web
> interface for running update and maintenance scripts (and viewing
> associated logs)? That would probably make the whole process less painful
> and confusing for 3rd party users, especially if the interface offered some
> guidance on what each script did and when it was last run.
>
> Kaldari
>
> On Tue, Jan 13, 2015 at 9:08 AM, Bryan Davis  wrote:
>
> > On Tue, Jan 13, 2015 at 7:40 AM, Tyler Romeo 
> wrote:
> > > I know we just added some new maintenance scripts for checking things
> > with composer. I’m sure it wouldn’t be that bad having update.php check
> > first and tell the user to run “composer install” before doing
> update.php.
> >
> >
> > Kunal made the new "checkComposerLockUpToDate.php" maintenance script
> > to validate $IP/vendor against the $IP/composer.json file. An end user
> > could either add this to their typical workflow before running
> > update.php or we could try to find a reasonable way to integrate the
> > check it performs into the update script. Checking for external
> > dependencies isn't the same thing at all as updating a database schema
> > so I'd lean towards suggesting that the new script be used separately.
> >
> >
> > > On January 13, 2015 at 08:07:34, Marcin Cieslak (sa...@saper.info)
> > wrote:
> > >
> > > I am kind of late to the party but I have upgraded one of
> > > my throaway development wikis with the usual
> > > "git remote update && git merge && php maintenance/update.php" process
> > > and after the above succeeded I was nevertheless greeted by:
> > >
> > > Fatal error: Class 'Cdb\Reader' not found
> > >
> > > exception coming out of includes/cache/LocalisationCache.php on line
> 1263
> > >
> > > It seems that I just forgot to update the "vendor" directory
> > > (I am somehow reluctant to run composer due to allow_url_fopen=1)
> > > requirement
> > >
> > > Would that be reasonable to add some basic external libraries
> > > checks to update.php to remind users to update those core
> > > components prior to accessing the wiki?
> > >
> > > Btw. I think UPGRADE doc does not (yet) mention the new process.
> >
> > I think that Kunal's thinking on this (Composer and UPGRADE) was that
> > when the 1.25 tarballs are released they will likely bundle the
> > required libraries directly and thus use of Composer will not be
> > needed by the end user. There is a sentence in the Git subsection of
> >  mentioning the
> > external library dependency:
> > > If you are upgrading to MediaWiki 1.25 or later, you will also need to
> > install some external libraries. See the documentation on that for more
> > details.
> >
> > Maybe that needs a bit more emphasis on the wiki page?
> >
> > Bryan
> > --
> > Bryan Davis  Wikimedia Foundation
> > [[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
> > irc: bd808v:415.839.6885 x6855
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Recommended way to change a lot of absolute links at wiki

2014-08-11 Thread Dmitriy Sintsov
Replace in XML dump was done automatically and I did not want to remove
timestamp from all revisions (even from current ones) because the wiki is
quite large (about 9000 articles). So I made my own tool to touch such
revision timestamps:
https://github.com/Dmitri-Sintsov/MwDumpProcessor/commit/079cc194215632db3e8017e10630cdea2711dc62
It's not a complete solution (no real parser, no support for extra fields,
such as LiquidThread inserts into dump) however enough for ordinary NS_MAIN
pages.
It's strange that dump importer itself does not compare base36sha1, neither
for warning of altered content nor to import manually altered revisions
only.
Dmitriy



On Mon, Aug 11, 2014 at 5:14 AM, Benjamin Lees  wrote:

> If you simply remove the timestamp from a revision in a dump, the importer
> appears to happily insert it with the current time as the timestamp.  This
> may also cause cancer, summon Cthulhu, etc.
>
> In addition to pywikibot, there's the Replace Text extension[0], which
> ought to be able to handle what you want to do.
>
> [0] https://www.mediawiki.org/wiki/Extension:Replace_Text
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Recommended way to change a lot of absolute links at wiki

2014-08-10 Thread Dmitriy Sintsov
Hi!
I made a XML dump with --current option, then replaced some of external
domain links [http://somedomain.org] in it. When I import the dump back,
these pages aren't updated. I think that's because text processors /
editors do not update sha1 / timestamp fields. Why doesn't
maintenance/importDump.php recalculate and compare sha1 of actual page
content? How should I touch timestamp / sha1 xml field text in the modified
dump? Is there any ready solution? Or, shall I use pywikibot instead (that
will be longer and slower)?
Dmitriy
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Linux distros or Solaris for MediaWiki

2014-07-29 Thread Dmitriy Sintsov
At previous job, I've installed and maintained a wiki farm with multiple
sites, interwiki setup and caching while another emploee installed
MediaWiki from repository via single command and told to former boss that
my work is very simple. I hate anything web-related in packages because of
that.


On Tue, Jul 29, 2014 at 6:55 PM, Delirium  wrote:

> On 7/22/14, 3:09 PM, Mark A. Hershberger wrote:
>
>  Max Semenik  writes:
>>
>>  However all packages I know of (Debian flavors and not) split MW
>>> directory and put its parts into different places, trying to follow
>>> the Filesystem Hierarchy Standard. The result is [...] outright
>>> breakages because our code base generally assumes that everything lies
>>> in one place.
>>>
>> These are bugs that should be fixed.  Release management has worked
>> closely with Debian and Fedora packagers to improve their packaging
>> because, despite any on-wiki disclaimers, people will continue to use
>> "apt-get" and "yum" to install MediaWiki.
>>
>
> Having seen many "institutional" installations of MediaWiki (mostly in
> universities), imo the distro version is actually the better option to
> recommend by default for non-sophisticated users. Manually installed
> MediaWiki, unpacked from tarballs, has a bad habit of being installed once
> and *never, ever* upgraded. I just found one here running v1.14! If it had
> been the Ubuntu-package version, it's much more likely someone would have
> upgraded it in the years since then (e.g. the Apache on this box has been
> upgraded, but not the MediaWiki). The distro packaging does sometimes
> introduce some weirdness compared to the official structure, but imo it's
> the less-bad choice.
>
> -Mark
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Linux distros or Solaris for MediaWiki

2014-07-21 Thread Dmitriy Sintsov
Yes, please do not install MediaWiki via apt-get or aptitude. It will be
old version with suboptimal settings. It is much better to install it
manually.
Speaking of Ubuntu, I think WMF is running a lot of Ubuntu / Debian
systems. So, Ubuntu should be the best way of running MediaWiki.
Dmitriy



On Mon, Jul 21, 2014 at 10:55 AM, Pine W  wrote:

> David: https://www.mediawiki.org/wiki/Manual:Running_MediaWiki_on_Ubuntu
>
> That warning could be clearer about what it's referring to.
>
>
> On Sun, Jul 20, 2014 at 2:24 PM, David Gerard  wrote:
>
> > Where does it say it's unsupported? Some older page, talking about the
> > distro version? I routinely use MW from tarball on Ubuntu and it's
> > fine. https://www.mediawiki.org/wiki/Debian/Ubuntu
> >
> > I'd personally recommend you install the prerequisites then use the
> > MediaWiki tarball, but the distro 1.19 in recent versions of Ubuntu
> should
> > be
> > good (if a bit old).
> >
> > I have done MW on Solaris and it was an exercise in pain, unless you
> > use something that resolves dependencies for you. That is to say, a
> > Linux.
> >
> >
> > - d.
> >
> >
> >
> > On 20 July 2014 20:55, Pine W  wrote:
> > > I have experience with Ubuntu but MediaWiki says that Ubuntu is
> > > unsupported. Which Linux distro would people recommend, and which
> distro
> > of
> > > Linux does WMF use for MediaWiki? I am thinking about installing Debian
> > but
> > > am open to any suggestions that have a friendly UX.
> > >
> > > Solaris is an option also.
> > >
> > > Pine
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making a plain MW core git clone not be installable

2014-06-11 Thread Dmitriy Sintsov

On 11.06.2014 9:01, Matthew Walker wrote:

This also has knock on impacts elsewhere. BD808 has a patch that uses
PSR-log and Monolog for logging. We're starting to move to a model where we
recognize that we shouldn't write everything and that things in core have
significantly better replacements in the wider PHP community. It doesn't
make sense to keep maintaining the vastly inferior core components when
more and more core and extensions are going to want to rely on the newer
interfaces and features.
PSR loader is ok. PSR "standard" of using four spaces instead of tabs 
for indentation is strange. That prevents from easily adjusting the 
indentation (good editors can visually render tabs like N-spaces 
according to user preference) and has another drawbacks.



The question Tim posed in the commit comes down to:
* Do we bundle third party dependencies, or
* Do we allow composer to do what it was designed to do and manage the
dependencies for us


composer can use git to fetch dependencies, at least it does so when I 
developed with symfony2.

https://getcomposer.org/doc/05-repositories.md
Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] LiquidThreads - how do we kill it?

2014-06-06 Thread Dmitriy Sintsov

On 06.06.2014 22:17, Brian Wolff wrote:

On 6/6/14, Brad Jorsch (Anomie)  wrote:

On Thu, Jun 5, 2014 at 4:38 PM, Jon Robson  wrote:


So I want to know:
* What are the blockers for doing this?
* Are there any use cases / killer features in LiquidThreads that are
not in Flow that need to be ported over?


Flow doesn't support actual threaded discussions beyond a very limited
depth,[1] meaning that a real threaded discussion is likely to turn into a
morass of comments without clear indication of what is replying to what
unless people actively work around it.[2] Since converted LQT threads are
likely to lack the quoting necessary to work around this misfeature,
they're particularly liable to wind up unreadable if they're at all complex.

Also, bug 57512 comment 31 could use a reply.[3]


  [1]: Although this is a matter of configuration rather than something
hard-coded, I doubt the configuration is going to be changed.
  [2]: I won't go into more detail here about this or about why pings (as
Flow encourages to work around the misfeature) aren't sufficient.
  [3]: https://bugzilla.wikimedia.org/show_bug.cgi?id=57512#c31
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Personally I have yet to see a discussion system that surpasses (or
really even comes close) to standard talk page ":::comment here. "
syntax. Honestly it would make me happy if we just used that in
general.
I also like wikitext more than any of visual UI because it's faster to 
type. Also, that kind of comments
:::comment  probably can be incorporated into VisualEditor so the 
users will see something like LQT
while actual content will be a wikitext. VisualEditor could have 
separate scopes or modes for different

NS_* page types.


The exception being pages with large influxes of newbies, like
Project:Support_desk. In those pages LQT really does make a difference
to ensure things are well organized.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML templating systems & MediaWiki - is this summary right?

2014-03-19 Thread Dmitriy Sintsov

On 19.03.2014 12:10, Peter Kaminski wrote:

Hi Sumana,

I think a key concept you might want to capture is "separation of 
concerns" -- a templating engine allows separation of presentation 
logic from business logic.  Often, the two are handled by different 
people with different skills, in service of separate goals.  So having 
the templating engine specialized for presentation logic is important.


The point isn't so much that the templates look like a document, as 
much as they can be written in a simplified language that's 
specialized for outputting documents.


Also, I don't know if these are useful in this context, but I wanted 
to point to two of the cutting-edge template engines from the PHP 
frameworks world, as representatives of modern PHP template thinking:


Fabien Potencier's Twig
http://twig.sensiolabs.org/

Laravel's Blade
http://laravel.com/docs/templates#blade-templating
http://culttt.com/2013/09/02/using-blade-laravel-4/

Neither of these, though, are oriented to dual JavaScript/PHP support, 
which I think is an interesting path to consider.


And last, two Wikipedia pages that might be relevant:

https://en.wikipedia.org/wiki/Web_template_system
https://en.wikipedia.org/wiki/Comparison_of_web_template_engines

Pete

MediaWiki Parser itself could be used for skinning as well with some 
features disabled.
But actually just associative nested arrays are good enough as template 
engine, which are used for example in Drupal:

https://drupal.org/node/930760

|// New method of generating the render array and returning that
function mymodule_ra_page() {
$output =  array(
'first_para' => array(
'#type' => 'markup',
'#markup' => 'A paragraph about some stuff...',
),
'second_para' => array(
'#items' => array('first item', 'second item', 'third item'),
'#theme' => 'item_list',
),
  );
  return $output;
}|

Each key of such associative array can be mapped to view method, while 
it's values are view parameters (properties). Quite flexible, fast and 
powerful approach which also allows to late manipulation of Output.

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is Foxway a right way?

2014-01-13 Thread Dmitriy Sintsov

On 13.01.2014 18:17, Pavel Astakhov wrote:
> I implemented something similar before Scribunto was stable enough 
and > deployed

My idea is different

I know. But the interpreting probably is too slow for huge-load wikis.

Do not allow Foxway extension to deceive you.
It is a cheap intermediary, between functions of PHP and extensions 
written in PHP.

It understands and executes the commands as PHP code(no more of this).
This allows you get the data from a extension, process these data in 
functions of pure PHP and transmit it to other extension.

Yes, there is an additional expense, but they are small.

When you can (albeit slowly) to manage the fast and strong extensions 
and use powerfull functions in an arbitrary way, you can do anything 
you like and it will be quickly.

You do not pay much, but you have the ability to easily manage this.

Imagine such pages created by Wikipedia users. They can slow down, bring 
down or exploit the wiki via calling extensions directly. While English 
Wikipedia is one of the top sites in the world.



... Lua VM also was a bit faster than both PHP and Python some year ago.

I know this, but If we rewrite MediaWiki in LUA, it will not work faster.

Maybe a bit faster, but their point is different - they run Lua in 
controlled almost isolated environment (users cannot hack Wiki), while 
it's much harder to achieve that with PHP without interpretation. While 
PHP interpretation is probably too slow for high-load and huge English 
Wikipedia. They also a non-profit thus probably cannot buy as many 
servers as let's say Facebook or Google. So, reducing CPU load it's 
important to them.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is Foxway a right way?

2014-01-13 Thread Dmitriy Sintsov




Пн. 13 янв. 2014 16:12:29 пользователь Pavel Astakhov (pastak...@yandex.ru) 
написал:

> From: Dmitriy Sintsov  rambler.ru>
> Subject: Re: Is Foxway a right way? 
> <http://news.gmane.org/find-root.php?message_id=1389610306.562859.4565.53906%40mail.rambler.ru>
> Newsgroups: gmane.science.linguistics.wikipedia.technical 
> <http://news.gmane.org/gmane.science.linguistics.wikipedia.technical>

> Date: 2014-01-13 10:51:46 GMT (29 minutes ago)
> I implemented something similar before Scribunto was stable enough and 
> deployed

My idea is different

I know. But the interpreting probably is too slow for huge-load wikis.


> ... many people say that Lua ... better language in general ...
This is a very controversial statement.
I use the PHP interpreter because mediawiki written in PHP and PHP is 
more powerful.

Lua VM is not stack-based, it's RAM usage can be easily controlled. The people 
behind Scribunto (Tim Starling and Victor Vasiliev) are better programmers than 
me, if they choosed Lua then it's really worth something. Also Lua is used as 
scripting language in huge number of various applications (games, scientific) 
while PHP is not. Lua VM also was a bit faster than both PHP and Python some 
year ago.


> From: Tyler Romeo  gmail.com>
> Subject: Re: Is Foxway a right way? 
> <http://news.gmane.org/find-root.php?message_id=CAE0Q5ovAtE%5f5WDgXr6rYUwkCqt%5fVNawuiMjWP9i%5fLbrXU5CAKA%40mail.gmail.com>
> Newsgroups: gmane.science.linguistics.wikipedia.technical 
> <http://news.gmane.org/gmane.science.linguistics.wikipedia.technical>

> Date: 2014-01-13 09:37:46 GMT (43 minutes ago)
> How does this compare to the PECL runkit extension? Also have you
> benchmarked it against Scribunto? Because Scribunto does kind of the same
> exact thing except just with a different programming language (and
> Scribunto uses a native interpreter rather than one written in PHP).
I do not propose to improve what is already there.
I'm sure there is nothing faster LUA and one is the best choice.
Why is there a need for LUA? Because building html page from wiki markup 
without LUA takes a long time.
Why? Because wiki page use a lot of function calls that are working 
together very slowly.
So let it be. Can not we just all be cached? No, pages change frequently 
and cache is not dimensionless.

I guess they have enough of logged-in non-anonymous users so not everything and 
not always may be cached. Also I remember bad things could happen when source 
of template changes and lots of pages has to be regenerated, Scribunto probably 
reduced CPU load a lot in such case. Do not forget they are not small / medium 
size wiki but a really huge wiki.

I propose to discuss the new principle of building html pages from the 
wiki markup.
1. Need to separate Wiki markup from the functions, just as html 
separated from PHP code. In this case, need to cache only the result of 
these functions.

2. Let functions to work quickly. I checked, it is possible.
I'm not trying to build a page quickly, I'm trying to do this very 
effectively. That is my idea.

Efficient use of resources gives bigger win in speed.


Maybe you could apply your extension to Goole Summer of Code, or to another 
similar experimental project. They are announcing such projects regularly.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is Foxway a right way?

2014-01-13 Thread Dmitriy Sintsov




Пн. 13 янв. 2014 12:59:18 пользователь Pavel Astakhov (pastak...@yandex.ru) 
написал:

Hi! I would like to discuss an idea.
In MediaWiki is not very convenient to docomputingusing the syntax of 
the wiki. We have to use several extensions like Variables, Arrays, 
ParserFunctions and others. If there are a lot of computing, such as 
data processing received from Semantic MediaWiki, the speed of page 
construction becomes unacceptable. To resolve this issue have to do 
another extension (eg Semantic Maps displays data from SMW on Maps). 
Becomes a lot of these extensions, they don't work well with each other 
and these time-consuming to maintain.
I know about the existence of extension Scribunto, but I think that you 
can solve this problem by another, more natural way. I suggest using PHP 
code in wiki pages, in the same way as it is used for html files. In 
this case, extension can be unificated. For example, get the data from 
DynamicPageList, if necessary to process, and transmit it to display 
other extensions, such as Semantic Result Formats.This will give users 
more freedom for creativity.
In order to execute PHP code safely I decided to try to make a 
controlled environment. I wrote it in pure PHP, it is lightweight and in 
future can be included in the core. It can be viewed as an extension 
Foxway. The first version in branch master. It gives an idea of what it 
is possible in principle to do and there's even something like a 
debugger. It does not work very quickly and I decided to try to fix it 
in a branch develop. There I created two classes, Compiler and Runtime.
The first one processes PHP source code and converts it into a set of 
instructions that the class Runtime can execute very quickly. I took a 
part of the code from phpunit tests to check the performance. On my 
computer, pure PHP executes them on average in 0.0025 seconds, and the 
class Runtime in 0.05, it is 20 times slower, but also have the 
opportunity to get even better results. I do not take in the calculation 
time of class Compiler, because it needs to be used once when saving a 
wiki page. Data returned from this class is amenable to serialize and it 
can be stored in the database. Also, if all the dynamic data handle as 
PHP code, wiki markup can be converted into html when saving and stored 
in database. Thus, when requesting a wiki page from the server it will 
be not necessary to build it every time (I know about the cache). Take 
the already prepared data (for Runtime and html) and enjoy. Cache is 
certainly necessary, but only for pages with dynamic data, and the 
lifetime of the objects in it can be greatly reduced since performance 
will be higher.
I also have other ideas associated with the use of features that provide 
this realization. I have already made some steps in this direction and I 
think that all of this is realistic and useful.
I'm not saying that foxway ready for use. It shows that this idea can 
work and can work fast enough. It needs to be rewritten to make it 
easier to maintain, and I believe that it can work even faster.
I did not invent anything new. We all use the html + php. Wiki markup 
replaces difficult html and provides security, but what can replace the 
scripting language?
I would like to know your opinion: is it really useful or I am wasting 
my time?

Best wishes. Pavel Astakhov (pastakhov).


Hi, Pavel!
I implemented something similar before Scribunto was stable enough and deployed:
http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/QPoll/interpretation/qp_eval.php?revision=103452&view=markup

However, instead of creating interpreter I just used php built-in tokenizer via 
token_get_all() to sanitize the code (to disallow some operators and calls) 
then eval'ed() it when security check passed successfully.

I probably should convert everything to use Scribunto instead because many 
people say that Lua has more secure VM and better language in general and also 
it's VM is one of the fastest (only JVM is faster). And Wikimedia needs 
Scribunto because they had high CPU load while executing some large templates 
at their servers.

However it's of course a bit sad that PHP runkit is so much outdated and 
abandoned and non-mainstream.

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] python vs php

2013-07-28 Thread Dmitriy Sintsov


The problem is not awkward syntax of PHP. Main problem is that PHP is a 
specialized language. For example it does not have built-in threading. Posix 
threading is not universally available for PHP. Python and Java has it's own 
VM-backed threading everywhere. Java is suitable platform to build OS. PHP is 
not. Even web server backend itself is rarely implemented in PHP.
http://www.artima.com/insidejvm/ed2/jvm.html
"Complete feature" language should be suitable to build OS.
Perhaps Zend VM could be "beefed up" itself to make PHP the same class as Java.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Varnish

2013-07-25 Thread Dmitriy Sintsov



26 Июль 2013 г. 5:59:57 пользователь Jacobo Nájera (jac...@metahumano.org) 
написал:

I am exploring around Varnish Cache, do you know some tools for test
performance and benchmarking?
Thanks,
Jacobo


https://www.varnish-cache.org/docs/3.0/tutorial/increasing_your_hitrate.html

varnishtop -i txurl
varnishlog -c -m 'RxURL:^/foo/bar

where /foo/bar is script path

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Suggested file format of new incremental dumps

2013-07-01 Thread Dmitriy Sintsov

On 01.07.2013 22:56, Tyler Romeo wrote:

Petr is right on par with this one. The purpose of this version 2 for dumps
is to allow protocol-specific incremental updating of the dump, which would
be significantly more difficult in non-binary format.


Why the dumps cannot be just split into daily or weekly XML files 
(optionally compressed ones). Such way the seeking will be performed by 
simply opening .MM.DD.xml file.
It is so much simplier than going for binary git-like formats. Which 
would take a bit less space but are more prone to bugs and impossible to 
extract and analyze/edit via text/XML processing utils.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] CHanging the name of Vector on 1.21.1

2013-06-09 Thread Dmitriy Sintsov



10 Июнь 2013 г. 9:15:44 пользователь  (j00...@mail.com) написал:

Hello everyone,
I want to modify the default Vector theme that comes with 1.21.1. But before I 
do that I want to rename it. I want to name it nighttime.  I created a folder 
called nighttime and copied all the vector files into it. Then I made a copy of 
Vector.php and called it Nighttime.php. I then modified the appropriate 
contents of Nighttime.php as follows...
---
class SkinNighttime extends SkinTemplate {
 protected static $bodyClasses = array( 'vector-animateLayout' );
 var $skinname = 'nighttime', $stylename = 'nighttime',
 $template = 'NighttimeTemplate', $useHeadElement = true;

...
unction setupSkinUserCss( OutputPage $out ) {
 parent::setupSkinUserCss( $out );
 $out->addModuleStyles( 'skins.nighttime' );
...
class NighttimeTemplate extends BaseTemplate {
-
You can see what the site looks after I renamed everything at 
http://beta.dropshots.com/j00100/media/75373447 It appears as if there is no 
formatting.
I did some searching on Google but everything I found dealt with older 
versions. Does anyone know how to rename Vector and have it working on 1.21?
Thanks

Also, if you change skin name you may encounter some incompatibilities, for example 
Extension:VisualEditor has a whitelist for "supported skin names":

class VisualEditorHooks {
/** List of skins VisualEditor integration supports */
protected static $supportedSkins = array( 'vector', 'apex', 'monobook' 
);

That's why I started to modify Vector directly, instead of copying it according 
to Daniel Friesen tutorial.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP 5.4 (we wish)

2013-06-09 Thread Dmitriy Sintsov




9 Июнь 2013 г. 21:51:13 пользователь Tyler Romeo (tylerro...@gmail.com) написал:

To be honest, the upgrade isn't that exciting. The only real worthwhile new
feature is traits. Everything else is just random fixes in syntax. As for
the timetable for this, PHP 5.3 isn't going anywhere anytime soon, so
there's no way we can stop supporting it.
What I'm really excited about is PHP 5.5, which has generators, finally
clauses, dereferencing of container literals, and a new password hashing
API. Unfortunately it'll be literally a decade before we switch to that. :(
*-- *

It will probably happen much faster. PHP 5.4 is a kind of experimental branch, 
for example there is no stable APC opcode cache for that version.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code style: overuse of Html::element()

2013-05-14 Thread Dmitriy Sintsov

On 13.05.2013 21:26, Max Semenik wrote:

Hi, I've seen recently a lot of code like this:

$html = Html::openElement( 'div', array( 'class' => 'foo' )
 . Html::rawElement( 'p', array(),
 Html::element( 'span', array( 'id' => $somePotentiallyUnsafeId ),
 $somePotentiallyUnsafeText
 )
 )
 . Html::closeElement( 'div' );

IMO, cruft like this makes things harder to read and adds additional
performance overhead. It can be simplified to

$html = '

Placing language operators into template itself like

{% for user in users %}

does not separate html data from code well enough. Template language always 
will be more limited comparing to processing via PHP classes.
Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code style: overuse of Html::element()

2013-05-13 Thread Dmitriy Sintsov

On 13.05.2013 21:26, Max Semenik wrote:

Hi, I've seen recently a lot of code like this:

$html = Html::openElement( 'div', array( 'class' => 'foo' )
 . Html::rawElement( 'p', array(),
 Html::element( 'span', array( 'id' => $somePotentiallyUnsafeId ),
 $somePotentiallyUnsafeText
 )
 )
 . Html::closeElement( 'div' );

IMO, cruft like this makes things harder to read and adds additional
performance overhead. It can be simplified to

$html = '
http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/QPoll/includes/qp_renderer.php?revision=103452&view=markup

For performance reasons I did not bother about attribute validation and 
inner text quotation, performing these tasks in caller instead.

Output generarion was simple recursive traversal of nested PHP arrays.

However, it also has methods to "dynamically" add columns and rows to 
html tables, including colspans and rowspans.


The class worked well enough allowing to manipulate content, however 
subtree inserting and moving was not so elegant.


When I forced to abandon MediaWiki development due to financial reasons, 
I re-thinked the class and made the same tagarrays based

on XMLDocument and XMLWriter:

https://bitbucket.org/sdvpartnership/questpc-framework/src/a5482dd1035b6393f52049cda98c9539b6f77b6c/includes/Xml/XmlTree.php?at=master
https://bitbucket.org/sdvpartnership/questpc-framework/src/a5482dd1035b6393f52049cda98c9539b6f77b6c/includes/Xml/Writer/GenericXmlWriter.php?at=master

They are slower than "echo $var;" for sure, but allow much more powerful 
tag manipulation and templating in jQuery-like way. And the output is 
always valid and of course XMLDocument

automatically cares about inner text escaping and so on.

Here's an example of newer version of tag array definition:
array( '@tag' => 'div', 'class' => 'foo', array( '@tag' => 'p', array( 
'@tag' => 'span', 'id' => $id, $text ) ) );


String keys starting from '@' are special keys: '@tag' is a tag name, 
'@len' (optional) is count of child nodes.

Another string keys are attribute key / value pairs.
Integer keys are nested nodes - either nested tags or text nodes.
Also there are three special tag names:
'@tag' => '#text'
'@tag' => '#comment'
'@tag'=>'#cdata'

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-12 Thread Dmitriy Sintsov

On 12.05.2013 1:18, Tyler Romeo wrote:

FWIW, here is what I have so far: http://pastebin.com/hUQ92DfB

I haven't tested it yet because my PHP environment is not behaving, and the
only class I haven't implemented fully is SplHeap.


Perhaps you should send the link to HipHop developers (or to their list, 
if there's any).


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-11 Thread Dmitriy Sintsov

On 10.05.2013 23:51, Antoine Musso wrote:

Le 10/05/13 03:10, Tim Starling a écrit :

There's a few other SPL features that we don't use at the moment and
we should avoid introducing if possible due to lack of support in HipHop:



I wish we actually used Spl :-]  They are nice classes providing all
kind of useful features: http://php.net/manual/en/book.spl.php
Yes it is strange that Facebook does not need SPL. Especially because 
since PHP 5.3 SPL is not optional anymore.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-11 Thread Dmitriy Sintsov

On 10.05.2013 17:58, Chad wrote:

On Fri, May 10, 2013 at 8:05 AM, Jeroen De Dauw  wrote:

Hey,

I can see why SPL might require extra work in HipHop to support. At the
same time I find it somewhat unfortunate this means one cannot use the
Standard PHP Library.


Yeah, but I think it's a workable issue. And the HH team seems very
amenable to feature requests (and patches!), so implementing parts of
the SPL are certainly possible over the long term.

As Tim points out, for ArrayObject and filter_var() it's non trivial to
implement (even Zend's implementation of the former is 2000+ LOC).

System and development software, such as OS, compilers, language 
libraries, different kinds of VM's and so on are really huge and 2000+ 
lines of code actually is a *little* amount.
It's not a framework or wiki. It's development software written in 
low-level language.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] category intersection conversations

2013-05-09 Thread Dmitriy Sintsov

On 09.05.2013 20:28, Brad Jorsch wrote:

On Wed, May 8, 2013 at 10:47 PM, James Forrester
 wrote:

* Pages are implicitly in the parent categories of their explicit categories
* -> Pages in  are in  (its first parent) and  (its first parent's parent) and  (its second
parent) and  (its second parent's parent) and …
* -> Yes, this poses issues given the sometimes cyclic nature of
categories' hierarchies, but this is relatively trivial to code around

Category cycles are the least of it. The fact that the existing
category hierarchy isn't based on any sensible-for-inference ontology
is a bigger problem.

Let's consider what would happen to one of my favorite examples on enwiki:
* The article for Romania is in . Ok.
* And that category is in , so Romania is in that too.
Which is a little strange, but not too bad.
* And  is in  and .
Huh? Romania doesn't belong in either of those, despite that being
equivalent to your example where pages in  also end up in  via .


There is probably nothing contradictionary in your Black sea category 
relation example because "Seas of " implies that  has 
*multiple* seas, while Romania has only *one* sea border (no offence, 
there are lot of small countries and large country does not always means 
a happy life).  is a little bit more weird, but 
could be explained as long and complex area of Crimean peninsula. So, 
the categories actually are not so wrong.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Anyone using Apache 2.4 to run MediaWiki?

2013-04-26 Thread Dmitriy Sintsov

On 26.04.2013 23:42, Max Semenik wrote:

On 26.04.2013, 19:15 Dmitriy wrote:


However, I remember wikimedia was running something old like htpd 1.3

maxsem@mw1020:~$ dpkg -l apache*
[snip]
ii  apache2.2-common  2.2.22-1ubuntu1.3 Apache HTTP 
Server common files


Surely it was some not so long time ago, there even was old httpd patch. 
Maybe they consider new hardware powerful enough so the difference 
became negligible.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Anyone using Apache 2.4 to run MediaWiki?

2013-04-26 Thread Dmitriy Sintsov

On 27.04.2013 5:23, Mark A. Hershberger wrote:

On 04/26/2013 02:29 PM, Maarten Dammers wrote:

I'm running MediaWiki on apache-2.4.4 on NetBSD.

Ah! Thank you.  I'm gonna guess, then, that this is a problem particular
to XAMPP and/or Windows.

Are you using rewrite rules with your installation?  Are they different
than what we have documented for Apache?


Rewrite rules usually are compatible, they support old syntax but also 
now have more powerful conditions (which may be used but are not required).
My site was running fine just after the two changes I sent in previous 
mail. However I never use pre-configured environments Windows, I install 
and configure everything manually.
In Linux I even used to compile everything from source, but in last 
years switched to packages.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Anyone using Apache 2.4 to run MediaWiki?

2013-04-26 Thread Dmitriy Sintsov

On 26.04.2013 5:02, Mark A. Hershberger wrote:

On 04/25/2013 08:40 PM, Brian Wolff wrote:

I would say we shouldn't document unless we habe a vauge idea why.

Agreed.  But I was basing this on reports from people who were using
XAMPP, not my lame attempts.  Still, I haven't tried to reproduce the
XAMPP issue, either.

For now, I suppose this is just something to make a mental note of until
the problem becomes more real.


Instead of
Order allow,deny
Allow from all

httpd 2.4 uses
Require all granted

RewriteLog /var/www/host/logs/rewrite.log
RewriteLogLevel 3
also is not supported

LogLevel warn rewrite:trace8
should be used instead.

I briefly run through the documentation and it seems that rewrite rules 
and condition processing become a bit nginx-like (seems they were aware 
that nginx increased it's presense).
However, I remember wikimedia was running something old like htpd 1.3 
and also some local admins recommended it as fastest approach. I 
personally run nginx/php-fpm, even though it's unsupported (no IE6 
workaround). However I do not care about IE6.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tutorial on Semantic MediaWiki in Russian

2013-04-21 Thread Dmitriy Sintsov

On 21.04.2013 13:02, Yury Katkov wrote:

On Sun, Apr 21, 2013 at 2:53 AM, Mathieu Stumpf <
psychosl...@culture-libre.org> wrote:


Le samedi 20 avril 2013 à 05:36 +0400, Yury Katkov a écrit :

Any ideas where to publish the English version? I was quite surprised

when

I found out that there is no collective blog in english internet as our
Habrahabr.

I was thinking about IBM Developerworks but maybe they need something
closer to the programming. Can I try to propose the article to
http://blog.wikimedia.org/c/technology/ ?

Would the meta wiki be innapropriate?

Too small audience. The amount of questions to which SMW is the answer and

the amount holy wars in mailing lists about installing Semantic MediaWiki
on Wikimedia websites is growing and ensures that everyone in MW worlds
will sooner or later discover this extension for himself. My aim is to
attract the attention to SMW from outside and show that with SMW MediaWiki
can be much more than just wiki engine with ugly syntax that is used in
Wikipedia, but a framework for interesting collaborative systems with
consistent and dynamic content.


WikiData will also get queries sooner or later, so it seems unlikely 
that SMW will be widely deployed at most important Wikimedia sites.
I remember that some time ago they said that SMW code is too large to 
review. Thus there was project to make SMW Light, I am unsure it was 
ever compete.
Do not forget they run very high load sites like wpen or wpde, which 
probably have more traffic than Russian top sites like mail.ru or 
yandex.ru. Thus they want highly scalable extensions.
It's interesting whether they will use external triplestore backend for 
WikiData.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Removing the Hooks class

2013-04-03 Thread Dmitriy Sintsov



4 Апрель 2013 г. 10:11:44 пользователь Daniel Friesen 
(dan...@nadir-seen-fire.com) написал:

On Wed, 03 Apr 2013 22:23:41 -0700, Dmitriy Sintsov     
wrote:
>
>> 4 Апрель 2013 г. 9:16:49 пользовате­ль Jeroen De Dauw    
>> (jeroended...@gmail.com) написал:
>>    Hey,
>> > I see no reason to get rid of the hooks class.
>> Given you also do not understand why I think the comment is funny, I
>> recommend you read up on why writing static code is harmful. And on how
>> global state can hide in static "classes".
>> > We use static classes other places in core.
>> https://yourlogicalfallacyis.com/bandwagon
>> In almost all such cases I have seen in core this kind of use of static    
>> is
>> bad.
>> > And there's no reason to revert to hideous functions like we had    
>> before.
>> No one is suggesting that.
>> Cheers
>> --
>>
> Why the hooks should not be static? Multi-site (farm) built-in support    
> in core without $wgConf? Common page table across multiple sites?
> Dmitriy
How do you envision non-static hooks working and supporting multiple wikis?


If hooks will be non-static, should the hooks become the members of 
RequestContext, maybe?
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Removing the Hooks class

2013-04-03 Thread Dmitriy Sintsov



4 Апрель 2013 г. 9:16:49 пользователь Jeroen De Dauw (jeroended...@gmail.com) 
написал:

Hey,
> I see no reason to get rid of the hooks class.
Given you also do not understand why I think the comment is funny, I
recommend you read up on why writing static code is harmful. And on how
global state can hide in static "classes".
> We use static classes other places in core.
https://yourlogicalfallacyis.com/bandwagon
In almost all such cases I have seen in core this kind of use of static is
bad.
> And there's no reason to revert to hideous functions like we had before.
No one is suggesting that.
Cheers
--


Why the hooks should not be static? Multi-site (farm) built-in support in core 
without $wgConf? Common page table across multiple sites?
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Cannot run the maintenance script

2013-03-26 Thread Dmitriy Sintsov




26 Март 2013 г. 14:16:09 пользователь Platonides (platoni...@gmail.com) написал:

On 25/03/13 23:19, Rahul Maliakkal wrote:
> I installed SMW extension in my local wiki yesterday and now when i visit a
> page in my local wiki i get this message "A database query syntax error has
> occurred. This may indicate a bug in the software. The last attempted
> database query was:
> 
> (SQL query hidden)
> 
> from within function "ShortUrlUtils::encodeTitle". Database returned

> error "1146:
> Table 'my_wiki.w1_shorturls' doesn't exist (127.0.0.1)""
> 
> Along with the page being displayed untidily.
> 
> So i tried to fix the problem ,as suggested by people i tried to run "php

> update.php"
> Then i got the following error message
> 
> "A copy of your installation's LocalSettings.php

> must exist and be readable in the source directory.
> Use --conf to specify it."
> 

It would be nice if maintenance scripts displayed the requested path to 
LocalSettins.php in case of such error.


> I have my LocalSettings.php in the same place where my default index.php is
> located,earlier i had made some logo changes to my wiki and they were
> succesfully reflected in my wiki,so the localhost has access to the
> LocalSettings.php
> 
> I am working on Ubuntu and have mediawiki 1.20 installed
> 
> Please Help!!Its Urgent
> 
> Thanks In Advance

That's very odd. Perhaps you are running the script as a different user
which doesn't have read access? Is your file printed if from the folder
you do php update.php you run    cat ../LocalSettings.php ?


Also one may have MW_INSTALL_PATH environment variable set pointing to 
different directory. I had such weirdness at one hosting sharing two different 
versions of MediaWiki in wrong way.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC "How/whether MediaWiki could use ZendOptimizuerPlus" -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-23 Thread Dmitriy Sintsov

On 22.03.2013 18:10, Thomas Gries wrote:

just one message, just arrived:

http://php.net/archive/2013.php#id2013-03-21-1

PHP 5.5.0 beta1 available 21-Mar-2013

The PHP development team announces the release of the first beta of PHP
5.5.0. This release is the first to include the Zend OPCache. Please
help our efforts to provide a stable PHP version and test this version
carefully against several different applications, with Zend OPCache
enabled and report any bug in the bug tracking system.
THIS IS A DEVELOPMENT PREVIEW - DO NOT USE IT IN PRODUCTION!



Ubuntu 12.04 LTS still has 5.3 which is officially outdated by php.net 
and does not have traits and also no Closure::bindTo() which makes 
closures useless as mixin properties via __invoke(). Already had an ugly 
issue when local php 5.4 closure had access to instance $this while at 
server running 5.3 it produced an error. Quite strange that closures did 
not receive instance context in 5.3. And 5.2-style __get() based mixins 
are slower. And because 12.04 LTS is so much conservative I still cannot 
rely on 5.4 features in my projects (surely I can add ppa but it's not 
always a desirable thing).
Also 5.4 had some Zend performance optimizations. But it seems they want 
to move to 5.5 even faster than 5.3 to 5.4 transition.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] FlaggedRevs at www.mediawiki.org

2013-03-18 Thread Dmitriy Sintsov


Hi!
I cannot check out my own Extension:GoogleMapsFn manual page changes at 
mediawiki.org.
Changes are pending few days already.
Can someone do that?
http://www.mediawiki.org/w/index.php?title=Extension:GoogleMapsFn&oldid=648005&diff=cur
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Virtual machines for testing IE

2013-03-13 Thread Dmitriy Sintsov




13 Март 2013 г. 22:42:57 пользователь Matthew Flaschen 
(mflasc...@wikimedia.org) написал:

On 03/13/2013 11:06 AM, Željko Filipin wrote:
> On Wed, Mar 13, 2013 at 1:31 AM, Matthew Flaschen
> wrote:
> 
>> Unlike before, they now even offer VirtualBox and VMWare images
> 
> 
> ievms[1] makes it even simpler to download all images.

Cool, thank you.
Matt

Aren't these Windows systems bundled into VM images get expired every month or 
so so you have to re-download huge images again and again? I used to download 
and run IE tests for custom MW scripts / skins about 1.5 years ago, it was 
really tiresome and slow.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Purpose of #wikimedia-dev

2013-02-27 Thread Dmitriy Sintsov

On 28.02.2013 8:40, Tyler Romeo wrote:

On Wed, Feb 27, 2013 at 5:53 PM, Platonides  wrote:


If you are coding/reviewing MW code, I recommend you to be available on
the irc channel. That way we could isntantly ask you "wtf are you
commiting here?"



It's not for lack of wanting to go on IRC. It's technically blocked at my
job so I can't go on.
*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Many people (including myself) consider IRC a major waster of time. It's 
too simple to spend time chatting in IRC channels instead of actually 
learning / doing job etc. My professional growth at younger time was 
partially ruined by chat addictions. Now I try to stay away from IRC 
when possible. It's a bit off-topic, however might be a good warning for 
younger people.
IRC has another major disadvantage - when you are living in different 
timezone, your message will be easily missed by people from very 
different timezone.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Better non-MySQL db support

2013-02-26 Thread Dmitriy Sintsov




26 Февраль 2013 г. 14:27:06 пользователь Nikola Smolenski (smole...@eunet.rs) 
написал:

On 26/02/13 04:18, Matthew Flaschen wrote:
> Sure, for starters. :) Bear in mind, if we want to keep support for all
> these dbs, every change to the database schema has to (at some point)
> result in a change to separate SQL files for each DB (MySQL and SQLite
> use the same ones).    For instance, there is a separate active
> oracle/tables.sql.
I am wondering if it would make sense to give up on SQL, make universal 
table creation functions in PHP, the same way there are for other 
queries, and use that. Has anyone tried this before, is there other 
software that works like this?


http://stackoverflow.com/questions/108699/good-php-orm-library

By the way, MySQL 5.6 is out and it supports fulltext search indexation for 
InnoDB tables. They also promise better peformance on large hardware.
Still cannot find 12.04 ppa for 5.6.10, though and do not want to go troubles 
installing from source (although I installed MySQL from source some years ago)..
Why going another database engines besides MySQL / MariaDB?
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-14 Thread Dmitriy Sintsov

On 14.02.2013 21:14, vita...@yourcmc.ru wrote:
I guess this would not directly solve any of the problems listed, but 
would

it be helpful to bring back to life
https://www.mediawiki.org/wiki/Enterprise_hub ? It was started by 
somebody
an year or two ago but seems to have been abandoned at a draft stage. 
I am

thinking if everybody adds some information about extensions/pages they
find particularly useful in the enterprise world, it will help future 
users
but also help current enterprise wikis exchange experience. Does this 
seem

worthwhile?


IMHO there are so much useful extensions that I think it can be a 
little much for that page.


For example if I edited that article I would put almost all extensions 
from our distribution there... so I'm documenting them on 
http://wiki.4intra.net/Category:Mediawiki4Intranet_extensions :-)


The question is, how much they are stable and secure. MediaWiki is 
high-quality software that should not be impaired by low-quality 
extension. Also, when extension is unmaintained, it's stability and 
security becomes questional as well.
Also, I remember for major MW extensions scalability is the big problem. 
Efficient SQL queries, using APC / Memcached cache, not invalidating 
parser cache too often. For example my own Extension:QPoll is not 
well-scaling requiring some major rewrites. That applies to many of 
another extensions as well.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: Re: How to speed up the review in gerrit?

2013-02-13 Thread Dmitriy Sintsov

On 14.02.2013 9:14, Brian Wolff wrote:

Also, the locations of extensions should be stable. If the repos start
moving around as extensions move up and down the ladder it  will cause
confusion.


Disadvantages are really small comparing to huge advantages for both 
reviewers and extension's authors. Another advantage is that importance 
and stability of extension will be evaluated by experienced WMF 
developer rather than self-placing extension description page into 
[[Category:Stable extensions]].

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: Re: How to speed up the review in gerrit?

2013-02-13 Thread Dmitriy Sintsov

On 14.02.2013 6:33, Marco Fleckinger wrote:



On 02/14/2013 03:28 AM, Chad wrote:

On Wed, Feb 13, 2013 at 9:22 PM, Marco Fleckinger
  wrote:

Having a "can review all extensions" group is easy, but allowing for
exemptions will be a pain to manage the ACLs for. For every extension
that opts out of being reviewed by this group, we'd have to adjust 
its

ACL to block the inherited permissions.



How about instead of "can review all extensions", we make it easier to
request review rights on non-WMF extensions?


Good idea, but in general there could just be 3+ different classes of
extensions? The class can be calculated by its importance, e.g. 
installed on

WMF-sites, number of other wikis using it, etc.



Having classes of extensions is difficult to maintain from an ACL
standpoint. Permissions in Gerrit are directly inherited (and there's no
multiple inheritance), so things in mediawiki/extensions/* all have the
same permissions. So having rules that apply to only some of those
repositories requires editing ACLs for each repository in each "group."


Sorry, I think you misunderstood me. I meant classes like:

* "Used by WMF"
* "non-WMF very important"
* "non-WMF important"
* "non-WMF less important"
* "non-WMF unimportant"

No multiple inheritance will be needed for this model.

That is a really great idea. If there were 
mediawiki/extensions/(wmf|non-wmf-unimportant|non-wmf-important)/* 
subdirectories introduced, such classification should encourage 
extension developers to improve their extensions so they can move up 
from "non-WMF unimportant" to "non-WMF important"and maybe higher.
I do not develop for MW last year, however this idea is great and I give 
my personal +100 for having extensions importance hierarchy in 
repository. Should be easier for reviewers as well.
Maybe even corporate donations campaigns for "non-WMF very important" 
can be introduced at some late stage then, which is even better to make 
more extensions useful and stable.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Dmitriy Sintsov

On 11.02.2013 20:19, Mariya Nedelcheva Miteva wrote:

Hi all,

For what I've undrstood from conversations so far the Web or HTML API is
not enough for extension developement and the variability of exposed
internal classes is often responsible for the incompatibility of extensions
with certain MediaWiki versions. Correct me if I am wrong.


Stable internal API means MediaWiki LTS version. Another alternative is 
to update extensions to keep them compatible to internal API changes / 
new features.

There were quite a lot of outdated and abandoned extensions at mediawiki.org
Sometimes they are abandoned because the author lost his interest in 
developing for MediaWiki.
But there are another cases when customer funding was small / short, 
very limited and thus extension's maintainer does not have enough time 
to support it.
WMF extensions are updated, that's true, however there are many more 
extensions used at 3rd party wikis. Not so top quality as major WMF 
extensions, though.
If there were corporate donations for non-WMF extensions, things could 
have been improved.
But usually 3rd party site owners want quality for free while not 
willing to provide funding for long-time support. That's life.

Developing free software is great, until living difficulties come.
Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to support MediaWiki

2013-01-14 Thread Dmitriy Sintsov



14 Январь 2013 г. 19:59:01 пользователь Mark A. Hershberger 
(m...@everybody.org) написал:

On 01/14/2013 01:34 AM, Dmitriy Sintsov wrote:
> My own (already past) experience of developing MediaWiki extensions
> shows that it is much easier to keep the compatibility layer in
> extension itself rather than backporting large changes into older
> versions of core. 


This does point to another possible way to backport the changes -- some
sort of compatibility extension.    That allows the core to remain stable,
but allows people to use some newer extensions if they need the features.


I am not sure Wikimedia would like to put their efforts on this. In fact, if I 
was in Wikipedia, why would I spend my time helping lazy competitors? Wikimedia 
is too major and MediaWiki is primarily their tool which they generously offer 
to another people. It should be much simpler to persuade extension-level LTS.


> However there are major milestone versions, such as 1.17 with
> ResourceLoader and 1.21 with ContentHandler, and it's not easy to
> backport their functionality back [...]. So, the
> extensions probably has to be compatible to such "major milestone
> versions".

Agreed.    1.19 is (in retrospect) a good choice for an LTS version since
ResourceLoader has stabilized somewhat.


But it misses ContentHandler. And there was a lot of 1.17 installations as 
well. But yes, 1.17 does not support some things, like specifying to load css 
on top. Maybe more.


> But many of the sites the wikistats list have are quite large and thus
> earn enough of money to their owners. I wonder why do not they update
> their sites.

"Don't fix what isn't broke" is probably part of the thinking.


My own experience shows that most of site owners are quite greedy people. Maybe 
I am bad at business though.


> One guess I know that some admins consider newer versions
> are slower than old ones, do not know how much that is true.

Mariya Miteva [[mw:User:Mitevam]] is talking to non-WMF users of
MediaWiki and has found that some people are put off by the difficulty
of upgrading MediaWiki.

For example, many people have put in some effort to modify a skin for
their needs only to find gratuitous changes in the core made with (what
appears to be) little thought for backwards compatibility.


Skins are horrorful part of MediaWiki. They are deliberately made too much 
low-level to run fast (they probably sample performance). However after upgrade 
custom skinning could break or miss newly added parts. Daniel Friesen attempted 
to make skins template-based but I do not think it was accepted into the core.


So, if a site is working for me as it is, and the only reasons for me to
upgrade are some obscure security issues that I may never face, then I
will probably find a way to work around the issues instead of upgrading.


Maybe, somewhat.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Adapting Visual Editor for 1.19 (and LTS)

2013-01-13 Thread Dmitriy Sintsov


But some changes were hard to adapt, for example at some point the definition 
of SpecialPage::userCanExecute() changed from
public function userCanExecute( $user )

to

public function userCanExecute( User $user )
with typecheck. Then extension's special pages that inherited from SpecialPage 
and override ::userCanExecute() throwed PHP error about incompatible 
declaration.

It would be better to check $user instanceof User in SpecialPage code itself, 
but who cares.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Adapting Visual Editor for 1.19

2013-01-13 Thread Dmitriy Sintsov



9 Январь 2013 г. 4:09:21 пользователь James Forrester 
(jforres...@wikimedia.org) написал:

On 4 January 2013 09:02, Mark A. Hershberger  wrote:
> Would it be possible to adapt the Visual Editor to run under 1.19?
Possible? Yes. However, it would involve back-porting a number of
changes that have been made to core since then (and will no-doubt
continue to be added as we discover new ways in which MW assumes that
"editing" and "EditPage.php" are the same thing). For some quick
examples, and https://gerrit.wikimedia.org/r/#/c/36237/ and
https://gerrit.wikimedia.org/r/#/c/37191/ (which are both in 1.21
wmf5) - though I'm sure there are more out there. I worry that you'd
end up needing to create 1.19.4-ve_support branches that would be no
easier than pushing to a 1.21 bleeding-edge branch, sadly.

My own (already past) experience of developing MediaWiki extensions shows that it is much 
easier to keep the compatibility layer in extension itself rather than backporting large 
changes into older versions of core. For example, one of the gerrit patches mentioned above 
can be "probed" via method_exists( $this->mTitle, 'getEditNotices' ), then 
acting according to result - pretty fast and efficient.
So, the extensions should be LTS first, not core. The very core of MediaWiki 
(at least when deployed at small sities) is still revision / article / title, 
which is possible to use in very similar ways.

However there are major milestone versions, such as 1.17 with ResourceLoader and 1.21 
with ContentHandler, and it's not easy to backport their functionality back (although I 
managed to load dynamically some jQuery scripts in 1.15 with Lab.js loader). So, the 
extensions probably has to be compatible to such "major milestone versions".

But many of the sites the wikistats list have are quite large and thus earn 
enough of money to their owners. I wonder why do not they update their sites. 
One guess I know that some admins consider newer versions are slower than old 
ones, do not know how much that is true.

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] getting Request object from EditPage

2012-12-20 Thread Dmitriy Sintsov

On 18.12.2012 16:51, Yury Katkov wrote:

Of course I mean WebRequest class.
-
Yury Katkov, WikiVote



On Tue, Dec 18, 2012 at 4:48 PM, Yury Katkov  wrote:

Hi guys!

I'm writing the EditPage::showEditForm:fields and I want to get a
Request object. The use of wgRequest considered to be deprecated, so
how is it possible to get request object in my hook function?

  static public function showBacklinks($editpage, &$output){
  return true;
  }

Cheers,
Yury Katkov, WikiVote

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


$output->getContext()
or
$editpage->getArticle()->getContext()

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Creating custom skin based on Vector in MediaWiki 1.20

2012-11-17 Thread Dmitriy Sintsov




Sat Nov 17 2012 22:09:38 GMT+0400 (Caucasus Standard Time) пользователь Krinkle 
(krinklem...@gmail.com) написал:

On Nov 17, 2012, at 6:08 PM, Dmitriy Sintsov  wrote:

> Right. I can probably make local stylesheet with references to google cdn, 
however I am not sure it wil not violate IE security or something.
> So I did:
>       $out->addLink( array( "rel" => "stylesheet", "href" => 
"http://fonts.googleapis.com/css?family=PT+Sans+Narrow&subset=latin,cyrillic,latin-ext";
> ) );
>       $out->addLink( array( "rel" => "stylesheet", "href" => 
"http://fonts.googleapis.com/css?family=Open+Sans:400,600&subset=latin,cyrillic,latin-e
> xt,cyrillic-ext,greek,greek-ext" ) );
> 


No IE security issues, unless your website is served from HTTPS in which 
Chrome, IE and possibly other browsers will block those requests (which is 
good).

The Google Font APIs support HTTPS natively:
* 
http://fonts.googleapis.com/css?family=PT+Sans+Narrow&subset=latin,cyrillic,latin-ext
* 
https://fonts.googleapis.com/css?family=PT+Sans+Narrow&subset=latin,cyrillic,latin-ext

So in that case I'd recommend load with with a protocol-relative url so that it 
always works (only do this for urls that you know support both, such as Google 
Font APIs).

"href" => 
"//fonts.googleapis.com/css?family=PT+Sans+Narrow&subset=latin,cyrillic,latin-ext"

More about protocol-relative: 
http://paulirish.com/2010/the-protocol-relative-url/

-- Krinkle


I know that MW uses (not widely-known) protocol-relative URL's since 1.18. 
You're right that link has to be corrected. it wil point to two different 
stylesheets,
depending on site protocol used:
http://fonts.googleapis.com/css?family=PT+Sans+Narrow&subset=latin,cyrillic,latin-ext
https://fonts.googleapis.com/css?family=PT+Sans+Narrow&subset=latin,cyrillic,latin-ext

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Creating custom skin based on Vector in MediaWiki 1.20

2012-11-17 Thread Dmitriy Sintsov




Sat Nov 17 2012 02:32:27 GMT+0400 (Caucasus Standard Time) пользователь Roan 
Kattouw (roan.katt...@gmail.com) написал:

On Tue, Nov 13, 2012 at 12:33 AM, Dmitriy Sintsov  wrote:
> However, 'skin.vector' module includes both styles and scripts. And
> setupSkinUserCss() adds styles only. So 'dependencies' did not help, vector
> styles are loaded later, anyway. What can I do with that?
>
Unfortunately, addModuleStyles() and dependencies don't work well
together. You shouldn't use dependencies for CSS that is essential for
rendering the page.


I figured that out. So I re-implemented screen.css from scratch, instead of 
overriding Vector's one.
At least I extend VectorTemplate, so I re-implemented only 
VectorTemplate::execute(), while
renderNavigation() and another helper methods were inherited.

But I cannot inherit from SkinVector, because SkinVector does:
$out->addModuleScripts( 'skins.vector' );
...
$out->addModuleStyles( 'skins.vector' );

Why can't SkinVector do:
$out->addModuleScripts( "skins.{$this->skinname}" );
...
$out->addModuleStyles( "skins.{$this->stylename}" );

so I do not have to copy / paste initPage() and  setupSkinUserCss(), only to 
define few short properties?
   var $skinname = 'artmuseum', $stylename = 'artmuseum',
$template = 'ArtmuseumTemplate', $useHeadElement = true;


> Also, I need to load remote google webfonts. Does ResourceLoader support
> this or I have to use old-fashionated methods of OutputPage() ?
Unfortunately RL doesn't support this directly. Although to load a
webfont, all you need is a stylesheet with an @font-face in it, right?


Right. I can probably make local stylesheet with references to google cdn, 
however I am not sure it wil not violate IE security or something.
So I did:
   $out->addLink( array( "rel" => "stylesheet", "href" => 
"http://fonts.googleapis.com/css?family=PT+Sans+Narrow&subset=latin,cyrillic,latin-ext";
) );
   $out->addLink( array( "rel" => "stylesheet", "href" => 
"http://fonts.googleapis.com/css?family=Open+Sans:400,600&subset=latin,cyrillic,latin-e
xt,cyrillic-ext,greek,greek-ext" ) );

in initPage().

I want to use google cdn, because it should be faster than my local hosting. 
Also, maybe there will be some improvements in fonts, I do not know.
What if they change their stylesheet's
url(http://themes.googleusercontent.com/static/fonts/ptsansnarrow/v3/UyYrYy3ltEffJV9QueSi4V77J2WsOmgW1CJPQ9ZetJo.woff)
 format('woff');
at some time?
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Creating custom skin based on Vector in MediaWiki 1.20

2012-11-13 Thread Dmitriy Sintsov


Hi!
Some year ago, I used to create Vector-based skin with redesigned layout (very 
different positions of sidebar and action links) in MW 1.17 from scratch, via 
copying all of Vector subtree and modifying it, then adding my skin resources 
into Resources.php. It worked, but was a lot of work, including core patching.

Now I work with 1.20 and there's an article written by Daniel Friesen on how to 
create Vector-derived skins without modifying Resources.php and using Vector 
classes and styles as a base. So I do not have to modify the core and to copy 
the whole skin subtree:
http://blog.redwerks.org/2012/02/28/mediawiki-subskin-tutorial/

I mostly followed the instructions in the guide. However my skin also changes 
skin execute() method, because not just css and quite a lot of layout is 
changed. execute() works fine.

I need to override a lot of Vector's css, located in 'skins.vector' resource 
module.

But the following code:
/**
 * @param $out OutputPage object
 */
function setupSkinUserCss( OutputPage $out ){
parent::setupSkinUserCss( $out );
$out->addModuleStyles( "skins.artmuseum" );
}

causes 'skins.vector' styles to be loaded after my 'skins.artmuseum' styles. 
So, the Vector styles are not overwritten by my skin styles.
Changing the order does not help:
function setupSkinUserCss( OutputPage $out ){
$out->addModuleStyles( "skins.artmuseum" );
parent::setupSkinUserCss( $out );
}


ResourceLoader has 'dependencies' key to make resource automatically be 
dependent on another resource:
$wgResourceModules['skins.artmuseum'] = array(
'styles' => array(
'artmuseum/screen.css' => array( 'media' => 'screen' ),
),
'remoteBasePath' => &$GLOBALS['wgStylePath'],
'localBasePath' => &$GLOBALS['wgStyleDirectory'],
'dependencies' => 'skin.vector'
);

However, 'skin.vector' module includes both styles and scripts. And 
setupSkinUserCss() adds styles only. So 'dependencies' did not help, vector 
styles are loaded later, anyway. What can I do with that?

Also, I need to load remote google webfonts. Does ResourceLoader support this 
or I have to use old-fashionated methods of OutputPage() ?
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MW 1.20 backwards compatibility in extensions

2012-11-08 Thread Dmitriy Sintsov



9 Ноябрь 2012 г. 2:52:54 пользователь Daniel Friesen 
(dan...@nadir-seen-fire.com) написал:

On Thu, 08 Nov 2012 14:08:53 -0800, Tim Starling     
wrote:

> All extension branches were removed during the migration to Git. Very
> few extensions have branches for MW core major version support.
> There's no longer a simple way to branch all extensions when a core
> release is updated, and nobody has volunteered to write a script.
>
> So we're back to the situation we had in MW 1.9 and earlier, where
> it's usually not possible to run any actively maintained extension
> against an MW core that's not the current trunk.
>
> Given this, I think code reviewers should insist on backwards
> compatibility with MW 1.20 for commits to the master branch of
> extensions that are commonly used outside Wikimedia, at least until
> the release management issue is solved.
>
> -- Tim Starling

I've always been in favor of the trunk/master of an extension retaining    
compatibility with the latest stable version of MediaWiki until our next    
release. (with brand new extensions designed around new features in alpha    
an exception)

However our LTS support for 1.19 is going to make this more of an issue.


I made quite enough of MediaWiki freelancer work for small to medium MediaWiki 
installations here in Russia. Most of owners want new extensions or new functionality in 
old extensions and rarely want to upgrade the core. They consider that "core update 
is too expensive". It was especially true for 1.17, which introduced a lot of 
client-side changes.
Maybe someday the most of activity (like Wikidata, Parsoid and so on) will be 
outside of core? Such way core might become smaller part of total project and 
change less often.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Very low APC user cache utilization at MediaWiki 1.19.2 host

2012-11-07 Thread Dmitriy Sintsov


Hi,

At one wiki host there is 5.3.3-7+squeeze14 with apc 3.1.3p1 (both are quite 
old however these are provided by Debian and I am not a regular admin of the 
server).
The wiki host has about 10-20k visits per day and about 3500 of pages. It's not 
the smallest wiki.

There is apc.ini (already tweaked in hope to get a better utilization but no 
avail):
extension=apc.so

apc.enabled=1
apc.shm_segments= 1
apc.shm_size= 256
apc.ttl= 0 ; 7200
apc.user_ttl = 0 ; 7200
apc.gc_ttl = 0 ; 7200 ; was: none
apc.num_files_hint = 0; 1024
apc.user_entries_hint = 1 ; was: none
apc.mmap_file_mask = /tmp/apc.XX
apc.enable_cli = 0 ; 1
apc.cache_by_default = 1
apc.max_file_size = 10M
apc.stat = 1; 0
apc.write_lock = 1 ; was: none
apc.localcache = 1 ; was: none
apc.localcache.size = 256 ; was: none

Part of LocalSettings.php (tried both CACHE_ACCEL constant and 'apc' string key 
without any difference):
$wgMainCacheType = CACHE_ACCEL; # 'apc';
$wgMessageCacheType = CACHE_ACCEL; # 'apc';
$wgParserCacheType = CACHE_ACCEL; # 'apc';


However, apc.php monitoring script shows really low user cache usage which goes 
between 100K and 4M in about 20-30 seconds of time.

My actions:
1. I set all apc*ttl settings to zero in hope entries would not be timed out 
from cache.
2. I set apc.user_entries_hint = 1 in suspect that MediaWiki may write a 
lot of entries, thus older apc user cache entries are purged out.
3. I disabled all extensions except of Cite / ParserFunctions in suspect that 
some of them might purge client cache or parser cache.
LocalSettings.php does NOT have $wgEnableParserCache = false;
4. I switched from custom skin to standard 1.19.2 Vector skin.

Nothing made any difference, user cache utilization is still very low after 
about a hour of apache restart.

There are some client JS scripts, maybe I should try to disable these as well?

I tried to place debug calls into APCBagOStuff. MediaWiki seems to almost never calls 
->delete(), a lots of ->set() and ->get().
Parser cache entries with 'pcache:idhash' keys have ttl of 86400 seconds, which 
should be sufficient enough to fill 256MB of apc user cache.

Why the user cache is so low and why about every 10-30 seconds it loses lots of 
entries, reducing from 4MB to 100K in apc.php monitoring script? Even if users 
were visiting only few of 3500 pages, there should not be such cache drop.

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Modifying img src values via extension?

2012-11-03 Thread Dmitriy Sintsov



3 Ноябрь 2012 г. 2:00:47 пользователь Platonides (platoni...@gmail.com) написал:

On 01/11/12 06:37, Dmitriy Sintsov wrote:
> 
> Also I wish it was possible to extend (and perhaps to define) classes

> via variable values:
> 
> class $myParser extends $wgParserConf['class'] {

> ...
> }
> $wgParserConf['class'] = $myParser;
> 
> Such way, multiple extensions could nest (override) default class. Even

> monkey patching is not so much needed then.
> Should probably be possible via eval(), however eval() is a hacky way.
> Perhaps a great idea for PHP 5.5.
> Dmitriy

We have one eval in MediaWiki for doing something like that.
(tests/phpunit/includes/parser/MediaWikiParserTest.php)


Interesting, thanks for the path.


If properly argued, it _should_ be simple to get a new hook added, though.


Monkey patching or dynamic class names (if were properly available w/o eval) 
probably will have better performance than adding more hooks.
Also I wish another classes would be instantiated as fiexible as Parser, via 
$wg* setting.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Modifying img src values via extension?

2012-10-31 Thread Dmitriy Sintsov


Also I wish it was possible to extend (and perhaps to define) classes via 
variable values:

class $myParser extends $wgParserConf['class'] {
...
}
$wgParserConf['class'] = $myParser;

Such way, multiple extensions could nest (override) default class. Even monkey 
patching is not so much needed then.
Should probably be possible via eval(), however eval() is a hacky way.
Perhaps a great idea for PHP 5.5.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Modifying img src values via extension?

2012-10-31 Thread Dmitriy Sintsov


Also I wish it was possible to extend (and perhaps to define) classes via 
variable values:

class $myParser extends $wgParserConf['class'] {
...
}
$wgParserConf['class'] = $myParser;

Such way, multiple extensions could nest (override) default class. Even monkey 
patching is not so much needed then.
Should probably be possible via eval(), however eval() is a hacky way.
Perhaps a great idea for PHP 5.5.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Modifying img src values via extension?

2012-10-31 Thread Dmitriy Sintsov




31 Октябрь 2012 г. 21:34:25 пользователь Brion Vibber (br...@pobox.com) написал:

On Wed, Oct 31, 2012 at 6:53 AM, Daniel Barrett  wrote:

> Is there any easy way (via extension) to modify the "src" attribute of
> images on wiki pages?
>
> I see hooks for modifying href values - LinkBegin and LinkEnd.    But I
> don't see something similar for images, whose URLs seem to be produced via
> File::getUrl().
>
> Purpose: I want to add a querystring parameter onto the ends of the URLs,
> turning  into  src="foo.jpg?timestamp=xxx"/> to aid with caching.
>

There's not a great hook point for this I think, but you could probably add
one in.


Custom hooks are quite hard to get into "master" repository, unless one is a 
Wikimedia programmer. I had one custom hook in EditPage, used by my extension; however it 
wasn't possible to include it into core.


ThumbnailImage::toHtml() in includes/media/MediaTransformOutput.php is the
function that actually produces the  tag. You could probably stash
something there, or on the constructor for the class, to modify the URL.


I use the following trick to get thumbnail html (albeit with surrounding div's)

if ( $wgParserConf['class'] !== 'Parser' ) {
die( 'Extension:PrettyPhoto cannot override non-default Parser' );
}
$wgParserConf['class'] = 'PrettyPhotoParser';

class PrettyPhotoParser extends Parser {

function makeImage( $title, $options, $holders = false ) {
return PrettyPhoto::apply( parent::makeImage( $title, $options, 
$holders ) );
}

} /* end of PrettyPhotoParser class */

Where extension's own PrettyPhoto::apply() method modifies content of generated 
image html via DOMDocument / xpath.

What bothers me, how much reliable parser class override is.

However, if another class "factories" were as flexible as Parser factory, one 
could imagine something like this:
$wgDefaultClass['ThumbnailImage'] = 'MyThumbnailImage';

class MyThumbnailImage extends ThumbnailImage {
   function toHtml( /* args */ ) {
 $html = parent::toHtml( /* args */ );
 ...
   }
}

I wish PHP has "monkey patching", so classes could be manipulated and extended 
dynamically but unfortunately it does not. __get(), __set(), __call() are optional and 
quite slow.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SPDY?

2012-10-27 Thread Dmitriy Sintsov




27 Октябрь 2012 г. 23:20:39 пользователь Chad (innocentkil...@gmail.com) 
написал:

Not to mention varnish/squid.



Varnish author on SPDY:
http://developers.slashdot.org/story/12/07/13/1327235/varnish-author-suggests-spdy-should-be-viewed-as-a-prototype

"The author of Varnish, Poul-Henning Kamp, has written an interesting critique of SPDY 
and the other draft protocols trying to become HTTP 2.0. He suggests none of the candidates 
make the cut. Quoting: 'Overall, I find the design approach taken in SPDY deeply flawed. For 
instance identifying the standardized HTTP headers, by a 4-byte length and textual name, and 
then applying a deflate compressor to save bandwidth is totally at odds with the job of HTTP 
routers which need to quickly extract the Host: header in order to route the traffic, 
preferably without committing extensive resources to each request. ... It is still unclear for 
me if or how SPDY can be used on TCP port 80 or if it will need a WKS allocation of its own, 
which would open a ton of issues with firewalling, filtering and proxying during deployment. 
(This is one of the things which makes it hard to avoid the feeling that SPDY really wants to 
do away with all the "middle-men") With my security-analyst hat on, I see a lot of 
DoS potential in the SPDY protocol, many ways in which the client can make the server expend 
resources, and foresee a lot of complexity in implementing the server side to mitigate and 
deflect malicious traffic.'
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Best policies for WikiFarms

2012-10-16 Thread Dmitriy Sintsov



16 Октябрь 2012 г. 11:08:52 пользователь Nathan Larson 
(nathanlarson3...@gmail.com) написал:

>
> I doubt WMF uses wgCacheDirectory at all. It's default is false and WMF
> has Memcached. It doesn't even affect WikiFarms since it's not set by
> default. You have to explicitly set it yourself.
>

It's absent from InitialiseSettings.php but in CommonSettings.php it's set
as $wgCacheDirectory = '/tmp/mw-cache-' . $wmfVersionNumber;

* http://noc.wikimedia.org/conf/highlight.php?file=CommonSettings.php


Is that correct? I set separate dir for each host:
foreach ( $wgConf->suffixes as $suffix ) {
...
   $wgConf->settings['wgCacheDirectory'][$suffix] = "{$IP}/{$suffix}/cache";
   $wgConf->settings['wgUploadPath'][$suffix] = $wgConf->get( 'wgScriptPath', $suffix ) . 
"/{$suffix}/images";
...
}

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Best policies for WikiFarms

2012-10-15 Thread Dmitriy Sintsov




15 Октябрь 2012 г. 22:31:32 пользователь Mark A. Hershberger 
(m...@everybody.org) написал:

Axel Thimm and Patrick Uiterwijk are working on packaging MediaWiki 1.19
for Fedora and have asked for advice on how to enable their MediaWiki
package to support WikiFarms out of the box.[0]

I really don't know how to help them, but I'm sure that someone on this
list has something that would be helpful.

If you have experience, please join in the discussion!



LocalSettingsGenerator should generate $wgConf members (and manipulate them - 
add wiki, delete wiki, upgrade wikis) instead of plain global $wg* settings 
generation.
That's probably impossible in 1.19 without developing such feature then 
backporting it into 1.19.

By the way, how does WMF add new configuration settings into their 
InitialiseSettings.php when new version of MediaWiki recieves them? Eg. 
$wgCacheDirectory which was added in 1.16 and many other new settings? Just 
manually? I do that manually in my farm, although there is always a chance to 
miss the new setting such way. Which may leave it into conflicting or 
suboptimal state. Imagine $wgCacheDirectory poitning to the same dir for all 
local hosts of farm, that's probably not a good thing.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Make Extensions Aware of the ContentHandler

2012-10-12 Thread Dmitriy Sintsov



Dmitriy Sintsov.

11 Октябрь 2012 г. 16:13:02 пользователь Jeroen De Dauw 
(jeroended...@gmail.com) написал:

Hey,


Also, it would be great, if WMF selected some old version of MediaWiki as LTS 
(Long Time Support) so extensions would be required to work with it. Currently 
that could be 1.17, first version which got ResourceLoader.



My observation is that most WMF maintained extensions are not maintained with 
much regard to backwards compatibility but with a lot of effort being put in 
replacing use of deprecated code usage quickly. They thus tend to have 
compatibility broken early after MediaWiki releases. I'd be semi-surprised to 
find a single one that is still compatible with 1.17 at this point.


What's so major is preventing new extensions from running in 1.17? Page actions 
in separate classes? Router? Are these major obstacles?

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Capabilities class (was Re: Make Extensions Aware of the ContentHandler)

2012-10-11 Thread Dmitriy Sintsov




12 Октябрь 2012 г. 7:23:19 пользователь Victor Vasiliev (vasi...@gmail.com) 
написал:

On 10/11/2012 08:33 PM, Tim Starling wrote:
> Then you would have to load a capability map with potentially hundreds
> of entries at registration time, despite the fact that on most
> requests, most of the hooks will never be called. It seems inefficient
> to me. At least with the current system, the number of support
> constants is small (4-5).
>
> -- Tim Starling
We can cache them. If we use APC, an individual call to 
Capabilities::has() should take about 5 to 6 μs.

Shouldn't class_exists( 'ResourceLoader' ) or class_exists( 'ContentHandler' ) 
be enough for most of tasks?
One even can use Reflection to check for particular changes of mentioned 
classes introduced in newer versions.

Of course the compatibility to something really old (let's say 1.14) is not 
desired.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Make Extensions Aware of the ContentHandler

2012-10-11 Thread Dmitriy Sintsov




11 Октябрь 2012 г. 15:53:42 пользователь Jeroen De Dauw 
(jeroended...@gmail.com) написал:

Hey,

> What is wrong with making extensions work right now both with and without
it?

+1. I don't see any harm done in adding such conditional checks, and they
solve the problem at hand. We probably want to add them in any case, unless
breaking compatibility with MediaWiki older then 1.21 is acceptable.


Also, it would be great, if WMF selected some old version of MediaWiki as LTS 
(Long Time Support) so extensions would be required to work with it. Currently 
that could be 1.17, first version which got ResourceLoader.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Extending the ORMTable functionality (was: Wikidatareview status)

2012-10-06 Thread Dmitriy Sintsov




Sat Oct 06 2012 19:44:51 GMT+0400 (Caucasus Standard Time) пользователь Jeroen 
De Dauw (jeroended...@gmail.com) написал:

Hey,

Starting a new thread as this is really not related to the Wikidata review
status thread.

On 6 October 2012 08:51, Dmitriy Sintsov  wrote:
>
Speaking of ORM, are there the plans to support table field relations (one
> to one, one to many, many to many) and maybe to use ORM through the core
> and another extensions? This could make DB access cleaner and simplier
> however maybe with little overhead. Also index hinting is probably required
> in such case, especially for such large and actively accessed databases.
>

The existing ORMTable functionality I wrote is intended as a very simple
and lightweight layer on top of the regular database abstraction MediaWiki
has. It is not a fully fledged ORM framework and is not intended to be one.
I think it's a very bad idea to try to turn it into one. If you really want
something more like a full object relational mapper, then you'll be better
of starting from scratch after putting some serious thought into how you're
going to do it. I do suspect that most core developers (including me) will
be rather sceptical of putting such a thing into core though, as it would
seriously impact a lot of existing code.


Many frameworks allow to combine "direct" SQL calls and ORM calls, so there is 
not that big legacy compatibility problem.
The efficiency could be a problem, though. Not all of ORM's are heavy-weight. 
For example, Yii's is quite light however queries weren't always most 
efficient. I used to work with Yii recently, that's why I asked. Speaking of 
Java, that's more heavy-weight, something like Doctrine, probably (which I 
didn't use yet).
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikidata review status

2012-10-05 Thread Dmitriy Sintsov




5 Октябрь 2012 г. 23:42:30 пользователь Denny Vrandečić 
(denny.vrande...@wikimedia.de) написал:

Hi all,
here's a status update of the reviews on the Wikidata related changes to core:
* ContentHandler: a lot has happened here in the last few days. We
created a faux commit that squashed the changes in order to gather
comments, and have responded to most of them, implementing the
suggested changes. Here's a link to the faux commit (note that we are
not updating the faux commit):

Here's a link to the Gitweb of the actual branch:

Here's a link to the Bugzilla bug tracking the change and discussing the state:

The further plan is to have the actual merge commit read early next
week, and then hopefully have this one land as one of the first big
features in 1.21.
* Sites management: this one is awaiting Asher's review, after Chad
gave it a thumbs up. The database impact is expected to be low, so we
hope that the review will happen soonish and the change be approved.
Link to Gerrit here: 
* Sorting in jQuery tables: this patchset has received a number of
updates, many comments, and now three +1s, and it is also verified.
Maybe someone will be brave enough to +2 it? It would be nice to see
this merged in too. Here's the link to Gerrit:

* Allow ORMTable to access foreign wikis: this allows a wiki to use
the load balancer to access another wikis tables. It has been verified
and +1'ed by two, but not yet +2'ed and merged. Some reviewing love
would be appreciated. 
All in all, I am thankful for the increasing activity in the last few
weeks, and I hope that by next week this mail will be much shorter and
clearer.


Speaking of ORM, are there the plans to support table field relations (one to 
one, one to many, many to many) and maybe to use ORM through the core and 
another extensions? This could make DB access cleaner and simplier however 
maybe with little overhead. Also index hinting is probably required in such 
case, especially for such large and actively accessed databases.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Moving MediaWikiWidgets.org to Wikimedia

2012-09-05 Thread Dmitriy Sintsov




4 Сентябрь 2012 г. 21:40:20 пользователь John Du Hart (compwhi...@gmail.com) 
написал:


Does MediaWikiWiki really need any more shitty/insecure addons that no
one is going to maintain? I think we have enough already.

Pick out the best of the bunch and nuke the rest.


It is not easy (at least here in Russia) to earn anything via programming 
MediaWiki extensions. I made some extensions, two for local university and 3 or 
4 as freelancer job. They were paid just once. And you have to write them in 
fixed time, that's why many parts of code aren't cleaned / polished up. While 
new versions of MediaWiki introduce new functionality, sometimes 
incompatibility with old extensions. It seems that extension-writing must be 
full-time paid job, however nobody of employers here wants that. They want 
minimal, single-time expenses. I guess that's why there are many abandoned and 
low-quality extensions. Struggling with low income problems is not nice thing, 
especially for the people who have families.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] $( '' ) vs. $( '') in codingconventions (and methods for building DOMs with jQuery)

2012-08-28 Thread Dmitriy Sintsov



Indeed.
Moreover, aside from the performance and security, there's another important 
factor to take into account. And that is the fact that IDs can contain 
characters that have special meaning in CSS selectors (such as dots).

We've seen this in before when dealing with a MediaWiki heading (where the 
ID-version of the heading can (or could) contain dots). So whenever you have 
what is supposed to be an element ID in a variable, use document.getElementById 
(even if you don't care about performance or security).


Modern browsers like Chrome / IE9 are really fast, jQuery isn't just about performance 
but also about shorter code. Why don't have something like $.id(myId), as a 
"shortcut" to document.getElementById(). Also perhaps another .get* methods 
(there were bunch of them).
As you are one of jQuery code maintainers, such feature can be easily 
implemented.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] gerrit tab width

2012-08-08 Thread Dmitriy Sintsov



8 Август 2012 г. 22:43:15 пользователь bawolff (bawolff...@gmail.com) написал:


I use 8 :P

(Seriously though, everything looks so crunched up with 4 space tab
width... Makes me claustrophobic!)


Tab 8 encourages to make less "structural nesting" like foreach( if ( foreach ( if () ) ) ) in one 
method, because such lines would become too long. Instead there will be large amount of smaller functions / 
methods. But there should be the middle balance between "large method which does a lot" and 
"lots of extremly tiny methods nested calls". Perhaps Tab 4, I do not know.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Captcha for non-English speakers II

2012-07-31 Thread Dmitriy Sintsov



1 Август 2012 г. 6:26:02 пользователь MZMcBride (z...@mzmcbride.com) написал:

Risker wrote:
> Putting on my checkuser hat for a moment - yes, please please look at
> finding a different CAPTCHA process - the cross-wiki spamming by bots that
> are able to "break" the CAPTCHA is becoming overwhelming.    This issue has
> been reported separately, and there may be a different fix, but this is a
> pretty big deal as a few hundred volunteer hours a month are going into the
> despamming effort.

Reported separately where?

CAPTCHAs were designed for "test if you're human," not "test if you're
spam." It's a wonder they've worked this long. I imagine better anti-spam
tools are needed (which may be a new extension, new AbuseFilter filters,
better user scripts, etc.).

If the situation is as dire as it sounds, it shouldn't be difficult to find
a few resources to throw at the problem. In a discussion like this, examples
of particular problematic behavior (links!) are always most helpful to
developers, I've found. "This is the bad behavior we're seeing and want to
stop. How should we do that?" :-)


How's about image-based approach?
http://en.wikipedia.org/wiki/CAPTCHA#Interaction_with_images_as_an_alternative_to_texting_.28text_typing.29
http://www.picatcha.com/captcha/
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Upload files: a simple progress bar ?

2012-07-22 Thread Dmitriy Sintsov

On 22.07.2012 16:07, Chad wrote:

On Sun, Jul 22, 2012 at 5:29 AM, Thomas Gries  wrote:

Question:

Why don't we have a simple progress bar (in the core) when uploading files ?

Because there's not a "simple" way of doing it. Every solution I've seen
over the years has either been kludgy (using iframes or some such
nonsense) or requires something like flash. Supposedly this is easier
with the file upload API in HTML5 (says Google), but this isn't supported
across all browsers yet.


What would you recommend me to use ?


Your browser status bar. It's told me the progress of my uploads
for 10+ years now.
Only Chrome shows precise and smoothly updated percentage of POST upload 
data sent in status bar. Firefox requires some addon for that:

https://addons.mozilla.org/en-us/firefox/addon/uploadprogress/
Perhaps one could detect and use HTML5 API when available (there are 
growing numbers of such browsers installations every day).


Although it's really strange why browser's download windows are so much 
more advanced than their upload bars. Perhaps not many people are 
uploading stuff.


Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How to display the title with underscores?

2012-07-16 Thread Dmitriy Sintsov




17 Июль 2012 г. 9:22:26 пользователь wangfeng wangfeng 
(wangfeng.v1.1...@gmail.com) написал:

Hi, I have resolve the problem that I can display "a_b_c".
What I did is modify the code in mediawiki/includes/Title.php ,in which I
found something like "str_replace( '_',' ',... )".
But it's not complete. I cannot display "_a_b_c_" .The display is also
"a_b_c".
So I guess there are some mechanisms that convert the string from "_a_b_c_"
to "a_b_c".
Can anyone tell me where is the code for the convert mechanisms?
Thans very much.


You should place two lines into LocalSettings.php, as Platonides suggested:
$wgAllowDisplayTitle = true;
$wgRestrictDisplayTitle = false;

Also make sure you don't overwrite the values of the mentioned variables later in 
"code flow".

It is not necessary to patch the core, the required functionality is there 
already.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making language selection sticky

2012-06-26 Thread Dmitriy Sintsov




26 Июнь 2012 г. 20:56:14 пользователь Denny Vrandečić 
(denny.vrande...@wikimedia.de) написал:




Hi all,
we are trying to let users switch their language - whether they are
logged in or not - through a language selector. This can be either the
ULS, which is progressing impressively, or just a list of languages in
the sidebar, or anything else. After selecting it, the page is
rerendered using the uselang parameter.
Now the problem: the uselang parameter is not sticky. When I move to
another page, it is lost.
We tried to change the linker in order to add the uselang parameter
every time -- but it only works in the content, not in the sidebar and
actionlinks.

I've developed such extension back in 2009 when started to work with MediaWiki. 
I used:
'SkinTemplateBuildNavUrlsNav_urlsAfterPermalink' hook to add sticky uselang 
value
to sidebar links. It was long time ago. Don't know whether that will work with 
Vector.


We could put the language into a cookie, as the ULS currently does,
but this means that the squid caches won't work, afaik.
We could take the output just before it is send to the browser and
regex-substitute all the links in order to add the uselang parameter
every... OK, half joking. Only half.

Another solution could be to put the language into the path, i.e. the
pretty URL /wiki/San_Franicisco does get rewritten to
/w/index.php?title=San_Francisco as of now, but change that to
/hr/San_Francisco rewritten to /w/index.php/San_Francisco?uselang=hr
(or /w/index.php/Special:UseLang/hr/San:Franciso with an Alias if this
is more pleasing)
I think this is based on an idea of Brion during the Hackathon.
So switching the user language amounts to change the URL.
Any comments on this?

It would be nice if the language code was the base of directory in URL path. Then it may 
be extracted by Router. That thing is not only about persistently switching language 
"on the fly". It's also about making multi-lingual wiki's without multiple 
virtual hosts and without interwiki links. Which is nice thing for non-English sites (eg. 
our university site has English and Russian version of the page and they did not want to 
have en.* and ru.* domains for that).
There were some problems with uselang, so I've made different setup: virtual hosts based 
on URI's virtual directory: site.org/en/Page site.org/ru/Page much as you suggested, 
parsing URI in LocalSettings.php. And real interwiki hosts with "commons" to 
share files. It wasn't made online due to extremly low budget, however.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Extension Skeleton Program

2012-06-25 Thread Dmitriy Sintsov




25 Июнь 2012 г. 20:06:19 пользователь Derric Atzrott 
(datzr...@alizeepathology.com) написал:




Would anyone be interested in a program that generates Skeletons for new
extensions?    I've noticed that when I make extensions I generally go through
the exact same process (with minor variations) each time.

The idea in my head right now is for a program that does the following:
    * Asks for the Extension name and other Credit information
    * Asks whether or not the extension is going to need to change the
database schema
    * Asks whether or not the extension is going to make use of ResourceLoader
    * Asks whether or not the extension is going to include a Special Page
    * Depending on the answers to the above it may do some of the remaining
items on this list
    * Creates a folder hierarchy with the following folders:
ExtensionName
|- includes
|- js
|- images
|- styles
|- sql
    * Create skeleton files for ExtensionName.php, ExtensionName.i18n.php,
ExtensionName.alias.php, and SpecialExtensionName.php
    * Create a skeleton file for sql\ExtensionName.sql
* Creates table, adds ID column UNIQUE PRIMARY
    * Includes basic configuration for Schema Updates in ExtensionName.php
    * Includes basic configuration for Special Page in ExtensionName.php
    * Includes basic configuration for Resource Loader in ExtensionName.php



MediaWiki extensions can be so different. Some, like many "media handlers", do not even require 
direct DB access. Some extensions might implement their own remote API's (so there might be api folder 
added). In 1.20 there can be new "actions", in the future maybe new "page handlers" for 
non-wikitext. There are almost unexplored extensions, like XML Importer / Exporter extensions, I think there 
is only one extension of such kind.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revoking +2 (Re: who can merge into core/master?)

2012-06-16 Thread Dmitriy Sintsov

On 16.06.2012 8:20, Andrew Garrett wrote:

On Sat, Jun 16, 2012 at 6:53 AM, Rob Lanphier  wrote:


On Fri, Jun 15, 2012 at 8:48 AM, Sumana Harihareswara
  wrote:

If you merge into mediawiki/core.git, your change is considered safe for
inclusion in a wmf branch.  The wmf branch is just branched out of
master and then deployed. We don't review it again.  Because we're
deploying more frequently to WMF sites, the code review for merging into
MediaWiki's core.git needs to be more like deployment/shell-level
review, and so we gave merge access to people who already had deployment
access.  We have since added some more people.  The current list:
https://gerrit.wikimedia.org/r/#/admin/groups/11,members

Let me elaborate on this.  As unclear as our process is for giving
access, it's even less clear what our policy is for taking it away.
If we can settle on a policy for taking access away/suspending access,
it'll make it much easier to loosen up about giving access.

Here's the situation we want to avoid:  we give access to someone who
probably shouldn't have it.  They continually introduce deployment
blockers into the code, making us need to slow down our frequent
deployment process.  Two hour deploy windows become six hour deploy
windows as we need time to fix up breakage introduced during the
window.  Even with the group we have, there are times where things
that really shouldn't slip through do.  It's manageable now, but
adding more people is going to multiply this problem as we get back
into a situation where poorly conceived changes become core
dependencies.

We haven't had a culture of making a big deal about the case when
someone introduces a breaking change or does something that brings the
db to its knees or introduces a massive security hole or whatever.
That means that if the situation were to arise that we needed to
revoke someones access, we have to wait until it gets egregious and
awful, and even then the person is likely to be shocked that their
rights are being revoked (if we even do it then).  To be less
conservative about giving access, we also need to figure out how to be
less conservative about taking it away.  We also want to be as
reasonably objective about it.  It's always going to be somewhat
subjective, and we don't want to completely eliminate the role of
common sense.

It would also be nice if we didn't have to resort to the nuclear
option to get the point across.  One low-stakes way we can use to make
sure people are more careful is to have some sort of rotating "oops"
award.  At one former job I had, we had a Ghostbusters Stay Puft doll
named "Buster" that was handed out when someone broke the build that
they had to prominently display in their office.  At another job, it
was a pair of Shrek ears that people had to wear when they messed
something up in production.  In both cases, it was something you had
to wear until someone else came along.  Perhaps we should institute
something similar (maybe as simple as asking people to append "OOPS"
to their IRC nicks when they botch something).


In general I think we'd want to start by making sure that the person who
broke something actually found out that they broke it. I don't think we
need to get into "punishment" unless we actually start having serious
problems. Otherwise this is a solution looking for a problem.

In terms of low stakes "punishment" for breaking the build, I've heard of
an organisation where the last person who broke it is responsible for some
unpleasant task (resolving merge conflicts?). In more homogenous co-located
organisations I can see something like "has to buy a drink for the people
who cleaned up after it" working, but that doesn't really work in our
organisation.


Shrek ears?? Why people have to be humiliated? Why don't just:
1. temporary reduce the salary
2. permanently reduce the salary
3. fire
?
Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki tarballs and the WMF

2012-06-07 Thread Dmitriy Sintsov




8 Июнь 2012 г. 5:33:06 пользователь Tim Starling (tstarl...@wikimedia.org) 
написал:




On 07/06/12 03:36, Mark A. Hershberger wrote:
> The Foundation has made MediaWiki available for everyone and that's a
> great thing.    But Wikimedia's funding comes from donations as a result
> of requests on Wikipedia, not from distribution of MediaWiki, so they
> are rightly focused on their production cluster.

I've long believed that MediaWiki should be considered a project of
the WMF, on the same level as the wikis we host. Perhaps if we
included donation requests on the download and installer pages then
MediaWiki might be considered worthy of some attention in its own right?

-- Tim Starling


MediaWiki is quite worth itself for many different projects. Additional funding 
could be received by expanding the commercial support of another 
MediaWiki-based projects. Also by making MediaWiki more CMS-like (despite the 
countless warnings that MW is not CMS, it becomes more and more CMS-like with 
every version, especially when bundled with extensions). With Router, Actions, 
also with non-wikitext page views in development (WikiData), semantic, ACL 
extensions and so on.

Perhaps MediaWiki could become something like Confluence but in PHP. I don't 
know.

Speaking of tarballs, I was quite surprised that non-recommended to use, 
non-tarball 1.18-wmf1 has 'mediawiki.jqueryMsg.js' ResourceLoader module, while 
newer and recommended 1.18.2 / 1.18.3 tarball has not.

Dmitriy
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Lua API specification

2012-06-03 Thread Dmitriy Sintsov

* Victor Vasiliev  [Sat, 2 Jun 2012 22:43:56 +0200]:

Currently there is no proper way in Lua to get such basic things about
the current page name. To fix this, I propose to provide a MediaWiki
API accessible from the Lua scripts:




Any feedback appreciated.

— Victor.

Is that local PHP bindings only or the remote MW API querying will be 
possible? In MediaWiki, API usually means "do something remotely", 
although local API usage is possible (except for very hard parts where 
POST is required - but maybe that was fixed in recent versions of 
Request / Response abstractions).
I am asking because when there are two different things called "API" 
this may cause some confusion.

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stance on PHP namespaces?

2012-05-16 Thread Dmitriy Sintsov

Dmitriy Sintsov.

Actually dollar sign as variable name prefix comes from Perl, which was very
popular language during PHP3 / PHP4 time. The authors of PHP probably wanted to
simplify Perl to PHP porting.

17 Май 2012 г. 3:48:55 пользователь Terry Chay (tc...@wikimedia.org) написал



 On May 16, 2012, at 3:52 PM, Max Semenik wrote:

 > Frankly, the namespace syntax in PHP is so atrocitous that I would
 > like to never see namespaces in out code:P


 Ahh the namespace separator!

 https://wiki.php.net/rfc/namespaceseparator

 Brings back memories.

 Hmm, Now that you mention it maybe we should have used ":P" as the namespace
 separator. ;-)

 ( I miss ":)" That would have made PHP a much happier place. It already has a
 lot of money ($), now it just needs some smileys.)

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Lua: return versus print

2012-04-13 Thread Dmitriy Sintsov

On 13.04.2012 16:12, Petr Bena wrote:

I have no knowledge of Lua, but I don't see what is problem with print
here, the function print is supposed to print output to output device
in most of programming languages, just as in this case, so I don't
understand why we should want to use return (which is supposed to
return some data / pointer back to function it was called from) in
this case? I mean if we can pick if we should use print or return as
recommended way to print text, I would vote for print(), especially if
it has better performance that the implementation using return.


output buffer has to be "catched", while return value may be more 
complex than just a text, being processed via API or another way. Not 
all of the scripts should generate plain text. My extension will need 
nested arrays (or simple objects) to process, some of another extensions 
probably too.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Dropping support for the automatic /index.php/Article

2012-03-25 Thread Dmitriy Sintsov

* Platonides  [Mon, 26 Mar 2012 00:51:59 +0200]:

On 25/03/12 11:30, Daniel Friesen wrote:
> So I'm considering dropping this automatic feature.
> Making this change will mean that MediaWiki will start outputting
> /index.php?title=... style URLs for everyone who hasn't configured
short
> urls.

I don't like dropping that.
We could try making automatically a .htaccess for short urls, but that
isn't coherent now that we stopped creating LocalSettings.php. And
providing a .htaccess in the tarball doesn't look like a good idea,
either (and note we want it *above* the mediawiki folder).
Moreover, we would need testing to check that .htacces is indeed
working, although that could be solved with a config/.htaccess and 

some

install check.

.htaccess directory specific rules might be disabled in httpd.conf 
virtualhost settings. Also .htaccess will not work for nginx (and 
probably for IIS as well).

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Using prototypical inheritance in dynamically loaded ResourceLoader modules

2012-03-22 Thread Dmitriy Sintsov
* Daniel Friesen  [Thu, 22 Mar 2012 01:40:16 
-0700]:
That's the very definition of an array. An array is a list, the keys 

are
indexes. By definition you cannot have an index that does not exist. 

If
PHP arrays are sparse, including numeric ones. They are actually like 
hashmaps I think so. I don't know whether they use different ways to 
store numeric and string keys. I haven't studied the low-level 
implementation. They also have "real" arrays like in C (lower level), 
however these are the part of SPL:

http://php.net/manual/en/class.splfixedarray.php


an
item in the array does not exist then the items after it take on 

earlier

indexes and the length is shorter.
If you have a case where 0 and 2 exists but 1 is not supposed to exist
then you're not using an array, you're using a hashtable, map, etc...
For that purpose you should be using Object like a hashtable:
var h = {}
h[0] = 'a';
h[2] = 'c';

You can use for..in on such a thing. And you also shouldn't have to
worry
about using hasOwnProperty on it. If anyone is stupid enough to set
something on Object.prototype a lot more than your script will break.

Ok. Which are the cases when jQuery.each() is preferable? I use it with 
jQuery selectors. Or, something else as well?

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Using prototypical inheritance in dynamically loaded ResourceLoader modules

2012-03-22 Thread Dmitriy Sintsov
* Dmitriy Sintsov  [Thu, 22 Mar 2012 11:15:40 
+0400]:

In Chrome, executing the following code:
var a = [];
a[0] = 'a';
a[2] = 'c';
worked, however debugger inspector shows 'undefined' elements between
real elements of array. And the length is counted for 0..last element,
including the undefs. However, for..in worked for such arrays fine, 

but

may be inefficient - I do not think it has hashes or linked lists to
skip the undefs, it probably just skips them one by one. That worked 

in

IE8/IE9/FF/Chrome so I must not be first coder who used that.

I switches to {} and $.each() two days ago, so that's already the 

past.


Maybe the efficient and simple way for sparse arrays would be 
maintaining non-sparse array for sparse keys for..in with that array 
then using "numeric string" object properties to get / set actual 
values? I don't know, the speed is not so critical in my usage case.



> * Sparse objects don't exist in any language afaik. What you created
is
>   just a regular plain object.
I was mentioning JS objects with numeric keys - I didn't know they are
silently converted to numbers. Now I removed parseInt() from my code.


Mistyped, silently converted to strings, of course.

MediaWiki with extensions does a lot of requests, which requests do load 
extension messages? I'll try to find.

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Using prototypical inheritance in dynamically loaded ResourceLoader modules

2012-03-22 Thread Dmitriy Sintsov

* Krinkle  [Wed, 21 Mar 2012 18:51:45 +0100]:


Few points:
* The easiest way to understand it is to consider "sparse arrays" to 

not

  exist in javascript.
  It doesn't throw an exception if you try it, but it's not supposed 

to

be
  posisble, so don't.

In Chrome, executing the following code:
var a = [];
a[0] = 'a';
a[2] = 'c';
worked, however debugger inspector shows 'undefined' elements between 
real elements of array. And the length is counted for 0..last element, 
including the undefs. However, for..in worked for such arrays fine, but 
may be inefficient - I do not think it has hashes or linked lists to 
skip the undefs, it probably just skips them one by one. That worked in 
IE8/IE9/FF/Chrome so I must not be first coder who used that.


I switches to {} and $.each() two days ago, so that's already the past.

* Sparse objects don't exist in any language afaik. What you created 

is

  just a regular plain object.
I was mentioning JS objects with numeric keys - I didn't know they are 
silently converted to numbers. Now I removed parseInt() from my code.



* As I said before I think there is no point in using $.each() in this
  case. It is nice if you need a context but as you already found out,
more
  often than not it's actually a burden to have to deal with it if you
just
  want a loop inside an existing context. I would've kept the for-in
loop you
  originally had which was just fine, only change the array into an
object.

Maybe I'll switch to for ...in back then however I scared by "basic" 
prototypes so maybe I'll check .hasOwnProperty() then, too. Still 
haven't decided that.



I'd have to look further into the actual code to know what you mean
here,
but there isn't really such thing as "refactorign a JS class into a
ResourceLoader module". For one, javascript doesn't have classes, but
that's just word choise. Secondly, ResourceLoader "just" loads
Actually, creating JS function and assigning the prototypes to it and 
then instantiating it via new operator and then adding properties to 
it's context, including some from it's own prototypes works like 
classes. Prototypes are really similar to virtual methods tables 
available in another OO languages, but another languages manipulate them 
indirectly, in higher and better level than JS. Frankly, I am not much a 
fan of JS, however I have no another widespread choice for 
client-programming web.



You don't have to refactor anything. There are a few things that it
enforces (such as execution of the javascript file in a local 

context),

but
other than that you don't have to change anything "for 

ResourceLoader".


There are a lot of conventions and best practices often associated 

with
"porting to ResourceLoader" but that's more along the lines of "now 

that

we're working on this, lets get rid of crappy code and make it nice",
not
anything ResourceLoader actually requires.

I'm curious what kind of problems you are referring to here? The kind 

of

problems you didn't have when you loaded jQuery in MediaWiki 1.15

The problems are simple - when I load another module with 
mw.loader.using, then call it's module from mw.loader.using callback, 
there is no return to original module. So I am enforced to build 
inflexible chain of calling main module directly from loaded module main 
context. I got used to it, although that was a little disappointment.


Now I have another pain: mw.msg() caches messages quite aggressively, 
even though I put the following lines in LocalSettings.php:

$wgResourceLoaderDebug = true;
$wgEnableParserCache = false;
$wgCachePages = false;

I got the same old version of message, even after multiple Ctrl+R in 
Chrome, action=purge and even httpd restats. How can I make mw.msg() 
being updated every time I change i18n file of my extension? Maybe the 
evil provider caches some requests though :-( I probably should check 
with Wireshark. It's tiresome that fighting with environment takes too 
much of the time.

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Video codecs and mobile

2012-03-20 Thread Dmitriy Sintsov

On 20.03.2012 20:29, Daniel Friesen wrote:
On Tue, 20 Mar 2012 07:03:06 -0700, Lars Aronsson  
wrote:



On 03/20/2012 02:24 AM, Brion Vibber wrote:
The prime competing format, H.264, has potential patent issues - 
like other
MPEG standards there's a patent pool and certain licensing rules. 
It's also
nearly got an exclusive choke hold on mobile - so much so that 
Mozilla is

considering ways to adopt H.264 support to avoid being left behind:

http://blog.lizardwrangler.com/2012/03/18/video-user-experience-and-our-mission/ 



Is it time for us to think about H.264 encoding on our own videos?

Right now users of millions of mobile phones and tablets have no 
access to

our audio and video content,


Which are the patents and when do they expire? Which are the
platforms that don't support Theora, and what stops them?
Maybe we should flood Wikipedia's most visited articles with
videos, so millions of users will be made aware that the makers
of their equipment (Apple iPad?) should support open formats.

Now, if we were to take this path, how do we flood Wikipedia with
videos? Live interviews in all biographies of living people?
If this turns out to be completely unrealistic, because we can't
produce videos in sufficient quantity, then maybe the time is not
yet mature for video in Wikipedia.


Anyone have a good stockpile of old Public Domain movies?
I believe there are also at least two freely licensed movies. Everyone 
is using Blender's CC-BY "Big Buck Bunny" for  demos. And I 
believe there was another film that was openly distributed using 
.torrents.
How about embedding full movies into the articles into the Wikipedia 
articles about the movies when said movie is a freely licensed modern 
movie or a Public Domain film?



Are these PD? There are quite a lot of them.
http://www.archive.org/details/movies
Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Using prototypical inheritance in dynamically loaded ResourceLoader modules

2012-03-20 Thread Dmitriy Sintsov

* Krinkle  [Mon, 19 Mar 2012 14:32:51 +0100]:
Converted all of for..in into $.each(). The most funny thing is that 
$.each() did not work correctly with sparse arrays [], walking with 
"undefs" between real elements. While for..in used to work fine 
(FF,Chrome,IE8,IE9). So I had to convert sparse arrays into sparse 
objects {}, just as you recommended. This way $.each() works, however it 
also creates new context (different "this") so I've had to use var 
myself = this more often than it used to be.


The most pailful part was to pass control to the main module after edit 
module was abruptly terminated (without any warnings in Chrome console). 
I managed to call the main module method directly from edit module. Not 
a very nice thing, but at least it works. Who would know that such 
potentially easy thing as refactoring one js class into two 
ResourceLoader modules can be that much painful. I used to dynamically 
load jQuery for custom MediaWiki 1.15 installation via LABjs and don't 
remember such problems back then.


Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Using prototypical inheritance in dynamically loaded ResourceLoader modules

2012-03-19 Thread Dmitriy Sintsov

On 19.03.2012 17:23, Krinkle wrote:



On Mon, Mar 19, 2012 at 9:35 AM, Daniel Friesen
wrote:


On Mon, 19 Mar 2012 00:40:54 -0700, Dmitriy Sintsov
wrote:
var jqgmap = [];


for ( var mapIndex in jqgmap ) {


This is VERY bad JavaScript coding practice. Please use $.each().


This is rather exaggerated. Even more when looking at that suggestion.

Arrays should, indeed, not be enumerated with a for-in loop. Arrays in JS
can only contain numeral indices, so they should simply be iterated with a
simple for-loop like this `for (i = 0; i<  myArr.length; i += 1) { .. }`
(maybe cache length for slight performance gain by reducing property
lookups).


My array is numeric but sparse,
myArr = [];
myArr[0] = a;
myArr[1] = b;
So I cannot just use incremental key iteration, at least existence of 
element should be checked.




Using $.each has overhead (+1+n function invocations). When given an array
it will do a simple for-loop with a counter. It has overhead of 1+n
additional function invocations and context creations. In most cases there
is no use for it. There is one case where it is handy and that's when you
specifically need a local context for the loop (being careful not to create
later-called functions inside the loop, risking all variables being in the
post-loop state). If you don't need a local scope for your loop, then using
$.each (or the native [].forEach in later browsers) is pointless as it only
has overhead of additional function invocations and lowering the position
in the scope chain.

When iterating over objects, however, (not arrays) then $.each is no better
than a for-in loop because (contrary to what some people think) it is not a
shortcut for for-in + if-hasOwn wrapper. When an object is passed, it
literally just does a plain for-in loop invoking the callback with the
value. jQuery does not support environments where someone extends the
native Object.prototype because it is considered harmful (an therefore
MediaWiki inherently does not support that either), so a plain for-in loop
over an object (excluding array objects) is perfectly fine according to our
conventions.

See also http://stackoverflow.com/a/1198447/319266

 but so much for the good (and bad, evil) parts of javascript :D


I was thinking about Daniel warnings on Array prototyping and thought 
that proper iteration would check whether the prototype has the same 
member, then do not include it (skip from) callback.

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Using prototypical inheritance in dynamically loaded ResourceLoader modules

2012-03-19 Thread Dmitriy Sintsov
* Daniel Friesen  [Mon, 19 Mar 2012 01:35:21 
-0700]:

( function( $, mw ) {

} )( jQuery, mediaWiki );

I modified all of three modules: main, view and edit module to the 
recommended pattern.

http://pastebin.com/1kS6EyUu
http://pastebin.com/WQzBTw6W
http://pastebin.com/UqpTAvZ8

However, the execution still stops after the following call:
mw.jqgmap.loadEdit();
performed in main module. When I comment out that call, execution of 
further scripts does not terminate: however extra prototypes defined in 
'ext.jqgmap.edit' become unavailable; thus MapController and 
MarkerController instances do not work correctly in edit mode.


Still cannot dynamically load 'ext.jqgmap.edit' successfully..

Maybe I'll temporary re-unite the modules into monolithic one, otherwise 
I risk to ruin the timeline of project. Too bad I wasn't able to defeat 
the trouble.

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Using prototypical inheritance in dynamically loaded ResourceLoader modules

2012-03-19 Thread Dmitriy Sintsov
* Daniel Friesen  [Mon, 19 Mar 2012 01:35:21 
-0700]:

Autoloading classes is not possible. Even if every browser supported
getters and we could use them to dynamically load classes, this would
require synchronous http calls. Which are absolutely HORRIBLE because
they
block the entire JS thread and in addition typically even block the
browser's UI.

How does ResourceLoader figures that function loadEditPrototypes() was 
defined in 'ext.jqgmap.edit' when I call this function from 
'ext.jqgmap'? And if there was no extra http call, then why the list of 
scripts in Chrome debugger does not include 'JQGmap.edit.js' among the 
scripts before mw.loader.using() and loadEditPrototypes() call was 
performed?



You shouldn't be dropping the closure, you don't want local things in
the
global scope. You also shouldn't be using the global $ and mw 

directly.
Prototypes are not local things. They are carried out with function they 
belong to. I define few prototypes in 'ext.jqgmap.view' then I want to 
add more prototypes in 'ext.jqgmap.edit'. But not always, only when the 
edit mode is active. However you are right about minor local variables.



Everything should be using a pattern like:
( function( $, mw ) {

} )( jQuery, mediaWiki );

Ok, I'll try the closure call with two arguments in every module, didn't 
think about that. I'll report if that will fix the abrupt termination of 
module "chain" execution.



> var jqgmap = [];
> for ( var mapIndex in jqgmap ) {

This is VERY bad JavaScript coding practice. Please use $.each().

I know that I can use $.each() here. I don't do that for few reasons: 
for..in is easier to debug (step by step key) and also because for..in 
was problematic only for objects {}, not for arrays [] AFAIK. However I 
might be wrong. Maybe I'll switch to $.each() however the code was 
working with for..in perfectly well before I refactored it into multiple 
modules. There was only one module. However it was growing so I had to 
split it into view and edit submodules.



> $(''+
> '' + mw.msg( 

'jqgmap-show-code'

)
> +
> '' + mw.msg( 

'jqgmap-show-code'

)
> +
> '' +
> '' +
> '')

I should really start poking people about this one. `"" . wfMsg(
'foo' ) . ""` is bad in PHP code, and it's just as bad inside 

JS.
You should be creating your html with NO + concatenation, and then 

using

a
.find() and .text() to insert things where they belong. That, or use
multiple $() calls with .append() to create the full structure.
Likewise you shouldn;t be inserting this.Idx that way into the
attribute.
It should be added with something like .attr( "jqgmap_code" + this.Idx
);

I might consider refactoring of jQuery DOM nodes creation, however the 
loading  of 'ext.jqgmap.edit' module does not work now so I cannot 
continue until I'll figure out why. This code is not ideal however it's 
not the reason of the fault. This code was working perfectly well before 
splitting into multiple modules and introducing 
mw.loader.using('ext.jqgmap.edit',function(){...});

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Using prototypical inheritance in dynamically loaded ResourceLoader modules

2012-03-19 Thread Dmitriy Sintsov

Hi!
I've tweaked my code few times, trying to make it simpler: got rid of 
some closure calls and left only one dynamic module load at client-side.


http://pastebin.com/UxyifLmx
http://pastebin.com/q3Tm6Ajd
http://pastebin.com/4emMDBS6

Still, it gives me headaches, because mw.loader.using( 'ext.jqgmap.edit' 
,function(...)) does not actually execute the loaded module's code even 
when the callback function is "fired". This means that additional 
mw.jqgmap prototypes defined in 'ext.jqgmap.edit' are not available when 
actual object instances are created in function createMapControllers(), 
thus throws an error. When I add explicit call to dummy function 
loadEditPrototypes() defined in 'ext.jqgmap.edit', the module is 
executed, but the further scipts execution abruptly ends without any 
error in Chrome console.


To me it seems that ResourceLoader tries to be too smart not actually 
executing the code just before .using() callback. How does it figure out 
that I defined function loadEditPrototypes() in 'ext.jqgmap.edit' 
module? That kind of 'automagic' is tricky. Why not to have something 
like $wgAutoloadClasses but for ResourceLoader? Or, even to be able to 
really execute module on load. And why does it stop execution path is a 
puzzle to me..


Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Using prototypical inheritance in dynamically loaded ResourceLoader modules

2012-03-18 Thread Dmitriy Sintsov

On 18.03.2012 18:26, Dmitriy Sintsov wrote:

replaced 'new MarkerController(...)' calls to 'new 
mw.jqgmap.MarkerController(...)' calls left from incomplete refactoring 
of early working (non-broken) revision, however refactored separate view 
/ edit modules code still does not work, with the same js code being 
stopped without additional Chrome errors - so it hard to guess the 
reason what's broken in my js code.


Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Using prototypical inheritance in dynamically loaded ResourceLoader modules

2012-03-18 Thread Dmitriy Sintsov

Hi!
Can a JavaScript / ResourceLoader guru explain what's wrong with my code?

Module definitions at server side:
http://pastebin.com/8cmRbNxe

modules are loaded correctly by the following code:
http://pastebin.com/MFWk6znv
It is checked many times during extension development, 'localBasePath' 
and 'remoteExtPath' is set up correctly.


* By default, the module with the following code is loaded:
http://pastebin.com/0u96aMep

Then it loads either 'ext.jqgmap.view'  module or 'ext.jqgmap.edit' 
module, depending on data provided in generated html page.


This is made for two reasons:
1. Editor code is larger and when the page does not need the editor, it 
loads only viewer code, which is faster to load.

2. Editor modifies the mw.jqgmap.MapController.prototype.createEditor
and in the future may also introduce or modify more prototypes. This 
makes code more modular.


However, I have very big problem loading 'ext.jqgmap.view' from 
'ext.jqgmap.edit':


1.
   mw.loader.using( 'ext.jqgmap.view', function() {
2.
   console.log(mw.jqgmap);

When callback function is executed, Chrome console does not show 
mw.jqgmap as being defined. So console.log(mw.jqgmap) returns 
'undefined' and only when that happens, actual execution of 
'ext.jqgmap.view' begins. It seems that ResourceLoader loads the code 
wrapped into exception handler which tries to detect when the code has 
to be loaded. That would be ok, however, when I execute 
'ext.jqgmap.view' code step by step:


1.
   if ( mw.jqgmap ) {
2.
   return;
3.
   }
4.
5.
   mw.jqgmap = (function () {

First line
if ( mw.jqgmap ) {
}

is executed second time, when mw.jqgmap value is set with closure 
result. Then the further execution of code just stops. No further errors 
in Chrome console. And, even mw.jqgmap is correctly defined!


There used to be no if (mw.jqgmap) ( return; ) check originally, then 
another code (Google maps) complained about being included twice, which 
may cause "unpredictable results".


How can I fix the chain of loading so it will correctly setup 
mw.jqgmap.MapController and mw.jqgmap.MarkerController so they will be 
different in edit and view mode and available to use in 'ext,jqgmap' code?


I also have the whole extension under development however I haven't 
comitted it into svn yet - I don't know whether yet another map 
extension is desirable there. Although there was no rule against 
duplicate functionality in extensions?


ResourceLoader is a bit hard to me.

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Auto-created gerrit accounts for users that had enough info in USERINFO

2012-03-17 Thread Dmitriy Sintsov

On 17.03.2012 22:04, Antoine Musso wrote:

Le 17/03/12 07:09, Dmitriy Sintsov wrote:

Is that only for core commit developers? I have svn account at
ques...@svn.wikimedia.org and valid USERINFO. However I cannot login to
the forementioned site with my svn.wikimedia.org credentials. Should I
ask to transfer my extensions into git repository first? I am inactive
in recent months due to personal problems however I still hope to have
some commits in the future.

Hello Dmitriy,

I am the one who provided a list of accounts to Ryan for automatic creation.

That list only included authors active during the last six months and
having an extension being migrated to git/Gerrit. That list of
extensions is available, for now, at:

   https://gerrit.wikimedia.org/mediawiki-extensions.txt


The reason I have filtered out some accounts is not to exclude people,
but merely because there is 0.99 chance that they will not have to use
Gerrit anytime soon.


Since over the last six months you have only been active on the
QPoll extension, your account was not included in the list I have
provided to Ryan.

Hope it clarifies the situation.

Ok, I understand. I'll probably never review the other's code myself. 
Still I hope to have some of my extensions transferred to git 
repository, if that's possible with push master access to these. There's 
been advantage of another developers suggesting improvements in my code. 
But if that's too much, no problem. I just wonder about the future of 
svn repository, will the commits to that old repository continue to 
receive occasional observation / attention of developers?

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Auto-created gerrit accounts for users that had enough info in USERINFO

2012-03-17 Thread Dmitriy Sintsov

On 17.03.2012 14:16, Ryan Lane wrote:

My user account on MediaWiki.org is 'SVG' and my user account on
labsconsole.wikimedia.org is 'Tim Weyer'. Is it because I didn't tell
  my username?
In general, it's a great working script.


So, I was given a CSV and was asked to script linking accounts with
the info in the CSV. The script works perfectly. I can't attest to the
data ;).



I fixed the email address in my USERINFO, if that's still relevant.
https://svn.wikimedia.org/svnroot/mediawiki/USERINFO/questpc
Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Auto-created gerrit accounts for users that had enough info in USERINFO

2012-03-16 Thread Dmitriy Sintsov

On 17.03.2012 2:24, Ryan Lane wrote:

I've ran a script to auto-create user accounts for users who had
enough information in their USERINFO file (user name, svn account
name, email address). If you are one of these lucky few (there were
only 28 that didn't already have accounts), then you can follow these
instructions to log in:

 https://labsconsole.wikimedia.org/wiki/Help:Access#Initial_log_in

- Ryan


Is that only for core commit developers? I have svn account at 
ques...@svn.wikimedia.org and valid USERINFO. However I cannot login to 
the forementioned site with my svn.wikimedia.org credentials. Should I 
ask to transfer my extensions into git repository first? I am inactive 
in recent months due to personal problems however I still hope to have 
some commits in the future.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How to get the article object inside a function called by hook LinkEnd

2012-03-12 Thread Dmitriy Sintsov

* Thomas Gries  [Mon, 12 Mar 2012 07:57:05 +0100]:

Dmitriy,
the second argument ( $target ) is the link target (article object)
which is/has been processed by the linker,
not the article, on which this link was processed. This is, what I 

need.


It appears. that the whole Linker.php module does not refer to that,
because it is usually irrelevant
for forming links.


You may also hook to:
http://www.mediawiki.org/wiki/Manual:Hooks/BeforeInitialize
It has the context.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How to get the article object inside a function called by hook LinkEnd

2012-03-11 Thread Dmitriy Sintsov

* Thomas Gries  [Sun, 11 Mar 2012 22:38:10 +0100]:

Can someone help me, please:

I need the $article object (the article page, where the LinkEnd hook 

is

called) inside
a function called by hook LinkEnd.

 - because I want to modifiy some links in the hooked function,
but only on articles where the raw text contains some strings
in E:WikiArticleFeeds .



function onLinkEnd( $skin, Title $target, array $options, &$text, array 
&$attribs, &$ret ) {

$article = new Article( $target, 0 );
}

Or, you need to get exactly $wgArticle but without declaring it as 
global? Then I don't know how.


Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making the Lua/Javascript decision (Re: Performance roadmap update)

2011-12-25 Thread Dmitriy Sintsov
* Keisial  [Sun, 25 Dec 2011 23:16:46 +0100]:
> On 23.12.2011 18:21, Dmitriy Sintsov wrote:
> > I remember that PHP had some outdated and unmaintained sandboxing 
PECL
> > module, however it's unmaintained for a long time.
> > http://php.net/manual/en/runkit.sandbox.php
> > Dmitriy
>
> Dmitry Zenovich applied for maintaining it last year, although little
> seems to have been done in the official repo. He seems to have been
> working on https://github.com/zenovich/runkit/
> I worked with runkit some years ago and it wasn't hard to make it run.
> The downside is that you need a threaded php.
>
Wikimedia has experienced PHP/C developers who probably can maintain and 
update such module. Maybe even cross-translation from subset of JS into 
PHP (with caching) then executing via runkit could be possible. But 
threading is not something that is desirable in current environment. So 
maybe it was a bad idea.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making the Lua/Javascript decision (Re: Performance roadmap update)

2011-12-23 Thread Dmitriy Sintsov
On 23.12.2011 18:30, Jay Ashworth wrote:
> This is a really critical point: if you're going to provide an 
> interpreted language to end-users from within a program that is, 
> itself, written in an interpreted language, *you cannot use the 
> underlying interpreter* to run the end-users' programs, unless that 
> interpreter has sandboxing built-in. If you try, you will almost 
> certainly be exposing yourself to critical security vulnerabilities. 
> You're almost *better* off picking a different language, so that 
> you're not tempted to try. Cheers, -- jra 
I remember that PHP had some outdated and unmaintained sandboxing PECL 
module, however it's unmaintained for a long time.
http://php.net/manual/en/runkit.sandbox.php
Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making the Lua/Javascript decision (Re: Performance roadmap update)

2011-12-22 Thread Dmitriy Sintsov
* Victor Vasiliev  [Fri, 23 Dec 2011 09:46:46 +0400]:
> On Fri, Dec 23, 2011 at 9:26 AM, Dmitriy Sintsov 
> wrote:
> > Also it would be great if WikiScripts or Lua extension allowed to
> easily
> > bind functions / class wrappers for another MediaWiki extensions. 
The
> > extension I develop uses PHP eval for interpretation of the scripts, 
I
> > am looking for adaption of WikiScripts to use it in conjunction with
> my
> > extension.
> > Dmitriy
>
> Well, WikiScripts, if we decide to develop it seriously, will
> certainly have extension interface (it is on my large TODO list for
> that). Something tells me it would be possible to do in Lua as well.
> All we have to do now is to settle down on what way we choose.
>
Victor, what do you think about making WikiScripts syntax more similar 
to the subset of Lua or JavaScript syntax? That probably should not be 
too hard? What's about cross-translating to PHP then eval()?
Lua is great, however, it's a bit strange to use two interpreters 
(PHP+Lua) together. That limits hosting possibilities and it's something 
like using two similar screwdrivers for the same screw. It is a shame 
PHP itself does not have stable good VM extension for execution of 
scripts.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making the Lua/Javascript decision (Re: Performance roadmap update)

2011-12-22 Thread Dmitriy Sintsov
* Owen Davis  [Thu, 22 Dec 2011 11:50:48 -0800]:
> I think that would be a nice feature to have.  Amazingly enough,
> WikiScripts is not of the extensions that we have in our codebase at
> wikia (I think we're over 800+), but I will take a look at the
> implementation.  The  parser hook allows for parameters, but 
that's
> not the same.  Having access to normal template parameters would allow 
a
> lua template to act as a drop in replacement for an existing template,
> which is essential.
>
WikiScripts parses and executes it's scripts via PHP, which has both 
advantages and disadvantages. Lua extension should run faster, however 
it is not suitable for every hosting out there, because PHP Lua module 
is uncommon. I wonder whether it's possible to run around without PHP 
Lua extension, perhaps to daemonize Lua then exchange data between 
apache/mod-php and lua processes via named pipes? It is really sad that 
PHP itself does not have stable "VM" allowing to execute safe subset of 
it's own code in server-side scripts (it had only some beta or alpha 
status module). I wonder whether it is possible to cross-translate the 
subset of WikiScripts or Lua into PHP source, then PHP eval it (however 
would that work with HipHop?).

Also it would be great if WikiScripts or Lua extension allowed to easily 
bind functions / class wrappers for another MediaWiki extensions. The 
extension I develop uses PHP eval for interpretation of the scripts, I 
am looking for adaption of WikiScripts to use it in conjunction with my 
extension.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making the Lua/Javascript decision (Re: Performance roadmap update)

2011-12-21 Thread Dmitriy Sintsov
* Owen Davis  [Wed, 21 Dec 2011 19:18:46 + 
(UTC)]:
> It is littered with embedded HTML and string.format statements. Ugh.
> I'm going to look at something simple like Moustache (which has both
> JS and Lua implementations already) as a proof of concept.  Does
> anybody have a better suggestion?  I think Lua (or JS) by itself is 
not
> enough, there has to be a nice library of utility functions.  Has 
anyone
> thought about what that will really look like?
>
Also it would be great if Lua binding could perform parser functions. I 
believe that Extension:WikiScripts uses parser frame and can use current 
template parameters, when available:
http://www.mediawiki.org/wiki/Extension:WikiScripts
Template library (tpl_) allows the script to access the parser and the 
parameters of the template that invokes the script.

tpl_arg( argname[, default] ) returns the argument which name is 
specified as an argument, or the default value if it is not set (if no 
default is specified, it returns false).
tpl_named_args() returns all the named arguments to the template.
tpl_numbered_args() returns all the numbered arguments.
tpl_is_transcluded() returns whether the code is invoked from a 
template.

Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making the Lua/Javascript decision (Re: Performance roadmap update)

2011-12-21 Thread Dmitriy Sintsov
* Owen Davis  [Wed, 21 Dec 2011 19:18:46 + 
(UTC)]:
> It is littered with embedded HTML and string.format statements. Ugh.
> I'm going to look at something simple like Moustache (which has both
> JS and Lua implementations already) as a proof of concept.  Does
> anybody have a better suggestion?  I think Lua (or JS) by itself is 
not
> enough, there has to be a nice library of utility functions.  Has 
anyone
> thought about what that will really look like?
>
My guess that scripts interpreter should have MediaWiki API bindings:
http://www.mediawiki.org/wiki/Api
and Wiki DOM bindings, either something like already existing extension:
http://www.mediawiki.org/wiki/Extension:Wiki_Object_Model
or, even better WikiDOM manipulation:
http://www.mediawiki.org/wiki/Visual_editor/WikiDom_Specification
when the PEG Parser will be stable enough.

Building an API wrapper around Lua http client probably is not enough, 
because local API should be performed through FauxRequest (but still no 
POST emulation?).
I don't know how PHP and Lua can bind to each other. I know a local pro 
in Lua working in local uni CS lab, however he does not work with 
MediaWiki. Lua is used for macros and automation in C / C++ projects.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki is a living beast: please try to test code and commit it only when it works

2011-12-20 Thread Dmitriy Sintsov
* Brion Vibber  [Tue, 20 Dec 2011 14:46:07 
-0800]:
> MediaWiki is a mature, living breathing beast. When making lots and 
lots
> of
> little tweaks, please try to make sure they actually work correctly
> prior
> to committing.
>
> I'm becoming convinced that a huge benefit of the git migration will 
be
> in
> moving to pre-commit review. If it doesn't work, don't let it into to
> core
> yet... The very idea what we can have a big "review backlog" on things
> ALREADY COMMITTED is a large part of why we have so many regressions
> when
> we finally catch up and update.
>
> That is all; please go on about your business, citizens.
>
> 
>
Will the pre-commit review work for non-Wikimedia extension commits, or 
that would be impossible due to shortage of reviewers? How will the 
commit queue work for non-Wikimedia extensions?
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] LocalWiki released

2011-12-17 Thread Dmitriy Sintsov
On 17.12.2011 6:01, Philip Neustrom wrote:
>>> Page-content plugins will need editor plugins, yeah (like our "include
>>> page" plugin).  But I think that's a good thing.
>>>
>> How do these plugin work when there is no XML "injection" into page
>> content? How the final layout is being build?
> We take the HTML5 that's stored and, before rendering it, transform
> certain bits into functions.  Right now we go HTML5 ->  Django
> template, and turn certain specially-registered HTML bits into Django
> template tags (functions).  We can then cache the Django template.
> We're going to make registering a new page plugin easier (no HTML
> traversal) soon.
>
> Here's the docstring describing how we've got inside-page plugins working:
>
> Conversion of HTML into template with dynamic parts.
>
> We want to allow some dynamic content that gets inserted as the HTML is
> rendered. This is done by converting certain HTML tags into template tags.
> There are two mechanisms to do this: plugin handlers and tag handlers.
>
> Plugins are meant for inserting bits of dynamic content at specific places on
> the final rendered page marked by a placeholder element. The output can be
> whatever you like: widgets, bits of JavaScript, anything. Tag handlers, on the
> other hand, are meant for fixing up the HTML slightly and transparently, i.e.,
> fixing links and adding helpful visual styles and the like. When in doubt, use
> plugins.
>
> Plugin handlers work with HTML elements that have the class "plugin". When
> an element has the class "plugin", it will be passed to registered handlers
> based on the other classes it has.
>
> For example, the following element will be passed to the handler registered 
> for
> the "includepage" class:
>
> Include Navigation
>
> which will convert it to this:
>
> {% include_page "Navigation %}
>
> to be rendered by the include_page template tag.
>
> Tag handlers work similarly, but they are applied by element tag instead of by
> class. They are best used for routine processing of content, such as styling.
>
> For example, to mark links to non-existant pages with a different style, this:
> My Page
> gets converted to this:
> {% link "My Page" %}My Page{% endlink %}
> and rendered as appropriate by the LinkNode class.
>
>
{%%} syntax looks really "wiki-like". That's a kind of reverse logic, 
comparing to MediaWiki, where parser functions are converted to HTML 
output, not opposite. What's really important, parser functions can be 
nested (and by using proper parser frame parser tags probably can be 
nested as well - I haven't tried that yet but might try soon). Can 
LocalWiki template tags be nested as well?
Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] LocalWiki released

2011-12-15 Thread Dmitriy Sintsov
* Philip Neustrom  [Thu, 15 Dec 2011 14:50:32 
-0800]:
> Keep in mind that with our generic versioning framework we can version
> and diff any structured data, independent of wiki page content.  So
> there's probably no need to stuff extension invocation inside of page
> content.
>
Extension invocation inside of page content allows to build very 
flexible dynamic layout. One can mix output of various extensions 
(audio, video, forms) to create new functionality.

> Page-content plugins will need editor plugins, yeah (like our "include
> page" plugin).  But I think that's a good thing.
>
How do these plugin work when there is no XML "injection" into page 
content? How the final layout is being build?

> The concerns you raise are probably valid for Wikipedia.  We're
> working with new communities, for the most part.  (and our work isn't
> based on MediaWiki - not sure if I made that clear in my original
> message).
>
To the MediaWiki extension I develop, visual editor probably will 
multiply the amount of code by the factor of two. So, yes, programmer's 
life is much easier without that. However you are right, end-user will 
prefer visual editor.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


  1   2   3   4   >