Re: [Wikitech-l] Abandoning -1 code reviews automatically?

2014-04-13 Thread Marcin Cieslak
>> Tim Landscheidt  wrote:
>
> I'm all for abandoning changes when the author doesn't react
> and the patch doesn't apply anymore (not in a technical
> sense, but the patch's concept cannot be rebased to the cur-
> rent HEAD).  But forcing work on many just so that a metric
> can be easier calculated by one is putting the burden on the
> wrong side.

As somebody who contributes something in development
surges (like a week per three months or so), I think
that cleaning up statistics to make our code review
process look nicer is not the way to go.


What about automatically accepting the change
that has not been reviewed by anybody for two weeks
or so instead?


I agree that -1 is practically a death penalty to a change.
But that's not a positive development, because
even a mild -1's completety discourages anybody to post
a positive review (I wonder how many +1 or neutral
comments were posted *after* some of the WMF reviewers
posted a -1).

Some examples from my own dashboard:

1) https://gerrit.wikimedia.org/r/#/c/99068/

practically dead although I completely disagree
with the -1 reviewer, as reflected in the comment afterwards.

2) https://gerrit.wikimedia.org/r/#/c/11562/

My favourite -1 here is "needs rebase".

In general our review process disourages somehow
incremental updating of the patches (do we know
how many non-original-submitters posted follow up patchsets,
not comments?).

This kind of review discourages refactoring or some
non-trivial changes. See "seems too complex" in the example
#2 above. 

Regarding Openstack policies: I'd say we should not follow them.

I used to be #2 git-review contributor according to launchpad
until recently. I gave up mainly because of my inability
to propose some larger change to this relatively simple
script. For a nice example of this, please see

https://review.openstack.org/#/c/5720/

I have given up to contribute to this project some time
after this, I have no time to play politics to submit
a set of tiny changes and play the rebase game depending
on the random order they might would have got reviewed.

The next time I find time to improve Johnny the causual
developer experience with gerrit I will just rewrite
git-review from scratch. The amount of the red tape
openstack-infra has built around their projects is
simply not justifiable for such a simple utility
like git-review. Time will tell if gerrit-based
projects generally fare better than others.


//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Working around composer? (Fatal error: Class 'Cdb\Reader' not found)

2015-01-13 Thread Marcin Cieslak
I am kind of late to the party but I have upgraded one of
my throaway development wikis with the usual 
"git remote update && git merge && php maintenance/update.php" process
and after the above succeeded I was nevertheless greeted by:

Fatal error:  Class 'Cdb\Reader' not found 

exception coming out of includes/cache/LocalisationCache.php on line 1263

It seems that I just forgot to update the "vendor" directory
(I am somehow reluctant to run composer due to allow_url_fopen=1)
requirement

Would that be reasonable to add some basic external libraries
checks to update.php to remind users to update those core
components prior to accessing the wiki?

Btw. I think UPGRADE doc does not (yet) mention the new process.

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Exceptions, return false/null, and other error handling possibilities.

2013-10-09 Thread Marcin Cieslak
>> Erik Bernhardson  wrote:
> Moving forward, the Flow team is considering using a php implementation
> that follows the ideas of the haskell Maybe monad(
> https://github.com/schmittjoh/php-option ).  This is, in concept, rather
> similar to the Status class the wikitext parser returns. We would like to
> use this library as a way to more explicitly handle error situations and
> reduce the occurrences of forgetting to check false/null.  This particular
> pattern is very common in Functional languages.

I don't think exceptions are evil, they are more structured gotos and
goto can be used properly to handle errors.

Status class has similar problem as unchecked exceptions - you never
know which exception might come, and similarly, you never know
what kind of Status you might inherit from the code called.

However, exceptions can form a hierarchy, which allows the
caller to react selectively to the class of problems we have.

I recently was struggling with this piece of MediaWiki
code, which interprets values of internalAttemptSave:

(from EditPage.php)

1195  // FIXME: once the interface for internalAttemptSave() is made nicer, 
this should use the message in $status
1196  if ( $status->value == self::AS_SUCCESS_UPDATE || $status->value == 
self::AS_SUCCESS_NEW_ARTICLE ) {
1197  $this->didSave = true;
1198  if ( !$resultDetails['nullEdit'] ) {
1199  $this->setPostEditCookie();
1200  }
1201  }
1202 
1203  switch ( $status->value ) {
1204  case self::AS_HOOK_ERROR_EXPECTED:
1205  case self::AS_CONTENT_TOO_BIG:
1206  case self::AS_ARTICLE_WAS_DELETED:
1207  case self::AS_CONFLICT_DETECTED:
1208  case self::AS_SUMMARY_NEEDED:
1209  case self::AS_TEXTBOX_EMPTY:
1210  case self::AS_MAX_ARTICLE_SIZE_EXCEEDED:
1211  case self::AS_END:
1212  return true;
1213 
1214  case self::AS_HOOK_ERROR:
1215  return false;
1216 
1217  case self::AS_PARSE_ERROR:
1218  $wgOut->addWikiText( '' . 
$status->getWikiText() . '' );
1219  return true;
1220 
1221  case self::AS_SUCCESS_NEW_ARTICLE:
1222  $query = $resultDetails['redirect'] ? 'redirect=no' : '';
1223  $anchor = isset( $resultDetails['sectionanchor'] ) ? 
$resultDetails['sectionanchor'] : '';
1224  $wgOut->redirect( $this->mTitle->getFullURL( $query ) . 
$anchor );
1225  return false;
1226 

This code is somehow replicated in ApiEditPage.php, but in another way.

I wanted re-use this logic in the Collection extension
(which sometimes creates some page in bulk on behalf
of the user) and I really wished error reporting
was done with exceptions. At top level, they
could be handled in EditPage.php way, API would
return exception objects instead and other
extensions could selectively handle some values
and ignore others - for example, I would be happy
to get the standard EditPage.php behaviour 
for most of the errors I am not particularly
interested in.

Regarding the use of php-option:

php-option seems to be as a handy way to provide substitute
objects in case of errors; I think this case
comes not very often and is arguably better
handled by the use of factory methods, i.e.
a method is reponsible to deliver foreign
instance of some class; factory methods can
be defined once for particular orElse/getorElse
situation; it is also easier to make sure
that particular instances will deliver
the same (or similar) interface.
And factory methods can be overriden
if some particular implementation 
needs different creation of fallback objects.

Instead of having:

  return $this->findSomeEntity()->getOrElse(new Entity());

One could have:

  interface StuffNeededFromEntity {
/* what Entity() really provides */
  }
  
  class Entity implements StuffNeededFromEntity {
  }

  class FallbackEntity implements StuffNeededFromEntity {
  }
  

  /*...*/

  /* In the code trying to findSomeEntity */

  /* @return StuffNeededFromEntity */
  protected function entityFactory() {
  $entity = $this->findSomeEntity();
  if (null === $entity) {
 return new FallbackEntity();
  }
  }

  /* replace return $this->findSomeEntity()->getOrElse(new Entity()); */
  return $this->entityFactory();

[The use of interface/implements above is for clarity only,
 one does not actually need to use those features]

I agree with php-option author that if ($x===null) boilerplate
should be avoided, but in my opinion the solution is
to do the logic once and abstract it properly for re-use
and not hide it in some syntatic sugar instead.
Imagine what happenes if we need some constructor
parameters for new Entity() -> instead of updating
entityFactory once one needs to go through all
those "orElse" cases again.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org

Re: [Wikitech-l] [IRC] "hiring" more wm-bot operators

2013-10-11 Thread Marcin Cieslak
>> Petr Bena  wrote:
> Hey all,
>
> Some of you may know our belowed robot, which is working as a slave in
> some of our dev channels. Unfortunately, freenode as well as wikimedia
> labs is a bit unstable, when it comes to network connectivity. So both
> freenode servers as well as internet connectivity of labs are
> occasionally down. This is causing some troubles to wm-bot who isn't
> really able to reconnect, given to laziness of its developers as well
> as complexity of multiple bouncers it is using.

Petr,

I was running a couple of recentchanges minibots (based on the
UDP logging -> urdrec.c -> Python IRC module) pretty reliably.

The code is here https://bitbucket.org/plwiki/bot/src/ ("irc" module)
but of course I am happy to help with hosting/reducing complexity
and getting my hands finaly on C# if needed.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Subclassing User?

2013-10-27 Thread Marcin Cieslak
Hi,

given that there are some extensions which perform edits/actions
automatically (not directly as a result of user request),

I was wondering, was anyone attempting or successful at subclassing
"User"?

There are some places where name of this class is hardcoded.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Subclassing User?

2013-10-27 Thread Marcin Cieslak
>> Brion Vibber  wrote:
> Generally I would not recommend subclassing User; while you can certainly
> create such a subclass it will have limited utility as you can't really
> control how they get created easily.
>
> Like the rest of MediaWiki, the User class is intended to be customized
> through extension hooks... What sort of behavior customization are you
> thinking of doing?

Some example:

https://gerrit.wikimedia.org/r/#/c/92252/

needs https://gerrit.wikimedia.org/r/#/c/92179/ in core,
that gives some method to override.

Surprisinly, it even works (rc_ip will be set to "" on
AbuseFilter blocks).

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Subclassing User?

2013-10-27 Thread Marcin Cieslak
>> Daniel Friesen  wrote:
> On 2013-10-27 2:45 PM, Marcin Cieslak wrote:
>> Some example:
>>
>> https://gerrit.wikimedia.org/r/#/c/92252/
>>
>> needs https://gerrit.wikimedia.org/r/#/c/92179/ in core,
>> that gives some method to override.
>>
>> Surprisinly, it even works (rc_ip will be set to "" on
>> AbuseFilter blocks).
>>
>> //Saper

> Could you explain why a whole subclass of user is needed. From what I'm
> seeing there's little need for an actual class. And a whole lot of what
> looks methods copied from core and then perhaps only slightly modified
> (ie: non-DRY).

newFrom* and friends needed to be copied over because they create
instances of "User" and not of derivative class (to fix this
a factory method would be needed to replace "new User" in those
methods).

The only (experimental) reason for now is the little getUserIP.
This is just a proof of concept, if such subclassing has
any chance of working.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Subclassing User?

2013-10-28 Thread Marcin Cieslak
>> Brion Vibber  wrote:
> I think a better way to go is to add a hook point in
> RecentChange::checkIPaddress()... I don't like mixing more session-related
> stuff into User like a getUserIP method.

Brion and everyone,

thanks for your time and your insight. I will try to hold my object oriented
horses next time :)

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Memento Extension for MediaWiki: Advice on Further Development

2013-11-01 Thread Marcin Cieslak
>> Shawn Jones  wrote:

> 1.  The Memento protocol has a resource called a TimeMap [1]
> that takes an article name and returns text formatted as
> application/link-format.  This text contains a machine-readable
> list of all of the prior revisions (mementos) of this page.  It is
> currently implemented as a SpecialPage which can be accessed like
> http://www.example.com/index.php/Special:TimeMap/Article_Name.
> Is this the best method, or is it more preferable for us to
> extend the Action class and add a new action to $wgActions
> in order to return a TimeMap from the regular page like
> http://www.example.com/index.php?title=Article_Name&action=gettimemap
> without using the SpecialPage?  Is there another preferred way of
> solving this problem?

It just occured to be that if TimeMap were a microformat, this
information could be embeded in to ?title=Article_Name&action=history
itself.

Even then, if we need an additional MIME type for that maybe
we could vary action=history response based on the desired MIME
type (text/html or linking format).

> 3.  In order to create the correct headers for use with the Memento
> protocol, we have to generate URIs.  To accomplish this, we use the
> $wgServer global variable (through a layer of abstraction); how do we
> correctly handle situations if it isn't set by the installation?  Is
> there an alternative?  Is there a better way to construct URIs?

We have wfExpandUrl (yes, there are some bugs currently wrt empty $wgServer
now... https://bugzilla.wikimedia.org/show_bug.cgi?id=54950).

> 5.  Is there a way to get previous revisions of embedded content, like
> images?  I tried using the ImageBeforeProduceHTML hook, but found that
> setting the $time parameter didn't return a previous revision of an
> image.  Am I doing something wrong?  Is there a better way?

I'm not in a position to give you a full answer, but what I would
do I would try to see if I can setup a MediaWiki with $wgInstantCommons = true
and see how I can make ForeignAPIRepo to fetch older revisions
from Wikimedia via API. Then we can have a look at other media
storage backends, including those used by WMF installation.

> 7.  We have two styles for serving pages back to the user:
>* 302-style[2], which uses a 302 redirect to tell the user's browser 
> to go fetch the old revision of the page (e.g. 
> http://www.example.com/index.php?title=Article&oldid=12345)
>* 200-style[3], which actually modifies the page content in place so 
> that it resembles the old revision of the page
>  Which of these styles is preferable as a default?

I guess that 302 is better. Sounds like a much better idea due to caching
(to me).

> 8.  Some sites don't wish to have their past Talk/Discussion pages
> accessible via Memento.  We have the ability to exclude namespaces
> (Talk, Template, Category, etc.) via configurable option.  By default
> it excludes nothing.  What namespaces should be excluded by default?

There might be interesting issues about deleted content, some
people feel very strongly about making it unavailable to others
(partly due to some legal issues); some people setup wikis dedicated
to provide content deleted from Wikipedia. Are you sure history
should not be redacted at times? :-)

Not sure why somebody does not like archiving Talk pages like this
but I think this feature could be enabled per-namespace like many
others in MediaWiki. Archiving media and files will be certainly different
and you will run into interesting issues with versioning Categories
and Templates. Extension:FlaggedRevs has some method to
track what kind of ancilliary content has been modified 
(FRInclusionManager.php and FRInclusionCache.php might be things
to look at).

And a question back to you:

How are you going to handle versioning of stuff like MediaWiki:Common.js,
MediaWiki:Common.css independently of the proper content itself?
Some changes might affect presentation of the content meaningfully,
for example see how https://en.wikipedia.org/wiki/Template:Nts works.

If you don't know already, PediaPress developed generator
of static documents out of wiki content (http://code.pediapress.com/,
see Extension:Collection) and they had to deal with lots of
similar issues in their renderer, mwlib. The renderer accesses
the wiki as a client and fetches all ancillary content as needed.


//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New Bugzilla users have restricted accounts

2013-11-07 Thread Marcin Cieslak
> Let's look at the github model -- there's no assignment at all.  I just
> file a bug, maybe make some comments on it to say I'm working on it, and
> some time later I submit a pull request referencing the bug and saying, "I
> fixed it".  That seems to work fine for collaboration, and offers no
> roadblocks.

GitHub issues are owned by whoever submitted them (and the project
owner). You can't for example convert an issue to a pull request
if you are a third-party. 

But you can always reference an issue in a commit or a comment though.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Operations buy in on Architecture of mwlib Replacement

2013-11-13 Thread Marcin Cieslak
>> Matthew Walker  wrote:
> [1 ]https://www.mediawiki.org/wiki/PDF_rendering/Architecture

I think requirement number one is that Jimmy the casual MediaWiki
user would be able to install his own renederer without replicating
WMF infrastructure:

https://www.mediawiki.org/wiki/Talk:PDF_rendering/Architecture#Simple_set_up_for_casual_MediaWiki_users_35545

//Saper



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New Bugzilla users have restricted accounts

2013-11-13 Thread Marcin Cieslak
>> Andre Klapper  wrote:
> I don't know your specific usecase - maybe the shared saved search named
> ""My" CC'd Bugs" might work (or not) which you could enable on
> https://bugzilla.wikimedia.org/userprefs.cgi?tab=saved-searches (see
> http://blogs.gnome.org/aklapper/2013/07/12/bugzillatips-saved-searches/
> for general info on saved searches and sharing them with other users).

I've been using "i-am-on-cc" (now shared) filter similar to this one
to a great success to find stuff I am working on/interested in.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Pywikipediabot] Using the content of a file as input for articles

2013-12-01 Thread Marcin Cieslak
>> Mathieu Stumpf  wrote:
> Hello,
>
> I want to add esperanto words to fr.wiktionary using as input a file
> where each line have the format "word:the fine definition". So I copied
> the basic.py, and started hacking it to achieve my goal.
>
> Now, it's seems like the -file argument expect a file where each line is
> formated as "[[Article name]]". Of course I can just create a second
> input file, and read both in parallel, so I feed the genFactory with the
> further, and use the second to build the wiktionary entry. But maybe you
> could give me a hint on how can I write a generator that can feed a
> pagegenerators.GeneratorFactory() without creating a "miror file" and
> without loading the whole file in the main memory.

I think that the secret sauce to make a working generator is "yield"
Python keyword. Will try to provide a working example later.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Pywikipediabot] Using the content of a file as input for articles

2013-12-01 Thread Marcin Cieslak
>> Mathieu Stumpf  wrote:
> Hello,
>
> I want to add esperanto words to fr.wiktionary using as input a file
> where each line have the format "word:the fine definition". So I copied
> the basic.py, and started hacking it to achieve my goal.
>
> Now, it's seems like the -file argument expect a file where each line is
> formated as "[[Article name]]". Of course I can just create a second
> input file, and read both in parallel, so I feed the genFactory with the
> further, and use the second to build the wiktionary entry. But maybe you
> could give me a hint on how can I write a generator that can feed a
> pagegenerators.GeneratorFactory() without creating a "miror file" and
> without loading the whole file in the main memory.

All "pagegenerators" return only a series of Page objects and nothing else;
they are useful to create just a list of pages to work on.

I wrote a very simple mini-bot using a different kind of generator
that feeds the bot with both pagename and the content.

You can download the code from Gerrit:

https://gerrit.wikimedia.org/r/98457

You should run it like this:

python onelinecontent.py -simulate -contentfile:somecontent

where "somecontent" contains:

A:Test one line
B:Second line

Hope that provides some starting point for you,

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Uninstalling hooks for tests?

2013-12-05 Thread Marcin Cieslak
I am not very happy about this but we came to the case
where it might be useful to explicitly uninstall some
hook(s) for out unit tests.
 
You might want to checkout MediaWikiTestCase::uninstallHook

https://gerrit.wikimedia.org/r/#/c/99349/

I am not happy about blurring differences between unit
and integration testing, but breaking core with extensions
and vice versa is sometimes useful.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mailing list etiquette and trolling [Bugzilla etiquette]

2013-12-22 Thread Marcin Cieslak
>> Chad  wrote:
> On Wed, Dec 11, 2013 at 3:50 PM, Isarra Yos  wrote:
>
>> On 11/12/13 23:28, Petr Bena wrote:
>>
>>> I think we need to use less rules and more common sense.
>>>
>>>  This.
>>
>>
> Rules are silly. Common sense for all :)

Yeah, and at this very moment we are getting a "Bugzilla etiquette[1]"
instead of improving a plain text explanation to bug
submitters how our process works.


//Saper

[1] https://www.mediawiki.org/wiki/Bug_management/Bugzilla_etiquette



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Cry for help from translatewiki.net

2012-05-21 Thread Marcin Cieslak
> Some extension commits go past Gerrit code review. This means that it
> is impossible to even get notifications on those extensions. Some of
> those extensions are in use at translatewiki.net and given the
> numerous breakages related to those extensions lately, I am seriously
> considering removing those extensions from translatewiki.net until
> this issue is solved. That is bye bye to maps showing our registered
> translators around the world.

I think that having commits to extensions unreviewed is a good thing
for their maintainers. Auto-approval looks like a workaround.

I have a question here since I am not sure I fully understand the problem:
how are you getting notifications that i18n files have changed?
>From gerrit or git? I think it should be the latter (some git hook).

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] git review version update, May 2012

2012-05-27 Thread Marcin Cieslak
>> Platonides  wrote:
> On 26/05/12 20:02, Amir E. Aharoni wrote:
>> `git review' says that "a new version of git-review is availble on PyPI".
>> 
>> The last update created some unwanted surprises, so I decided to avoid
>> updating it for now. What do our Git experts suggest?
>> 
>> Thank you,
>
> It has been telling me that for a long time.
> Yet the proposed command to update (pip install -U git-review)
> dies with exec(open("git-review", "r"))
> TypeError: exec() arg 1 must be a string, bytes or code object
>
> So well, I just ignore it :)

You can try to download it by hand from

http://pypi.python.org/pypi/git-review#downloads

it's just one python script.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] git review version update, May 2012

2012-05-30 Thread Marcin Cieslak
>> Raimond Spekking  wrote:
> os.chmod(target_file, os.path.stat.S_IREAD | os.path.stat.S_IEXEC)
> WindowsError: [Error 2] Das System kann die angegebene Datei nicht
> finden: '.git\\hooks\\commit-msg'
>
>
> (the last sentence means: "The system cannot find the file
> '.git\\hooks\\commit-msg'")
>
> Any ideas how to fix this?

Maybe it is now related to the fact that if git submodules are used
and relatively new git is used and the moon is waning there might
no longer be ".git" subdirectory in the submodule (i.e. extension)
but there is only one two levels below. 

I think aaronsw fixed this upstream with this change:

https://review.openstack.org/#/c/7166/

(it got merged, and should be included in 1.17, the latest of git-review).

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit question: pushing to another branch

2012-05-30 Thread Marcin Cieslak
>> Ryan Kaldari  wrote:
> How do you create the new branch on gerrit?

In Gerrit Web UI:

Admin -> Projects -> (choose project) -> Branches -> Create branch

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit question: pushing to another branch

2012-06-05 Thread Marcin Cieslak
>> Bergi  wrote:
> Doesn't Git encourage us to create as many branches as we can, to share 
> our work and collaborate? Or should I publish my branch(es) somewhere 
> else, maybe without gerrit at all?

Sorry to say this and many people here might disagree:

 Forget 80% of git versatility when working with gerrit.  No "git merge"
 (other than fast forward), no "git cherry-pick" (only -n) and branches
 are rarely useful for more than a local marker for commit.

You are supposed to push one perfect commit for review; you might
also push some with dependencies between them but then things get 
nasty.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] feedback from a gerrit newbie: gerrit questions; improvement of the Git/Tutorial;

2012-06-05 Thread Marcin Cieslak
>> Thomas Gries  wrote:

> In the Tutorial, git review -R is suggested to be used.
>
> Hashar showed me git review -f
> (documented on
> https://labsconsole.wikimedia.org/wiki/Git-review#Full_feature_branch_wor=
> kflow_with_git-review
> )
>
> I suggest that the tutorial uses -f this (instead of -R). Please can you
> experts think what's is better suited, or if both should be covered in
> Tutorial ?

Everything is fine given that you don't want to --amend the commit later
for some reason.

The more we get into this the more I regret we recommended using git-review
in the first place ;)

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Give create gerrit repo right to all WMF engineers

2012-06-09 Thread Marcin Cieslak
>> Lars Aronsson  wrote:
> On 2012-06-06 00:19, Diederik van Liere wrote:
>> A workflow where engineers have to bug a Gerrit admin to do something 
>> is a broken workflow:
>
> As something of an outsider/newcomer, I hear two very different
> stories. The first is the story of all the good reasons why
> Linus Torvalds created git, how it is fully decentralized and
> asynchronous, and how bad it was to work with SVN. The other
> story is gerrit, and how everything must now go through this
> bottleneck of new centralization. There's a conflict here, that
> needs to be sorted out. Does Linus Torvalds really use gerrit?

No, he does not. He uses email workflow to manage patches.

Gerrit tries to do something contrary a bit to the original
git philosophy - it tries to manage commits (trees of files)
as patches (changes to the code), it also encourages
that developers work one-perfect-commit at a time instead
of a "feature branch". 

I am not saying it's a bad or impossible workflow but 
it seems to be a bitter dissapointment for people coming
from different background (say, github-like pull-requests).

I would say gerrit puts a cap on a typical git workflow.
Hey, it's even difficult to review and approve changes
off-line.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Update on IPv6

2012-06-10 Thread Marcin Cieslak
>> Anthony  wrote:
> On Sat, Jun 9, 2012 at 4:29 PM, Anthony  wrote:
>> On Fri, Jun 8, 2012 at 9:59 AM, Strainu  wrote:
>>> 2012/6/8 Anthony :
 No one has to break the loop.  The loop will break itself.  Either
 enough people will get sick of NAT to cause demand for IPv6, or they
 won't.
>>>
>>> That one way of seeing things, but I fear it's a bit simplistic and
>>> naive. People won't "get sick of NAT", since most of them don't know
>>> what NAT is anyway. They'll just notice that "the speed sucks" or that
>>> they can't edit Wikipedia because their public IP was blocked. But
>>> they won't know IPv6 is (part of) the solution unless someone tells
>>> them to, by events like the IPv6 day.
>>
>> Or by the ISP which provides IPv6 advertising those faster speeds or
>> decreased privacy.
>
> Here at BestISP, we assign you a unique number that you can never
> change!  We attach this unique number to all your Internet
> communications, so that every time you go back to a website, that
> website knows they're dealing with the same person.
>
> Switch to BestISP!  1% faster communications, and the increased
> ability for websites to track you!

There are numerous reasons to have fixed IPv6 addresses per
connection. For example, I have right now around 6 devices supporting
IPv6 at home and I do connect between them internally (for example one
of the is printer - my laptop prints on my printer no matter whether it
is at home or somewhere else provided it has IPv6). You *DON'T* want to
renumber your whole home network every time your ISP changes your IPv6
prefix.

Just because some people got away with the stuff they do on the Internet
because their ISP changes their IPv4 address every so and then does
not mean that dynamic IPv4 address provides *any* privacy.

I could argue that current scheme w/dynamic IPv4 provides less privacy
in the long term for the user.  One of the reasons for that is it is
difficult to run your own infrastructure (like mail server, web server)
on one's own residential connection and you have to rely on external
(called "cloud" today) providers for that with obvious privacy
consequences of that.

The whole point of IPv6 is to give the choice not to use external
providers - you become part of the "cloud", not just a dumb consumer.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Some old proposed changes in Gerrit waiting merge, after a code review.

2012-06-14 Thread Marcin Cieslak
>> Sébastien Santoro  wrote:
> Hi,
>
> I saw this morning those reviewed but not merged code changes in gerrit:
>
> Parser issue for HTML definition list
> Bug 11748: Handle optionally-closed HTML tags without tidy
>   2012-04-17
>   Owner: GWicke
>   Review: +1 by saper
>   https://bugzilla.wikimedia.org/11748
>   https://gerrit.wikimedia.org/r/#/c/5174/

I have just rebased this one to the current master.

https://gerrit.wikimedia.org/r/#/c/5174/

git fetch https://gerrit.wikimedia.org/r/mediawiki/core refs/changes/74/5174/3 
&& git checkout FETCH_HEAD

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Planning for the future: prepare high resolution icons and images in your code

2012-06-20 Thread Marcin Cieslak
>> Brion Vibber  wrote:

> * INCLUDE THE SVG IN SOURCE CONTROL!

(...)

> We'll develop some best practices about how to switch in high-res versions
> and whether it's better to use the SVGs or double-sized PNG rasterizations.
> You don't need to use them now, just make sure the images are there when
> we're ready to use them.

One of possible solutions (cerainly not sufficient) is to ask browsers
to send us image/svg+xml in their HTTP Accept: line.

Relevant Mozilla bug:

https://bugzilla.mozilla.org/show_bug.cgi?id=240493

got wonfixed in January 2012 after 8 years of discussion (good read!)

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] bot activity in #mediawiki on freenode

2012-06-22 Thread Marcin Cieslak
>> Brandon Harris  wrote:
>
>
>   Please move the bots out.  

I like bots. I've taken care of some bugs or CR only because
I've seen it on IRC.

+1 for flood protection and thanks for silencing l10n

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Today's git master

2012-06-29 Thread Marcin Cieslak
$ git reset --hard
HEAD is now at de13c31 Actually we have many contributors
$

Chad, you made my day:)

//Saper



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Barkeep code review tool

2012-06-29 Thread Marcin Cieslak
As seen on IRC:

https://github.com/ooyala/barkeep/wiki/Comparing-Barkeep-to-other-code-review-tools

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Upstream keyword in bugzilla

2012-07-02 Thread Marcin Cieslak
>> Niklas Laxström  wrote:
> On 2 July 2012 17:55, Chad  wrote:
>> On Mon, Jul 2, 2012 at 10:45 AM, Niklas Laxström
>>  wrote:
>>> I think the upstream keyword in bugzilla is useless. Can we replace it
>>> with a field which takes an url to the upstream bug report?
>>>
>>
>> When I've filed an upstream but, I usually put it in the URL
>> field. Does that not work?
>
> If you can keep the keyword and URL in sync. Quick search [1] confirms
> my concerns. There seems to be only few cases where URL is used to
> indicate where the bug happens, so conflicts on there are rare. But
> the problem remains.

I just used that keyword on https://bugzilla.wikimedia.org/show_bug.cgi?id=38114
to say "can be fixed upstream, no bug filed yet", and, in this case
"not a local customization or configuration change".

Real bug URL should be added when this is filed.

There is another small problem with "See also" field: it currently
accepts are pretty limited set of bugtrackers. 

See disussion under https://bugzilla.mozilla.org/show_bug.cgi?id=577847
and https://bugzilla.mozilla.org/show_bug.cgi?id=735196 for example
how this is handled within current Bugzilla development.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] potential network issue due to packet losses

2012-07-03 Thread Marcin Cieslak
>> Leslie Carr  wrote:
> When in a firewall filter, packets are rejected (which sends an ICMP
> rejected notice), the routing engine can receive too many of these
> requests, causing the routing engine to "choke" on its backlog of
> requests. 

Leslie, thanks for excellent update! Was is something similar to ICMP
storm caused by unreachables (similar to the problems caused by
subnet-directed packets in the old days) that even ICMP rate limiting
didn't help?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Maintaining (and thus editing) SVGs' text like wiki-text pages

2012-07-15 Thread Marcin Cieslak
>> Achim Flammenkamp  wrote:
>
> My aim is the have on wikimedia/wikipedia (/wiki-whatever sounds apropriate)
> 1) a version-control environment (as we have for artcile-, talk-, user-,
>  category-, template- ... etc (text-)namespace), because it is IMHO apropriate
>  for each (huamn-readable) textual namespace. 
> SVG has a (I guess historcial) exception, because it was new and was sorted in
> like (bitmap-) graphics (JPEG/PNG or what ever exists only on wikipedia)
> (badly classified IMHO).

I don't think that version control we offer with article revisions
is a proper one for any kind of XML, including SVGs. For git fanbois:
yes, git does not solve that, either.

The problem is that you have to think in terms of nodes, their attributes
and contents and not their textual form. I pretty often add/remove
spaces and newlines when editing XML by hand for clarity; that should
not be versioned as this does not change the substance.

I am editing SVG files by hand pretty often (using vi for simple things
and XMLmind's xxe for more complex stuff) to fix various problems
related by users, like missing namespaces, wrong CSS, etc.

But I wouldn't really want to do that within some textarea
interface within MediaWiki. Maybe, for the purpose of educating
users, there should be some way to pretty print XML source
of the SVG file - but unless there is a decent XML node editor
I don't think we this is something we need right now.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] [[Template:WikimediaDownload]]

2012-07-15 Thread Marcin Cieslak
Due to various requestes popping out on IRC I visting sometimes
some random extensions to MediaWiki. Today's pick was

https://www.mediawiki.org/wiki/Extension:WikiForum

The infobox says it is available on Git - I remember
the old version was giving pointers to SVN and Git.

Git links there give 404, as the extensions still
rests in

http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/WikiForum/

Now that ExtensionDistributor works nice with git,
shouldn't we have {{WikimediaGit}} and {{WikimediaSVN}} templates,
or let the {{WikimediaDownload}} point to SVN where necessary (older
releases or something?)

I don't fully follow what's happening there and what's the plan...
anyone?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Serious alternatives to Gerrit

2012-07-18 Thread Marcin Cieslak
>> Rob Lanphier  wrote:
> It would appear from reading this page that the only alternative to
> Gerrit that has a serious following is GitHub.  Is that the case?
> Personally, it seems like Phabricator or Barkeep has the best chance
> of dislodging Gerrit, but those won't probably get serious
> consideration without a champion pushing them.

Maybe this is because of the discussion format which was framed
as "for and against" Gerrit.

Is it possible to setup a copy of Phabricator in labs? What
is needed to help with this?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Serious alternatives to Gerrit

2012-07-18 Thread Marcin Cieslak
>> Daniel Friesen  wrote:
> On Wed, 18 Jul 2012 11:41:06 -0700, Ryan Lane  wrote:
>
>> On Wed, Jul 18, 2012 at 1:18 PM, Petr Bena  wrote:
>>> What about changing gerrit to our needs? It's open source, I suppose.
>>>
>>
>> That's of course the route we're going right now. The biggest hurdle
>> to doing so is that it's written in Java, and Prolog. Neither of those
>> languages are amazingly difficult, though, so I don't really see that
>> as a complete blocker.
>>
>> - Ryan
>
> The blocker for me was not the language, but the codebase. I wanted to  
> make a small tweak to Gerrit so I started looking through the code. And I  
> had absolutely no clue where to find what I was looking for. I couldn't  
> make sense of what most of what I found was even supposed to do. And  
> people have pointed out a number of issues with Gerrit like the way it  
> handles output and css which feel much more like fundamental (ie:  
> unfixable without practically rewriting) issues with Gerrit.

I got used to it. It's completely different Java if one is used
to old-skool Java programming. Components are decoupled with the
use of Guice (for "dependency injection" - 
http://c2.com/cgi/wiki?DependencyInjection)
framework plus there is Google Web Toolkit programming, again a very
special beast.

Another component is the ORM mapper, gwtorm.

Other than that it's pretty fine, with the usual Java problem
that I need to cut through gazillion of classes and interfaces
before I get to the core of things.

For example, to fix https://bugzilla.wikimedia.org/show_bug.cgi?id=38114
few lines need to be added before line 296 of

https://gerrit.googlesource.com/gerrit/+/05d942c324d7a17285873a468f3605cbc190b8d5/gerrit-gwtui/src/main/java/com/google/gerrit/client/changes/ChangeTable2.java

(not sure it's a good idea but here it is)

I have attached Jython interpreter to Gerrit to 
play a bit with the code:

https://gerrit-review.googlesource.com/#/c/34670/

You can play live with the ORM mapper for example
and retrieve Java objects from the database (not just
rows).

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] About outreach and tech events (as suggested by Sumana!)

2012-08-05 Thread Marcin Cieslak
>> Julien Dorra  wrote:

> «Testing Wikipedia» could be a nice catchy name for a series for events in
> various cities around TDD, with experienced dev mentoring less experienced
> community members, etc. Even if the experts come and go, everybody learn,
> some test and process get done, and the community grow and learn.

Maybe we should not be afraid and not only offer buzzwords but also point
out real technical issues we are facing here.

I would say that at least some parts the core code is hardly testable.
The situation improves as we go, but still there are lots of problems
with our almost-object-oriented coding.

On the other hand the number of integration issues we are facing
(talking to databases, caches etc.) plus high level of optimization
of code suitable for the hugh site does not make a TDD approach
enthusiasts happy; we need multiple levels of testing unit;
integration; user inteface with the last two very important.

And there also infrastructure issues, like

https://bugzilla.wikimedia.org/show_bug.cgi?id=37702
"Cloned tables for unitests do not have references and constraints"

discovered when trying to write unit test for the very core
MediaWiki functionality. (The fact we didn't find is earlier
is telling something about our test coverage).

So may be one of the approaches would be to have a mini-bugathon
to review some (or most typical? most annoying? site-breaking?)
bugs and try to evaluate how TDD approach could help us
to improve.

We might even have a nice cultural clash as a result:)

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Let's talk about arc: Phabricator replacement for git-review and more

2012-08-09 Thread Marcin Cieslak
I just wrote a very rough and quick walkthrough how to get that tool running:

https://www.mediawiki.org/wiki/Arcanist

My first impression is very good. The UI is very nice (it guides you
when it needs to, it just does the job if all is fine).

The user's guide is unfortunately poor.

I don't know yet how to avoid this warning:

This diff is for the 'E3Experiments' project but the working copy belongs to 
the '' project. 

I see that arc can be also installed as the git pre-receive hook,
but it needs some project configuration for that. Interesting.

Anyway, I managed to download one change:

$ git branch -vv
* arcpatch-D3 be7cfd9 Merge branch 'master' of 
ssh://gerrit.wikimedia.org:29418/mediawiki/extensions/E3Experiments into 
munaf/pef2
  master  e40fce0 [origin/master] Rename events.js -> communityClicks.js

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Let's talk about arc: On Arcanist docs

2012-08-10 Thread Marcin Cieslak
>> Evan Priestley  wrote:
> I sent out a diff to fix the error message 
> (https://secure.phabricator.com/D3231), the new one reads:
>
> This patch is for the 'phabricator' project, but the working copy does
> not have an '.arcconfig' file to identify which project it belongs to.
> Still try to apply the patch?
>
> We're trying to catch the case where you're attempting to apply a
> patch to the wrong project -- I'm guessing revision D3 was made in
> a working copy with a .gitignored ".arcconfig" file that associates
> it with "E3Experiments". If you check in the ".arcconfig" file, arc
> will be able to recognize the working copy's project and will stop
> complaining.

I realize my probleme where related to the E3Experiments not having
.arcconfig 

I started suspecting E3Experiments is kind of unconfigured when
"arc git-hook-pre-received" hinted me about this:

"Usage Exception: You have installed a git pre-receive hook in a remote without 
an .arcconfig.")

> What could we improve about the user guide?

First, I found two entry points:

* Arcanist User Guide
  http://www.phabricator.com/docs/phabricator/article/Arcanist_User_Guide.html

Frankly, I don't remember I how I came to this one.

The problem here is that there is no full table of contents of the "User Guide"
(btw. Defined: src/docs/userguide/arcanist.diviner:1 seems useless to the 
causual
onlooker).

I could find this:

 
http://www.phabricator.com/docs/phabricator/article/Arcanist_User_Guide_arc_diff.html

But not much more. While writing this email I discovered 

 
http://www.phabricator.com/docs/phabricator/article/Arcanist_User_Guide_Configuring_a_New_Project.html

hidden in the arc_diff guide which was like tl;dr to me - I didn't know I needed
to learn about "arc diff" command to find out about .arcconfig and stuff.

Suggestions for improvement:

1) In the "Overview" there should be some links to basic installation and 
configuration
(the .arcconfig thing).

2) "Arcanist allows you to do things like" should explain more about the 
commands -
descriptions are too short. There are no links to explanations of particular
commands (certainly "arc diff" has one). 

Coming from gerrit I kind of looked for equivalent of "git fetch ... 
refs/changes/CD/ABCD/1"
and "git push ... refs/for/master". From the terse description there I could 
sense
that "arc diff" does something to push the changes for review and "arc patch" 
fetches
the change from the repo (although "arc export" sound nice, too). 
Unfortunately, "arc download/upload" do something different :)

* Arcanist Something - Table of Contents 
  http://www.phabricator.com/docs/arcanist/

The good thing is that Phabricator installation has links to this document
at https://phabricator.wmflabs.org/apps/. This is a big plus.

This the Arcanist Something guide is confusing because it contains 95% of
developer API documentation. I hoped to find info about .arcconfig
in "ArcanistConfiguration" or "ArcanistSettings" but both were
disappointing. 

Now I see I should go into ArcanistOverview but somehow I missed that.
It links to Arcanist_User_Guide_Configuring_a_New_Project
which I missed so badly yesterday. 

1) Probably ArcanistOverview should be *the* front page for the documentation
and the User Guide with full Table of Contents of all docs. Maybe the
TOC should be on all "User Guide" pages.

API pages should be clearly marked. Use different skin if possible
(red instead of blue:) or clearly mark links to API and UserGuide
articles differently (consistent title names? we can't rely on colors or
icons). Javadoc output might be ugly, but at least I know immediately
"uh oh I ended up in the developer documentation".


There is some problem with 
http://www.phabricator.com/docs/phabricator/article/Arcanist_User_Guide_Customizing_Lint,_Unit_Tests_and_Workflows.html
it sounds like the gentle introduction into the whole API stuff.
Not sure yet how it fits to the casual user.

And then, there is

http://www.phabricator.com/docs/phabricator/article/Arcanist_User_Guide_Repository_Hooks.html

deals only with SVN. "arc git-hook-pre-receive" sounds promising
but I have no idea where to find out more about it.

Unfortunately, Phabricator docs use "workflow" as a slang description
of some piece of code, so I could not find out "How a typical
workflow with arc looks like" and "How installing a git hook
changes my workflow".


In general: docs seems to aimed either at the advanced person looking
to write "workflows" or "classes" for linting/whatever
or for the user of the already pre-configured repository.
I would review this again in view of the "lonely wolf" developer
that has some (maybe her own repository) and tries to
set up this thing. I didn't look at the rest of the Phabricator
docs yet but I'd be happy to find guides for 
"How do I switch to Phabricator with a github/sourceforge/whatever project".


//Saper


___
Wikitech-l mailing list
Wikitech-l@lists

Re: [Wikitech-l] Let's talk about arc: Phabricator replacement for git-review and more

2012-08-10 Thread Marcin Cieslak
>> Daniel Friesen  wrote:
>
> Why is arc written in PHP? That seems to be a horrible software decision  
> to me. Core extensions not enabled by default can be hard to install on  
> some OS. And imho the packaging setup is not as good. Frankly I gave up  
> trying to get mcrypt installed on either version of php installed on my  
> Mac.

It could be improved to check for curl and bcmath (the ones I found
out are needed) on startup, not during some other command
after other succeded (unless of course the extension is needed
only for some specific operation not applicable to general public).

This one I find interesting:


> arc looks as if it works completely with patches on it's own rather than  
> doing anything with git.

> I can't see how phabricator can have commit workflow support any better  
> than gerrit when it appears to take the repo completely out of the  
> question.

Erik also wrote this earlier:

> As I understood it, the big gotchas for Phabricator adoption are that
> Phabricator doesn't manage repositories - it knows how to poll a Git
> repo, but it doesn't have per-repo access controls or even more than a
> shallow awareness of what a repository is; it literally shells out to
> git to perform its operations, e.g. poll for changes - and would still
> need some work to efficiently deal with hundreds of repositories,
> long-lived remote branches, and some of the other fun characteristics
> of Wikimedia's repos. Full repo management is on the roadmap, without
> an exact date, and Evan is very open to making tweaks and changes
> as needed, especially if it serves a potential flagship user like
> Wikimedia.

After Gerrit, I think it might actually be a GoodThing(tm)
to detach the code review tool from managing the repository.

Git at its core is a tool to quickly snapshot directories.
Blobs are its first-class objects, not patches or diffs
(this is I think pretty innovative looking at the traditional
version control systems). 

I think there is a reason why Linus settled with email
patch workflow that is even included in the git user
interface. 

Keeping patches and commits separately starts making 
sense to me - otherwise one ends up in rebase hell.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Let's talk about arc: Phabricator replacement for git-review and more

2012-08-10 Thread Marcin Cieslak
>> Evan Priestley  wrote:
>
> On Aug 10, 2012, at 12:52 AM, Marcin Cieslak wrote:
>
>> It could be improved to check for curl and bcmath (the ones I found
>> out are needed) on startup, not during some other command
>> after other succeded (unless of course the extension is needed
>> only for some specific operation not applicable to general public).
>
> We should be checking for curl on startup (and a few other things
> -- namely JSON and a reasonable PHP version). Was this not your
> experience? 

I have already had php-curl installed and I only noticed I needed
it because the need to configure list of acceptable CAs; so I *knew*
php-curl is needed.

PHP without bcmath failed on me badly with PHP fatal error.

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Bugzilla workflow: keywords

2012-08-27 Thread Marcin Cieslak
Hello,

Recently I noticed that keywords in bugzilla get
updated more and more often, mostly with keywords
like "patch", "patch-need-review", etc.

I am wondering what to do in the following situations
(like https://bugzilla.wikimedia.org/show_bug.cgi?id=39635
for example):

- user A posts a patch
- the bug gets "patch", "patch-need-review"
- user B posts a patch that is different and says
  he does not like patch of A
- user B submits change to gerrit

When "need-review" should be removed? What are replacements
if any? What if I believe that core ideas behind the
patch are wrong? What if I just think the implementation
should be improved? What it it's more or less okay?
I see only "patch-reviewed" in the keywords - which can be
both negative and positive.

Before I open a whole can of worms by asking a question
how do I relate those keywords to the Gerrit workflow
we have, maybe the current bugmeisters could explain
how they use those keywords and how we can help?

//Saper



*





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Can we kill DBO_TRX? It seems evil!

2012-09-27 Thread Marcin Cieslak
>> Daniel Kinzler  wrote:
> So, my current proposal is a more expressive high-level api for transaction
> control consisting of start/finish/flush (and perhaps abort) on top of the low
> level interface consisting of begin/commit/rollback. Documentation needs to be
> very clear on how they behave and how they relate to each other.

I did some application (in Zope) where correctness was more important
than locks and it was running on PostgreSQL so we never had this problem.
Zope collects all transactions from different sources (multiple database
connections for example) and handles them transparently (like automatic
rollback on error/exception n the code). In MediaWiki context that would
be equivalent to keeping transactions controlled at the WebRequest
level. I know too little about InnoDB transactions to comment as I
understand MySQL is very different.

> For the short term, I suggest to suppress warnings about nested transactions
> under some conditions, see my previous response to aaron.

In includes/db/DatabasePostgres.php there is PostgresTransactionState
monitor, which is very nice to debug all problems with implicit/explicit
transactions. It can easily be abstracted (to Database.php or somewhere
else) and maybe there are functions to monitor InnoDB transaction
status as well.

>From the PostgreSQL side I see one problem with nesting - we are already
using savepoints to emulate MySQL's "INSERT IGNORE" and friends.\
It might be difficult to abuse that feature for something more than this.
There is a class "SavepointPostgres" which is used for that.

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Learning Git/Gerrit? - 3 Oct 2012 17:30 UTC

2012-09-27 Thread Marcin Cieslak
>> Sumana Harihareswara  wrote:
> Git, Gerrit, and You! A Tutorial
>
> Where:IRC/SIP/SSH
>
> We want all our developers to feel comfortable with Git, git-review, and
> Gerrit. So saper is leading a hands-on online training:
> https://www.mediawiki.org/wiki/Project:WikiProject_Extensions/MediaWiki_Workshops/Git_/_Gerrit_Tutorial
> . Check [[Git/Workshop]] for testing access to the conference & lab setup.
>
> Saper will be available for 3 hours, and there'll be a break in the
> middle. Absolute beginners with Git might want to stay for the whole
> three hours; people with some experience won't need as long.
>
save the date: 3 October 2012, 17:30 UTC:

http://timeanddate.com/worldclock/fixedtime.html?msg=Git%2BGerrit+with+saper+live&iso=20121003T1730&ah=3

In case you want to participate and haven't signed up yet, please
sign up here:

http://www.doodle.com/zhn7buksgrg8e8rx

Thank you!

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Can we kill DBO_TRX? It seems evil!

2012-09-28 Thread Marcin Cieslak
>> Brad Jorsch  wrote:
> On Thu, Sep 27, 2012 at 08:40:13PM +0000, Marcin Cieslak wrote:
>> 
>> From the PostgreSQL side I see one problem with nesting - we are already
>> using savepoints to emulate MySQL's "INSERT IGNORE" and friends.\
>> It might be difficult to abuse that feature for something more than this.
>> There is a class "SavepointPostgres" which is used for that.
>
> As long as the savepoints are properly nested and multiple levels of
> nesting don't try to reuse the same name, things should be fine. And
> since this use is just "SAVEPOINT", "INSERT", "RELEASE SAVEPOINT",
> there's no opportunity for things to not be properly nested, and
> avoiding name collision would not be hard.

All is fine as long something like SqlBagOStuff or Localization
Cache or something else working in parallel does not do
something to your so-called "transaction" 
(https://bugzilla.wikimedia.org/show_bug.cgi?id=35357 or 
https://bugzilla.wikimedia.org/show_bug.cgi?id=27283).

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Learning Git/Gerrit? - 3 Oct 2012 17:30 UTC

2012-10-03 Thread Marcin Cieslak
Hello, 

Our scheduled Git+Gerrit session starts in ca. 40 minutes from now.

Everything will happen via SIP audioconference and SSH connection.

Please make sure your SIP and SSH clients works!

More information on the setup:

 https://www.mediawiki.org/wiki/Git/Workshop

I am already available on SIP as well as on IRC
(#git-gerrit on Freenode) if you would like
to test your setup.

See you soon!

Marcin Cieślak
(saper)


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github replication

2012-10-04 Thread Marcin Cieslak
>> Chad  wrote:
> Yeah, that sounds sane. Anyone who wants to volunteer to keep an eye
> on Github and make sure patches get into Gerrit, let me know and I'll add
> you to the group on Github.

+1

I'm github.com/saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Learning Git/Gerrit? - 3 Oct 2012 17:30 UTC

2012-10-04 Thread Marcin Cieslak
>> Marcin Cieslak  wrote:
> More information on the setup:
>
>  https://www.mediawiki.org/wiki/Git/Workshop
>
> I am already available on SIP as well as on IRC
> (#git-gerrit on Freenode) if you would like
> to test your setup.

Thank you everyone for joining, it was fun 
although it took a bit longer than expected.

We've had 8 people on the teleconference, most
of them managed to join the hands-on part
of the tutorial.

Please feel free to send me (or Sumanah)
your feedback.

I think we should restructure this into
a series of smaller trainings, separating
git basic/advanced and gerrit basic/advanced
stuff.

//Marcin



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Welcome Željko Filipin, QA Engineer

2012-10-04 Thread Marcin Cieslak
>> Zeljko Filipin  wrote:
> On Tue, Oct 2, 2012 at 9:21 PM, Antoine Musso  wrote:
>> I am in CET timezone myself and
>> working on continuous integration.  Ping "hashar" on freenode :-]
>
> Will do. I will probably need help with Jenkins.

Welcome :) Now I know why you joined us yesterday on the Git+Gerrit
session!

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] master broken on PostgreSQL - ContentHandler?

2012-10-10 Thread Marcin Cieslak
Hello,

I updated one of my wikis today from f2138b1 to 
9299bab032a85c1a421436da04a595b79f2b9d6c (git master as I write this) and after 
running update.php
I get this:

A database error has occurred. Did you forget to run maintenance/update.php 
after upgrading? See: 
https://www.mediawiki.org/wiki/Manual:Upgrading#Run_the_update_script
Query: SELECT page_id,page_len,page_is_redirect,page_latest,page_content_model 
FROM "page" WHERE page_namespace = '0' AND page_title = 'Test11' LIMIT 1 
Function: LinkCache::addLinkObj
Error: 42703 ERROR: column "page_content_model" does not exist
LINE 1: ...*/ page_id,page_len,page_is_redirect,page_latest,page_conte...
^

Backtrace:

#0 /usr/home/saper/public_html/pg/w/includes/db/DatabasePostgres.php(477): 
DatabaseBase->reportQueryError('ERROR: column ...', '42703', 'SELECT 
page_id...', 'LinkCache::addL...', false)
#1 /usr/home/saper/public_html/pg/w/includes/db/Database.php(942): 
DatabasePostgres->reportQueryError('ERROR: column ...', '42703', 'SELECT 
page_id...', 'LinkCache::addL...', false)
#2 /usr/home/saper/public_html/pg/w/includes/db/Database.php(1367): 
DatabaseBase->query('SELECT page_id...', 'LinkCache::addL...')
#3 /usr/home/saper/public_html/pg/w/includes/db/Database.php(1458): 
DatabaseBase->select('page', Array, Array, 'LinkCache::addL...', Array, Array)
#4 /usr/home/saper/public_html/pg/w/includes/cache/LinkCache.php(222): 
DatabaseBase->selectRow('page', Array, Array, 'LinkCache::addL...', Array)
#5 /usr/home/saper/public_html/pg/w/includes/Title.php(2895): 
LinkCache->addLinkObj(Object(Title))
#6 /usr/home/saper/public_html/pg/w/includes/Title.php(4320): 
Title->getArticleID()
#7 /usr/home/saper/public_html/pg/w/includes/WikiPage.php(416): Title->exists()
#8 /usr/home/saper/public_html/pg/w/includes/WikiPage.php(465): 
WikiPage->exists()
#9 /usr/home/saper/public_html/pg/w/includes/WikiPage.php(204): 
WikiPage->getContentModel()
#10 /usr/home/saper/public_html/pg/w/includes/WikiPage.php(190): 
WikiPage->getContentHandler()
#11 /usr/home/saper/public_html/pg/w/includes/Action.php(92): 
WikiPage->getActionOverrides()
#12 /usr/home/saper/public_html/pg/w/includes/Action.php(139): 
Action::factory('view', Object(WikiPage))
#13 /usr/home/saper/public_html/pg/w/includes/Wiki.php(144): 
Action::getActionName(Object(RequestContext))
#14 /usr/home/saper/public_html/pg/w/includes/Wiki.php(528): 
MediaWiki->getAction()
#15 /usr/home/saper/public_html/pg/w/includes/Wiki.php(447): MediaWiki->main()
#16 /usr/home/saper/public_html/pg/w/index.php(59): MediaWiki->run()
#17 {main}

Look like LinkCache.

Can this be quickly fixed or do we need to revert this?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] master broken on PostgreSQL - ContentHandler?

2012-10-10 Thread Marcin Cieslak
>> Platonides  wrote:
> On 10/10/12 09:02, Marcin Cieslak wrote:
>> Hello,
>> 
>> I updated one of my wikis today from f2138b1 to 
>> 9299bab032a85c1a421436da04a595b79f2b9d6c (git master as I write this) and 
>> after running update.php
>> I get this:
>> 
>> A database error has occurred. Did you forget to run maintenance/update.php 
>> after upgrading? See: 
>> https://www.mediawiki.org/wiki/Manual:Upgrading#Run_the_update_script
>> Query: SELECT 
>> page_id,page_len,page_is_redirect,page_latest,page_content_model FROM "page" 
>> WHERE page_namespace = '0' AND page_title = 'Test11' LIMIT 1 
>> Function: LinkCache::addLinkObj
>> Error: 42703 ERROR: column "page_content_model" does not exist
>> LINE 1: ...*/ page_id,page_len,page_is_redirect,page_latest,page_conte...
>> ^
> (...)
>> Look like LinkCache.
>> 
>> Can this be quickly fixed or do we need to revert this?
>> 
>> //Saper
>
> Seems the files of maintenance/archives/patch-*content* could need to be
> copied to maintenance/postgres/archives
>
> It works for me on mysql, but it is inconsistent in that the db is still
> used a varbinary but the description uses an integer.

Yes, the updater needs to be fixed (it is normally enough
to update "includes/installer/DatabaseUpdater.php" for smaller changes).

I remember mentioning it in code review somewhere, finding the "right"
data type was an issue to be addressed by developers...

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] master broken on PostgreSQL - ContentHandler?

2012-10-10 Thread Marcin Cieslak
>> Daniel Kinzler  wrote:
> I'll try to look into this today, but I need to find help from someone
> knowledgable about postrtges (and especially about the postgres
> updater. it's... different).

Sure, feel free to ask, I will be travelling starting tomorrow
but today we can try to fix it. To add simple fields you don't
usually need to create a patch file, a simple
array ('addPgField', ... ) should do.

Thanks!

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] master broken on PostgreSQL - ContentHandler?

2012-10-10 Thread Marcin Cieslak
>> Marcin Cieslak  wrote:
>>> Daniel Kinzler  wrote:
>> I'll try to look into this today, but I need to find help from someone
>> knowledgable about postrtges (and especially about the postgres
>> updater. it's... different).

Thank you everyone - the working fix is now in Gerrit 
https://gerrit.wikimedia.org/r/#/c/27413/

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] IPv6 routing problem?

2012-10-15 Thread Marcin Cieslak
>> Faidon Liambotis  wrote:
> Hi,
>
> Thanks for forwarding the report. I've chatted with the user via IRC on
> Sunday and subsequently via e-mail, so we're on it. For what it's worth,
> the underlying issue is still there, although restoring European traffic
> via the esams (Amsterdam) cluster has significantly reduced the impact.

Could 
http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/64555 be 
the same problem?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Research on newcomer experience - do we want to take part?

2012-11-14 Thread Marcin Cieslak

Hello,

Kevin Carillo[1] from University of Wellington is going to research
"Newcomer experience and contributor behavior in FOSS communities[2]
So far Debian, GNOME, Gentoo, KDE, Mozilla, Ubuntu, NetBSD, OpenSUSE
will be taken into account, and FreeBSD recently joined[3] and 
there is still some possibility for other large FOSS projects to join.

I think it could fit nicely into our recent efforts directed
at newcomer experience after Git migration. And MediaWiki is
a bit different than above projects.

Are we interested
to include MediaWiki in that research?

As Kevin explains in his post he tried to avoid spamming mailing
lists to look for project interested, so I am doing this for him :-)

//Saper

[1] http://kevincarillo.org/about/, http://twitter.com/kevouze
[2] http://kevincarillo.org/survey-invitation/
[3] http://kevincarillo.org/2012/11/15/welcome-freebsd/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How do MS SQL users install MediaWiki?

2013-03-07 Thread Marcin Cieslak
>> Mark A. Hershberger  wrote:
> On 03/04/2013 01:34 AM, Chad wrote:
>> However, we do
>> have people who want/use MSSQL, so I think taking the effort to
>> keep it working is worthwhile--if someone's willing to commit.
>
> Since Danny Bauch has been using MSSQL and modifying MW for his needs,
> I'll work with him to get the necessary changes committed.
>
> Danny, if you could commit your changes into Gerrit, I'd be happy to
> test them.

I'll be happy to come back to my PostgreSQL work and I'd happy to
talk to other RDBMs people to coordinate some stuff (like getting
unit tests to work or getting some abstractions right - transactions,
schema management etc.).

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Identifying pages that are slow to render

2013-03-08 Thread Marcin Cieslak
>> Antoine Musso  wrote:
> Le 06/03/13 22:05, Robert Rohde a écrit :
>> On enwiki we've already made Lua conversions with most of the string
>> templates, several formatting templates (e.g. {{rnd}}, {{precision}}),
>> {{coord}}, and a number of others.  And there is work underway on a
>> number of the more complex overhauls (e.g. {{cite}}, {{convert}}).
>> However, it would be nice to identify problematic templates that may
>> be less obvious.
>
> You can get in touch with Brad Jorsch and Tim Starling. They most
> probably have a list of templates that should quickly converted to LUA
> modules.
>
> If we got {{cite}} out, that will be already a nice improvement :-]

Not really, given https://bugzilla.wikimedia.org/show_bug.cgi?id=45861

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Commonist now supporting the new login token

2010-04-11 Thread Marcin Cieslak
Hello,

I have hacked up a patched version[1] of a popular Commonist tool[2]
so that it can now login to commons.

[1] http://saper.info/files/commonist-0.3.43-patched.jar
[2] http://commons.wikimedia.org/wiki/Commons:Tools/Commonist

-- 
  << Marcin Cieslak // sa...@saper.info >>


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Wikimania 2010: Call for Participation is there!

2010-04-28 Thread Marcin Cieslak
Wikimania is an annual global event devoted to Wikimedia projects 
around the globe (including Wikipedia, Wikibooks, Wikinews, 
Wiktionary, Wikispecies, Wikimedia Commons, and MediaWiki). The 
conference is a community gathering, giving the editors, users 
and developers of Wikimedia projects an opportunity to meet each 
other, exchange ideas, report on research and projects, and 
collaborate on the future of the projects. The conference is open 
to the public, and is a chance for educators, researchers, 
programmers and free culture activists who are interested in the 
Wikimedia projects to learn more and share ideas about the 
Wikimedia projects.

This year's conference will be held JULY 9-11, 2010 in Gdansk, 
Poland at Polish Baltic Philharmonic. For more information, please 
visit the official Wikimania 2010 site:

http://wikimania2010.wikimedia.org/

Wikimania 2010 will be a mix of submitted talks, open space 
meetings, birds of a feather groups, and lightning talks. 
Submissions will be discussed and selected in an informal process
on the wiki. If your submission is not added to the schedule, you 
will still have many opportunities to bring topics forward 
on-site.

IMPORTANT DATES

* Deadline for submitting workshop, tutorial, panel and
   presentation proposals: May 20
* Notification of acceptance: May 25 (workshops), May 31
   (panels, tutorials, presentations)
* All proposals and presentations will be welcome in the
   Open Space track of the conference, whether or not they
   are accepted in this initial process.

PROGRAM COMMITTEE

Submissions will be reviewed informally by a team of volunteers.

TRACKS

This year Wikimania will offer three tracks for submissions for 
members of wiki communities and interested observers to share 
their own experiences and thoughts and to present new ideas:

People and Community

The People and Community track provides a unique forum for 
discussing topics related to people using/building wikis. 
Relevant topics include, but are not restricted to, the 
following:

* Wiki Community: Conflict resolution and community dynamics;
   reputation and identity;
* Wiki Outreach: Promotion of wikis and Wikimedia projects among
   the general public;
* North meets south, east meets west: How can people of a
   different cultural background create an encyclopedia according
   to common rules? Same subject in the eye of different cultures.
* Special: Wikipedia in Central/Eastern Europe: this theme will
   provide a forum to present and discuss the latest progress of
   Wikis in the central/eastern European community.

Knowledge and Collaboration

The Knowledge and Collaboration track aims to promote research 
and find exciting ideas related to knowledge...

* Wiki Content: New ways to improve content quality, credibility;
   legal issues and copyrights (is free knowledge free?); use of
   the content in education, journalism, research;
* Semantic Wikis: The use of semantic web technologies, linked
   data; semantic annotation and metadata (in particular manual
   vs. automated approaches).

Infrastructure Track

The Infrastructure track at Wikimania will provide a forum where 
both researchers and practitioners can share new approaches, 
applications, and explore how to make Wiki access ever more 
ubiquitous:

* MediaWiki development: issues related to MediaWiki development
   and extensions;
* Moving beyond MediaWiki: what other Wiki-like platforms exist;
   what tools and features do we need for collaboration on
   different types of knowledge?
* Mobile Wikis: The Web is moving off the desktop and into mobile
   phones, how we use wikis on mobile devices?; wiki-based
   Augmented Reality (AR) applications, location based services
* User Interface Design: Usability and user experience;
   accessibility, adaptive interfaces and personalization; novel
   UI designs.

WIKISYM 2010

Please note that Wikimania 2010 is co-located with WikiSym, The 
International Symposium on Wikis and Open Collaboration. More 
information about WikiSym can be found on the conference website:

http://www.wikisym.org/

SUBMIT A PROPOSAL

To submit a proposal for a presentation, workshop, panel or 
tutorial, please visit:

http://bit.ly/Submit2010

Thank you for helping make Wikimania 2010 a successful event. :-)
See you in Gdansk, July 9-11!



-- 
Marcin Cieslak
Wikimania 2010 Gdansk

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Balancing MediaWiki Core/Extensions

2010-09-24 Thread Marcin Cieslak
>> Roan Kattouw  wrote:
> 2010/9/22 Trevor Parscal :
>> Modular feature development being unique to extensions points out a
>> significant flaw in the design of MediaWiki core. There's no reason we
>> can't convert existing core features to discreet components, much like
>> how extensions are written, while leaving them in stock MediaWiki. This
>> sort of design would also make light work of adding extensions to core.
>>
> Making MediaWiki more modular won't magically make it possible (or
> even desirable) to write any old feature as an extension. Some things
> will still have to go in core and some things we'll simply /want/ to
> put in core because making them extensions would be ridiculous for
> some reason.

I'd rather have MediaWiki build on some classes & object instances
with a clear responsibilities, Inversion of Control and possibility
to test each _object_ separately without causing interference
to other components. Discussion what's core/extension is to me 
secondary. Maybe at some point "hooks" as we know them will
not be needed, we could be able to use interfaces as
provided by the programming language instead of hooks, 
possibly (yes, I'm dreaming) tied together by something like
PicoContainer. 

Even still, those interfaces *will* change and it would be 
good if developers could refactor the code in the core
and extensions at the same time.

I don't why we really duplicate so much functions even
within core (API vs. standard special pages for example).
But that's probably the issue to be solved in phase4 :)

So before asking how much to add into core, maybe we should
first clean up some, and then possibly add. Or sometimes
adding something (like a proper multi-wiki configuration 
management $wgConf++) may clean up and simplify some things
inside core.

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Pywikipedia-l] Humans vs bots: 0:1

2011-02-02 Thread Marcin Cieslak
Zawartość nagłówka ["Followup-To:" gmane.comp.python.pywikipediabot.general.]
>> John  wrote:
> Yeah, all you need to do is remove the incorrect links from all affected
> articles.
>
> You sorta did that with -localright. however that just fixed the correct
> article but still left some articles pointing to the wrong article. you need
> to fix every article, as long as one page has the wrong link it will be
> propagated back

On 22nd somebody readded those links manually because article was reconstructed.

But earlier, after this revert:

https://secure.wikimedia.org/wikipedia/pl/w/index.php?title=Aktywizm&diff=25027755&oldid=25014331

I tried to use pywiki interwiki.py to remove 'pl' by making sure [[en:Activism]]
has no pl: link (enforced by {{nobots}}), [[pl:Aktywizm]] has no interwiki links
(enforced by {{nobots}}) and running interwiki.py -localright poiting to 
en:Activism
to remove it on all other wikis. But this got reverted again in few minutes
by the bots running. 


Sure, after this:

https://secure.wikimedia.org/wikipedia/pl/w/index.php?title=Aktywizm&diff=25078310&oldid=25028009

you can't reproduce this anymore, but this is pretty strange.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GSOC project "Improve our Android application"

2012-03-05 Thread Marcin Cieslak
>> Yeshow Lao  wrote:
> Hello, everybody. I'm a GSOC student from China. With some development
> experience on Android, I would love to work on this project "Improve our
> Android application -- integrate with SuggestBot to suggest a mobile task
> to a user."[1.]
>
> Could somebody tell me some information, please?

Hello,

You will find lots of information about the project from the recent 
announcement:

http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/59033

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Git, Gerrit and the coming migration

2012-03-10 Thread Marcin Cieslak
I am a seasoned developer. I only do it in my spare time and only 
when I am particularly annoyed about something not working in MediaWiki
(mostly as a feedback from plwiki community or recently checkusers).

You can see my pitiful track record here:

https://www.mediawiki.org/wiki/Special:Code/MediaWiki/author/saper 

Most of those things are urgent one-liners that got to be pushed
to deployment very quickly and I never had any problem with getting
it through the process. But I'm a small guy. So please do not
treat my remarks below as something that would make life of people
pushing hundreds of lines per week to MediaWiki more difficult.

I only don't like (as it happends with git to me recently) to
learn the tool once again from scratch just because I haven't used
it for the last 4 weeks or so.

>> Antoine Musso  wrote:
> Diederik van Liere wrote:
>>  We are making three fundamental changes in one-shot:
>
> They are not that much of change. It is like if you changed from using a
> paper map and an old car to a nice SUV with a GPS. It is still a lot of
> metal on 4 wheels with one purpose: move some fresh meat from A to B.
>
> The model is the same. Only the tool changes.
> (you can quote me on this when we finally take the decision to migrate
> to JavaScript or Python)
>
>
>> 1) Migrating from a centralized source control system to a
>> decentralized system (SVN -> Git)
>
> Decentralization itself is just a buzz word for the twitter guys. In the
> end, it does not change that much since most people have a reference
> repository.  I guess most developers will use the WMF repository as a
> reference, or at the very least, all patches will eventually end up in
> the WMF repository.
>
> We could imagine having the WMF feature team to use their own repository
> then submit a nice giant patch once in a while.

I am using mercurial, git and starting to learn fossil. There is one 
change which is partially related to  tools, partally to the distributed
nature of git.

The fundamental change is something else to me: you lose feeling
of linearity. I like hg because it still tries to give me a cosy nice
local version numbers (great to switch from SVN, you can even have
your old SVN commit numbers to stay after migration). But when
I look at the gerrit interface (not gerrit's fault) I have
no idea what was done before, what was done after, what's the history.

What happend to me on a very first day trying gerrit:

I got an email to merge a change (since somebody pushed something
conflicing in between), so I duly issued some magic git commands
and I got it pushed. However, when I came back to gerrit
I ended up with an "empty" commit:

https://gerrit.wikimedia.org/r/#change,2916

I thought I did merge though! What happened? Looks an empty commit
got filed...

I tried poking around in gerrit to find out what happened I 
had no clue, only running "git log" locally revealed that
may change was indeed merged by somebody else in the background. 

I had a feeling I have 4 or more revisions flying around ("commits")
and I could not relate them to each other. Only "git log"
locally helped me to get out of the trouble.

Looking at this screen:

https://gerrit.wikimedia.org/r/#dashboard,103

Those two commits are related, but it's totally non-obvious that
something follows up on something else. I have clicked on them
and yes, I can find that 87f491132487313144e531354578ea2fbd3b42b4
is common to both of them. Oh, cool!

In comparison to this, the current Special:Code follow up revision
system is easy, readable and very useful.

Oh, and by the way those I3577f029 and Ifb002160 are some
identifiers totally unrelated to commits (I need to learn more about
Change-Id vs. a commit... I promise I will - already got burned
by missing pre-commit hook in the repo). And there is only
date, not a timestamp to get some sense of linearity again.

I am really afraid I will be lost when my dashboard will have
many more patches and merges. 

Looking at this or any more complex git development tree makes my
cry for linear revision numbers. At least I can find out what
was before, what is after - sure it comes at a cost of potentially
more difficult merging and branching, but let's be serious,
how many "edit conflicts" do we have in the tree? 

I don't think that our development timeline is more complex than this:

http://fossil-scm.org/index.html/timeline?n=200

and this is so much more readable (and yes, I know gitk).

I read in this thread that there could be a tree-like priority system
to sort out more impactful changes from more specific ones.
Building such a tree can be very challenging and as far as I understand
we don't have a tool yet. We end up with a bunch of loose commits,
somehow connected to each other, not linearized. 

And from experience, trivial and small patches get through
to deployment very fast. It's larger things that have to wait
longer...

I presume this is less of a problem with the current use in 
operations where change

Re: [Wikitech-l] Git, Gerrit and the coming migration

2012-03-12 Thread Marcin Cieslak
>> Platonides  wrote:
>> The fundamental change is something else to me: you lose feeling
>> of linearity. I like hg because it still tries to give me a cosy nice
>> local version numbers (great to switch from SVN, you can even have
>> your old SVN commit numbers to stay after migration). But when
>> I look at the gerrit interface (not gerrit's fault) I have
>> no idea what was done before, what was done after, what's the history.
>
> Yes, git is so poewerful, that gets fragile in itself. I end up with
> several clones and no idea about where they differ.
> Isn't there a way to compare them?

Yeah, I have the same problem very often. Many times got confused 
enough to save diffs and "rm -rf" the working directory.

Git is a mess to integrate with - it's almost impossible to have some API.
And you can't have "hg rollback" - my favorite feature of Mercurial
(no need to commit --amend or something).

>> Looking at this again:
>> 
>> https://gerrit.wikimedia.org/r/#change,2916
>> 
>> I have 28f176ca9ca3767bfa9f0ec219f5fa0c299c5761 and
>> 87f491132487313144e531354578ea2fbd3b42b4 here (those are commits,
>> fine) and Ifb002160485030496c7d3f2abc4991484b533648 
>> 
>> Additionally there is this c64fd4488d2ea24e120acb15db413377494dd3b3 
>> ("Patch Set 1") referring me to (gitweb) which is calls it "commit".
>> Ah, and there is 1101a1b3fe7f4d1c29321157fc1ef9b9f3fb6ff0 as well.
>> 
>> Ouch and there is this "refs/changes/16/2916/1" <-- the good think
>> I can actually click on it in gitweb!
>
> ? That's not a link.

Go to to the gitweb screen 

https://gerrit.wikimedia.org/r/gitweb?p=test/mediawiki/extensions/examples.git;a=log;h=87f491132487313144e531354578ea2fbd3b42b4

and you will see little pink patches (tags) like

https://gerrit.wikimedia.org/r/gitweb?p=test/mediawiki/extensions/examples.git;a=log;h=refs/changes/14/2914/1

those refs/changes/14/2914/1 are gerrit pointers to commits.
(http://book.git-scm.com/7_git_references.html calls them "Git references")

I see also some people want to get rid of them at times:
http://www.mailinglistarchive.com/html/repo-disc...@googlegroups.com/2010-05/msg00014.html

Unfortunately, I don't see them in my local repository. Why?
How can you clone them?


>> All this makes "MFT r111795, r111881, r111920, r112573, r112995, r113169"
>> looks pale in comparison. And I can actually click a link in 
>> [[Special:Code]],
>> and go back and forth on followups, neat!
>
> I proposed several times to bump git change numbers, so new ones  don't
> "conflict" with svn ones. Ie. the number alone would allow you to point
> to gerrit or CodeReview, keeping a bit of consistency between models.
> Even the urls kind of match https://gerrit.wikimedia.org/r/123 for r123
>
> Who cares about that? That r stands there for being a review system.
> I was told that zero effort was going to be made for that (they were
> unsure about the consequences of bumping the auto_increment, although
> there's little I can do about that) and to just talk about
> "change 1234" not "r1234".
>
> It's obvious that c3000 comes after r115000, isn't it? ;)

It looks like that Gerrit precedessor, Rietveld
works with SVN (yes!) and has a much nicer interface:

http://codereview.appspot.com/5727045/#ps1

it uses "Issues" as bases for development (I think this is
gerrit's "change"). Maybe we should use that instead
(yes, I know, google app engine and stuff...)

>> I really think that tighter integration with bugtracker (so
>> bug attachments end up in vcs review queue and commit comments
>> can be seen as quasi-bugs) would be much more beneficial to
>> users. I will try to see how it would have worked with systems
>> like fossil for example and report back.
>
> Indeed, that'd be an important feature.

And I think now you have englightened me.

Maybe our workflow should probably be completely 'change' based 
and not 'commit' based. 

Our [[Git/Workflow]] page does not say much about changesets.

Actually working on a changeset mini-branch
(git fetch origin refs/changes/16/2916/1 && git checkout FETCH_HEAD)
and then 

git add ... && git commit -m "..." && git push origin HEAD:refs/changes/2916

would be a nice workflow; we probably wouldn't (or shouldn't)
do rebasing then since theoretically many people can work
on a changeset at the same time. 

(I can't check how it works since my "git push" fails again on 
[remote rejected] HEAD -> refs/for/master (prohibited by Gerrit))

If "refs/changes/2916" would work also for fetch (now you need
"refs/changes/xx/2916/n" it could be very interesting.

If only I could somehow fetch all tags "refs/changes/*" to my
local git repository - anybody knows how do this?


It also seems impossible to do code reviews offline, since
we need to 

ssh -p 29418 gerrit.wikimedia.org host gerrit review  

It seems that we can use revision numbers sometimes:

ssh -p 29418 gerrit.wikimedia.org gerrit query 2714 

but to work on review witch patchsets we need to be more specific:

ssh gerrit.wikimedia.or

[Wikitech-l] Revisions in git outside of query or 'ssh ... gerrit query'

2012-03-12 Thread Marcin Cieslak
I have a following stuff in git log
(from 
ssh://sa...@gerrit.wikimedia.org:29418/test/mediawiki/extensions/examples):

commit 28f176ca9ca3767bfa9f0ec219f5fa0c299c5761
Merge: e2513b6 5c50ad6
Author: Demon 
Date:   Fri Mar 2 13:15:54 2012 +

Merge "Slight doc change as Git test"

commit 5c50ad62ebdee790735a29338aaec7bd9ad2c366
Author: jarry1250 
Date:   Thu Mar 1 21:03:19 2012 +

Slight doc change as Git test

Change-Id: I2b646e25ece2a0e16f960628e256828a2e020324

commit e2513b6edecf491e5dde1baa999627f9012386eb
Author: cmcmahon 
Date:   Wed Feb 29 14:03:41 2012 -0700

set up new laptop

Change-Id: I6d99a2c2dcc4a78ac2f8359c6aadbec52dc58dd6



It seems that ^demon is allowed to commit without
a "Change-Id"  Fine.

I'd like to fetch the patches related to jarry1250 commit:

$ ssh -p 29418 sa...@gerrit.wikimedia.org gerrit query 
5c50ad62ebdee790735a29338aaec7bd9ad2c366 --patch-sets
change I2b646e25ece2a0e16f960628e256828a2e020324
  project: test/mediawiki/extensions/examples
  branch: master
  id: I2b646e25ece2a0e16f960628e256828a2e020324
  number: 2913
  subject: Slight doc change as Git test
  owner:
name: Jarry1250
email: jarry1...@gmail.com
  url: https://gerrit.wikimedia.org/r/2913
  lastUpdated: 2012-03-02 13:15:54 UTC
  sortKey: 001b6f1b0b61
  open: false
  status: MERGED
  patchSets:
number: 1
revision: 5c50ad62ebdee790735a29338aaec7bd9ad2c366
ref: refs/changes/13/2913/1
uploader:
  name: Jarry1250
  email: jarry1...@gmail.com


type: stats
rowCount: 1
runTimeMilliseconds: 511

so I can now

$ git fetch 
ssh://sa...@gerrit.wikimedia.org:29418/test/mediawiki/extensions/examples 
refs/changes/13/2913/1
>From ssh://gerrit.wikimedia.org:29418/test/mediawiki/extensions/examples
 * branchrefs/changes/13/2913/1 -> FETCH_HEAD

$ git checkout -b r2913 FETCH_HEAD
Switched to a new branch 'r2913

Now let's try the same with ^demon's commit:

$ ssh -p 29418 sa...@gerrit.wikimedia.org gerrit query 
28f176ca9ca3767bfa9f0ec219f5fa0c299c5761 --patch-sets
type: stats
rowCount: 0
runTimeMilliseconds: 429

It's just not there...

Same with another one:

$ ssh -p 29418 sa...@gerrit.wikimedia.org gerrit query 
17f028f7d0b73b777cc2a06d99b3538d2c95db09  --patch-sets
type: stats
rowCount: 0
runTimeMilliseconds: 429

Even if I listen all abandoned changes in the project:

$ ssh -p 29418 sa...@gerrit.wikimedia.org gerrit query status:abandoned 
project:test/mediawiki/extensions/examples --patch-sets > r1

They are not there:

$ grep 28f176ca9ca3767bfa9f0ec219f5fa0c299c5761 r1
$ grep 17f028f7d0b73b777cc2a06d99b3538d2c95db09 r1

Seems like they are totally outside of gerrit. Pretty lame for a "gated trunk"!

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] History entry without content change?

2012-03-12 Thread Marcin Cieslak
>> Robert Cummings  wrote:
> On 12-03-12 05:25 AM, Max Semenik wrote:
>> On 12.03.2012, 12:44 Robert wrote:
>>
>>> Hello all,
>>
>>> I was wondering if there's a clean way within the Wiki codebase to
>>> generate a history entry for an article without actually modifying the
>>> content. Essentially, from within an extension, I'd like to treat the
>>> history as a logfile for specific events related to the article that
>>> don't actually modify the article.
> >
>>
>> See Revision::newNullRevision()
>
>
> Perfect, thank you!

or maybe you can do something like

$logEntry = new ManualLogEntry( 'something', 
'somethingmaybeese' );
$logEntry->setPerformer( $user );
$logEntry->setTarget( $this->mTitle );
$logEntry->setComment( $reason );
$logid = $logEntry->insert();
$logEntry->publish( $logid );

(from WikiPage.php line 2990)

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] We need to use git-review - need help packaging for Win/Mac (Gerrit)

2012-03-14 Thread Marcin Cieslak
>> Sumana Harihareswara  wrote:
> I wrote:
>>  So if you have tried and failed to install and use
>> git-review, please speak up ASAP so we can make our git-review
>> instructions and workflow more robust.
>
> I'm not hearing anyone saying that this is still failing for them.  And,
> as Roan pointed out, people can use

There is still some problem with pushing to 

/test/mediawiki/extensions/examples

with git-review:

http://tools.wikimedia.pl/~saper/fail/gerrit-fail-01

or without:

http://tools.wikimedia.pl/~saper/fail/gerrit-fail-02

Seems to be something because of configuration:

https://labsconsole.wikimedia.org/w/index.php?title=Gerrit_bugs_that_matter&action=historysubmit&diff=2638&oldid=2611

https://labsconsole.wikimedia.org/wiki/File:Gerrit_branch_permissions_wrong.png

notice refs/for/refs/*  - but changing this apparently does not help.

Fortunately, /test/mediawiki/core works fine for me.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Integrating code common to several extensions into core?

2012-03-15 Thread Marcin Cieslak
>> Max Semenik  wrote:
> What is our policy/best practices on code needed by several
> extensions? Does it make sense to integrate such code into core?
>
> For example, my current situation: there is a class in MobileFrontend
> that performs reformatting of HTML: remove some tags depending on
> their ID/class and so on. MF uses it to make page content more
> suitable for mobile devices. At the same time, it can be used for
> other transformations such as getting plain-text extracts of articles.
> Consequentially, producing such extracts is currently part of
> MobileFrontend although such functionality should belong to a separate
> extension (and why not core?). So if I want to use this functionality
> in several extensions, they should either depend on one of them or some
> meta-extension, both of these would be inconvenient.

If done *cleverly* - why not? Do you mean something like preventing
adding something by OutputPage::addHTML()/addElement() in the first place?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Test suite for dumping MediaWikis using xmldumps-backup

2012-03-17 Thread Marcin Cieslak
>> Christian Aistleitner  wrote:
>
> --===2205038051751942713==
> Content-Type: multipart/signed; micalg=pgp-sha512;
>   protocol="application/pgp-signature"; boundary="Q68bSM7Ycu6FN28Q"
> Content-Disposition: inline
>
>
> --Q68bSM7Ycu6FN28Q
> Content-Type: text/plain; charset=us-ascii
> Content-Disposition: inline
> Content-Transfer-Encoding: quoted-printable
>
> Hello everyone,
>
> in the past weeks, I put together "xmldumps-test"---a test suite for
> Ariel's xmldumps-backup software. xmldumps-test tries to assure that
> the MySQL database, MediaWiki, and xmldumps-backup play nicely
> together.
>
> xmldumps-test injects data into the database (so do not use it on a
> live database), starts xmldumps-backup, and compares the generated XML
> dumps against pre-verified data.
> Using xmldumps-test I hope to catch problems caused by modifications
> to MediaWiki or xmldumps-backup /before/ they hit Wikimedia's
> production servers dumping enwiki, ...
>
> The code is up for review at
>   https://gerrit.wikimedia.org/r/p/operations/dumps/test.git
>=2E
> README serves as general point of entry to the documentation.
> README.installation shows you how to set up xmldumps-test.

One question:

in the https://gerrit.wikimedia.org/r/p/operations/dumps/test.git
repository there are two branches, "master" and "ariel"
and the README me says we should use "ariel".

"master" however see to be also attached to a gerrit project

I was able to check it out using 

ssh://sa...@gerrit.wikimedia.pl/operations/dumps/test.git 

port 29418

Which shall we use? It seems that I can propose patches
using gerrit only to "master" while "ariel" seems
to be a bit more active.


Second thing - I was fixing recently few nuts and bolts
for seamless PostgreSQL support, so I'd love to have
that for PostgreSQL too. Once I sort out outstanding
installer/updater issues I am willing to help, of course.

We already ran a PostgreSQL testsuite on jenkins and
I think we should check dumps too. 

//Saper



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mark Hershberger departing Wikimedia Foundation in May

2012-03-17 Thread Marcin Cieslak
>> Petr Bena  wrote:
> Hi all
> I thought that his role must have been one of most boring like the walking
> through a lot of various reports and trying to make some lazy developers to
> fix them ;-) but i was surprised that when I started to work with him thas
> the work we did was almost most entertaining experience I had on wikimedia
> ever. Seriously I always felt like he is underappreciated for the hard work
> he did. Thanks to Mark I got in contact with lot of awesome people inluding
> himself and found that being a bugmeister is not a boring work but
> important part of development process. Thanks for your hard work! I hope ur
> not leaving the project definitely and stay at least as a volunteer who can
> help us even if not so actively

I fully agree. Mark turned this uninteresting job into something actually
motivating. No bug was too stupid to take care of it and research. 
He also did some testing which was very helpful. I have to admit Mark's
work motivated me to do a bit more with MediaWiki!

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Test suite for dumping MediaWikis using xmldumps-backup

2012-03-17 Thread Marcin Cieslak
>> Christian Aistleitner  wrote:
>
> --===8638225475146652871==
> Content-Type: multipart/signed; micalg=pgp-sha512;
>   protocol="application/pgp-signature"; boundary="sm4nu43k4a2Rpi4c"
> Content-Disposition: inline
>
>
> --sm4nu43k4a2Rpi4c
> Content-Type: text/plain; charset=us-ascii
> Content-Disposition: inline
> Content-Transfer-Encoding: quoted-printable
>
> Hi Saper,
>
> On Sat, Mar 17, 2012 at 01:59:33PM +, Marcin Cieslak wrote:
>> > [ Announcing xmldumps-test ]
>> >
>> > The code is up for review at
>> >   https://gerrit.wikimedia.org/r/p/operations/dumps/test.git
>>
>> [ Confusion of URLs and branches ]
>
> yes, the current situation with the "ariel" branch is suboptimal.

I don't mind that, actually. I have confused names and repositories.

>> Which shall we use?
>
> The current xmldumps-test is stored in the "master" branch of
>   https://gerrit.wikimedia.org/r/p/operations/dumps/test.git
>
> The current xmldumps-backup is stored in the "ariel" branch of
>   https://gerrit.wikimedia.org/r/p/operations/dumps.git

Ok so those are two separate pieces of software, indeed
when checking out "ariel" branch of dumps.git informs me
that there are no shared commits indeed.

My consfusion came from the fact that I started reading
README.installation (of -test) that starts with description
of -backup. 

Thank you for explaining this!

//Saper



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki core deployments starting in April, and how that might work

2012-03-24 Thread Marcin Cieslak
>> Rob Lanphier  wrote:
> On Fri, Mar 23, 2012 at 10:13 AM, Chad  wrote:
>> On Fri, Mar 23, 2012 at 1:11 PM, Rob Lanphier  wrote:
>>> On Fri, Mar 23, 2012 at 10:03 AM, Chad  wrote:
 I disagree here. The wmf branches should always be linear, and you
 should never merge to it without accepting that it may get scapped.
>>>
>>> Actually, you should never merge to master without accepting that it
>>> may get scapped.  Why have two layers here?
>>>
>>
>> Well true, but I still don't see an instance where you'd merge
>> to master but *not* be willing to have it deployed.
>
> Chad and I discussed this, and I think we figure out where we were
> talking past each other.  I'll let him explain in more detail when he
> gets a chance, but the short answer is we'll probably be doing
> something roughly like I described as the second plan in my original
> email, with the addition of liberal use of tagging.  We could even
> *try* to get away with tagging instead of branching, and then only
> branch when necessary.

How "git committers" (i.e. mere mortals) will be able to propose
changes to be deployed in wmf production (a.k.a. "1.19wmf1" tag
in [[Special:Code]]?

If we plan to use gerrit for that (submitting commits for review
in the "wmf" branches by the means of refs/for/wmf-something),
I would ask for one thing - can we full go end-to-end and test
the workflow with actual commands?

I have noticed potential issues with gerrit+git combination vs
pure git workflow:

* It seems I can't have a longer running branch with my
  changes on some topic - those commits cannot be easily 
  submitted to gerrit (since my branch will likely
  contain merges from master and we don't want't them
  to be resubmitted via the dependency tree)

* It seems necessary to change "Change-Id" everywhere
  for "git cherry-pick" to work without -n - it seems
  that cherry-pick does not invoke our commit-msg hook
  when committing. Neither does "git merge".

Those things limit possibilities in moving commits
between branches (official or local ones), so I think
it would be good to review the process in the test tree
using some actual commands (and with some test users
that don't have privilege to cross the gated trunk
involved).

More on the second case in separate thread.

//Saper



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Caveat: git merge and git cherry-pick don't add "Change-Id" by default

2012-03-24 Thread Marcin Cieslak
When playing with moving (cherry picking) commits from master to one
of the release branches (REL1_19) I noticed that "git cherry-pick"
and "git merge" do not invoke "commit-msg" hook and therefore don't
add Change-Id to the commits.

This is normally not a problem for people who are allowed changes
without "Change-Id" to the repository (i.e. trunk gatekeepers),
but this may add some problems for committers at large.

1) you can't push result of such merge or cherry-pick for review

2) you can't directly cherry-pick a commit imported from SVN 
  into git master (those have no change-ID's) to one of 
  the branches.

3) merge commits done by gatekeepers are not visible to gerrit
 (you can't find them for example when searching by ID).
 Therefore we loose ability to comment on them if necessary.

Here is an example how it failed on me today:

https://www.mediawiki.org/w/index.php?title=Git/Workflow&diff=prev&oldid=515259

Sometimes a "git commit -c " or "git commit --amend"
are able to fix the issue, because "git commit" DOES invoke the hook.

This will be especially important for developers wanting to propose
their changes into release branches or deployment branches.

This may also bite you if you are using some private branches,
all commits there have Commit-IDs but merges will not have
and you may have hard time push it all together for review. 

It is also interesting to see how Change-IDs represent
multiple review items (I52150208654fa14e02b6d80fb2cff4108089ef6c 
is https://gerrit.wikimedia.org/r/3713 and 
https://gerrit.wikimedia.org/r/3714).

I think the right workaround right now is to make sure
all commits (even merges or things going directly into git master
for some reason) have their gerrit "Change-ID". 

With cherry pick it can be done by doing it in two stages:
  git cherry-pick -n 
  git commit -c 
(but you might want to improve the commit message).

Something similar can be done for merges probably  
(there is --no-commit option to git merge).

There are probably some other git commands causing automatic
commit that may have this problem. Fast forwards are
fine, as they don't produce commits.

I did some of the testing today:

https://gerrit.wikimedia.org/r/#q,project:test/mediawiki/core+owner:saper%2540saper.info,n,z

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Committing followups: please no --amend

2012-03-27 Thread Marcin Cieslak
>> Tim Starling  wrote:
>
> It doesn't work, I'm afraid. Because of the implicit rebase on push,
> usually subsequent changesets have a different parent. 

How does the "implicit rebase on push" work? Do you mean git-review?

I don't know whether "git push  HEAD:for/master/"
rebases anything. I still prefer plain "git push" (where I can also
ay "HEAD:refs/changes/XXX") to git-review magic.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Changeset differ (was Re: Committing followups: please no --amend)

2012-03-28 Thread Marcin Cieslak
>> Tim Starling  wrote:
> I wrote:
>> It doesn't work, I'm afraid. Because of the implicit rebase on push,
>> usually subsequent changesets have a different parent. So when you
>> diff between the two branches, you get all of the intervening commits
>> which were merged to the master.
>
> I was hoping that someone was going to say "you're wrong, making those
> diffs is easy, here's how." But I take it by the silence that I'm not
> wrong, and it really is hard.


I just tried to push second commit to 
https://gerrit.wikimedia.org/r/#change,3841
patchet three.

If you don't start from "scratch" i.e. base your commit on the parent:

8824515e571eadd4a63b09e1331f35309315603f

(now I have

$ git log HEAD ^HEAD^^^
commit e67af5bbd843db3062cc0082254b69aae3d1241b
Author: saper 
Date:   Wed Mar 28 22:06:17 2012 +0200

An example how a foreign key should be added to the table

Change-Id: I0da5b25f4b4499facac6c410fa7ab74250935288

commit 96692fb23c00cb726144290b108623896cf24834
Author: Marc A. Pelletier 
Date:   Tue Mar 27 22:44:32 2012 -0400

(bug 5445) remove autoblocks when user is unblocked

(...comment truncated...)

Change-Id: I4aa820ae9bbd962a12d0b48b6c638a1b6ff4efc9

This is the current HEAD:

commit 8824515e571eadd4a63b09e1331f35309315603f
Author: Santhosh Thottingal 
Date:   Wed Mar 28 11:25:45 2012 +0530


Trying to commit e67af5bbd843db3062cc0082254b69aae3d1241b
makes gerrit say:

 ! [remote rejected] HEAD -> refs/changes/3841 (squash commits first)

It does not matter if I use the same change ID or not. It knows
exactly where it should go but it still refuses it.

I have managed to workaround this by creating a branch, doing
lots of commits there, merging it, and push the merge to gerrit.

But then it uploads lots of unrelated changets:

https://gerrit.wikimedia.org/r/#change,3706
https://gerrit.wikimedia.org/r/#change,3707
https://gerrit.wikimedia.org/r/#change,3708 (but this was outside of the branch)
https://gerrit.wikimedia.org/r/#change,3709

The commit tree looked like:

 private branch:  3706 --- 3707 ---
 / \
   62562768cf8f2696 +  3708 + 3709 (merge)


As you can see, although there were so many changesets,
they all have dependencies set properly.

Is this a better way? I don't know...

I wonder why in this case gerrit does not complain
with its usual  (squash commits first)

Having private branches with other people
would certainly help to work together on issues.

I tried to submit an improvement to 
https://gerrit.wikimedia.org/r/#change,3841 and
it seems I can't do this the other way than 
rebasing my changes to the parent of the changeset
(*not* master). Not sure how to make a branch
out of it (maybe I should merge it with the parent
commit?)


//Saper




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Changeset differ (was Re: Committing followups: please no --amend)

2012-03-29 Thread Marcin Cieslak
>> Tim Starling  wrote:
> I wrote:
>
> Also, I'm concluding based on Roan's objections that I'm going to have
> a hard time convincing people to stop amending their commits. So I
> wrote this script that provides changeset diffs for reviewers:
>
> http://tstarling.com/gerrit-differ.php
>
> It fetches both commits from a local mirror into a temporary branch,
> rebases the branches to a common ancestor, and then diffs them.

There is an interesting failure mode of the whole plan :)
As you noticed on IRC comparing patchset 4 and 5 makes no sense.
Patchset 5 actually does not change anything except the commit
message relative to the patchset 4, but that's not the issue.

Side note:
 I have a small tool (https://github.com/saper/gerrit-fetch-all)
 to fetch all changesets from gerrit and branch the as
 "change/3841/4". Then you can diff your changesets locally,
 in the git repository.

Here's why: patchsets 1 till 4 are based off revision

$ git log --pretty=oneline change/3841/4 ^change/3841/4^^
fcc05dee9b93e080b13fc4b0d5b83a1c75d34362 (bug 5445) remove autoblocks when user 
is unblocked
8824515e571eadd4a63b09e1331f35309315603f Fix Bug 30681 - Wrong escaping for 
inexistent messages.

While patchset 5 has been rebased to a newer changeset
from master:

$ git log --pretty=oneline change/3841/5 ^change/3841/5^^
4f2ff743ff1bc93d922ab9b5b3135786df5c7b69 (bug 5445) remove autoblocks when user 
is unblocked
571e63cd2c2bac9a033e1816f5ad8b6a14b4f42b Merge "Use local context to get 
messages"
95c35e52113b9a98accc1e9b0e9fffc15b1661a8 Use local context to get messages

$ git branch -vv |grep change/3841/
  change/3841/1  89daac5 Remove autoblocks when original block 
goes (Bug #5445) 
  change/3841/2  b9090b3 (bug 5445) remove autoblocks when user 
is unblocked
  change/3841/3  96692fb (bug 5445) remove autoblocks when user 
is unblocked
  change/3841/4  fcc05de (bug 5445) remove autoblocks when user 
is unblocked
* change/3841/5  4f2ff74 (bug 5445) remove autoblocks when user 
is unblocked

So here's how patchsets 4 and 5 differ according to git:

$ git log --pretty=oneline change/3841/4...change/3841/5
* 4f2ff743ff1bc93d922ab9b5b3135786df5c7b69 (bug 5445) remove autoblocks when 
user is unblocked
*   571e63cd2c2bac9a033e1816f5ad8b6a14b4f42b Merge "Use local context to get 
messages"
|\  
| * 95c35e52113b9a98accc1e9b0e9fffc15b1661a8 Use local context to get messages
* |   681a170f290ca0a7b0d771155ddc59f091a5576d Merge "Add phpunit testcases for 
Bug 30681"
|\ \  
| * | b91ffd7b09b445224cdef27a3a40bc9ded1fb8c7 Add phpunit testcases for Bug 
30681
|  /  
* | dde3821ac130486a24a7f7a97eaf0eb6d67e55d2 (bug 35541) ns gender aliases for 
Croatian (hr)
|/  
* cc2f70df0d106f84877591113d3973214bcfd36a gitignore mwsql script history file
* fcc05dee9b93e080b13fc4b0d5b83a1c75d34362 (bug 5445) remove autoblocks when 
user is unblocked

The common revision for both changesets is the next one:

$ git merge-base change/3841/4 change/3841/5   
8824515e571eadd4a63b09e1331f35309315603f

(this is the parent of change/3841/1..4)

So it is clear that diff between them will include all the changes merged to
master in between. 

My git-fu is limited, so I don't know how to compare such revisions.

I think we generally run into git architectural assumption
- that git is meant to store trees of files and "diffs"
or "changes" are not an object in the git world at all.

So I think that rebasing the patchset to the current master is a bad idea
for review. However, this increases likelihood of merge conflict
once the change is approved. Maybe we should have a workflow like this:

patchset 1 - proposed change
patchset 1 - review, negative
patchset 2 - updated change
patchset 2 - review, negative
patchset 3 - updated change
patchset 3 - review, possitve - change approved
patchset 4 - patchset 3 rebased to the master branch
patchset 4 - merged, closed (if ok)

Would that work in practice?

//Saper





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] correct way to import SQL dumps into MySQL database in terms of character encoding

2012-04-01 Thread Marcin Cieslak
>> Piotr Jagielski  wrote:
> Hello,
>
> set my data source URL to the following in my Java code:
> jdbc:mysql://localhost/plwiki?useUnicode=true&characterEncoding=UTF-8

Please note you have "plwiki" here and you imported into "wiki".
Assuming your .my.cnf is not making things difficult I ran a small
Jython script to test:

$ jython
Jython 2.5.2 (Release_2_5_2:7206, Mar 2 2011, 23:12:06) 
[OpenJDK 64-Bit Server VM (Sun Microsystems Inc.)] on java1.6.0
Type "help", "copyright", "credits" or "license" for more information.
>>> from com.ziclix.python.sql import zxJDBC
>>> d, u, p, v = "jdbc:mysql://localhost/wiki", "root", None, 
>>> "org.gjt.mm.mysql.Driver"
>>> db = zxJDBC.connect(d, u, p, v, CHARSET="utf8")
>>> c=db.cursor()
>>> c.execute("select cl_from, cl_to from categorylinks where cl_from=61 limit 
>>> 10")
>>> c.fetchone()
(61, array('b', [65, 110, 100, 111, 114, 97]))
>>> (a,b) = c.fetchone()
>>> print b
array('b', [67, 122, -59, -126, 111, 110, 107, 111, 119, 105, 101, 95, 79, 114, 
103, 97, 110, 105, 122, 97, 99, 106, 105, 95, 78, 97, 114, 111, 100, -61, -77, 
119, 95, 90, 106, 101, 100, 110, 111, 99, 122, 111, 110, 121, 99, 104])
>>> for x in b:
... try:
... print chr(x),
... except ValueError:
... print "%02x" % x,
... 
C z -3b -7e o n k o w i e _ O r g a n i z a c j i _ N a r o d -3d -4d w _ Z j e 
d n o c z o n y c h

array('b", [ ... ]) in Jython means that SQL driver returns an array of bytes.

It seems to me that array of bytes contains raw UTF-8, so you need to decode it 
into
proper Unicode that Java uses in strings. 

I think this behaviour is described in

http://bugs.mysql.com/bug.php?id=25528

Probably you need to play with getBytes() on a result object
to get what you want.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] GSOC proposal: Native application uploader

2012-04-04 Thread Marcin Cieslak
Zawartość nagłówka ["Followup-To:" 
gmane.science.linguistics.wikipedia.technical.]
>> Platonides  wrote:
> Hello all,
> I'm presenting a GSOC proposal for a native desktop
> application designed for mass uploading files on
> upload campaigns.

> Opinions? Improvements? Sexy names? Mentors?
>
> All of them are welcome!
>
> 1-
> http://lists.wikimedia.org/pipermail/wikilovesmonuments/2012-March/002538.html

Odder already mentions Commonist in his email, so let me expand on this
(as I was fixing older Java versions to work with a more modern MediaWikis):

You have the last Java version in our SVN:

http://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/commonist-java/

as well as the next generation Commonist in Scala:

http://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/commonist/

Scala version solved some architectural issues with the Java version.

I would definitely recommend to build on Commonist; I actually like the
tool very much (I was still using old java version until recently).  It
has simple UI that meets *most* of the requirements.

Actually providing some sensible defaults (or even-simpler-UI) should be
enough for WLM people. Commonist is actually quite customizable
(a lot can be done using property files and templates), 

The only thing which I really don't like in Commonist ist that actual
upload phase is done together with metadata editing.  Metadata weren't
saved (at least in the older versions I have used) together with
images (or somewhere else - images can be on R/O medium) 
so you would lose them if the tool was closed.

So probably there should be three phases:

(1) metadata management/editing (that includes some defaults for WLM
folk) 

(2) actual upload/sync (Commonist has ability to re-upload).  

(3) obtaining upload results and letting users to decide what to do with
problems (force re-upload etc.)

Some users with very limited upstream bandwidth reported quite good
results with Commonist when needing to upload lots of images and having
to leave computer working overnight to actually transfer them.

And there is one feature that actually huge majority of people liked
- Commonist can be launched from the webpage as the Java Webstart
application, so - from the user's perspective - you don't really need
to "install" it on your computer. I've even talked to some who didn't
realize really it was a separate application, it just magically worked
for them out of the browser. Huge advantage. 

So from my POV - +1 for taking Commonist to the next level, even
if this means learning Scala. 

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Save to userspace

2012-04-11 Thread Marcin Cieslak
>> Petr Bena  wrote:
> It isn't stable, maybe someone should take over the work on it... If
> it was finished it would be nice to have feature, if it was ever
> deployed of course.

Can you describe (maybe on a talk page or maybe better in bugzilla)
what's wrong with this extension (why it isn't working) and also
how would you like to have it changed? It's much easier to start
working on the obsolete codebase than from scratch. 

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] selenium browser testing proposal and prototype

2012-04-11 Thread Marcin Cieslak
>> Markus Glaser  wrote:
> Some time ago some people from the test framework team started working on a 
> Selenium Framework for MediaWiki [1], in PHP and with Selenium 1.0. One of 
> the reasons the project discontinued was that there was no clear case of when 
> Selenium would be useful as opposed to unit tests, esp. using QUnit and 
> TestSwarm for UI testing. I still see some use cases, though:

> * This also might be useful when filing bugs (make them reproducible)

One example I would like to have scripted is the following:

- have a sysop account to watch the non-existing page name
- create that page with some content
- have a sysop to delete this page

Very good testing case for DB transaction related problems.

I doing such tests should be possible in the new framework?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] @since

2012-04-11 Thread Marcin Cieslak
Hello,

I have a problem similar to the one of RELEASE-NOTES.

After great pains (broken merges, unknown dependencies, etc.)
I have pushed f74ed02ef744138a8d2a87322f81023ddc143a5f where
I have marked some methods "@since 1.19" since I really hope
to have this backported to 1.19 and maybe even 1.18.

How should we handle the @since stuff? What if it never
gets merged to the release branch? If not, which one?

Should REL1_18 say @since 1.18.3 and REL1_19 @since 1.19
or should all consitent mention the lowest (although
1.18.4 may be younger than, say 1.19)?

Or, once something "@since 1.20" gets merged in to REL1_19,
I should modify @since to 1.19 in master and update 
master's RELEASE-NOTES-1.19 as well?

The answer I got on IRC is that it is usually up to the
developer to plan for which release it should go to
in advance;  but I can understand that my changes
won't be allowed into, say, REL1_18 for this reason or another
(they are not security fixes, for example).

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Lua: return versus print

2012-04-15 Thread Marcin Cieslak
>> Trevor Parscal  wrote:
> +1 to all the points for using return values.

Zope has a nice solution here:

   print "Asdsds" 

prints actually to the internal magic variable "printed"
which has to be returned later with

   return printed

if it's going to end up as the function result.

Not sure if this is possible in Lua.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] git-review wanting to submit lot of changes

2012-04-15 Thread Marcin Cieslak
>> Daniel Friesen  wrote:
> What about the users who clone from their own GitHub fork as origin and  
> push side-project branches there before merging and pushing finished  
> projects to gerrit?

A proper fix in the works currently is to not need .gitreview file at all
if one of the remotes ("origin", "gerrit" or whatever) is pointing
to a valid Gerrit instance. Actually we don't need a new remote at all;
but that's the behaviour git-review insisted on.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Reminder about online Git/Gerrit tutorial - May 8 (today) @ 19:00 UTC

2012-05-08 Thread Marcin Cieslak
>> Gregory Varnum  wrote:
> Reminder to folks about the tutorial that saper is doing for Git / Gerrit 
> later today @ 19:00 UTC.
>
> Below is more info, but here is a link with the highlights:
> https://www.mediawiki.org/wiki/Project:WikiProject_Extensions/MediaWiki_Workshops#Upcoming:_Git_.2F_Gerrit_Tutorial
>
> Remember to RSVP!
> http://doodle.com/qnrgibpqxyamqhzb
>
> I hope to see folks there!
>
> This is at least the third Git related training and more are coming.  In 
> other words - have no fear - help is on the way.  :)

I have posted some details regarding SIP/IRC/SSH setup:

  https://www.mediawiki.org/wiki/Git/Workshop

Please test your SIP connection beforehand!

See you there...

//Marcin



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The most efficient approach to patchset dependencies

2012-05-08 Thread Marcin Cieslak
>> Beau  wrote:
> I know gerrit can use dependencies, so I can make a chain of dependant
> changes: c1 <- c2 <- c3 <- c4. However if c2 and c3 got a positive
> review and c1 needs some rework, c2 and c3 need to be reviewed again
> after I submit c1. Sometimes another, unrelated change may be merged, so
> the whole chain needs to be rebased against master.

Refactoring with code review is hard... If you refactor code,
whole merging magic is usually making more harm than good.

What I would suggest:

* Try to keep refactoring changes separate to functional changes
* You can try to separate independent changes and have a common parent
  for them

Ideally, the commit tree could look like:

  P --
  | \ \
  |  \ \
  r1  r2   c1
  |\   \
  | \   \
  c2 c3 c4

r1 - refactoring change 1
r2 - independent refactoring change 2
c1 - independent functional change
c2,c3 - functional changes after refactoring 1
c4 - functional change after refactoring 2 

You can achieve this by goint back to parent every time
you are finished with some other change:

 git checkout 

or 

 git reset --soft  (if you already have changes pending)

Make sure to have separate Change ID's for c2 and c3 otherwise
they might end up as two patchsets of the same change.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Reminder about online Git/Gerrit tutorial - May 8 (today) @ 19:00 UTC

2012-05-08 Thread Marcin Cieslak
>> Beau  wrote:
> W dniu 05/08/12 17:23, Marcin Cieslak pisze:
>> See you there...
>
> Saper, thanks for your time! Some mysterious git commands actually make
> sense for me now :-).

There were 11 people on the call who managed to overcome issues with
SIP networking. Big thanks to alfa, hotel, india, oscar, sierra, tango,
whiskey, xray and zulu for staying for so long and your patience :)

I took almost two hours - but I hope you liked the somewhat experimental
format of the exercise. Please feel free to send your feedback to me
or Sumanah - it was the first time in this format and we need to work
to make it better.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] a procedure for setting up a prototype Wikimedia wiki

2011-03-13 Thread Marcin Cieslak
>> Amir E. Aharoni  wrote:
> Is there some kind of a procedure for setting up a prototype Wikimedia
> wiki for testing? For example, which articles, templates and special
> pages should be copied there from the corresponding live wiki?
>
> In the Arabic prototype ( http://prototype.wikimedia.org/release-ar/ )
> there are very few articles and all of them have Arabic titles. It
> makes it impossible to test Bug 26665. I created an article called
> ABCDE to test it, but to save time and ensure better testing in the
> first place, creating articles in various scripts and directionalities
> must be a part of the standard procedure for creating a prototype.

What about importing some sane amount of articles using [[Special:Import]]
or a bot (like pywikipedia)?

> And since i'm mentioning it,
> http://prototype.wikimedia.org/deployment-ar/ doesn't seem to work at
> all.

Some templates ended in a loop, I have blanked it so you can find
out which template it was.

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New plans for the skin system

2011-03-13 Thread Marcin Cieslak
>> Daniel Friesen  wrote:
> http://www.mediawiki.org/wiki/User:Dantman/Skinning_system/New_skin_layout
> I have some plans for a new way of laying out skins in a new skin system.
>
> The key changes being; The creation of new region block handling in the 
> system (default skins typically just including a standard body region 
> and messages region) to replace newtalk, sitenotice, the jsmessage div, 
> bodytext, catlinks, and dataAfterContent (used by flaggedrevs, smw, and 
> other extensions). And changes to what types of navigation we support 
> and how dominant the location of the toolbox is.
>
> This new layout and system will likely be done using a planned xml/html 
> based template syntax.
> http://www.mediawiki.org/wiki/User:Dantman/Skinning_system/Monobook_template
> http://www.mediawiki.org/wiki/User:Dantman/Skinning_system#xml.2Fhtml_template_syntax

Did you see Zope Page Templates

http://www.zope.org/Documentation/Books/ZopeBook/2_6Edition/ZPT.stx

There is even a PHP implementation out there (TAL).

It's a consistent and mature system (I still prefer
tagsoup-like templating, but that has obvious disadvantages).

> http://www.mediawiki.org/wiki/User:Dantman/Skinning_system/Skin_examination
> I would also like to eventually eliminate our three skins using the 
> legacy system (Nostalgia, Standard/Classic, Cologne Blue). They don't 
> "properly" use our SkinTemplate system and require another 930+ lines of 
> code dedicated to their support, full of code duplication which 
> repeatedly gets in the way of new features because we insist that any 
> change we make to the ui should be backported to also work with legacy code.

That depends whether you want to change people or the system. I happened
to survive a whole Monobook era on either Standard, Simple and finally
settled down on Nostalgia due to its unique feature of having dropdown
list of specialpages (I use [[Special:Specialpages]] a lot). Then
switched to Vector with some reluctance, but finally accepted it.

If you plan to have a new, pretty and universal templating system
for the skins, please do try to accommodate those as a challenge.

It is also worth looking at some non-MediaWiki supplied skins,
like one of my almost favourites Beagle 
(http://beagle-project.org/skins/beagle-skin.tar.gz, that one
builds on QuickTemplate/Monobook I think) but you
fill find many others. It could be interesting to see those
that depart from the typical MediaWiki look (you know,
that thing that makes you type "Special:RecentChanges" in the
URL bar :)

Not to mention the ever-recurrng subject of the mobile site :)

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] GitHub Mirror of the svn repository

2011-03-16 Thread Marcin Cieslak
>> Yuvi Panda  wrote:
> I noticed that there's a github mirror of the svn repository at
> https://github.com/mediawiki, but it is rather out of date. Any idea
> if/when it could be made up-to-date again?

Can't help you with github, but I keep an almost-live mirror of the MediaWiki 
SVN here if you can use Mercurial:

http://bitbucket.org/mediawiki/test/

I recommend using ssh for checkout.

Good thing: revision numbers match svn revision numbers.

//Marcin




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Update collapsed!

2011-03-17 Thread Marcin Cieslak
>> Uwe Baumbach  wrote:
>
> Warning: mysql_query(): supplied argument is not a valid MySQL-Link
> resource in /data/wiki/wiki-test/includes/db/DatabaseMysql.php on line 23

Seems like you don't have a valid connection to your test database
(I hope it's the copy of the production one :)

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] using RC4 cypher for secure.wikimedia.org

2011-03-21 Thread Marcin Cieslak
>> Daniel Kinzler  wrote:
> hi all!
>
> In the hope i'm not clubbing a diseased donkey, i'd like to share an idea i 
> ran
> across: we could use the RC4-128 cypher for secure.wikimedia.org, instead of
> AES256. RC4 is reportedly a lot faster (3 to 4 times the throughput). Since 
> CPU
> capacity for encryption has been mentioned as one of the problems with making
> secure.wikimedia.org reliable, I thought it might help.

% openssl s_client -connect secure.wikimedia.org:443 -CApath 
/usr/local/share/certs/  -cipher RC4-SHA

(...)

New, TLSv1/SSLv3, Cipher is RC4-SHA
Server public key is 1024 bit
Secure Renegotiation IS supported
Compression: NONE
Expansion: NONE
SSL-Session:
Protocol  : TLSv1
Cipher: RC4-SHA
Session-ID: 23D5071FD392363CB3215134418219F083866583174FC2809795C4AB373F4EDC
(...)

seems to work for me.

If you want to give up benefts brought to you by the Diffie-Hellman ephemeral
key exchange, anyone may turn it off in their browser. In Seamonkey I have
at hand it's going into "about:config", and setting a whole bunch of 
modes in "security.ssl3" to "false".

Not sure if this should be done server-side, though. Unless there is a dire need
and we can't support more sessions than we already do.

//Marcin
  






___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] testing of localization

2011-03-22 Thread Marcin Cieslak
>> Markus Glaser  wrote:
>
> Hello,
>
>> The right way to do it is to have a test suite: A document that walks
>> the translator through all the use cases of the feature so that all
>> possible messages permutations and combinations appear.
>
> Possibly, Selenium tests could be helpful. A Selenium test basically
> acts as a remote control for a browser. So a test could walk through
> most of the cases that produces localized messages. Since a Selenium
> test is client side, though, it cannot (re)produce all the errors
> on a server. So, for example, a missing connection to a database
> could not be triggered. Another thing we might have to consider is
> how to set breakpoints to pause the Selenium test in order to allow
> a translator to check the messages.

That would be wonderful. (I don't think PHPUnit is the way to go,
since translators would only get checks whether message looks
like it *should*, not the context).

When translating a complex extension, like FlaggedRevs or AbuseFilter
you have to develop quite a large dictionary of concepts that should
be somehow consistent. This attempt at consitency somehow leads to unclear
messages as seen in the real life. 

So having a possibility to have a pre-flight test of the translation
(or even watch the demo of the original in action) is something
Selenium could deinitely help.  In many cases, translators
do not have permission to experience some interface in the live
environment (CheckUser, AbuseFilter, etc.). Could Selenium
solve this problem as well? Is it just mocking up the interface
or do I need a instance behind that is somehow setup somewhere?

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Toolserver source control (was Re: Converting to Git?)

2011-03-22 Thread Marcin Cieslak
>> Mark Wonsil  wrote:
>
> I haven't used git yet but after reading the excellent article that
> Rob Lanphier posted (http://hginit.com/00.html), I think I will. That
> article also explains why there wouldn't have to be as many updates to
> SVN as is done today.

The article is about Mercurial, not git :)

Although I assume most of this thread is about using DVCS in general,
and not only about git.  Mercurial has a bit more concise and 
consistent set of commands. 

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] testing of localization

2011-03-23 Thread Marcin Cieslak
>> Markus Glaser  wrote:
> Hello,
>
>> What would be nice for a first take would be if, from translatewiki, 
>> one could click a link that would take one to a list of screens that 
>> show the message in question, with the text highlighted even.  
>> Even if the only thing we had to work from was a display of the English 
>> language texts in context, that would still be a big step up from where 
>> we are now.
>
> I think, Selenium provides a screenschot feature. So if we can
> get Selenium to dump screens and name them according to a sensible
> scheme, that could produce the material you ask for. This could
> then be done for the English language as a kind of template screen
> set and also for the target language, at certain points, e.g. before
> a new minor release comes out.

That's worth looking at. Actually we could have complete walkthroughs
via many different workflows in the MediaWiki, useful also for training
or documentation purposes.  I wonder how many developers
after spending many hours on reworking code have time and energy
to update guides like this:

https://secure.wikimedia.org/wikipedia/meta/wiki/Help:CentralNotice

What I am doing now is deploying a small test wiki on my box
or on the Polish toolserver, enabled the specific extension there
and click through it. But how many translators can do this
on their own? Or even a better question, how can I share those
one-time setups with other translators, since this is mostly 
a one-off effort? Or is test.wikimedia.org enough for everybody?

Of course, the problem is in preparing 
those "walkthroughs" - how to cover many variants
of possible configuration (many extensions show/hide
elements of the UI depending on various $wgVariables) as well
as preparing a reasonable set of test data to be imported
(having empty CheckUser or AbuseFilter interfaces is pretty
useless).  But having an automated screenshot machine gun
would improve things a lot. Gee, I wish I attended
one of the talks that Markus gave last year about Selenium.

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Converting to Git?

2011-03-23 Thread Marcin Cieslak
>> Roan Kattouw  wrote:
> 2011/3/23 Tim Starling :

> I disagree, however, that the backlog of trunk reviews is not
> concerning. It means that we still haven't come up with a good process
> for reliable and quick (as in time between commit and review) code
> review. I have briefly told the list about my ideas for this in the
> past, maybe I should revive that thread some time. I also believe
> that, once we have a process where every commit is reviewed within a
> reasonable timespan (ideally at most a week), getting deployment
> closer to trunk and getting it to stay there will actually be fairly
> easy to do.

One of the ways to improve this is to assign mentors to new committers.
Mentors don't have to be necessarily related to the particular area
of committers work. The question is how many current developers
would have time to accommodate "newbies", but maybe we can work
towards this idea?

//Marcin (saper)



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Improving code review: Mentors/maintainers? (was: Re: Converting to Git?)

2011-03-23 Thread Marcin Cieslak
>> Roan Kattouw  wrote:
> 2011/3/23 Marcin Cieslak :
>> One of the ways to improve this is to assign mentors to new committers.
>> Mentors don't have to be necessarily related to the particular area
>> of committers work. The question is how many current developers
>> would have time to accommodate "newbies", but maybe we can work
>> towards this idea?
>>
> I was personally going for something more like assigning reviewers to
> MW components, paths, time slots, or some combination thereof. But if
> we're gonna have that discussion for real, let's have it in its on
> thread :)

Yea, right, sorry for hijack. Changed subject for now. I am not sure
that any formal assignement will work esp. taking volunteers into account.

I prefer to do "svn log" and see who made last few meaningful commits.
In some areas it will be easy - like in the api I am most likely to see
"catrope" or "reedy", but there are some forgotten swamps (like
some extensions) where only raymond and firends push localization updates.

Obviously, I consider it a good practice to talk to someone
touch this code earlier (as I did with CentralNotice and I didn't with
API). Maybe we lack some forum for this pre-commit exchange
(I think some open forum is better than private exchange
on mail or even IRC channel). But sometimes things are too trivial
to bother *the* mailing list (and then, wikimedia-tech seems to be
more likely place since it became more developer-oriented than 
mediawiki-l). 

Just to give a not-so-hypothetical example, since I don't like discussing 
in vain, what about this:

 Is this okay to fix https://bugzilla.wikimedia.org/show_bug.cgi?id=16260
 by adding a new [[Message:qsidebar]] that is the same as [[Message:Sidebar]]
 only accepts  EDIT, THISPAGE, CONTEXT, MYPAGES, SPECIALPAGES, TOOLBOX boxes?

I see that hartman and dartman did some work there recently, and ashley
one clean up about a year ago.

//Marcin
 


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] On HTML5

2011-03-30 Thread Marcin Cieslak
>> Aryeh Gregor  wrote:
>
>> The fact that you don't see the difference in the two screenshots is not
>> reassuring. In one, the search icon is clearly misplaced as it is
>> overlapping the border of the search input, in the other it is not. This is
>> caused solely by the DOCTYPE difference.
>
> Now I see it, yes.  They're cropped very differently, so I didn't spot
> the difference when flipping back and forth.  I was looking at the
> boxes' edges, and didn't notice the difference in the position of the
> magnifying glass.  Anyway, this is exactly the sort of minor bug where
> it's not worth it to worry too much about whether it breaks for a
> while -- certainly not to the extent of having to budget for the
> change, nor to the extent of reverting a change that's been in place
> for so long.  To the extent of reviewing before deployment, sure,
> maybe.  Doesn't bother me if we deploy without reviewing for this kind
> of thing, but I can see why you'd prefer to be more careful.

Sorry for bringing this up, I was the one who cropped those screenshots that 
way.
Those screetshots are best viewed within the context of:

http://secure.wikimedia.org/wikipedia/mediawiki/wiki/Special:Code/MediaWiki/82155#c15527

By the way, I am not happy with the "reverted" version either, the search
icon is a bit too high and the box is overlapping the line. 
In general, this CSS is probably too fragile and to prone to
font/platform/resolution/zoom factor/whatever problems.

I really hope that getting proper CSS for HTML5 in *core* is just a matter of
checking few workarounds in Vector and maybe Monobook and that's it.

However, for successful Wikimedia deployment, there is a different problem:

For years already I was fighting a happy-go-lucky approach to styling 
in MediaWiki:Common.css/Monobook.css as well as in individual templates.
(My battleground is Polish Wikipedia). There are dozens of quick fixes
and workarounds there also for various browsers issues. It is indeed
the problem they haven't been sometimes properly communicated back upstream.

Actually I was wondering - and we need to discuss this with
plwiki's tech community - to literally throw out all of 
MediaWiki:Common.js/Common.css
code and "see what breaks" once Wikimedia moves to HTML5. 
Maybe this could be a sort of heads-up message given out to the community
once we are ready to switch over to HTML5?

We have a huge backlog in converting our JavaScript already
(ResourceLoader for one, removing all the library code we should
get rid of once we have jQuery). Only most rudimentary
fixes to sitewide js has been done so far and maybe some gadgets.

While I deeply believe we can quickly sort out all the issues with
the MediaWiki codebase. The real problem is with the global 
css/js cruft, that major Wikipedias accumulated over years.
I am afraid that's something that WMF developers cannot really handle,
since they didn't put this in there in the first place, but
maybe this is the problem with just few major wikis only?

Maybe there aresome things that would help, like
unning copies of major wikis on prototype (including the content!)
or something like that. Any suggestions?

//Marcin






___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Git on Windows

2011-03-30 Thread Marcin Cieslak
> Also, since the bash shell is there, I think we can legitimately use 
> some custom scripts to help with dealing with piles of repos at once. 
> I'd probably pick python, ;) though theoretically you could write bash 
> scripts and use them on Windows with the Git Extensions setup.

Many people belive [1] it helps to have a set of scripts to properly
implement the workflow of their project. Given how git is constructed
(it is indeed a bunch of scripts in different programming languages),
this might seem natural in this environment.  Frankly, I am not
sure I grasped "porcelain" (the "user friendly frontend") commands
enough to tackle the "plumbing" ("the internals").

Aside of people learning how to use the VCS properly (I feel I'm
re-learning git everytime I use it but that's my limitation) you
need to make sure everyone's on the same page and has the same set
of scripts (Python you say) installed. I think this can be really
difficult.

//Marcin

[1] http://jeetworks.com/node/58
[2] http://whygitisbetterthanx.com/#easy-to-learn


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Focus on sister projects

2011-04-02 Thread Marcin Cieslak
>> MZMcBride  wrote:
> Ryan Kaldari wrote:
>> Yeah, the local CSS/JS cruft is definitely a problem. I've tried doing
>> clean-up on a few wikis, but I usually just get chewed out by the local
>> admins for not discussing every change in detail (which obviously
>> doesn't scale for fixing 200+ wikis). I would love to hear ideas for how
>> to address this problem.
>
> This caught my eye as Wikimedia has far more than 200 wikis. There seems to
> be a shift happening within the Wikimedia Foundation. The sister projects
> have routinely been ignored in the past, but things seem to be going further
> lately

The good thing about forgotten/abandoned/unloved/etc. projects is that
they probably don't have lots of cruft accumulated in the global CSS/JS
files (as they require quite lively tech-savvy community to maintain them).

So those sites will not probably require any changes and will survive
HTML5 migration without any problems. 

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Focus on sister projects

2011-04-02 Thread Marcin Cieslak
>> Billinghurst  wrote:
> Ryan,
>
> While admins will always be protective of their patch, especially if 
> something breaks 
> universally, none of us wishes to impede progress and we want to know how we 
> can help. 
> -> Make us do our homework
> -> Give us time to marshal resources, and 
> -> Have expectations that we should be organised to help.
> If we cannot do that, then it is somewhat upon our heads if you have to do 
> what you have 
> to do.

Frankly, from my experience as sysop at plwiki (not a sister project, though, 
but I did
some of the css/js work for pl* sisters), I'd rather recommend to explain a bit
(like "say who you are"), but go ahead with the changes. Maybe a single page
on meta will do. The problem will be with non-English projects, as some people
may not read English at all, like some of the quite MediaWiki-savvy admins
in the projects for smaller languages in the former USSR. 

Giving time and having much discussion serves little point, since from my 
experience 
those volunteers who spent lots of time on building those scripts now how little
time to re-write them or review them (as many of stuff simply needs
to be deleted).  Some may react badly, like just reverting changes because
"it works".  Well, it works in a way, and very often some strange effect
appear (mainly because changed loading order of js/css).

I am all for opening up extensions for sister projects - [[Extension:Proofread]]
improved situation at many wikisources, I would envision we have
more sister-project-related extensions in the SVN, even if they 
are CSS- or JS- only (or mostly). This gives core developers a chance
to better understand of the impact they make and gives them the
opportunity to fix problems themselves in a cross-repository sweeping
commits with proper commit logs. 

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] MediaWiki core tests failing? why?

2015-10-24 Thread Marcin Cieslak
Hello,

I have tried to run our current MediaWiki tests
using PHP 5.4 on a Gentoo machine against +/- git master:

+ git -C . log --oneline -1
5c63cce Merge "ApiQueryAllRevisions: Actually use 'start' and 'end'"
+ git -C vendor log --oneline -1
5efd7d7 Update OOjs UI to v0.12.12
+ git -C skins/Vector log --oneline -1
9f5f333 Localisation updates from https://translatewiki.net.


I have posted my php.ini to  https://phabricator.wikimedia.org/P2226
I am running PHPUnit 4.3.1; 3.x series crashes
at some point.

I am getting few interesting failures:

- floating point format problems
- various XMP XML metadata issues

and few others.
 
This looks to me like some environment problems
(some library too old etc. etc.).

Any hints where those come from?
If those are env problems indeed, I'd like to
try to add some checks to the installers.

~saper


env LC_MESSAGES=C LANG=C LC_TIME=C  php5.4 -c ${HOME}/php54.ini 
tests/phpunit/phpunit.php --configuration tests/phpunit/suite.xml 
--exclude-group Broken,Stub,Dump,ParserFuzz --log-junit 
"${HOME}/tests/log/postgres-log.xml"
Script started on Sun Oct 25 00:47:01 2015
#!/usr/bin/env php
Using PHPUnit from /usr/share/php/phpunit/phpunit.phar
PHPUnit 4.3.1 by Sebastian Bergmann.

Configuration read from 
/usr/home/saper/public_html/ybabel/tests/phpunit/suite.xml

(...)

Time: 12.45 minutes, Memory: 1301.75Mb

There were 35 failures:

1) ApiFormatPhpTest::testGeneralEncoding with data set #7 (array(1.0E+42), 
'a:1:{i:0;d:1.0E+42;}', array(1))
Failed asserting that two strings are identical.
--- Expected
+++ Actual
@@ @@
-a:1:{i:0;d:1.0E+42;}
+a:1:{i:0;d:144885712678075916785549312;}

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/api/format/ApiFormatTestBase.php:61
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

2) ApiFormatPhpTest::testGeneralEncoding with data set #30 (array(1.0E+42), 
'a:1:{i:0;d:1.0E+42;}', array(2))
Failed asserting that two strings are identical.
--- Expected
+++ Actual
@@ @@
-a:1:{i:0;d:1.0E+42;}
+a:1:{i:0;d:144885712678075916785549312;}

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/api/format/ApiFormatTestBase.php:61
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

3) JavaScriptContentTest::testUpdateRedirect with data set #1 ('/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=MediaWiki:MonoBook.js\\u0026action=raw\\u0026ctype=text/javascript");',
 '/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=TestUpdateRedirect_target\\u0026action=raw\\u0026ctype=text/javascript");')
Failed asserting that two strings are equal.
--- Expected
+++ Actual
@@ @@
-'/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=TestUpdateRedirect_target\u0026action=raw\u0026ctype=text/javascript");'
+'/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=MediaWiki:MonoBook.js\u0026action=raw\u0026ctype=text/javascript");'

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/content/JavaScriptContentTest.php:268
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

4) JavaScriptContentTest::testGetRedirectTarget with data set #0 
('MediaWiki:MonoBook.js', '/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=MediaWiki:MonoBook.js\\u0026action=raw\\u0026ctype=text/javascript");')
Failed asserting that null matches expected 'MediaWiki:MonoBook.js'.

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/content/JavaScriptContentTest.php:324
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

5) JavaScriptContentTest::testGetRedirectTarget with data set #1 
('User:FooBar/common.js', '/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=User:FooBar/common.js\\u0026action=raw\\u0026ctype=text/javascript");')
Failed asserting that null matches expected 'User:FooBar/common.js'.

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/content/JavaScriptContentTest.php:324
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

6) JavaScriptContentTest::testGetRedirectTarget with data set #2 
('Gadget:FooBaz.js', '/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=Gadget:FooBaz.js\\u0026action=raw\\u0026ctype=text/javascript");')
Failed asserting that null matches expected 'Gadget:FooBaz.js'.

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/content/JavaScriptContentTest.php:324
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

7) BitmapMetadataHandlerTest::testMultilingualCascade
'right(iptc)' does not match expected type "array".

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/media/BitmapMetadataHandlerTest.php:43
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

8) BitmapMetadataHandlerTest::testPNGXMP
Failed asserting that two arrays are equal.
--- Expected
+++ Actual
@@ @@
 Array (
 'frameCount' => 0
 'loopCount' => 1
-'duration'

  1   2   >