Re: [Wikitech-l] Seemingly proprietary Javascript

2013-03-05 Thread Brian Wolff
On 2013-03-05 9:17 PM, Antoine Musso hashar+...@free.fr wrote:

 Le 05/03/13 14:28, MZMcBride a écrit :
  A number of former and current contributors (notably Lee Daniel Crocker)
  have released their creative works and inventions into the public
domain:
  https://en.wikipedia.org/wiki/User:Lee_Daniel_Crocker.

 Does that include is work on the OCaml tool that generate the math
 rendering?  I am wondering if the rendering result would end up being PD
 too.

 --
 Antoine hashar Musso


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

The ocaml tool does security verification from what I understand. The
actual rendering is done by TeX.(I think) Also I didnt think the license of
a tool extended to its output. I can make non gpl images in the gimp, etc

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Linking from Wikipedia articles to local library resources

2013-03-05 Thread Brian Wolff
On 2013-03-05 9:20 PM, Sumana Harihareswara suma...@wikimedia.org wrote:

 See http://lists.wikimedia.org/pipermail/glam/2013-March/000361.html 
 http://everybodyslibraries.com/2013/03/04/from-wikipedia-to-our-libraries/
 : how do we get people from Wikipedia articles to the related offerings
 of our local libraries?

 http://en.wikipedia.org/wiki/Template:Library_resources_box create a
 sidebar box with links to resources about (or by) the topic of  a
 Wikipedia article in a reader’s library, or in another library a reader
 might want to consult.

 And more!

 As with most things related to Wikipedia, this service is experimental,
 and subject to change (and, hopefully, improvement) over time.  I’d love
 to hear thoughts and suggestions from users and maintainers of Wikipedia
 and libraries.

 John, since you said you're new to template-building, you might enjoy
 learning about what the new Lua templating system gives you:
 https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual

 --
 Sumana Harihareswara
 Engineering Community Manager
 Wikimedia Foundation

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Sounds like the use case of special:booksources page...

offtopic rant

The problem with libraries electronic resources (or at least my libraries')
is not that people are too google addicted to consider them. The problem is
that they are a usability nightmere. In one case I recall I was not able to
download more than 10 pages at a time or effectively navigate because the
interface was a horrid mess. People go where they can get what they need in
the easiest fashion. Libraries are not even close to providing that for
electronic resources. Otoh I love me my dead tree books, and libraries are
still king there.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problem with CentralAuth in MobileFrontend

2013-03-05 Thread Brian Wolff
From what I *understand* you don't have an account on the local wiki until
you visit there. Could perhaps whatever api methods used by the app not be
triggering this auto-account-creation process properly like a normal web
interface edit would?

-bawolff
On 2013-03-05 11:17 PM, Jon Robson jdlrob...@gmail.com wrote:

 So an update. I'm pretty sure I've worked this out. CentralAuth will only
 work if the user has previously visited the wiki project the login attempt
 is made for. Many browsers these days refuse cookies for sites the user has
 not visited. I'm still investigating but I'm pretty sure an image to a URL
 counts as a previous visit.
 On 28 Feb 2013 13:07, Juliusz Gonera jgon...@wikimedia.org wrote:

  On 02/27/2013 05:13 PM, Paul Selitskas wrote:
 
  Do you use the same protocol in Wikipedia and other projects? When I
  first log in via HTTPS and then somehow get to HTTP, I need to log in.
 
 
  We use the same protocol. We enforce HTTPS after login, and later use
  protocol agnostic URLs.
 
  --
  Juliusz
 
  __**_
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/**mailman/listinfo/wikitech-l
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Some Sort of Notice for Breaking Changes

2013-03-09 Thread Brian Wolff
On 3/8/13, Tyler Romeo tylerro...@gmail.com wrote:
 Is there any way that extension developers can get some sort of notice for
 breaking changes, e.g., https://gerrit.wikimedia.org/r/50138? Luckily my
 extension's JobQueue implementation hasn't been merged yet, but if it had I
 would have no idea that it had been broken by the core.
 *--*
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

What about little things. I happened to stumble across
https://gerrit.wikimedia.org/r/#/c/26828/ today, which seems like a
rather unnecessary back-compat break

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-03-09 Thread Brian Wolff
On 3/8/13, Waldir Pimenta wal...@email.com wrote:
 On Wed, Feb 27, 2013 at 9:13 PM, Daniel Friesen
 dan...@nadir-seen-fire.comwrote:


 index.php, api.php, etc... provide entrypoints into the configured wiki.

 mw-config/ installs and upgrades the wiki. With much of itself
 disconnected from core code that requires a configured wiki. And after
 installation it can even be eliminated completely without issue.


 I think this clarifies the issue for me. Correct me if I'm wrong, but
 basically the entry points are for continued, repeated use, for indeed
 *accessing* wiki resources (hence I suggest the normalization of the name
 of these scripts to access points everywhere in the docs, because entry
 is a little more generic), while mw-config/index.php is a one-off script
 that has no use once the wiki installation is done. I'll update the docs in
 mw.org accordingly, to make this clear.


 I wouldn't even include mw-config in entrypoint modifications that would
 be applied to other entrypoint code.


 You mean like this one https://gerrit.wikimedia.org/r/#/c/49208/? I can
 understand, in the sense that it gives people the wrong idea regarding its
 relationship with the other access points, but if the documentation is
 clear, I see no reason not to have mw-config/index.php benefit from changes
 when the touched code is the part common to all *entry* points (in the
 strict meaning of files that can be used to enter the wiki from a web
 browser).

 That said, and considering what Platonides mentioned:

 It was originally named config. It came from the link that sent you
 there: You need to configure your wiki first. Then someone had
 problems with other program that was installed sitewide on his host
 appropiating the /config/ folder, so it was renamed to mw-config.


 ...I would suggest the mw-config directory to be renamed to something that
 more clearly identifies its purpose. I'm thinking first-run or something
 to that effect. I'll submit a patchset proposing this.

 --Waldir
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

The installer is also used to do database updates when upgrading
mediawiki, so its not just a run-once-and-only-once thing.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-09 Thread Brian Wolff
On 2013-03-08 2:20 PM, Bartosz Dziewoński matma@gmail.com wrote:

 On Fri, 08 Mar 2013 17:07:18 +0100, Antoine Musso hashar+...@free.fr
wrote:

 I guess the whole idea of using GitHub is for public relation and to
 attract new people.  Then, if a developer is not willing to learn
 Gerrit, its code is probably not worth the effort of us integrating
 github/gerrit.  That will just add some more poor quality code to your
 review queues.


 This a hundred times. I manage a few (small) open-source projects at
GitHub, and most of the patches I get are not even up to my standards (and
those are significantly lower than WMF's ones).

 Submitting a patch to gerrit and even fixing it after code review is not
that hard. (Of course any more complicated operations like rebasing do
suck, but you hopefully won't be doing that with your first patch.)

 --
 Matma Rex


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Making it easier to contribute is always going to cause more lower quality
content to be submitted, since the unmotivated arent weeded out. But there
are plenty of good people that also would get weeded out. I think this
debate has a lot in common with the perenial debates on wikipedia to futher
restrict anons and non autoconfirmed users.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-09 Thread Brian Wolff
On 2013-03-09 5:04 PM, Jon Robson jdlrob...@gmail.com wrote:

  So, why am I not trying to learn Gerrit or try to submit patches?
 Because it's not worth my time.  The interface is so far outside of what
I'm used to, and it's just so touchy.  By comparison, GitHub has a solid,
no frills, Mac app that handles all of the important stuff.  And, even when
I committed to GitHub by command line, there was no way I could Merge
branch 'master' of ssh://gerrit.wikimedia.org:29418/mediawiki/core by
miss-typing a re-base https://gerrit.wikimedia.org/r/#/c/37684/.

 Thank you for sharing this view. This was my fear and it is useful to
 get this view.

 To me I would be happy having more contributions regardless of
 quality. A contribution in itself is wonderful as it shows an interest
 in the work that is being done and a will to help with that work. We
 should be striving to mentor any developer who contributes poor
 quality code not see this as a negative thing. To me this is what is
 so beautiful about open source development - we get the opportunity to
 create awesome things and create awesome developers.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

In theory you are right - more folks = more awesomeness. In practice this
involves a lot of effort, effort that people often are not willing to put
in. Just look at our rather poor history with bugzilla patches (although
things have improved)

Notwithstanding that, I still think we should reduce as many barriers as
possible. Even if the ideal world mentoring is not there, at least more
openness makes it more likely someone will figure stuff out themselves.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Live recent changes feed

2013-03-10 Thread Brian Wolff
On 2013-03-10 1:20 AM, Victor Vasiliev vasi...@gmail.com wrote:

 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1

 Hi everybody,

 For long time it was acknowledged that our current way of serving the
 recent changes feed to users (IRC with formatting using funny control
 codes) is one of the worst-suited for this purpose. It made the life
 miserable both for users who had to parse it (since nobody is actually
 reading it from IRC) and for developers who had to fit that thing into
 IRC line length limit. Time passed, and many ways were suggested to fix
 this (including https://meta.wikimedia.org/wiki/Recentchanges_via_XMPP
 and
 
https://www.mediawiki.org/wiki/Requests_for_comment/Structured_data_push_notification_support_for_recent_changes
),
 but
 nobody actually went ahead and made it work.

 After recent discussion on this list I realized that this has been in
 discussion for as long as four years I went WTF and decided to Just Go
 Ahead and Fix It. As a result, I made a patch to MediaWiki which allows
 it to output recent changes feed in JSON:
https://gerrit.wikimedia.org/r/#/c/52922/

 Also, I wrote a daemon which captures this feed and serves them through
 WebSockets and simple text-oriented protocol which serves same JSON
 without WebSocket wrapping (for poor souls writing in languages without
 proper WebSocket support):
 https://github.com/wikimedia/mediawiki-rcsub

 This daemon is written in Python using Twisted and Autobahn and it takes
 ~200 lines of code (initial version took ~80).

 As a bonus, this involves no XML streaming in any form (unlike XMPP or
 PubSubHubbub), so the unicorns are happy and unharmed, and minds of
 programmers implementing this will remain unfried.

 I hope that now getting recent changes via reasonable format is a matter
 of code review and deployment, and we will finally get something
 reasonable to work with (with access from web browsers!).

 - -- Victor.
 -BEGIN PGP SIGNATURE-
 Version: GnuPG v1.4.11 (GNU/Linux)
 Comment: Using GnuPG with undefined - http://www.enigmail.net/

 iQIcBAEBAgAGBQJRPBffAAoJEHEOTaoYvDHXUCMP/jml/EGAxXLuz1sGrS5R0iRF
 EJCjUKkysl1Gw0Wmr597UETtF1BCHh1myicGBN6tEjEd4N9rkNC8embBIdMjnlNN
 KFfJeg4cSMhfIprjFQHdYjy3jw6mK1Kr87jc/KIWkDdWwoV5EmcbQ/cGc/UQrcd2
 9cVmc3qUXWEf/oxhv3nGTfeW6gJDRZshpB66+YNr5LzAaBhroastW1r0b8UDXZt9
 3u1BOr9lcHbi62DLqPOCH+aXljOidrjoWff+cV9CzUS9M4axcHThzu4Eo1s7EpgX
 iWPVTuk3By3/EPxk9gJPETl7oPET6qNvNkUzix9Enu3iGuaWwEcano8xgFIfAWp8
 /Prf00xIe6VjMWssb3M+G9OkaclDBTPnMs9WxYMGHui8SZT62zQowJKeF+HrphjA
 A/rrpHEfQz4TlutrvtPthSKTAICzuXDcnXLUxIHhvJfVF6iq57ntA8iJ2vrrqQge
 ISOIZRgfDNQFb1UOER4P5VsXN1fKaP72OCSbP9smlVOtWgoCz0IqifdFSvc/Wo/O
 Fj5cafbPPB8R0AqMb29bnv89u6SvVCh5Y3v9pK5523xo0LVP+WGXe+WNuxW9jjeZ
 +y/d3EQTjl40pP/MzsBxR+BCz+Q84myjmpO0FvmPPxqxnA2bz0dSyfYyZlIIu7Mj
 zesgY0TGThmu12q0Y068
 =oGgQ
 -END PGP SIGNATURE-

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Good work. Its wonderful to see people just go and fix things that need
fixing instead of the usual bikesheddingness that often takes place.

-bawolff

P.s. I too used to read the irc rc feed (back when i was an active editor
at enwikinews). It can be useful on smaller projects
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GSoC 2013 guidelines??

2013-03-10 Thread Brian Wolff
On 2013-03-10 2:55 PM, Nischay Nahata nischay...@gmail.com wrote:

 On Sun, Mar 10, 2013 at 5:27 PM, Tejas Nikumbh tejasniku...@gmail.com
wrote:

  I created a bugzilla account but the process of going through the code
and
  fixing bugs seems cryptic. Any resources you can provide which can aid
me
  in understanding the process? A video which shows the process or some
  documentation perhaps?
 
 
 Have you already gone through the process of how to be a hacker?
 http://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker

 Not a hacker who breaks things of course :)



Hey, if you figure out how to break things, that's useful too, provided you
don't actually break things but tell us how they could be broken.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Live recent changes feed

2013-03-10 Thread Brian Wolff

 If Whatever the JSON encoder we use does means that one day, the
 daemon starts sending UTF-8 encoded characters, it is quite possible
 that existing clients will break because of previously unnoticed
 encoding bugs. So I would like to see some formal documentation of the
 protocol.

Json standard is pretty clear that any character can be escaped using \u
utf-16 code point or you can just have things be utf8. If clients break
because they can't handle that, that is the client's fault. Its not a hard
requirement.

I see  no reason why we couldnt change later if need be. Furthermore I see
no reason why we would care which way we went on that issue. The raw json
isnt meant for human eyes.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Live recent changes feed

2013-03-10 Thread Brian Wolff
On 2013-03-11 12:26 AM, Tyler Romeo tylerro...@gmail.com wrote:

 On Sun, Mar 10, 2013 at 10:38 PM, Victor Vasiliev vasi...@gmail.com
wrote:

   Finally, other than WebSocket and the socket interface, the one
   other subscription method we should have it some sort of HTTP hook
call,
   i.e., it sends an HTTP request to the subscriber. This allows
  event-driven
   clients without having a socket constantly open.
 
  I am not sure what exactly do you mean by that.


 When a message is sent, it is delivered by the daemon submitting an HTTP
 POST request to a registered client URI. This is a commonly used scheme
for
 push notification delivery, such as when using Amazon's notification
 service.

 *--*
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Wait, so it just sends http post requests to some address until explicitly
told to stop? That sounds like an incredibly bad idea (if I understand it
correctly)

*if you forget to unsubscribe we send you post requests until the end of
eternity.
*dos vector - register someone you don't like's url. Register 100
variants from the same domain. Push enwikipedia's rc feed there.

In any case, I don't see the need to have every form of push api imaginable
implemented. Especially not initially but even in general.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Live recent changes feed

2013-03-11 Thread Brian Wolff
On 2013-03-11 1:11 AM, Tyler Romeo tylerro...@gmail.com wrote:

 *dos vector - register someone you don't like's url. Register 100
  variants from the same domain. Push enwikipedia's rc feed there.


 Or you roll out some EC2 instances and open 100 sockets. (And before
 you say rate-limit based on IP address, the same can be done for the HTTP
 idea.)


I mean you could use such a service to DoS somebody else. If you can open
sockets, then its your own server.

Sure you could add some mechamism to prove you own the domain where you
want the rc updates to be sent, but things can get rather complex.

--bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Live recent changes feed

2013-03-11 Thread Brian Wolff
On 2013-03-11 3:46 PM, Jeroen De Dauw jeroended...@gmail.com wrote:

 Hey,

 Sure you could add some mechamism to prove you own the domain where you
  want the rc updates to be sent, but things can get rather complex.
 

 Google uses, or at least used to use, the following to do exactly that:

 On request provide a auth file to the user which includes some unique
 identifier. Require this file to be made available via the domain in
 question. Have the user point to the location where it is made available
 and check if it is actually there. If so, domain authenticated.

 That seems rather simple to create.

 Cheers

 --
 Jeroen De Dauw
 http://www.bn2vs.com
 Don't panic. Don't be evil.
 --
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I think that proves my point - what you describe is not what google does.
Google tells the user the path for the file (i believe the usual place is
in the root of the domain). The user does not pick the path. Otherwise I
could prove I own wikipedia (assuming mime types weren't checked) by using
action=raw.

Things that finiky to be made secure should be avoided imo.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Live recent changes feed

2013-03-11 Thread Brian Wolff
On 2013-03-11 4:32 PM, Tyler Romeo tylerro...@gmail.com wrote:

 Honestly, the solution could be as simple as requiring that the HTTP
 response have a certain header or something.

 *--*
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Ok. I withdraw my security related objections :). Some sort of header based
checking to make sure the posts are wanted sounds sane (provided that very
initially a get request is used to verify this. Post requests to arbitrary
unverified urls can be dangerous.).

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Category sorting in random order

2013-03-11 Thread Brian Wolff
To ask the obvious question, is the key you have scp configured to use the
same as the one in your gerrit prefs?

-bawolff

On 2013-03-11 8:05 PM, Paul Selitskas p.selits...@gmail.com wrote:

 Git review, of course. The log is here: http://pastebin.com/iC4N1am0


 On Tue, Mar 12, 2013 at 1:46 AM, Matthew Flaschen
 mflasc...@wikimedia.orgwrote:

  On 03/11/2013 08:38 AM, Paul Selitskas wrote:
   Can you add Belarusian projects as well?
  
   'bewiki' = 'uca-be',
   'bewikisource' = 'uca-be',
'be_x_oldwiki' = 'uca-be',
  
   I was denied while sending a patch for review.
 
  Please file a bug if you haven't already.
 
  How did you attempt to do a patch?  The recommended way is now Gerrit
  (https://www.mediawiki.org/wiki/Gerrit/Getting_started).
 
  Matt Flaschen
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 



 --
 З павагай,
 Павел Селіцкас/Pavel Selitskas
 Wizardist @ Wikimedia projects
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Category sorting in random order

2013-03-11 Thread Brian Wolff
No worries. The obvious things are often the hardest to spot :-)

-bawolff
On 2013-03-11 8:24 PM, Paul Selitskas p.selits...@gmail.com wrote:

 Yes, thanks for that question. I didn't check but this is obviously it.
 Sorry for spamming :)


 On Tue, Mar 12, 2013 at 2:15 AM, Brian Wolff bawo...@gmail.com wrote:

  To ask the obvious question, is the key you have scp configured to use
the
  same as the one in your gerrit prefs?
 
  -bawolff
 
  On 2013-03-11 8:05 PM, Paul Selitskas p.selits...@gmail.com wrote:
  
   Git review, of course. The log is here: http://pastebin.com/iC4N1am0
  
  
   On Tue, Mar 12, 2013 at 1:46 AM, Matthew Flaschen
   mflasc...@wikimedia.orgwrote:
  
On 03/11/2013 08:38 AM, Paul Selitskas wrote:
 Can you add Belarusian projects as well?

 'bewiki' = 'uca-be',
 'bewikisource' = 'uca-be',
  'be_x_oldwiki' = 'uca-be',

 I was denied while sending a patch for review.
   
Please file a bug if you haven't already.
   
How did you attempt to do a patch?  The recommended way is now
Gerrit
(https://www.mediawiki.org/wiki/Gerrit/Getting_started).
   
Matt Flaschen
   
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
   
  
  
  
   --
   З павагай,
   Павел Селіцкас/Pavel Selitskas
   Wizardist @ Wikimedia projects
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 



 --
 З павагай,
 Павел Селіцкас/Pavel Selitskas
 Wizardist @ Wikimedia projects
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-03-12 Thread Brian Wolff
On 2013-03-12 6:31 AM, Daniel Friesen dan...@nadir-seen-fire.com wrote:

 On Sat, 09 Mar 2013 08:15:05 -0800, Platonides platoni...@gmail.com
wrote:

 On 09/03/13 15:47, Waldir Pimenta wrote:

 So mw-config can't be deleted after all? Or you mean the installer at
 includes/installer?
 Is you mean the former, then how about run-installer instead of my
 previous proposal of first-run?
 Any of these would be clearer than mw-config, imo.

 --Waldir


 You can delete it, but then you can't use it to upgrade the wiki (or
 rather, you would need to copy it again from the new tree).


 mw-config only updates the databse to the currently installed version of
MW. So it's fine to delete mw-config because you won't need that mw-config
anymore. When you upgrade you'll need the mw-config that comes with the
code for the new version of MediaWiki you're installing.


 --
 ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

There are cases where you still want mw-config even without upgrading. For
example changing wgCategoryCollation or installing an extension with schema
changes.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Detect running from maintenance script in a parser function extension

2013-03-12 Thread Brian Wolff
On 2013-03-12 3:19 PM, Tyler Romeo tylerro...@gmail.com wrote:

 On Tue, Mar 12, 2013 at 1:47 PM, Toni Hermoso Pulido toni...@cau.cat
wrote:

  Hello,
 
  I'm checking whether I can detect that a process is run from a
  maintenance script in a parser function extension.
 
  Which would be the best way / more recommendable to detect it?
 
  Thanks!
 

 $wgCommandLineMode should be able to tell you, although I think checking
if
 the RUN_MAINTENANCE_IF_MAIN constant is set is probably a better method.

 *--*
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

More interesting question - why do you need to know.

Making wikitext vary between maintenance script and normal may cause a bit
of breakage given jobQueue etc.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Detect running from maintenance script in a parser function extension

2013-03-12 Thread Brian Wolff
On 3/12/13, Toni Hermoso Pulido toni...@cau.cat wrote:
 Al 12/03/13 21:08, En/na Brian Wolff ha escrit:
 On 2013-03-12 3:19 PM, Tyler Romeo tylerro...@gmail.com wrote:

 On Tue, Mar 12, 2013 at 1:47 PM, Toni Hermoso Pulido toni...@cau.cat
 wrote:

 Hello,

 I'm checking whether I can detect that a process is run from a
 maintenance script in a parser function extension.

 Which would be the best way / more recommendable to detect it?

 Thanks!


 $wgCommandLineMode should be able to tell you, although I think checking
 if
 the RUN_MAINTENANCE_IF_MAIN constant is set is probably a better method.

 *--*
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 More interesting question - why do you need to know.

 Making wikitext vary between maintenance script and normal may cause a bit
 of breakage given jobQueue etc.

 Hello,

 maybe it's a bit weird and little orthodox…
 In any case, it's for batch processing (with WikiPage::doEdit) some wiki
 pages that have a UserFunctions parserfunction in their wikitext
 http://www.mediawiki.org/wiki/Extension:UserFunctions
 so that such parser function is ignored in building the page.

 Cheers,
 --
 Toni Hermoso Pulido
 http://www.cau.cat

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Ok, that's probably safe, since that extension disables caching.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-12 Thread Brian Wolff
On 3/12/13, Lars Aronsson l...@aronsson.se wrote:
 In Wiktionary, it's very convenient that some words
 have sound illustrations, e.g.
 http://en.wiktionary.org/wiki/go%C3%BBter

 These audio bites are simple 2-3 second OGG files, e.g.
 http://commons.wikimedia.org/wiki/File:Fr-go%C3%BBter.ogg

 but they are limited in number. It would be very
 easy to record more of them, but before you get
 started it takes some time to learn the details,
 and then you need to upload to Commons and specify
 a license, and provide a description, ... It's not
 very likely that the person who does all that is
 also a good voice in each desired language.

 Here's a better plan:

 Provide a tool on the toolserver, or any other
 server, having a simple link syntax that specifies
 the language code and the text, e.g.
 http://toolserver.org/mytool.php?lang=frtext=gouter

 The tool uses a cookie, that remembers that this
 user has agreed to submit contributions using cc0.
 At the first visit, this question is asked as a
 click-through license.

 The user is now prompted with the text (from the URL)
 and recording starts when pressing a button. The
 user says the word, and presses the button again.
 The tool saves the OGG sound, uploads it to Commons
 with the filename fr-gouter-XYZ789.ogg and
 the cc0 declaration and all metadata, placing it
 in a category of recorded but unverified words.

 Another user can record the same word, and it will
 be given another random letter-digit code.

 As a separate part of the tool, other volunteers are
 asked to verify or rate (1 to 5 stars) the recordings
 available in a given language. The rating is stored
 as categories on commons.

 Now, a separate procedure (manual or a bot job) can
 pick words that need new or improved recordings,
 and list them (with links to the tool) on a normal
 wiki page.

 I know HTML supports uploading of a file, but I don't
 know how to solve the recording of sound directly to
 a web service. Perhaps this could be a Skype application?
 I have no idea. Please just be creative. It should be
 solvable, because this is 2013 and not 2003.


 --
Lars Aronsson (l...@aronsson.se)
Aronsson Datateknik - http://aronsson.se



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

It was solvable with a java applet (or flash, but that's usually
considered evil) back in 2003. However it still requires someone to
actually do it.

With modern web browsers, you can do it with html5/webRTC [1].

Someone could probably make an extension that integrates with
MediaWiki, so all user has to do is go to special:recordAudio and they
could record/upload from there. Perhaps that would make a good gsoc
project (Not sure if the scope is big enough, but could probably add
stuff like making a slick ui to make it big enough).

[1] http://www.html5rocks.com/en/tutorials/getusermedia/intro/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-12 Thread Brian Wolff
On 3/12/13, Tyler Romeo tylerro...@gmail.com wrote:
 On Tue, Mar 12, 2013 at 9:29 PM, Brian Wolff bawo...@gmail.com wrote:

 It was solvable with a java applet (or flash, but that's usually
 considered evil) back in 2003. However it still requires someone to
 actually do it.


 For security purposes, I'm really hoping we don't plan on using a Java
 applet. :P

 *--*
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Why? There's nothing inherently insecure about java applets. We
already use them to play ogg files on lame browsers that don't support
html5.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-12 Thread Brian Wolff
On 3/12/13, Tyler Romeo tylerro...@gmail.com wrote:
 On Mar 12, 2013 10:08 PM, Brian Wolff bawo...@gmail.com wrote:

 On 3/12/13, Tyler Romeo tylerro...@gmail.com wrote:
  On Tue, Mar 12, 2013 at 9:29 PM, Brian Wolff bawo...@gmail.com wrote:
 
  It was solvable with a java applet (or flash, but that's usually
  considered evil) back in 2003. However it still requires someone to
  actually do it.
 
 
  For security purposes, I'm really hoping we don't plan on using a Java
  applet. :P
 
  *--*
  *Tyler Romeo*
  Stevens Institute of Technology, Class of 2015
  Major in Computer Science
  www.whizkidztech.com | tylerro...@gmail.com
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 Why? There's nothing inherently insecure about java applets. We
 already use them to play ogg files on lame browsers that don't support
 html5.

 --bawolff

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 Can you say that for sure? With the number of exploits in Java over the
 past few months, everybody I know has already disabled their browser plugin.

 --Tyler Romeo
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Those types of people will probably have an html5 capable web browser :P

Let me rephrase my previous statement as, using java as a fallback
doesn't introduce any new issues that wouldn't be already there if we
didn't use java as a fallback. (Since we'd only fallback to java if
the user already had it installed). Furthermore, I imagine (or hope at
least) that oracle fixes the security vulnerabilities of their plugin
as they are discovered.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] A new feedback extension - review urgently needed

2013-03-13 Thread Brian Wolff
On 2013-03-12 8:38 PM, Lukas Benedix bene...@zedat.fu-berlin.de wrote:


 Do you have any advice what I can do?


Don't take this the wrong way, but you should perhaps start considering a
plan b. Community contributed extensions often take months before getting
deployed. While that's not always the case, it is the likely case.

You should also probably work on getting approval from the wikidata
community. Its unlikely the extension will be deployed unless there is
agreement at wikidata that the extension is wanted. (From what I've seen
you left a comment on the vp to which no one responded to. That is not
usually sufficient. Usually you have to get a bunch of people to actively
support you)

Best of luck,
--bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Brian Wolff
On 2013-03-14 11:20 PM, Tyler Romeo tylerro...@gmail.com wrote:

 Can we please be real here? The reason more contributors come in through
 GitHub than through Gerrit is because they *already have a GitHub
account*.
 My browser is always logged into GitHub, and it's at the point where I can
 casually just fork a project and begin working on it, whereas with Gerrit
 you need to request an account.

 Like I said before, if you know how to use Git, you know how to use Gerrit
 (and the contra-positive is true as well). The primary thing holding
people
 back is that it's confusing and not user friendly enough to make an
account
 and get working. Imagine if people could sign into Gerrit using their
 Google accounts like Phabricator allows. I can guarantee participation
 would skyrocket.

 *--*
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I can state from personal experiance that creating an account was not the
hard part (particularly because I had my account created for me ;) and
there definitly was a hard part learning gerrit. I have no idea how
easy/hard it is to do things on github as I don't have an account there but
at the very least they probably have more usability engineers than gerrit
has.

Svn had a much harder account creation procedure, but I personally felt the
learning curve was much lower (or maybe I wss just more familar with the
ideas involved.)

Anyhow, point of this ramble: gerrit is difficult for newbies (or at least
when I was. Many others have said similar things). Well we certainly want
to keep gerrit, its important to recognize this and mitigate the
difficulties where it is reasonable to do so.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Missing project ideas for GSOC

2013-03-20 Thread Brian Wolff
Some of these probably aren't really that suitable:

 After this filtering, we seem to be left with:


 * An easy way to share wiki content on social media services

I don't think this is a good idea unless the political climate on wikipedia
has changed (we want the gsocer to stay. Flame wars are probably not
conducive to that)

 * Write an extension to support XML Sitemaps without using command line

Meh, maybe. Rather low impact. Very few people fall into the use case, its
debatable how much a sitemap would even help with seo. Its also not the
easiest thing to do to make mw work on a schedule, especially if the task
needs to be accomplished a little at a time.

 * Extension:OEmbedProvider

Would be concerned that its not likely for the student to see their work
used (deployed) after a single summer of work unless it was mentored by
someone who would have the ability to deploy the extension themselves.
That's obviously not a requirement and someone could do a lot of good work
in the direction of making it useful, but something to keep in mind.

 * Allow smoother and easier Wikimedia Commons pictures discovery

This is more - come up with cool ideas (which is needed) rather then an
implementation project. Is that ok for a gsoc project?

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Brian Wolff
On 2013-03-21 3:08 PM, Tyler Romeo tylerro...@gmail.com wrote:

 Well, as part of the community and a volunteer, I can safely say that I
 don't think I (or anybody else) needs notification before bug fixes. :P

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

That depends on the bug. Some fixes do cause disruption. To pick a random
clear cut example from a while ago - consider adding the token to the login
api action. It was very important that got fixed, but it did cause
disruption.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Missing project ideas for GSOC

2013-03-22 Thread Brian Wolff
On 2013-03-22 9:37 AM, Guillaume Paumier gpaum...@wikimedia.org wrote:

 Hi,

 On Thu, Mar 21, 2013 at 12:43 AM, Quim Gil q...@wikimedia.org wrote:
 
  Many of the ideas listed there are too generic (Write an extension),
  improvements of existing features (Improve Extension:CSS)

 This may sound naive, but why are improvements of existing features
 discarded? My thinking was that, if the student didn't have to start
 from scratch, they would have more time to polish their work and make
 it fit with our strict standards, hence making it more likely for
 their work to be merged and deployed.

 (Of course, the existing code needs to be good enough not to require a
 complete rewrite, but that could be decided on a case-by-case basis.)

 --
 Guillaume Paumier

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I think improvement to exidting features are fine, but it should be
existing features that are used by (or have a high potential of being) used
by the wmf. If its a feature not used by wikimedia, it should have an
extremely high impact on third parties to compensate.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-22 Thread Brian Wolff
On 2013-03-22 9:20 AM, Tyler Romeo tylerro...@gmail.com wrote:

 On Fri, Mar 22, 2013 at 7:39 AM, Thomas Gries m...@tgries.de wrote:

  I have to change CACHE_ACCEL to CACHE_NONE in my LocalSettings.php,
  and will still enjoy opcode caching by ZendOptimizerPlus,
  but have no memory cache - currently.
 
 
  Is this correct ?
  Can the setup be improved, and how ?
 

 Yes, this is correct. It can be improved by setting up a memcached server
 (it's quick and easy, and in small wikis can even be run on the same
server
 as the web server, though not recommended for larger setups) and then
using
 that as your cache. As an alternative, you can also use CACHE_DB, which
 will use the database for caching, although that doesn't really help much
 since a cache miss usually means a DB query anyway.

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Some people have claimed that CACHE_DB might even slow things down compared
to CACHE_NONE when used as main cache type (cache db is still better than
cache none for slow caches like the parser cache). Anyhow you should do
profiling type things when messing with caching settings (or any
performance settings) to see what is effective and what is not.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit/Jenkins Verification

2013-03-22 Thread Brian Wolff
On 2013-03-22 9:24 AM, Tyler Romeo tylerro...@gmail.com wrote:

 I've noticed that sometimes Jenkins +1s changes and other times it +2s
them
 (for Verified, that is). Is there any specific pattern to this? It's not a
 problem or anything; I'm just curious. I feel like it was explained back
 when the system was changed to +1 and +2 but I forget and can't seem to
 find anything in the archives.
 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I think +1 is a merge and lint check where +2 is tests that actully execute
the code, so only happens if your on the trusted list of users. Jenkins
posts which tests its doing in a gerrit comment.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-22 Thread Brian Wolff
On 2013-03-22 10:45 AM, Tyler Romeo tylerro...@gmail.com wrote:

 On Fri, Mar 22, 2013 at 9:38 AM, Brian Wolff bawo...@gmail.com wrote:

  Some people have claimed that CACHE_DB might even slow things down
compared
  to CACHE_NONE when used as main cache type (cache db is still better
than
  cache none for slow caches like the parser cache). Anyhow you should do
  profiling type things when messing with caching settings (or any
  performance settings) to see what is effective and what is not.
 
  -bawolff
 

 Wouldn't be surprised. ;) The only problem is that with CACHE_NONE, many
 things (specifically, throttling mechanisms) won't work since the cache
 isn't persistent across requests.

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

That would be a mediawiki bug though. Does throtling actually work with
cache_db now? I remember it used to only work with the memcached backend.
Anyways if that's been fixed, throtling should be changed to use CACHE_ANY
so it actually works in all configs.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Alternative domains for Commons

2013-03-22 Thread Brian Wolff
On 2013-03-22 5:22 PM, MZMcBride z...@mzmcbride.com wrote:

 Juliusz Gonera wrote:
 We've been having a hard time making photo uploads work in
 MobileFrontend because of CentralAuth's third party cookies problem (we
 upload them from Wikipedia web site to Commons API). Apart from the
 newest Firefox [1,2], mobile Safari also doesn't accept third party
 cookies unless the domain has been visited and it already has at least
 one cookie set.
 
 Even though we have probably found a solution for now, it's a very shaky
 and not elegant workaround which might stop working any time (if some
 detail of default browser cookie policy changes again) [3].
 
 I came up with another idea of how this could be solved. The problem we
 have right now is that Commons is on a completely different domain than
 Wikipedia, so they can't share the login token cookie. However, we could
 set up alternative domains for Commons, such as commons.wikipedia.org,
 and then the cookie could be shared.
 
 The only issue I see with this solution is that we would have to
 prevent messing up SEO (having multiple URLs pointing to the same
 resource). This, however, could be avoided by redirecting every
 non-API request to the main domain (commons.wikimedia.org) and only
 allowing API requests on alternative domains (which is what we use for
 photo uploads on mobile).
 
 This obviously doesn't solve the broader problem of CentralAuth's common
 login being broken, but at least would allow easy communication between
 Commons and other projects. In my opinion this is the biggest problem
 right now. Users can probably live without being automatically logged in
 to other projects, but photo uploads on mobile are just broken when we
 can't use Commons API.
 
 Please let me know what you think. Are there any other possible
 drawbacks of this solution that I missed?
 
 [1] http://webpolicy.org/2013/02/22/the-new-firefox-cookie-policy/
 [2]
 
https://developer.mozilla.org/en-US/docs/Site_Compatibility_for_Firefox_22
 [3] https://gerrit.wikimedia.org/r/#/c/54813/

 Hi Juliusz,

 Please draft an RFC at https://www.mediawiki.org/wiki/RFC. :-)

 commons.wikipedia.org already redirects to commons.wikimedia.org (for
 historical reasons, maybe), so that has to be considered. I think what
 you're proposing is also kind of confusing and I'm wondering if there
 aren't better ways to approach the problem.

 A good RFC will lay out the underlying components in a Background
 section, the problem you're attempting to solve in a Problem section,
 and then offer possible solutions in a Proposals section. Variants on
 this also usually work.

 MZMcBride



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Imo this sounds like a hacky solution. Also doesnt work for wikis that are
not commons.

That said I don't have a better solution atm.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread Brian Wolff
 been a way to reliably measure performance...


The measure part is important. As it stands I have no way of measuring code
in action (sure i can set up profiling locally, and actually have but its
not the same [otoh i barely ever look at the local profiling i did set
up...). People throw around words like graphite, but unless im mistaken us
non staff folks do not have access to whatever that may be.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread Brian Wolff
On 2013-03-22 6:46 PM, Matthew Walker mwal...@wikimedia.org wrote:

 
  People throw around words like graphite, but unless im mistaken us
  non staff folks do not have access to whatever that may be.

 Graphite refers to the cluster performance logger available at:
 http://graphite.wikimedia.org/

 Anyone with a labs account can view it -- which as a commiter you do (it's
 the same as your Gerrit login.)

I've tried. My lab login doesnt work.

More generally, since labs account are free to make, what is the point of
password protecting it?


 otoh i barely ever look at the local profiling i did set up...

 This problem still exists with graphite; you have to look at it for it to
 do any good :)

That's lame ;)

-bawolff

 ~Matt Walker
 Wikimedia Foundation
 Fundraising Technology Team


 On Fri, Mar 22, 2013 at 2:17 PM, Brian Wolff bawo...@gmail.com wrote:

   been a way to reliably measure performance...
  
 
  The measure part is important. As it stands I have no way of measuring
code
  in action (sure i can set up profiling locally, and actually have but
its
  not the same [otoh i barely ever look at the local profiling i did set
  up...). People throw around words like graphite, but unless im mistaken
us
  non staff folks do not have access to whatever that may be.
 
  -bawolff
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OPW intern looking for feedback!

2013-03-27 Thread Brian Wolff
Hi,

Your extension looks quite good. I did an extensive review of your
code over at https://www.mediawiki.org/wiki/User:Bawolff/review/Git2Pages
. Well that page may look very long, most of the issues listed there
are minor nitpicks.


 - GitRepository will do a sparse checkout on the information, that is,
 it will clone the repository but only keep the specified file (this
 was implemented to save space)

Don't think that works the way you think it does. The entire repo is
still there (in fact, the entire history of the repo is there).

When I was first testing your extension, I tried to load a snippet
from mediawiki/core.git . The process of cloning the repo basically
made my web server unresponsive (eventually I got bored and restarted
apache. I'm not sure why wfShellExec didn't kick in before that
point). I think you need to do a shallow clone since you only need one
revision of the file.

 - The repositories will be cloned into a folder that is a md5 hash of
 the url + branch to make sure that the program isn't cloning a ton of
 copies of the same repository

Martijn wrote:
Why hash it, and not just keep the url + branch encoded to some charset
that is a valid path, saving rare yet hairy collisions?

I actually like the hashing. The chance of a collision happening seems
so unlikely that its not worth worrying about (If it is a concern, one
could use sha256 encoded as base36 to make collisions even less
likely). It also prevents the somewhat unlikely (but more likely than
a collision imo) of a url+branch combination that is  255 characters.

One thing I would recommend is prefix the directory names with
something like mw-git2page-hash so that if a system admin sees all
these entries, s/he would know where they are coming from.

Note, using directory names that can be pre-determined in a public
/tmp directory is a bit dangerous in a shared server. Another user
could make the directory, put something malicious in it (for example
an evil post-merge hook), and then have your script use the malicious
data. One way around that could be to add the $wgSecretKey (and some
salt) to the variables that generate the hash that becomes the
directory name.


 This is my baseline program. It works (for me at least). I have a few
 ideas of what to work on next, but I would really like to know if I'm
 going in the right direction. Is this something you would use? How
 does my code look, is the implementation up to the MediaWiki coding
 standard?buttt You can find the progression of the code on
 gerrit[3].

Your code actually looks quite good. Better then much of the code that
has come out of mentorship type projects in the past (imho). I put
quite extensive comments on the wiki page I linked above.


 Here are some ideas of what I might want to implement while still on
 the internship:
 - Instead of a pre tag, encase it in a syntaxhighlight lang tag if
 it's code, maybe add a flag for user to supply the language

That would be cool.

 - Keep a database of all the repositories that a wiki has (though not
 sure how to handle deletions)

One possibility with that (although quite a bit of work) would be to
have a post commit hook in a git server that caused the wiki page to
be re-rendered whenever a snippet it includes changes.

As for deletion - Check out how LinksUpdate stuff works, and how
extensions like GlobalUsage that tie into it work. In particular the
various hooks in that file:
https://www.mediawiki.org/wiki/Category:MediaWiki_hooks_included_in_LinksUpdate.php
If you meant page deletion (instead of snippet deletion) see the
ArticleDeleteComplete hook.


 Here are some problems I might face:
 - If I update the working tree each time a file from the same
 repository is added, then the line numbers may not match the old file
 - Should I be periodically updating the repositories or perhaps keep
 multiple snapshots of the same repository

I think its expected that if somebody puts {{#snippet:...}} on a page,
and then someone commits an update to the file being transcluded by
{{#snippet}}, that the {{#snippet:...}} would eventually update (aka
on the next page parse)

Having an option to include a snippet from a specific commit -
{{#snippet:filename=foo|repository=bar|revision=bf75623112354}} would
be cool (or could the branch option be used to do this already?)

Also, having an option to start at some line based on some pattern
(Start at line that has word foo on it) might be cool.

 - Cloning an entire repository and keeping only one file does not seem
 ideal, but I've yet to find a better solution, the more repositories
 being used concurrently the bigger an issue this might be

Yes that's a problem, especially on a big repository. At the very
least it should be a shallow clone. Note when you clone the repo - you
keep the entire repo, even if only some files are shown.

 - I'm also worried about security implications of my program. Security
 isn't my area of expertise, and I would definitely appreciate some
 

Re: [Wikitech-l] MediaWiki 1.21.0rc1 -- Help test 1.21 and update the announcement

2013-04-01 Thread Brian Wolff

  Does your version of wiki support pictures in PDF and/or EPS format?

 MediaWiki supports uploading PDF and EPS files with the right
 configuration.  I haven't tried uploading EPS files, but providing
 thumbnails of these and the PDFs should work.


Im doubtful thumbnails of eps would just work. Pdf thumbs need an extension
(pdfhandler) in order to thumbnail correctly. Uploading of both these
formats should be fine on vanilla mediawiki with a small config change to
add them to the allowed file types list.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit actively discourages discussion

2013-04-01 Thread Brian Wolff
On 2013-04-02 12:32 AM, Jeremy Baron jer...@tuxmachine.com wrote:

 On Tue, Apr 2, 2013 at 2:51 AM, Matthew Walker mwal...@wikimedia.org
wrote:
  You are definitely not the only one who finds these issues annoying. I
too
  worry about the same.

 +1

 Another big one is (1 comment) which gives no information about the
 comment or where it was made and no easy way to jump straight to it.
 (a link or AJAX or whatever).
 OTOH, the email notifs about inline code comments are often useful.

 -Jeremy

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

+1 to both emails.

It can be very frustrating trying to follow a conversation on gerrit after
the fact.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Sort on category

2013-04-10 Thread Brian Wolff
On 2013-04-10 10:25 PM, Small M smallma...@yahoo.com wrote:

 Hello,


 Are there any plans to have a tool that would allow to dynamically sort
content based on category? Single category views, especially for large
categories, aren't particularly helpful.


 Something similar to what Microsoft's pivot demo had?

 An HTML5 example at:
 http://pivot.lobsterpot.com.au/pass2012.htm

 -Small
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

No, there are no current plans for doing this. Interesting idea though. If
I understand you correctly you want a category page where things are
grouped by what other categories a page is a member of.

I don't think such a thing can be done in an efficient manner for big
categories given the way category membership is currently stored.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Sort on category

2013-04-11 Thread Brian Wolff
It certainly can be done (and done without need for silverlight) on
smallish categories (to pull numbers out of a hat, things with less than
1000 pages probably would be fine).

I doubt the silverlight control would work with a category that has a
million entries in it (and yes there are categories that big)

-bawolff

On 2013-04-11 11:56 AM, Small M smallma...@yahoo.com wrote:

 One of the demo collections originally shown in pivot was wikipedia
article's sorted by category. Though the collections are no longer up, you
can see videos at:

 http://www.youtube.com/watch?v=BZuFUZpEZ-A?t=2m30s
 http://www.youtube.com/watch?v=vgxCvdoXpwM

 It was done before (albeit using a dedicated viewer which is now a
silverlight control). So it can be done.
 -Small

 
 From: Brian Wolff bawo...@gmail.com
 To: Small M smallma...@yahoo.com; wikitech-l 
wikitech-l@lists.wikimedia.org
 Sent: Wednesday, April 10, 2013 6:42 PM
 Subject: Re: [Wikitech-l] Sort on category


 On 2013-04-10 10:25 PM, Small M smallma...@yahoo.com wrote:
 
  Hello,
 
 
  Are there any plans to have a tool that would allow to dynamically sort
content based on category? Single category views, especially for large
categories, aren't particularly helpful.
 
 
  Something similar to what Microsoft's pivot demo had?
 
  An HTML5 example at:
  http://pivot.lobsterpot.com.au/pass2012.htm
 
  -Small
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 No, there are no current plans for doing this. Interesting idea though.
If I understand you correctly you want a category page where things are
grouped by what other categories a page is a member of.
 I don't think such a thing can be done in an efficient manner for big
categories given the way category membership is currently stored.
 -bawolff


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Clicktracking being phased out

2013-04-12 Thread Brian Wolff
On 4/12/13, Andre Klapper aklap...@wikimedia.org wrote:
 On Thu, 2013-04-11 at 14:08 -0700, Ori Livneh wrote:
 Extension was dropped today.

 Lovely!

 Should http://www.mediawiki.org/wiki/Extension:ClickTracking get a
 blinkDo not use, go for EventLogging instead/blink box?

 AFAIK, ClickTracking depends on UserDailyContribs.
 http://www.mediawiki.org/wiki/Extension:UserDailyContribs states that
 UserDailyContribs is still deployed on WMF servers.
 Is phasing out UserDailyContribs also planned? It's still listed on
 https://noc.wikimedia.org/conf/InitialiseSettings.php.txt

 Are the remaining two open ClickTracking bug reports WONTFIX? Shall the
 Bugzilla component ClickTracking be closed for new bug entry?

 Thanks,
 andre
 --
 Andre Klapper | Wikimedia Bugwrangler
 http://blogs.gnome.org/aklapper/


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I took the liberty of adding a note to the extension page to say use
the other extension instead.

For the record, if anyone ever uses blink (or the equivalent code
that actually gets through Sanitizer.php) on MW.org, a unicorn kills a
kitten :P

Think of the kittens, don't blink ;)

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Project Idea for GSoC 2013 - Bayesian Spam Filter

2013-04-12 Thread Brian Wolff
On 2013-04-12 7:33 PM, Platonides platoni...@gmail.com wrote:

 On 09/04/13 18:20, Quim Gil wrote:
  Hi Anubhav,
 
  I have done a first reality check with Chris Steipp, who oversees the
  area of security and also spam prevention. Your idea is interesting and
  it seems to be feasible. This is a very good first step!
 
  It would require adding a hook to MediaWiki core, but this could be a
  small, acceptable change.
 I agree. Adding a hook is no problem.


Well a hook is obviously no problem, im not sure why a new one would be
needed. Surely if the abuse filter has all the hooks it needs, so would
this.

Qgill wrote:
It might have a performance penalty in a site like English Wikipedia with
plenty of concurrent edits, but for starters it could be potentially useful
to the 99% of MediaWiki instances that have a significantly smaller number
of daily edits and especially a very small number of editors and tools able
/ happy to deal with spam.

Hmm. I was playing with nlp-ish automated newpage patrol recently. One
thing that crossed my mind was if it becomes too expensive, one could run
the classifier in the job queue (and hence on a dedicated server(s) ) and
tag changes shortly after the fact.

Last of all I would suggest you also read up on other people who have done
machine learning approaches to vandalism detection. In particular
user:cluebot_NG - http://en.wikipedia.org/wiki/User:Cluebot_NG . There is
also a list of academic papers on the subject at
http://en.wikipedia.org/w/index.php?title=User:Emijrp/Anti-vandalism_bot_census(that
said, an extension like you are proposing does not have to be as good
as the rather complex state of the art in order to be useful. Any effective
system would probably be quite useful).

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Article on API Characteristics

2013-04-17 Thread Brian Wolff
My understanding is its not really possible to do this in php in a way
that would actually be of use to anyone. See
https://bugzilla.wikimedia.org/show_bug.cgi?id=26631#c1

-bawolff

On 4/17/13, Petr Onderka gsv...@gmail.com wrote:
 Regarding #7 in that list (Expect: 100-Continue), I think it would be nice
 if Wikimedia wikis did this.

 I know that at least in .Net, if I send a POST request to
 http://en.wikipedia.org/w/api.php,
 the Expect: 100-Continue header will be set, which results in an 417
 Expectation failed error.

 .Net has a switch to turn that header off, and with that the request will
 work fine.
 But I think it would be nice if Wikimedia wikis supported this.

 I think this is an issue with something in Wikimedia's configuration
 (Squid? or maybe something like that) and not MediaWiki itself, because it
 works fine for my local MediaWiki installation even with Expect:
 100-Continue set.

 Petr Onderka
 [[en:User:Svick]]


 On Wed, Apr 17, 2013 at 5:50 AM, Tyler Romeo tylerro...@gmail.com wrote:

 Found this interesting articles on designing an API for what it's worth.
 Thought some people my find it interesting.

 http://mathieu.fenniak.net/the-api-checklist/
 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Article on API Characteristics

2013-04-18 Thread Brian Wolff
On 2013-04-18 1:13 AM, Petr Kadlec petr.kad...@gmail.com wrote:

 On 17 April 2013 22:33, Brian Wolff bawo...@gmail.com wrote:

  My understanding is its not really possible to do this in php in a way
  that would actually be of use to anyone. See
  https://bugzilla.wikimedia.org/show_bug.cgi?id=26631#c1
 

 Still, supporting this in a way “that wouldn’t be of use”, i.e. send the
 100 status immediately instead of 417 would probably make it a tiny bit
 easier for clients. However, this is not a bug/problem/feature-request for
 MediaWiki, but for Squid. It seems ApachePHP would handle this correctly,
 but Squid rejects such requests. There is a configuration variable doing
 exactly what Svick is proposing 
 http://www.squid-cache.org/Doc/config/ignore_expect_100/, but I agree
 turning it on would not be a good idea. And FYI: Squid 3.2 seems to
support
 100-continue somehow, but not sure how much. 
 http://wiki.squid-cache.org/Features/HTTP11

 -- [[cs:User:Mormegil | Petr Kadlec]]
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I disagree. If we supported it, people will expect it to actually work, and
add the extra complexity to support 100 continue to their bots. This would
be bad since it would essentially be a no-op and just slow things down.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] No cascade protection on Mediawiki namespace

2013-04-20 Thread Brian Wolff
Mw namespace pages that expand templates generally arent the scary ones
from an anon editing prespective. I don't think this is really neccesary,
and can mostly be dealt with socially when needed.

-bawolff

P.s. $5 says someone will probably come up with an exception with in 10
seconds of me making such a statement.

On 2013-04-20 2:08 PM, Techman224 techman...@techman224.ca wrote:

 Some people are not aware of this, and they use templates to style
Mediawiki messages. If the administrator forgets to protect the template,
it opens up a backdoor for other users to edit the message.

 Techman224

 On 2013-04-20, at 4:25 AM, Petr Kadlec petr.kad...@gmail.com wrote:

  On 20 April 2013 07:08, Techman224 techman...@techman224.ca wrote:
 
  Right now the MediaWiki namespace is protected from editing. However,
if
  you add a template to a Mediawiki message and the template is
unprotected,
  any user could edit the message by editing the template, creating a
  backdoor.
 
 
  Ummm... Don't do that, then? Sometimes you _want_ to include pieces of
text
  editable by more than just sysops (say, by autoconfirmed users), so you
use
  a template from a MediaWiki message. (Cf. e.g.
  https://cs.wikipedia.org/wiki/MediaWiki:Recentchangestext and
  https://cs.wikipedia.org/wiki/%C5%A0ablona:Ozn%C3%A1men%C3%ADRC.) Or,
you
  do not want to do that, then why are you using an unprotected template?
  Either transclude another page in MediaWiki namespace as a template
  ({{MediaWiki:Something}}), or make sure you protect the used template
(with
  possible cascade).
 
  -- [[cs:User:Mormegil | Petr Kadlec]]
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 'Please - Help ' (GSOC)

2013-04-23 Thread Brian Wolff
On 2013-04-24 1:34 AM, hungers.to.nurt...@gmail.com wrote:




 I am wishing to propose a project that needs Wikipedia and I discussed
about it a little on IRC.

 I wish to know that Wikimedia  will not consider any Wikipedia related
project as idea’s page says or if somehow I am able to demonstrate that I
will be able to complete it in summer they can consider it then.

 Sent from Windows Mail
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Its kind of hard to give a full response as your mail did not include
details of what you are proposing (and im too lazy to go digging through
the irc logs)

In general though: convincing wikipedians that something is a good idea is
like hearding cats-it can be difficult. Thus projects that would be useful
to a wide variaty of wikis are preferred. If after all is said and done
wikipedia ends up not liking the results then someone else may use it.
Something usable by many wikis may also solve a more generic problem and be
more generally useful. The last thing we want is for someone to do a
project and their target audiance respond with why would anyone want such
a thing

There have been succesful projects from previous years that targetted a
specific project - for example one year someone made a bot to import legal
judgements into wikisource. If you are doing such a project I think the key
point is to demonstrate beyond a shadow of a doubt that the community in
question actually wants your project. This will probably be hard to do
unless you are already a member of the community in question.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] File Licensing Guidelines

2013-04-24 Thread Brian Wolff
On 2013-04-24 6:46 PM, Matthew Walker mwal...@wikimedia.org wrote:

 At the risk of starting another huge bikeshed like [1] I feel like we need
 some good guidance on just how in the heck we are required to license
 extensions/images/source code files. With the help of Marktraceur we now
 have

http://www.mediawiki.org/wiki/Manual:Coding_conventions#Source_File_Headers
 which
 is somewhat specific to PHP but could be generalized to JS, CSS, and SQL.

 [1] http://lists.wikimedia.org/pipermail/wikitech-l/2013-March/067217.html

 But I have some additional questions... breaking this up into bits; my
 current thought matrix is that:

 == Extensions ==
 * Must have a LICENSE file in the root with the full text of the license
 for the extension, and appended any additional licenses for
 libraries/resources they've pulled in
 ** How do we specify what license goes to what included component?

By saying in the license file component x is under license y.


 == PHP Files ==
 * For generic files, include a statement like

http://www.mediawiki.org/wiki/Manual:Coding_conventions#Source_File_Headers
 * If it's the Extension.php file $wgExtensionCredits array should have the
 following items
 ** author
 ** version
 ** url
 ** license?
 ** If we include additional libraries, so we add another entry to the
 wgExtensionCredits array?

Adding license info to extension credits is interesting idea. I have no
idea how we would displsy it though.

 == JS/CSS Files ==
 This gets a bit confusing because apparently we're supposed to have a
 license in every bit of content pushed to the user; did we ever settle
that
 huge thread in any meaninful way? E.g. how to push minimized but licensed
 files?

Says who? I do not believe this is a requirement. It perhaps would be nice,
if done sanely, but not a requirement.

 == Image Files ==
 Really shouldn't be licensed under GPLv2; but right now they implicitly
 are. Is there a way to explicitly identify image/binary content as being
CC
 licensed? Do we just add a line to the license file about this?

Yes they should be licensed under gplv2 as they are part of the software.
It would be nice if they were dual licensed under something cc-by-sa like
as well. Line in the license file sounds fine.

I think you are slightly over thinking things. We just need to adequetely
communicate license to potential reusers. It doesnt overly matter as to how
provided the people are informed.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Support for multiple content languages in MW core

2013-04-25 Thread Brian Wolff
On 2013-04-25 7:04 AM, Erik Moeller e...@wikimedia.org wrote:

 On Tue, Apr 23, 2013 at 9:08 PM, Brian Wolff bawo...@gmail.com wrote:

 Hi Brian,

  We already have the page lang support.

 What do you mean by that? AFAICT there's no existing designated place
 in the schema for associating a content language with a specific page.

 Thanks,
 Erik

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

There's nothing in the schema. But we do (since 1.19) have a notion of
page language separate from content language. See the PageContentLanguage
hook and related code. So all that needs to be done is add something to the
schema to actually store a value.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Support for multiple content languages in MW core

2013-04-25 Thread Brian Wolff
On 2013-04-25 9:12 AM, Amir E. Aharoni amir.ahar...@mail.huji.ac.il
wrote:

 2013/4/25 Brian Wolff bawo...@gmail.com:
  On 2013-04-25 7:04 AM, Erik Moeller e...@wikimedia.org wrote:
 
  On Tue, Apr 23, 2013 at 9:08 PM, Brian Wolff bawo...@gmail.com wrote:
 
  Hi Brian,
 
   We already have the page lang support.
 
  What do you mean by that? AFAICT there's no existing designated place
  in the schema for associating a content language with a specific page.
 
  There's nothing in the schema. But we do (since 1.19) have a notion of
  page language separate from content language. See the
PageContentLanguage
  hook and related code. So all that needs to be done is add something to
the
  schema to actually store a value.

 That, and a way for the user to specify that language. Either through
 a magic word or through a language selector on the editing page
 (either ULS or a dropdown). Of course, the wiki language should be the
 default. Is there anything else to it?

 That's a as far as a page language goes; It is also useful to specify
 the language of chunks of a page. Currently it's done with the HTML
 lang attribute, either raw or through templates. It should probably be
 done using the VisualEditor, and I already wrote a general spec for it
 a while ago:

https://www.mediawiki.org/wiki/VisualEditor/Internationalization_requirements

 --
 Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
 http://aharoni.wordpress.com
 ‪“We're living in pieces,
 I want to live in peace.” – T. Moore‬

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Im personally opposed to using a magic word for this. Having a magic word
that changes how magic words encountered prior to it are interpreted, not
to mention changing how it itself is interpreted seems to be just asking
for trouble.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Anyone using Apache 2.4 to run MediaWiki?

2013-04-25 Thread Brian Wolff
On 2013-04-25 2:01 PM, Mark A. Hershberger m...@everybody.org wrote:

 On 04/25/2013 11:42 AM, Tyler Romeo wrote:
  How exactly did it not succeed? That's a pretty serious bug.

 I did say I didn't try very hard.  I'm not sure php even works with
 apache 2.4 which is labeled experimental on Debian.

 So five minutes was about enough to apt-get the apache 2.4 package and
 dependencies and go to the URL that it was already set up on and see
 that it didn't work.

 If you'd like me to test some more, I can.  But I was hoping someone
 else had some experience before I had to dig in further.

 Mark.

 --
 http://hexmode.com/

 Imagination does not breed insanity. Exactly what does breed insanity
 is reason. Poets do not go mad; but chess-players do.
 -- G.K. Chesterson

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I would say we shouldn't document unless we habe a vauge idea why. From
your description it could be that debian doesnt have php enabled by default
in their package of apache or something like that which would not be a mw
issue.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] $wgUseMathJax

2013-04-27 Thread Brian Wolff
On 2013-04-27 4:34 PM, Moritz Schubotz phy...@physikerwelt.de wrote:

 Hi,



 I'd like to improve the variables of the math extension. I think
 $wgUseMathJax is misleading.

 I'd like to change $wgUseMathJax to $wgAllowMathJax and set it to true
 by default. In the same style I'd like to introduce $wgAllowLaTeXML
 and $wgDebugMath.



 What are your suggestions?



 Best regards

 Physikerwelt

 --
 Mit freundlichen Grüßen
 Moritz Schubotz

   Telefon (Büro):  +49 30 314 22784
   Telefon (Privat):+49 30 488 27330
   E-Mail: schub...@itp.physik.tu-berlin.de
   Web: http://www.physikerwelt.de
   Skype: Schubi87
   ICQ: 200302764
   Msn: mor...@schubotz.de

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Changing config variable names tends to cause lots of confussion. I would
reccomend not to unless you have a really good reason to.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tag extensions in Block mode

2013-04-28 Thread Brian Wolff
On 2013-04-28 6:01 AM, Moritz Schubotz phy...@physikerwelt.de wrote:

 Hi,

 how can I figure out in a tag extension callback e.g.

 function wfSampleRender( $input, array $args, Parser $parser, PPFrame
$frame )

 if the parser is inside a block mode or not?
 With block mode I mean something like
 :mytag/mytag
 or
 #mytag/mytag
 or

 mytag/mytag

 but not
 blindtext mytag/mytag more text

 If there is an extension that uses ways to dermine the block a link
 would help a lot.

 Best regards
 physikerwelt

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

What are you trying to do that requires that knowledge?  Off the top of my
head (so could be wrong) block level-ness is determined entirely after
parser tags are replaced.

If you are woried about doBlockLevels messing up the output of your
extension, you can return an array with certain structure to tell the
parser to protect your output from doBlockLevels messing with it. (Which is
for example how nowiki is implemented internally I believe)

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GSOC2013 Project idea - WiPhys: Box2D Interactive Physics Engine Plugin

2013-04-28 Thread Brian Wolff
On 2013-04-28 5:11 PM, Quim Gil q...@wikimedia.org wrote:

 On 04/28/2013 10:17 AM, Moriel Schottlender wrote:

 Hi everyone,

 I posted an idea earlier this weekend to the list and received feedback
--
 and I really appreciate it! It made me realize that the idea I proposed
was
 a little vague and elaborate and sounded too complex. So I re-drafted it
 and simplified it a lot, and I would really appreciate your opinions
again.
 I think it can be doable for a GSoC project.

 The idea briefly: My idea is to develop a plugin for MediaWiki that
enables
 the easy embedding of interactive physics demos in wiki articles, using a
 Javascript physics engine like Box2web (which is based on Box2D)

 You can see the new edited proposal here:
 http://www.mediawiki.org/wiki/User:Mooeypoo/GSOC_2013_Project


 Also, since this idea does not appear in the main idea list, I am hoping
 there may be a mentor available :)


 Good! There is not much time left. Please file a Bugzilla report and
submit your proposal to the GSoC site:

 http://www.google-melange.com/gsoc/homepage/google/gsoc2013

 You are also encouraged to submit the proposal to Outreach Program for
Women. That requires also a first contribution related with your project
proposal. Any suggestions? I'm no an expert in this area... Maybe something
related with Commons / media types?

 Anyway, the idea is that you just continue with the assumption that a
mentor will appear. Anybody interested?

 Thank you and good luck!

 --
 Quim Gil
 Technical Contributor Coordinator @ Wikimedia Foundation
 http://www.mediawiki.org/wiki/User:Qgil


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

That sounds interesting. I may be willing to (co-)mentor if the category
redirect thing falls through.

From your userpage I see you are still waiting to hear from something else.
I would reccomend still officially submitting your proposal now, and just
make sure to let us know if you are actually available before we have to
select the proposals.

As for related contrib. Since this is a make a new extension theres not
much related. I would reccomend maybe a resource loader commit as this
project involves lots of js.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Upcoming change to section edit links

2013-04-29 Thread Brian Wolff
On 2013-04-29 8:01 PM, Steven Walling swall...@wikimedia.org wrote:

 Hi all,

 To resolve bug 41729, patch 49364 was recently merged. This means that
 there's going to be a small design change for section edit links, and
 probably more relevant for this list, the way the HTML for them is
 generated will change. This is likely to break any custom gadgets,
 scripts or styles, etc. You can already see the change on
 MediaWiki.org and testwiki, if you're curious.

 There's pretty comprehensive documentation about this change at:
 https://meta.wikimedia.org/wiki/Change_to_section_edit_links

 Thanks,

 --
 Steven Walling
 https://wikimediafoundation.org/

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

For future reference, if you are going to write a mail about something
changing you should actually (succinctly) mention what's changing. In this
case: section edit links are being moved to be right beside the headline
instead of on the right of the page, and the class name is changing which
may break userscripts.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tag extensions in Block mode

2013-05-01 Thread Brian Wolff
Hmm. I don't think that is really possible. You may have to simply do math
mode=block and math mode=inline instead.

-bawolff
On 2013-05-01 4:15 AM, Moritz Schubotz phy...@physikerwelt.de wrote:

 Hi bawolff,

 I'm trying the following. Looking for example at
 http://math-test.instance-proxy.wmflabs.org/wiki/Summation

 The equations denoted with \n:math e.g.
 :math\sum_a^b/math
 should be rendered in displaystyle, whereas inline equations e.g.
 Let math\sum_i 2^{-i}/math should be rendered normal.

 Best
 Moritz

 On Sun, Apr 28, 2013 at 8:32 PM, Brian Wolff bawo...@gmail.com wrote:
  On 2013-04-28 6:01 AM, Moritz Schubotz phy...@physikerwelt.de wrote:
 
  Hi,
 
  how can I figure out in a tag extension callback e.g.
 
  function wfSampleRender( $input, array $args, Parser $parser, PPFrame
  $frame )
 
  if the parser is inside a block mode or not?
  With block mode I mean something like
  :mytag/mytag
  or
  #mytag/mytag
  or
 
  mytag/mytag
 
  but not
  blindtext mytag/mytag more text
 
  If there is an extension that uses ways to dermine the block a link
  would help a lot.
 
  Best regards
  physikerwelt
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
  What are you trying to do that requires that knowledge?  Off the top of
my
  head (so could be wrong) block level-ness is determined entirely after
  parser tags are replaced.
 
  If you are woried about doBlockLevels messing up the output of your
  extension, you can return an array with certain structure to tell the
  parser to protect your output from doBlockLevels messing with it.
(Which is
  for example how nowiki is implemented internally I believe)
 
  -bawolff
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l



 --
 Mit freundlichen Grüßen
 Moritz Schubotz

   Telefon (Büro):  +49 30 314 22784
   Telefon (Privat):+49 30 488 27330
   E-Mail: schub...@itp.physik.tu-berlin.de
   Web: http://www.physikerwelt.de
   Skype: Schubi87
   ICQ: 200302764
   Msn: mor...@schubotz.de

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML Validation Shell Script

2013-05-01 Thread Brian Wolff
$wgValidateAllHtml ?

-bawolff

On 2013-05-01 3:36 PM, Rob Moen rm...@wikimedia.org wrote:

 Like a boss!  Thanks Jon.

 On Wed, May 1, 2013 at 11:22 AM, Jon Robson jdlrob...@gmail.com wrote:

  W3CValidationTest





 --
 Rob Moen
 Wikimedia Foundation
 rm...@wikimedia.org
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Global watchlist and watchlist wishlist

2013-05-06 Thread Brian Wolff
On 2013-05-06 1:31 PM, Quim Gil q...@wikimedia.org wrote:

 On 05/06/2013 05:41 AM, Guillaume Paumier wrote:

 A few people have started to organize the various bug reports about
 watchlists, but there is still much to do before we have a clear 
 prioritized vision of what the watchlist feature should become.


 fwiw I had already suggested a Bug Day focusing on the Watchlist feature.
If this is considered useful Andre could schedule it whenever appropriate.


 Therefore, if a few developers could declare their interest in
 tackling the watchlist issue in the foreseeable future, it would help
 arouse interest and enthusiasm from users, and motivate them to
 organize user research in order to design a better watchlist feature.

 I don't think we need a formal pledge or commitment; a simple
 declaration of interest would imho be enough to get started. The
 specifics can be ironed out later.


 Sounds like an entry to
http://www.mediawiki.org/wiki/Mentorship_programs/Possible_projects#Raw_projectsmight
help as soon as there is a broad idea of what needs to be done.

What happened to last years gsoc project in this area? Are there people
working on getting it merged?

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Global watchlist and watchlist wishlist

2013-05-06 Thread Brian Wolff
+1

Such a rule would seem rather misguided...

-bawolff
On 2013-05-06 6:58 PM, Sumana Harihareswara suma...@wikimedia.org wrote:

 On 05/06/2013 05:26 PM, Aaron Pramana wrote:
  Brian Wolff bawolff at gmail.com writes:
 
 
  On 2013-05-06 1:31 PM, Quim Gil qgil at wikimedia.org wrote:
 
  On 05/06/2013 05:41 AM, Guillaume Paumier wrote:
 
  A few people have started to organize the various bug reports about
  watchlists, but there is still much to do before we have a clear 
  prioritized vision of what the watchlist feature should become.
 
 
  fwiw I had already suggested a Bug Day focusing on the Watchlist
  feature.
  If this is considered useful Andre could schedule it whenever
 appropriate.
 
 
  Therefore, if a few developers could declare their interest in
  tackling the watchlist issue in the foreseeable future, it would help
  arouse interest and enthusiasm from users, and motivate them to
  organize user research in order to design a better watchlist feature.
 
  I don't think we need a formal pledge or commitment; a simple
  declaration of interest would imho be enough to get started. The
  specifics can be ironed out later.
 
 
  Sounds like an entry to
 
 
 http://www.mediawiki.org/wiki/Mentorship_programs/Possible_projects#Raw_proj
  ectsmight
  help as soon as there is a broad idea of what needs to be done.
 
  What happened to last years gsoc project in this area? Are there people
  working on getting it merged?
 
  -bawolff
  ___
  Wikitech-l mailing list
  Wikitech-l at lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 
  Hi Folks,
 
  Last year, I worked on making improvements to the watchlist feature as my
  project for GSoC. The changes I made are still available as an unmerged
  change in Gerrit. [1] I'm glad to see that the project is starting to
 gain
  momentum. A major limitation of my efforts last year was that, under GSoC
  rules, I was the only developer who was allowed to code for the project.

 Aaron, can you point me to that rule? I am nearly certain that you are
 wrong.

  Now
  that other developers are looking to work on the project, I think that a
  project of this size has a much better chance of succeeding. Watchlist
  improvements as a project is extremely susceptible to feature creep, and
  unless there is a well-defined plan from the start, it risks getting
 shelved
  entirely. A blog of my progress last summer is available for ideas. [2]
 
  I'd recommend fixing feature request bugs as a large coordinated
 overhaul of
  the watchlist, rather than a piecemeal approach that may result in
  duplicative code or work that gets scrapped soon after it is finished.
 
  I may be unavailable intermittently for the next 6 weeks, but I should be
  able to help out from the middle of June until early September.
 
  -Aaron
 
  [1] https://gerrit.wikimedia.org/r/#/c/16419/
  [2] http://mw-watchlist.tumblr.com/
 
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l



 --
 Sumana Harihareswara
 Engineering Community Manager
 Wikimedia Foundation

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Coding style: Language construct spacing

2013-05-08 Thread Brian Wolff
On 2013-05-08 9:26 PM, Krinkle krinklem...@gmail.com wrote:

 Hi,

 Since there appears to have been a little bit of trivia

 Hi,

 Since there appears to have been a little bit of trivia around fixing
 these phpcs warnings, I'll open a thread instead.

 Both in javascript and PHP there are various keywords that can be used
 as if they are functions. In my opinion this is a misuse of the
 language and only causes confusion.

 I'm referring to code like this (both javascript and php):

 delete( mw.legacy );

 new( mw.Title );

 typeof( mw );

 echo( $foo . $bar );

 print( $foo . $bar );

 return( $foo . $bar );

 … and, wait for it..

 require_once( $foo . $bar );

 I think most experienced javascript developers know by now that using
 delete or new like it is a function is just silly and looks like
 you don't know what you're doing.

 To give a bit of background, here's why these work at all (they aren't
 implemented both keywords and functions, just keywords). Though I'm
 sure the implementation details differ between PHP and javascript, the
 end result is the same: Keywords are given expressions which are then
 evaluated and the result is used as value. Since expressions can be
 wrapped in parenthesis for readability (or logic grouping), and since
 whitespace is insignificant to the interpreter, it is possible to do
 `return(test)`, which really is just `return (test)` and
 eventually `return test`.

 I'm obviously biased, but I think the same goes for require_once
 (and include, require etc.). Right now this is causing quite a few
 warnings in our php-checkstyle report.

 I didn't disable that rule because it appears (using our code base as
 status quo) that we do want this. There's 0 warnings I could find in
 our code base that violate this, except for when the keyword is
 include|require(_once)?

 The check style sniffer does not (and imho should not) have a separate
 rule per keyword. Either you use constructs like this, or you don't.

 But let's not have some weird exception just because someone didn't
 understand it[1] and we all copied it and want to keep it for no
 rational reason.

 Because that would mean we have to either hack the sniffer to exclude
 this somehow, or we need to disable the rule, thus not catching the
 ones we do use.

 See pending change in gerrit that does a quick pass of (most of) these
 in mediawiki/core:

 https://gerrit.wikimedia.org/r/62753


 -- Krinkle

 [1] Or whatever the reason is the author originally wrote it like
 this. Perhaps PHP was different back then, or perhaps there was a
 different coding style.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] category intersection conversations

2013-05-08 Thread Brian Wolff
On 2013-05-08 11:48 PM, James Forrester jforres...@wikimedia.org wrote:

 On 8 May 2013 18:26, Sumana Harihareswara suma...@wikimedia.org wrote:

  Recently a lot of people have been talking about what's possible and
  what's necessary regarding MediaWiki, CatScan-like tools, and real
  category intersection; this mail has some pointers.
 
  The long-term solution is a sparkly query for, e.g., people with aspects
  novelist + Singaporean, and it would be great if Wikidata could be the
  data-source.  Generally people don't really want to search using
  hierarchical categories; they want tags and they want AND. But
  MediaWiki's current power users do use hierarchical labels, so any
  change would have to deal with current users' expectations.  Also my
  head hurts just thinking of the but my intuitively obvious ontology is
  better than yours arguments.
 

 To put a nice clear stake in the ground, a magic-world-of-loveliness
 sparkly proposal for 2015* might be:

Just to clarify, you mean sparkles in the way that a unicorn sparkles as
its hopping over a rainbow, not sparkle as in SPARQL (semantic triple store
based)?


 * Categories are implemented in Wikidata
 * - They're in whatever language the user wants (so fr:Chat and en:Cat
and
 nl:kat and zh-han-t:貓 …)

Issue (probably can be dealt with somehow or maybe rare enough not to
care): conflicts - what if the name of one cat in french is the same as a
different category in spanish. May be non issue if done using wikidata
numeric ids

 * - They're properly queryable

Various groups have variois definitions of this

 * - They're shared between wikis (pooled expertise)

Between wikipedias or all wikimedia wikis... category structure has varried
meaning between projects. Category:North_America has different types of
pages in enwikinews compared to enwikipedia.

 * Pages are implicitly in the parent categories of their explicit
categories
 * - Pages in Politicians from the Netherlands are in People from the
 Netherlands by profession (its first parent) and People from the
 Netherlands (its first parent's parent) and Politicians (its second
 parent) and People (its second parent's parent) and …
 * - Yes, this poses issues given the sometimes cyclic nature of
 categories' hierarchies, but this is relatively trivial to code around

In the current structure. It doesnt make sense for Bob to be in list of
people by professions. It makes less sense the futher you traverse the
cayegory graph. Otoh better querying capabilities may turn the category
system into more of a flat namespace making that less of an issue.


 * Readers can search, querying across categories regardless of whether
 they're implicit or explicit
 * - A search for the intersection of People from the Netherlands with
 Politicians will effectively return results for Politicians from the
 Netherlands (and the user doesn't need to know or care that this is an
 extant or non-extant category)

We would need some system to turn fake cats into real queries. I suppose
users could make redirects. The alternative of magic nlp sounds difficult

 * - Searches might be more than just intersections, e.g. Painters from
 the United Kingdom AND Living people NOT Members of the Royal
Academy
 or whatever.
 * - Such queries might be cached (and, indeed, the intersections that
 people search for might be used to suggest new categorisation schemata
that
 wikis had previously not considered - e.g. British politicians  People
 with pet cats  People who died in hot-ballooning accidents)

Dealing with cache invalidation (unless it is quite coarse grained) may be
difficult.

 * Editors can tag articles with leaf or branch categories, potentially
 over-lapping and the system will rationalise the categories on save to the
 minimally-spanning subset (or whatever is most useful for users, the
 database, and/or both)

That's quite an interesting idea, and one I haven't heard before from
previous times this has been brought up.

One concern id have is how to figure out which categories to list at the
bottom of the page (all that could fit, or only the base categories, and
how to determine what that is)

 * - Editors don't need to know the hierarchy of categories *a priori*
when
 adding pages to them (yay, less difficulty)
 * - Power editors don't need to type in loads of different categories if
 they have a very specific one in mind (yay, still flexible)
 * - Categories shown to readers aren't necessarily the categories saved
in
 the database, at editorial judgement (otherwise, would a page not be in
 just a single category, namely the intersection of all its tagged
 categories?)

 ​Apart from the time and resources needed to make this happen and
 operational, does this sound like something we'd want to do? It feels like
 this, or something like it, would serve our editors and readers the best
 from their perspective, if not our sysadmins. :-)

 [Snip]
 ​

  I think the best place to pursue this topic is probably in
  

Re: [Wikitech-l] category intersection conversations

2013-05-09 Thread Brian Wolff
On 2013-05-09 3:21 PM, Luke Welling WMF lwell...@wikimedia.org wrote:

 Without deliberately making it an even longer term plan, as I think it is
a
 great idea, another long goal solution to the same problem would be (as
 Flow gets Wikipedians into the idea of tagging) that categories get
largely
 replaced by tags.  That way they lose much of their absoluteness and
 therefore some of their controversy.

 Categories are hard for Wikipedia because compromise is not possible.
  Consensus can be reached on a subtly different compromise version of the
 wording of a sentence or paragraph, but there is no compromise on
 categories.  A category either exists or does not. A page either goes in
or
 does not.

 With tags, a biography could relatively uncontroversially  be tagged as
 Novelist, Woman, Best Selling, American, Blonde Haired, Enjoys Spicy
Food
 even if nearly everybody agrees that half the tags while true are entirely
 unimportant and not relevant to the subject's area of notability.  Whether
 some tags like race and appearance should exist at all may still generate
 debate, but if they are only ever available modifiers and not hard
 categories their offense would be softened.

 For some subjects, entirely uncontroversial tags could be extracted from
 Wikidata.

 It would be content shakeup and therefore perhaps politically difficult,
 but it would take a lot of the technical challenge out of joins, even
 permitting joins (automatically or manually) with tags translated into
 equivalent versions in other languages.

 All possible combinations of tag derived categories would then exist,
and
 it would just be a matter of debate as to whether there is a justification
 to add a link from a page to Biography+Novelist+Enjoys Spicy Food or if
 that is a meaningless category.  If reverted, the one person interested in
 that exact category could still always visit it, it's just that other
users
 would not be directed to it unless they probe talk page debates.

 Luke Welling



Nobody has ever been able to explain to me the technical difference between
tag and category. (Other than being able to query intersections, which is
wanted for cats anyhow)

Just change mediawiki:pagecategories to tags and change some social
conventions - boom you have tags.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] category intersection conversations

2013-05-09 Thread Brian Wolff
On 2013-05-09 4:13 PM, James Forrester jforres...@wikimedia.org wrote:

 On 9 May 2013 12:07, Brian Wolff bawo...@gmail.com wrote:

  Nobody has ever been able to explain to me the technical difference
between
  tag and category. (Other than being able to query intersections, which
is
  wanted for cats anyhow)
 

 The theory is that tags are non-hierarchical, casually-applied and
 well-supported in software (from intersections to more). You can see how
 people feel what we have is somewhat different from this world vision. :-)

 J.
 --
 James D. Forrester
 Product Manager, VisualEditor
 Wikimedia Foundation, Inc.

 jforres...@wikimedia.org | @jdforrester
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Categories can be flat if people want them to be. Categories can be casual
if people want them to be (guess the red link discourages but that's
trivial to change)

People seem to want lots of things. When it comes to the tag camp, other
than non-crappy category intersection, we seem to have the things people
want, which confuses me why people are asking for them.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [GSoC] tools for micromanagement

2013-05-10 Thread Brian Wolff
On 2013-05-10 3:52 AM, Yury Katkov katkov.ju...@gmail.com wrote:

 Hi everyone!

 What tools do you use for a small tasks in Google Summer of Code? I mean
 the tasks like prepare the working environment, learn gerrit, write a
 blogpost, etc.? I thin that bugzilla is too heavy for this purpose.

 Also can we use microblogging for reporting the current progress (in
 addition to posts in a blog one in 2 weeks )? I tried that once and it was
 very fun and efficient.

 Cheers,
 -
 Yury Katkov, WikiVote
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I think traditionally we just told people to do things without using any
tool.

As for microblogging - by all means whatever works. There used to be a
quite a few mw devs who actively used identica.

-bawolff

P.s. im not Quim, so my answers are unofficial and just my opinions, but I
doubt we would have hard and fast rules on this sort of thing.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [GSoC] tools for micromanagement

2013-05-10 Thread Brian Wolff
On 2013-05-10 4:13 PM, Amir E. Aharoni amir.ahar...@mail.huji.ac.il
wrote:

 2013/5/10 Yury Katkov katkov.ju...@gmail.com:
  Hi everyone!
 
  What tools do you use for a small tasks in Google Summer of Code? I mean
  the tasks like prepare the working environment, learn gerrit,
write a
  blogpost, etc.? I thin that bugzilla is too heavy for this purpose.

 Short answer: Google Docs, regular status meetings and a little bit of
 discipline should be enough.

 Long answer:

 I was a mentor in several projects. The most successful of them was in
 the last few months, and I mentored two students there. It's mostly
 done, and they are fixing the last bugs. They are actively studying,
 and together they had only about 10 hours a week.

 How did it work? Very simply: We had weekly meetings. Each meeting
 began with the students doing a demo of what they achieved. Then we
 had a little discussion about where should the project go next and
 wrote a list of tasks for the next week in a shared Google doc. We
 were just adding more and more tasks to the end. We tried to stick to
 it and check that the tasks were completed in the beginning of the
 next meeting, and marking completed tasks as done. And so on.

 This way of management was inspired by the Agile management
 methodology, though it doesn't follow it precisely. The Agile
 principles that we tried to follow were:
 1. As much as possible, letting the developers (the students)
 participate in the planning their own work and deciding what needs to
 be done.
 2. Breaking the work into small and clearly defined tasks. This
 includes all work-related tasks: both actual coding, as well as stuff
 around it, such as signing documents, opening accounts, learning
 Objective-C, uploading to AppStore etc.
 3. Making a long-term plan, but being ready to change it along the way.

 Trello was mentioned in one of the emails here. I didn't try it, but
 it may be good; It sounds like the kind of thing that was meant for
 this kind of task management. But honestly, if a Google doc works for
 you, don't work too hard to find something more complicated.

 If the student you are mentoring has more than 5 hours a week, you'll
 probably want to do the meetings more frequently than once a week.

 That's about it.

 --
 Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
 http://aharoni.wordpress.com
 ‪“We're living in pieces,
 I want to live in peace.” – T. Moore‬

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

One nice thing about wiki pages over google docs is that interested third
parties can see what is going on (which I would consider a good thing)

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is it possible to set revision tags from the API?

2013-05-11 Thread Brian Wolff
On 2013-05-11 5:06 PM, Jamie Thingelstad ja...@thingelstad.com wrote:

 I've looked into this and my conclusion is that it isn't possible, but I
would really like to do this so I figured I would ask.

 I have a bot making edits via the API and I would like to use revision
tags with the bots edits. Is this possible from the API?

 I see where I can retrieve them in revisions, and get them as properties
for a number of things. But I can't seem to see anyway that I can associate
a tag with a given revision or edit.
 Jamie Thingelstad
 ja...@thingelstad.com
 mobile: 612-810-3699
 find me on AIM Twitter Facebook LinkedIn

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

No that is not possible. Tags can currently only be set by extensions.
(Arguably having some interface for users to manipulate tags might be a
good thing)

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Brian Wolff
On 2013-05-13 5:16 AM, Moriel Schottlender mor...@gmail.com wrote:

 On Mon, May 13, 2013 at 4:06 AM, Markus Glaser gla...@hallowelt.biz
wrote:

  I like that idea very much. In the use case I have in mind, though, I do
  have actual releases. Do you think it's possible for your extension to
also
  consider tags? I am thinking of something like a tagging convention,
e.g.
  RELEASE v1.20. ExtensionStatus could then parse the tag and send it
  back to the user.
 

 Hi Markus,

 Absolutely, I wish it was already a convention. I created the 'read the
 remote git' to go around that problem. I could, however, set the system to
 first check a release tag and then fall back to testing dates/commits like
 it does now if the release tag is unavailable.

 The problem with tags, though, is that we will need to have some common
 location that keeps the newest release on record so the extension can then
 compare the local tag against a remote update. I believe this is what's
 done in systems like Wordpress and Drupal, but their extension database
 system is completely different, too, and I don't think it fits MW at all.
 For that matter, their extensions are more 'individual-based' rather than
 collaborative to the community, that won't work here.

 It will also require extension updaters/developers to update those tags.
 I think it's fairly easy to add a local vs remote tag comparison, the
 question is how it can be implemented in terms of convention so all
 extensions end up following it. Is it realistic?


 --
 No trees were harmed in the creation of this post.
 But billions of electrons, photons, and electromagnetic waves were
terribly
 inconvenienced during its transmission!
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

We actually do autocreate tags (REL1_XX) whenever we do a release that
corresponds to the version of extension included in the installer (if it is
included) and for use by special:extensiondistributor. How much people
update the tags varries by extension, with many not really being updated
but some are.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bugzilla Weekly Report

2013-05-13 Thread Brian Wolff
On 2013-05-13 7:21 AM, Željko Filipin zfili...@wikimedia.org wrote:

 On Mon, May 13, 2013 at 5:00 AM, reporter repor...@kaulen.wikimedia.org
wrote:

  General/Unknown 12
  Site requests   12
  General/Unknown 11
 

 General/Unknown is listed twice.

 Željko
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

With different results each time ;)

Presumably they represent general/unknown components from 2 different
products.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Job queue upgrade

2013-05-14 Thread Brian Wolff
You can clear your job queue (run all jobs in it) by running
maintenance/runJobs.php and disabling user write access temporarily to
prevent new jobs from being created.

Whether or not that is reasonable advice is another question. Otoh most
jobs are rather unimportant if a bunch get lost (couple emails might not be
sent. Some caches might have to wait for next edit to be cleared).

-bawolff
On 2013-05-14 10:14 AM, Mark A. Hershberger m...@everybody.org wrote:

 Andre reminded me that there is a problem with upgrading the job queues
 that have any items in them In https://bugzilla.wikimedia.org/46934.

 I would like to include a warning in the installation documentation to
 clear your job queue before the upgrade, but I don't know that much
 about the job queue in MW or even if this is possible.

 Are there any recommendations for what to tell users?

 --
 http://hexmode.com/

 Imagination does not breed insanity. Exactly what does breed insanity
 is reason. Poets do not go mad; but chess-players do.
 -- G.K. Chesterson


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Job queue upgrade

2013-05-14 Thread Brian Wolff
Purges don't update the links table (unless you do it from the api and add
an extra argument). They only clear the parser cache + varnish cache +file
cache of the immediate page. (So they don't trigger HTMLCacheUpdate jobs
for redirects either afaik)

-bawolff
On 2013-05-14 2:21 PM, Chad innocentkil...@gmail.com wrote:

 Wouldn't a purge also create refreshlinks jobs?

 -Chad
 On May 14, 2013 1:19 PM, Brian Wolff bawo...@gmail.com wrote:

  You can clear your job queue (run all jobs in it) by running
  maintenance/runJobs.php and disabling user write access temporarily to
  prevent new jobs from being created.
 
  Whether or not that is reasonable advice is another question. Otoh most
  jobs are rather unimportant if a bunch get lost (couple emails might
not be
  sent. Some caches might have to wait for next edit to be cleared).
 
  -bawolff
  On 2013-05-14 10:14 AM, Mark A. Hershberger m...@everybody.org wrote:
 
   Andre reminded me that there is a problem with upgrading the job
queues
   that have any items in them In https://bugzilla.wikimedia.org/46934.
  
   I would like to include a warning in the installation documentation to
   clear your job queue before the upgrade, but I don't know that much
   about the job queue in MW or even if this is possible.
  
   Are there any recommendations for what to tell users?
  
   --
   http://hexmode.com/
  
   Imagination does not breed insanity. Exactly what does breed insanity
   is reason. Poets do not go mad; but chess-players do.
   -- G.K. Chesterson
  
  
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] State of MediaWiki's render action (parameter to index.php)

2013-05-19 Thread Brian Wolff
On 2013-05-19 10:09 PM, K. Peachey p858sn...@gmail.com wrote:

 On Mon, May 20, 2013 at 4:34 AM, Matthew Flaschen
 mflasc...@wikimedia.org wrote:
  We should give people a heads up, unless/until it's reconsidered and we
  decide to undeprecate it (maintain support indefinitely).
 
  Matt Flaschen

 Um, how it be discussed and considered to deprecate it in the first place?

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

It seems a little odd to deprecate something due to load, when
realistically we probably represent 95% of the load due to that feature.
(Number pulled out of a hat. I have no idea what the real numbers are, but
ive never heard of anyone other than mediawiki using that feature, and we
use it quite a lot). Im also not a fan of telling people not to do
something and then doing it ourselves.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bugzilla: Separate bug report status when patch is in Gerrit?

2013-06-05 Thread Brian Wolff
On 6/5/13, Andre Klapper aklap...@wikimedia.org wrote:
 Hi everybody,

 in December I mentioned the idea of having a PATCH_AVAILABLE or
 PATCH_TO_REVIEW status in Bugzilla [1] and that we should re-evaluate
 the idea once we have automatic notifications from Gerrit into Bugzilla
 in place [2]. This is now the case [3].

 From the Amsterdam Hackathon I know that some developers would like to
 filter on bug reports that have or don't have a patch in Gerrit, and
 easier finding of bug reports with a corresponding patch  lack of
 recent changes might provide another entry point for new developers
 (pick up the existing patch and finish it).

 Hence I propose
   * to remove the manually set and error-prone Bugzilla keyword
 patch-in-gerrit: Every bug on its way to get RESOLVED FIXED
 has to pass this stage anyway so a status feels more
 appropriate, and
   * to make the Gerrit Notification Bot automatically change the
 bug report status to PATCH_AVAILABLE/PATCH_TO_REVIEW in
 Bugzilla when a patch for that bug report has been committed
 (not: merged) to Gerrit.

 Comments?

 andre

 [1]
 http://lists.wikimedia.org/pipermail/wikitech-l/2012-December/065046.html
 [2]
 http://lists.wikimedia.org/pipermail/wikitech-l/2012-December/065226.html
 [3] https://bugzilla.wikimedia.org/show_bug.cgi?id=17322

 PS: Making the Gerrit notification bot automatically close bug reports
 in Bugzilla after merging a patch in Gerrit, or differentiating in
 Bugzilla between RESOLVED FIXED (fix merged) and RELEASED (fix
 deployed on the Wikimedia wikisites) are also interesting topics to
 discuss at some point, but not in this thread. One step at a time.
 --
 Andre Klapper | Wikimedia Bugwrangler
 http://blogs.gnome.org/aklapper/


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Please Please Please :)

I've always wanted that to be a status, since it is a distinct stage
in a bugs life cycle (distinct from assigned imo, although others
disagree. In my opinion assigned means currently working on, and patch
in gerrit means already worked on and awaiting review by somebody
else)

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Doxygen tags - a proposal

2013-06-08 Thread Brian Wolff
On 2013-06-08 5:29 AM, S Page sp...@wikimedia.org wrote:

 Perhaps Ori is pointing out that doxygen (and jsduck) require needless
 verbiage. The tools aren't smart enough to infer obvious information from
 source on their own (or maybe they are but you're not sure and you see
 other comments using these symbols so you copy and paste), so you wind up
 repeating information in doc generation syntax.

 And to what end?  I view doxygen as an external test that we're being
 consistent in comments (quick, is it @param String $mystr  or @param
$myStr
 {string}  ?) but I never actually refer to the generated output. Does
 anyone? Until someone builds a browser bridge that automatically scrolls
 the right HTML into view as you move around in your editor and
 automatically regenerates the HTML as you type, I don't see my habits
 changing.

 If web search engines could understand the generated documentation and
 ranked it higher in search results it would be more useful and used more.

Thank you for that email Ori. It was beautiful.

To answer S's question - I mostly look at the code, but I do use the html
docs occasionally. I used them quite extensively when I was a newbie first
learning about mediawiki. I also regularly link to them when people in irc
say things like how do I do x.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] migrating hooks doc to doxygen?

2013-06-08 Thread Brian Wolff
Frankly I think we should try automating stuff towards our wiki rather than 
using it as a way to
take stuff out. Find ways to integrate this data automatically into parts of 
the wiki. Bots if you
ABSOLUTELY need to. But preferably instead extensions and Lua stuff. Things 
that provide the
data in ways they can be incorporated into wiki pages. Keep wiki pages up to 
date. Show full
UIs, etc... on special pages and dedicated namespaces. And ideally, be 
integrated right into
the search.

I agree 100%. It would be cool if we had a bot auto-update part of
those page (While still allowing users to add info and tips). Maybe
even some sort of parser function to retrieve documentation...

Both Manual:Hooks/foo and all the $wgFoo pages can definitely benefit
from some automation. (Even cooler, would be if we could have
something like Special:Documentation/Linker::link (Before anyone
balks, :: is allowed in special page subpage name), which retrieved
the info from doxygen for that page, as an alternative way to view the
docs).

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: [WikimediaMobile] Number crunching: Upload errors on mobile

2013-06-08 Thread Brian Wolff

 The server problems section is worth a look though - although a small
 percentage The modification you tried to make was aborted by an
 extension hook 61. These errors are occurring on the following wiki
 projects:
 * sv.m.wikipedia.org
 * de.m.wikipedia.org
 * test.m.wikipedia.org
 * en.m.wikipedia.org
 * ar.m.wikipedia.org
 * es.m.wikipedia.org
 * ja.m.wikipedia.org
 * he.m.wikipedia.org
 * fr.m.wikipedia.org
 * nl.m.wikipedia.org
 Any ideas what may be causing that error?

I suspect that is caused by UploadBlacklist extension, which
blacklists about 23 files by their sha hash. According to the config
file, there's a log at  udp://$wmfUdp2logDest/upload-blacklist, so
you can probably check if that guess is right.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] CHanging the name of Vector on 1.21.1

2013-06-10 Thread Brian Wolff
On 2013-06-10 3:02 AM, Dmitriy Sintsov ques...@rambler.ru wrote:


 10 Июнь 2013 г. 9:15:44 пользователь  (j00...@mail.com) написал:


 Hello everyone,
 I want to modify the default Vector theme that comes with 1.21.1. But
before I do that I want to rename it. I want to name it nighttime.  I
created a folder called nighttime and copied all the vector files into it.
Then I made a copy of Vector.php and called it Nighttime.php. I then
modified the appropriate contents of Nighttime.php as follows...
 ---
 class SkinNighttime extends SkinTemplate {
  protected static $bodyClasses = array( 'vector-animateLayout' );
  var $skinname = 'nighttime', $stylename = 'nighttime',
  $template = 'NighttimeTemplate', $useHeadElement = true;

 ...
 unction setupSkinUserCss( OutputPage $out ) {
  parent::setupSkinUserCss( $out );
  $out-addModuleStyles( 'skins.nighttime' );
 ...
 class NighttimeTemplate extends BaseTemplate {
 -
 You can see what the site looks after I renamed everything at
http://beta.dropshots.com/j00100/media/75373447 It appears as if there is
no formatting.
 I did some searching on Google but everything I found dealt with older
versions. Does anyone know how to rename Vector and have it working on 1.21?
 Thanks

 Also, if you change skin name you may encounter some incompatibilities,
for example Extension:VisualEditor has a whitelist for supported skin
names:

 class VisualEditorHooks {
 /** List of skins VisualEditor integration supports */
 protected static $supportedSkins = array( 'vector', 'apex',
'monobook' );


Visual editor should really use a hook for that



The reason you got no formatting is probably due to renaming thr line:
$out-addModuleStyles( 'skins.nighttime' );

Without creating a skins.nighttime module. This should all be covered in
Daniel's link.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deprecating use of the style attribute (part 1)

2013-06-11 Thread Brian Wolff
On 6/11/13, Jon Robson jdlrob...@gmail.com wrote:
 Many of you on the mailing list should be aware of the troubles that
 the style attribute brings to mobile [1,2] and the amount of hacks [3]
 that we have to introduce to work around them.

 I still truly believe the only way we can resolve this is a long term
 rethink of how we approach custom styling on wiki. I have also heard
 from Chris Steipp that there are security implications with allowing
 inline styles which such a move would address.

 I have submitted a patch [4] (mostly to share ideas and prompt
 discussion - before you pounce on it be aware I have -2ed it to allow
 discussion on whether there is a better way to do this - for instance
 it might be worthy of a new namespace, it might need more protection
 etc.. ).

 All the patch does is allow Template:Foo to have an associated
 stylesheet Template:Foo.css which is included in pages that use it.

 So if the San Francisco article uses templates Foo, Bar and Baz, a
 style tag will be constructed from the content of Template:Foo.css,
 Template:Bar.css and Template:Bar.css and inserted into the page. When
 the templates change the entire page San Francisco is changed and thus
 the new styling is applied.

 This would reduce the need for css hacks in mobile and keep power in
 editors hands.

 On the assumption that this patch makes it into core in some form that
 in future the mobile site can strip any style attributes from content
 and use the template css files instead and thus benefit from the
 ability to use media queries. This could be a long tedious process but
 I think it needs to be done.

 Thanks in advance for your discussion and thoughts around this long
 standing issue!
 ~Jon

 [1]
 https://www.mediawiki.org/wiki/Requests_for_comment/Deprecating_inline_styles
 [2] https://bugzilla.wikimedia.org/show_bug.cgi?id=35704
 [3]
 https://github.com/wikimedia/mediawiki-extensions-MobileFrontend/blob/master/stylesheets/common/mf-hacks.css
 [4] https://gerrit.wikimedia.org/r/68123

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I like the idea of this (for reasons that have nothing to do with
mobile). It would be nice to have css associated with the content
defined. I'm pretty sure wikis want things like this. Wikinews for
example loads MediaWiki:Common.css/{{FULLPAGENAME}} basically on every
page.

To be honest though, I'm unclear how this would fix things for mobile.
Wouldn't folks just put their problematic inline styles into a
stylesheet, and have them be just as problematic in the stylesheet?
(Not overly following mobile development)

 I still truly believe the only way we can resolve this is a long term
 rethink of how we approach custom styling on wiki. I have also heard
 from Chris Steipp that there are security implications with allowing
 inline styles which such a move would address.

I'm curious what those might be (Although I expect I won't find
out...). I know your patch is a proof of concept, but in its current
form, it introduces various security issues that don't exist before
(arbitrary css allowed = XSS).

---

Now, to bikeshed (I assume you're expecting this, sending it to
wikitech-l and all).

Personally, I would like the template namespace to not be special.
Hence I would like this all to work for other namespaces. So if you
create a fake template in your user space, you could do the css thing
too. This suggests a CSS namespace where you can create pages like
CSS:Template:Foo.

Alternatively we could have a css parser function where you put
cssfoo {border: red }/css, and it gets applied to the current
page, and everything that is transcluding it (Possibly with an option
similar to noinclude where the syntax highlighted version of the
tags contents is only shown on the original page and not on
transcluded pages.). The downside is less clear a content/style
separation, but its still quite a clear separation imo.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FLAC support in Mediawiki/Commons

2013-06-12 Thread Brian Wolff
On 6/12/13, Emmanuel Engelhart kel...@kiwix.org wrote:
 Hi

 I'm not an audio expert but AFAIK, FLAC is probably the best solution to
 store lossless audio streams. Similar to TIFF for pictures.

 The difference (with TIFF) is that we don't have FLAC support for now in
 Mediawiki... and it seems there is no way at all to upload audio streams
 in a lossless format for now.

 I have found two related features requests:
 * https://bugzilla.wikimedia.org/show_bug.cgi?id=20252
 * https://bugzilla.wikimedia.org/show_bug.cgi?id=39867

 But both don't have pretty recent comments and don't seem to be actively
 followed. My question is: Does someone has a project to fix that point
 in a near future?

 Regards
 Emmanuel
 --
 Kiwix - Wikipedia Offline  more
 * Web: http://www.kiwix.org
 * Twitter: https://twitter.com/KiwixOffline
 * more: http://www.kiwix.org/wiki/Communication

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Hi,

We do have FLAC support provided it is in an ogg container. For example:
https://commons.wikimedia.org/wiki/File:Muriel-Nguyen-Xuan-Brahms-rhapsody-opus79-1.flac.oga

Which gets transcoded to a vorbis file, and then played in browser.
But the source file is indeed using the FLAC codec.

It would of course be nice to be able to upload flac files in their
native container format too.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deprecating use of the style attribute (part 1)

2013-06-13 Thread Brian Wolff
On 6/13/13, Jon Robson jdlrob...@gmail.com wrote:
 Firstly thank you so so much for all this constructive discussion.

 I worry that I'm convoluting this discussion with my wish to deprecate
 the style attribute. It's a thing I would like to see but it is not
 the most important discussion to have on the short term in which the
 goal is to style things better on mobile.

 As a result since this does not seem to be helping I have decided to
 create a new but related request for comment:
 https://www.mediawiki.org/w/index.php?title=Requests_for_comment/Allow_styling_via_style_attributes_in_template

 I realise there are various templates that will not work or be helped
 by this move (for instance the Colorbox template Brad asks about), but
 I think it is too early to worry about these templates. I would like
 to see the style attribute completely unused but ultimately this
 decision in future would be one made by the community/security needs.
 I see this as step 1 in a long but much needed journey.

 I also realise there are security risks. I am aware this opens up the
 potential for vandalism but I'd hope that these would not happen often
 due to edits being restricted to admins.

 Thanks again for your constructive comments so far and I really hope
 we can get some consensus and push this work forward. Please view the
 talk page to see the more actionable next steps. I look forward to us
 moving this forward...!

 On Wed, Jun 12, 2013 at 10:01 AM, Gabriel Wicke gwi...@wikimedia.org
 wrote:
 On 06/11/2013 05:39 PM, Jon Robson wrote:
 [1]
 https://www.mediawiki.org/wiki/Requests_for_comment/Deprecating_inline_styles

 I left some comments at the bottom of the RFC.

 Gabriel



 --
 Jon Robson
 http://jonrobson.me.uk
 @rakugojon

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



Hi,

 As a result since this does not seem to be helping I have decided to
 create a new but related request for comment:
 https://www.mediawiki.org/w/index.php?title=Requests_for_comment/Allow_styling_via_style_attributes_in_template

That page doesn't exist.

 I also realise there are security risks. I am aware this opens up the
 potential for vandalism but I'd hope that these would not happen often
 due to edits being restricted to admins.

That's an important point you forgot to mention in your original
proposal. I assumed you wanted this to be editable by everyone :)

Personally I'd rather have it be safe, and usable by everyone. (Also
I'd like a pony if its not too much trouble)

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bikeshed 5: The Painter Strikes Back

2013-06-15 Thread Brian Wolff
On 6/16/13, Tyler Romeo tylerro...@gmail.com wrote:
 So I was wondering how people would feel about adding a coding convention
 for the use of is_null() in PHP code.

 It's 10 times slower than doing === null, which is a bit trivial in
 context, but nonetheless a fact, and it's also a bit easier to read,
 especially when doing the inverse (i.e., doing !is_null( ... ) versus !==
 null). Also, there's no functional difference between the two.

 Any objections other than maintaining the status quo?

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2016
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Easier to read is debatable. !is_null( $foo ) reads directly like an
english sentence Not is null. Ok, maybe an english sentence with bad
grammar, but I hardly find it unclear.

As for performance. 10x out of context doesn't mean much (How much
slower is 10x. If we changed all 681 instances to the other one, are
we talking about a difference of 1 microsecond in absolute time? Or is
10x an actually significant saving. For that matter is the benchmark
being used actually reliable?)

I feel such trivialities should be left at the discretion of the commiter

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Graphical note entry for wiki music [Extension:Score]?

2013-06-16 Thread Brian Wolff
On 6/16/13, Jan Nieuwenhuizen jann...@gnu.org wrote:
 Hi,

 I am working on a GUI for LilyPond and am looking for your feedback on
 the desirability or usefulness --if any-- for MediaWiki.

 My desire is to have LilyPond accessible to everyone by presenting a
 basic GUI initially so as to provide a more gentle introduction than
 text to most people, simultaneously offering an easy way to learn the
 LilyPond language and handhold them while they make the switch.

 Although I started three years ago it is still mostly a play thing --I
 do this just for fun and it has seen a few rewrites-- so it will be a
 long way before it matches any of the power that LilyPond has, or even
 that other GUIs offer right now.

 I would enjoy having more focus and actual users and so early this
 spring I took up the idea to create a basic web frontend alongside the
 main Gnome/Gtk+ GUI.

 Then, by the usual blend of sheer coincidence and providence, the
 score extension landed.  That has kept me wondering if what I have
 now could [almost] be useful for MediaWiki or WikiPedia...or what
 would need to be done to make it so.

 Any feedback is much appreciated!  Please have a look at

http://lilypond.org/schikkers-list

 or go straight to the LilyPond Schikkers Demo

http://lilypond.org/schikkers/

 Thankyou
 Greetings, Jan

 PS: as an aside, I read some complaints about Lily's SVG output on this
 list; would you please send a bug report on that, or anything else
 that you find amiss to bug-lilyp...@gnu.org?

 --
 Jan Nieuwenhuizen jann...@gnu.org | GNU LilyPond http://lilypond.org
 Freelance IT http://JoyofSource.com | Avatar®  http://AvatarAcademy.nl

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Hi,
Thanks for reaching out to us.

I just tried the demo, and it looks pretty awesome. I definitely think
integrating this into Score somehow would be very desirable.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Automating Main Page with Lua

2013-06-17 Thread Brian Wolff
On 2013-06-17 4:20 PM, Paul Selitskas p.selits...@gmail.com wrote:

 Turning back to the automating thing and the Main Page.

 I've got tired updating the Other Wikipedias section (congratulations to
 the Swedish Wikipedia!), so I wrote some code to automate the job.

 There is a bot that updates different statistics per wiki. I decided to
 parse the data page and push it through a mediawiki message to avoid
 hard-coded pieces of text inside.

 Here we have to expensive parts: getContent() for a template with
necessary
 data, and retrieving a message for the view. Is it OK to have expensives
at
 the Main Page?

 The module is placed here: http://goo.gl/3V5St

 --
 З павагай,
 Павел Селіцкас/Pavel Selitskas
 Wizardist @ Wikimedia projects
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Just personal opinion. Not an official answer] - main page is cached
like any other page. The expensive function is more a deterrent against
someone putting 1000 such calls on a page. A single getContent should not
be an issue, even on a widely viewed page.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Search documentation

2013-06-17 Thread Brian Wolff
Just as a note, MediaWiki default (aka crappy) search is very
different from the lucene stuff used by Wikimedia. Lucene search is
rather difficult to set up, so most third party wikis do not use it.

--bawolff


On 6/17/13, Nikolas Everett never...@wikimedia.org wrote:
 I'm not sure about http://www.mediawiki.org/wiki/Help:Searching but
 https://en.wikipedia.org/wiki/Help:Searching has lots of things we're going
 to have to add to our list.  My guess is
 http://www.mediawiki.org/wiki/Help:Searching is simply out of date.

 Nik


 On Mon, Jun 17, 2013 at 4:33 PM, Chris McMahon
 cmcma...@wikimedia.orgwrote:

 On Mon, Jun 17, 2013 at 1:28 PM, S Page sp...@wikimedia.org wrote:

  
  * enwiki says Hello dolly in quotes gives different results, mw
 directly
  contradicts this. Even on my local wiki, quotes make a difference.
 
  * enwiki disagrees with itself what a dash in front of a word does.
 

 I did some research a few weeks ago on the current state of Search and
 there are a number of discrepancies between the documentation and actual
 behavior.  Some of them have BZ tickets, like
 https://bugzilla.wikimedia.org/show_bug.cgi?id=44238
 -Chris
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Search documentation

2013-06-18 Thread Brian Wolff
My comments are based mostly on second hand knowledge. People who have
tried have often ran into problems, asked for help on #mediawiki, and
nobody knowing anything that can help them. Probably a large part of
the issue is requiring people to install a separate (non-php) program
that's not all that well documented.

--bawolff

On 6/17/13, Nikolas Everett never...@wikimedia.org wrote:
 One of our goals while building this has been to make something reasonably
 easy to install by folks outside of WMF.  I've added some notes about this
 to the page.  I'd certainly love to hear ways that'd make it simpler to use.

 Nik


 On Mon, Jun 17, 2013 at 8:23 PM, Brian Wolff bawo...@gmail.com wrote:

 Just as a note, MediaWiki default (aka crappy) search is very
 different from the lucene stuff used by Wikimedia. Lucene search is
 rather difficult to set up, so most third party wikis do not use it.

 --bawolff


 On 6/17/13, Nikolas Everett never...@wikimedia.org wrote:
  I'm not sure about http://www.mediawiki.org/wiki/Help:Searching but
  https://en.wikipedia.org/wiki/Help:Searching has lots of things we're
 going
  to have to add to our list.  My guess is
  http://www.mediawiki.org/wiki/Help:Searching is simply out of date.
 
  Nik
 
 
  On Mon, Jun 17, 2013 at 4:33 PM, Chris McMahon
  cmcma...@wikimedia.orgwrote:
 
  On Mon, Jun 17, 2013 at 1:28 PM, S Page sp...@wikimedia.org wrote:
 
   
   * enwiki says Hello dolly in quotes gives different results, mw
  directly
   contradicts this. Even on my local wiki, quotes make a difference.
  
   * enwiki disagrees with itself what a dash in front of a word does.
  
 
  I did some research a few weeks ago on the current state of Search and
  there are a number of discrepancies between the documentation and
  actual
  behavior.  Some of them have BZ tickets, like
  https://bugzilla.wikimedia.org/show_bug.cgi?id=44238
  -Chris
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki errors on Wikimedia wikis

2013-06-19 Thread Brian Wolff
On 6/19/13, Ori Livneh o...@wikimedia.org wrote:
 I added a link to http://tinyurl.com/n3twd8k to the channel topic of
 #wikimedia-tech  #wikimedia-operations. It points to a live graph of
 MediaWiki's error rate over the last 24 hours . I hope to automate
 monitoring of this data sometime soon, but in the meantime let's keep an
 eye on it collectively, especially right after deployments.

 ---
 Ori Livneh
 o...@wikimedia.org
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Cool.

Is there any *public* list of which exceptions/errors they are. Seeing
how many isn't all that helpful unless we know which ones. (yeah yeah
I know, there's concerns about data leakage with backtraces, but just
the exception names w/o backtrace should be safe (?))

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Performance issues with mediawiki 1.21

2013-06-20 Thread Brian Wolff
On 6/20/13, Johannes Weberhofer jweberho...@weberhofer.at wrote:
 Dear all!

 After upgrading a server to a complete new system
 (apache/php/mysql-mariadb) I have massive performance-problems when a large
 page is to be rendered.

 I have added a profiling log; I do not think, it is related to the database
 upgrade, as the database have a very low CPU usage while rendering, while
 Apache's CPU usage is very high for around 16 seconds. This happens
 especially with large pages. I have already tried to remove all extensions
 which did not make any difference.

 Do you have any ideas where to start?

 Best regards,
 Johannes


 Start request GET /wiki/IS_(Projekttagebuch)
 HTTP HEADERS:
 HOST: test.com
 USER-AGENT: Mozilla/5.0 (X11; Linux x86_64; rv:21.0) Gecko/20100101
 Firefox/21.0
 ACCEPT: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
 ACCEPT-LANGUAGE: de-at,de;q=0.8,en-us;q=0.5,en;q=0.3
 ACCEPT-ENCODING: gzip, deflate
 REFERER:
 https://test.com/w/index.php?title=IS_(Projekttagebuch)action=editsection=515
 COOKIE: mediawikiUserName=Web;
 mediawiki_session=qd955vfopmq2ri2limkteimp8rc8pu9v; mediawikiUserID=3;
 mediawikiPostEditRevision72824=1
 DNT: 1
 CONNECTION: keep-alive
 CACHES: EmptyBagOStuff[main] SqlBagOStuff[message] SqlBagOStuff[parser]
 [cookie] session_set_cookie_params: 0, /, , 1, 1
 Class LanguageDe not found; skipped loading
 LocalisationCache: using store LCStore_DB
 Connected to database 0 at localhost
 Fully initialised
 Title::getRestrictionTypes: applicable restrictions to [[IS
 (Projekttagebuch)]] are {edit,move}
 [ContentHandler] Created handler for wikitext: WikitextContentHandler
 User: cache miss for user 3
 User: loading options for user 3 from database.
 User: logged in from session
 User: loading options for user 3 from override cache.
 Connected to database 0 at localhost
 MessageCache::load: Loading de... got from global cache
 Unstubbing $wgParser on call of $wgParser::firstCallInit from
 MessageCache::getParser
 Parser: using preprocessor: Preprocessor_DOM
 Unstubbing $wgLang on call of $wgLang::_unstub from
 ParserOptions::__construct
 OutputPage::checkLastModified: client did not send If-Modified-Since header
 Article::tryFileCache(): not cacheable
 Article::view using parser cache: yes
 Parser cache options found.
 ParserOutput cache found.
 Article::view: showing parser cache contents
 Title::getRestrictionTypes: applicable restrictions to [[IS
 (Projekttagebuch)]] are {edit,move}
 Title::getRestrictionTypes: applicable restrictions to [[IS
 (Projekttagebuch)]] are {edit,move}
 Use of wfMsg was deprecated in MediaWiki 1.21. [Called from
 PdfBookHooks::onSkinTemplateNavigation in
 /usr/share/mediawiki/extensions/PdfBook/PdfBook.hooks.php at line 173]
 Use of wfMsgReal was deprecated in MediaWiki 1.21. [Called from wfMsg in
 /usr/share/mediawiki/includes/GlobalFunctions.php at line 1444]
 Use of wfMsgGetKey was deprecated in MediaWiki 1.21. [Called from wfMsgReal
 in /usr/share/mediawiki/includes/GlobalFunctions.php at line 1542]
 Title::getRestrictionTypes: applicable restrictions to [[IS
 (Projekttagebuch)]] are {edit,move}
 Class PEAR_Error not found; skipped loading
 OutputPage::sendCacheControl: private caching; Thu, 20 Jun 2013 14:20:08 GMT
 **
 DatabaseBase::query: Writes done: UPDATE  `page` SET page_counter =
 page_counter + 1 WHERE page_id = '4292'
 LoadBalancer::reuseConnection: this connection was not opened as a foreign
 connection
 LoadBalancer::reuseConnection: this connection was not opened as a foreign
 connection
 LoadBalancer::reuseConnection: this connection was not opened as a foreign
 connection
 LoadBalancer::reuseConnection: this connection was not opened as a foreign
 connection
 LoadBalancer::reuseConnection: this connection was not opened as a foreign
 connection
 LoadBalancer::reuseConnection: this connection was not opened as a foreign
 connection
 LoadBalancer::reuseConnection: this connection was not opened as a foreign
 connection
 LoadBalancer::reuseConnection: this connection was not opened as a foreign
 connection
 LoadBalancer::reuseConnection: this connection was not opened as a foreign
 connection
 LoadBalancer::reuseConnection: this connection was not opened as a foreign
 connection
 LoadBalancer::reuseConnection: this connection was not opened as a foreign
 connection
 LoadBalProfiling data
 Profile section ended by close(): -total
 20130620142023  16.176  /wiki/IS_(Projekttagebuch)

 Profiling data
 Name
 Calls Total  Each %   Mem
 -total
 1 16176.342 16176.342   100.000%  36200744  (16176.342 -
16176.342) [0]
 MediaWiki::main
 1 16039.443 16039.44399.154%  20009099  (16039.443 -
16039.443) [3614]
 OutputPage::output
 1 15748.575 15748.57597.356%   5828987  (15748.575 -
15748.575) [740]
 MediaWiki::performRequest
 1   270.768   270.768 1.674%  11315010  (  270.768 -
  270.768) 

Re: [Wikitech-l] Removal of Bugzilla admins

2013-06-22 Thread Brian Wolff
On 2013-06-22 6:49 PM, Thehelpfulone thehelpfulonew...@gmail.com wrote:

 On 22 June 2013 22:33, Alex Monk kren...@gmail.com wrote:

  I've just found out that WMF's Bugmeister Andre Klapper removed nearly
  everyone's Bugzilla adminship (and people with root access on the
servers
  now have access to a file which contains login details for an 'emergency
  admin' account). So I have some questions:
 

 This wasn't a sudden removal - Andre discussed it with ops and emailed
 *every* admin first, so it's far less dramatic than you may think. He's
 also been working on
 https://wikimediafoundation.org/wiki/Bugzilla_administrator_rights_policy,
 which I believe has approval from the relevant people (I'm can't think who
 that is off the top of my head).


Be that as it may, it still would have been nice for this to be publically
discussed (or at least publically announced) especially given the current
political controversies surounding rights removals from wmf services.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [WikimediaMobile] Videos on mobile

2013-06-24 Thread Brian Wolff
In regards to unresolved codec issues, that is more a
political/legal than a technical issues.

 http://www.mediawiki.org/wiki/Extension:OggHandler

We haven't been using OggHandler for quite some time now (Since
November 2012). We now use TimedMediaHandler extension.


-bawolff


On 6/24/13, Arthur Richards aricha...@wikimedia.org wrote:
 +wikitech-l

 This sounds like something that might be good for the newly forming
 multimedia team to work on. As Yuvi pointed out (off the wikitech-l list):

 There are also unresolved codec issues with playing videos (no H264
 support), so even if we do enable video playback it'll not be
 available everywhere, and even in places where it is it is going to be
 a biggish battery sink (no hardware decoding support). Would want to
 consider that before enabling it fully.

 I envision the mobile team helping to support this, but folks who focus on
 multimedia-related stuff would probably be the best candidates for digging
 into this and figuring out how we can best move forward.


 On Thu, Jun 20, 2013 at 6:16 PM, Jon Robson jdlrob...@gmail.com wrote:

 I would like to see videos working on mobile

 Yet there seems to be two issues here.
 1)  Cleaning up MobileFrontend code
 We are stripping the ogg_player by parsing and cleaning up the HTML
 We could probably do this in css/javascript instead.

 2) Making videos work on mobile where supported
 If we were to explore enabling videos on mobile we would have to look
 at the ogg player javascript code associated with it and get it
 working on mobile or create our own code that knows how to read it.
 http://www.mediawiki.org/wiki/Extension:OggHandler

 I'm also concerned about the size of the javascript module and since
 it is not needed by every page I would argue that it should only be
 loaded when the video is clicked (the existing extension may do this
 or not I'm not clear):

 https://git.wikimedia.org/blob/mediawiki%2Fextensions%2FOggHandler/895f74e63fa9cadeba1df63604b0aaeae10f803c/OggPlayer.js


 On Thu, Jun 20, 2013 at 5:35 PM, Max Semenik maxsem.w...@gmail.com
 wrote:
  Hi, when mobile WP was in its childhood, it was decided that we're not
  ready to display videos on our pages, so they were stripped. And
  stripped very crudely, by removing just #ogg_player_1 and
  #ogg_player_2 so that only first two videos on a page were removed.
  What are your opinions - shoudld we continue doing this?
 
 
 
  --
  Best regards,
Max Semenik ([[User:MaxSem]])
 
 
  ___
  Mobile-l mailing list
  mobil...@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/mobile-l



 --
 Jon Robson
 http://jonrobson.me.uk
 @rakugojon

 ___
 Mobile-l mailing list
 mobil...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/mobile-l




 --
 Arthur Richards
 Software Engineer, Mobile
 [[User:Awjrichards]]
 IRC: awjr
 +1-415-839-6885 x6687
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


-- 
--
- Brian
Caution: The mass of this product contains the energy equivalent of 85
million tons of TNT per net ounce of weight.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bot upload of files over 100MB

2013-06-25 Thread Brian Wolff
When browsing through commons, I happened to stumble upon
https://commons.wikimedia.org/wiki/User:Smallman12q/PyCJWiki which
appears to be a python bot that uses chunked uploading, and thus work
with files up to 500 mb big. Perhaps it would be helpful to you.

--bawolff

On 6/25/13, Tilman Bayer tba...@wikimedia.org wrote:
 With the https://commons.wikimedia.org/wiki/Commons:Chunked_uploads
 enabled, users can currently upload files up to 500MB, when using a
 supported browser (e.g. a current Firefox or Chrome).

 I think Daniel's question was about how to make this method work for a
 bot too, i.e. via the API rather than in a browser.

 On Tue, Jun 25, 2013 at 2:52 PM, OQ overlo...@gmail.com wrote:
 I would assume given the previous reply that regardless of upload method,
 chunked or otherwise, there is still the hard limit of how big the
 resultant file can be.


 On Tue, Jun 25, 2013 at 5:49 PM, Jeremy Baron jer...@tuxmachine.com
 wrote:

 On Jun 25, 2013 5:44 PM, Daniel Mietchen
 daniel.mietc...@googlemail.com
 
 wrote:
  my bot[1] occasionally stumbles upon files that are above 100MB and
  thus does not upload them[2]. What do I have to do to get it set up
  for handling these files too?

 this looks like the relevant section:

 https://www.mediawiki.org/wiki/API:Upload#Chunked_uploading

 I don't know the current settings; you might need to enable chunked
 uploads
 in the MediaWiki prefs for the user you're uploading as.

 -Jeremy
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



 --
 Tilman Bayer
 Senior Operations Analyst (Movement Communications)
 Wikimedia Foundation
 IRC (Freenode): HaeB

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bot upload of files over 100MB

2013-06-25 Thread Brian Wolff
On 6/25/13, Greg Grossmeier g...@wikimedia.org wrote:
 quote name=Techman224 date=2013-06-25 time=16:47:39 -0500
 Right now the wiki is setup to only allow up to 100 MB files. The only way
 I see to upload a file greater than that is to file a bugzilla request and
 a system administrator with shell access can manually upload it using a
 script.

 I thought it was set to 500mb?
 https://git.wikimedia.org/blob/operations%2Fmediawiki-config/40f5cf38a00edce951a2eb14ae6385aa1eac24d0/wmf-config%2FInitialiseSettings.php#L10185

 Greg

 --
 | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
 | identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Well as the comment says:
// Only affects URL uploads; web uploads are enforced by PHP.

That particular variable limits chunked uploads, and upload by url
(which is only enabled for flickr). Direct normal (non-chunked)
uploads get limited by the smallest of: MW config variable,
upload_max_filesize, and post_max_size and hence are limited to 100mb.

Note: Special:Upload is always non-chunked. Upload Wizard uses chunked
based on a preference, and you can use either method if you do things
yourself using the API.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Moving on from Doxygen?

2013-07-05 Thread Brian Wolff
On 7/5/13, Yuvi Panda yuvipa...@gmail.com wrote:
 On Sat, Jul 6, 2013 at 6:36 AM, Matthew Walker mwal...@wikimedia.org
 wrote:
 1. Not be tortoise slow
 Pretty sure this only matters because we do continuous integration -- we
 probably don't need to do this for every commit...? Maybe once a day?

 In any case -- who says PHPDoc is any faster.

 Slow to use, not slow to generate. On my firefox it constantly gets
 stopped with a 'script on this page is taking too long to run'


Interesting. For me its speedy (or at least acceptably fast) and I'm
on firefox 3.5

 2. Have usable search
 The demo at least doesn't even offer search functionality...

 But does this even matter? I would argue in favour of a independent search
 solution along the lines of Ohloh [1] so that we can integrate our JSDuck
 documentation.

 Haven't checked out Ohloh's, but something as simple as 'I want to see
 documentation for WikiPage::factory' should be achievable by typing
 'WikiPage::factory' into the docs. I'm setting up a phpdoc instance on
 my local system, to see how it goes.

I generally go directly to the class I want, so not really something
that would bother me (Actually I didn't even know we had a search
box).

I also primarily use grep locally to search for things... I guess
newbies which are the documentations primary use case, probably are
less likely to do that.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problems Updating from 1.18 to 1.21.1

2013-07-09 Thread Brian Wolff
On 7/9/13, Derric Atzrott datzr...@alizeepathology.com wrote:
 Good morning all,

 I have a question about a problem that cropped up during my update from 1.18
 to
 1.21.1.

 With one exception everything went smoothly during my update, but now all of
 my
 images, appear to be without thumbnails and inadvertently using File
 protocol
 links.

 The generated source for embedded images looks like this:
 p[a rel=nofollow class=external text
 href=File:ReportedTime.jpg%7C451px%7CReportedtime per activity on
 Project/a]/p

 Generated from:
 [[File:ReportedTime.jpg|451px|Reported time per activity on Project]]

 Any idea what I may have done wrong.  Is there a new setting that I may have
 missed?  Has anyone ever seen this sort of issue before?

 Thank you,
 Derric Atzrott
 Computer Specialist
 Alizee Pathology



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

What happened is that url protocols became case insensitive. So if you
had added file: as a url protocol, previously it would trigger only on
file:foo not File:foo. The solution is to change your entry in
$wgUrlProtocols so that it is 'file://' instead of 'file:'.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to get the number of pages in a category

2013-07-09 Thread Brian Wolff
On 7/9/13, Brad Jorsch (Anomie) bjor...@wikimedia.org wrote:
 On Tue, Jul 9, 2013 at 10:46 AM, Daniel Mietchen
 daniel.mietc...@googlemail.com wrote:
 Hello together,

 in the framework of a GLAM project, we are looking for ways to
 (1) identify the number of pages in a given category - including via
 subcategories - on a given wiki

 You can get the list of subcategories of a category with
 list=categorymemberscmtype=subcat. You'd have to make calls to this
 for each individual (sub)category you're interested in, and be sure to
 detect cycles properly.

 You can get the number of pages in a category with prop=categoryinfo.
 You can batch this by specifying up to 50 titles per query (500 if
 your account has the apihighlimits userright).

 If you're going to be doing a lot of this, it might be better to
 perform queries directly against the database, either by downloading
 the database dumps or using Tool Labs.

 (2) get the pageview stats for all these pages, including on aggregate

 The raw pageview stat data may also be available on Tool Labs. I see
 some data in /shared/viewstats/, but it doesn't seem to be up to date.


 --
 Brad Jorsch (Anomie)
 Software Engineer
 Wikimedia Foundation

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

It should be noted that the category table cat_pages entries are
sometimes inaccurate (especially for larger categories), and are
closer to an order of mangitude estimate. If you're going to be
looking at page views of all entries in the category, you could just
count how many pages there are directly.

Page view stats are available at
http://dumps.wikimedia.org/other/pagecounts-raw/

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Moving a page to a category page.

2013-07-11 Thread Brian Wolff
On 7/11/13, Beebe, Mary J bee...@battelle.org wrote:
 We would like to make pages into category pages.  If we try to move a page
 to a category namespace it will not let us.  Is there a configuration
 variable  to do that?

 Thanks,
 Mary


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

No. Moving things in the category namespace (or between the category
namespace) is restricted as categorymembers aren't moved with the page
move. There is no configuration variable at present to control this.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Remove 'visualeditor-enable' from $wgHiddenPrefs

2013-07-22 Thread Brian Wolff
On 7/22/13, James Forrester jforres...@wikimedia.org wrote:
 On 22 July 2013 11:45, Tyler Romeo tylerro...@gmail.com wrote:

 Putting all of the issues aside, I'd like to know what the reason is for
 hiding the preference. Let's assume for a second that VE does not hinder
 users at all, that it's JS footprint is nonexistent, and that the
 interface
 changes aren't that bothersome (which, to an extend, are true). Even with
 all that, what reason is there to purposely deprive users of the choice to
 completely hide VE if they're sure they have no intention of using it?


 Adding a preference to disable VisualEditor in normal user preferences
 (rather than making it as easy as possible for gadgets to disable if people
 so chose) would be a lie.

 It would imply that this is a preference that Wikimedia thinks is
 appropriate. This would be a lie. For a similar example, see the removal of
 the disable JavaScript option from Firefox 23.

 It would imply that this is a preference that Wikimedia will support.
 This would be a lie. We have always intended for VisualEditor to be a
 wiki-level preference, and for this user-level preference to disappear once
 the need for an opt-in (i.e., the beta roll-out to production wikis) is
 over.

 It would imply that Wikimedia thinks preference bloat is an appropriate way
 forward for users. This would be a lie. Each added preference adds to the
 complexity of our interface, increasing even further the choice paralysis
 and laughable usability of our existing preference system.

 It would imply that Wikimedia thinks preference bloat is an appropriate way
 forward for expenditure of donor funds. This would be a lie. Each added
 preference adds to the complexity of our software - so increasing the cost
 and slowness of development and testing, and the difficulty of user support.

 It would imply that Wikimedia can get rid of under-used preferences. This
 would be a lie. We do not have a successful track record of getting rid of
 preferences, even when used by a handful of our users, even when set away
 from default mostly by inactive accounts; accepting this form of product
 debt now on the spurious claim that we'll pay it off later is untrue.

 It would imply that getting rid of preference later rather than now would
 in any way reduce the outcry. This would be a lie.  The very few times we
 have done this, the arguments from those campaigning for retention are
 generally emotive and not based on the above points - that it's just a
 little preference, not harming anyone, that Wikimedia has enough money
 for just this one item, or that the preference is the only thing keeping
 the user from leaving - an argument that almost always is visibly proven
 untrue after the preference is removed.

 Creating such a preference is a lie, and a lie I cannot endorse.

 J.
 --
 James D. Forrester
 Product Manager, VisualEditor
 Wikimedia Foundation, Inc.

 jforres...@wikimedia.org | @jdforrester
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Really? Given the number of inane preferences in Special:Preferences
(I'm looking at you preference to disable sending 304 status codes),
this is where we're going to draw the line?

A preference for this seems fairly reasonable in my opinion.
Especially given that visual editor is not at a fully feature complete
state yet (For example, its not enabled in the project namespace as
far as I understand)

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Remove 'visualeditor-enable' from $wgHiddenPrefs

2013-07-22 Thread Brian Wolff
On 7/22/13, Ryan Lane rlan...@gmail.com wrote:
 On Mon, Jul 22, 2013 at 7:17 PM, Tyler Romeo tylerro...@gmail.com wrote:

 On Mon, Jul 22, 2013 at 9:35 PM, James Forrester
 jforres...@wikimedia.orgwrote:
 
   It would imply that this is a preference that Wikimedia thinks is
  appropriate. This would be a lie. For a similar example, see the removal
 of
  the disable JavaScript option from Firefox 23.
 

 You still haven't explained why this preference is inappropriate.


 This is slightly off topic, but removing that preference from firefox is a
 great idea. It's only used properly by power users, who would be able to do
 the same in about:config, or via noscript, or will add an extension to do
 it. That preference is almost always incorrect set by users who don't know
 what they are doing and it leads to a broken browser experience.

 Maybe there's a comparison to be made, but there's not really a simple way
 to disable VE in MediaWiki other than by having a preference.

 Assuming a proper implementation of edit/edit source I'm not sure what the
 big deal is, but I'm not a hardcore editor so I'm likely just not seeing it.

 - Ryan
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Offtopic, but I think a good comparison could be made to the (former)
external-editor preferences. Anybody who actually used the external
editor feature did not use the preference. Many people accidentally
selected the preference and totally screwed everything up.

/utterly offtopic aside

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Remove 'visualeditor-enable' from $wgHiddenPrefs

2013-07-25 Thread Brian Wolff
 I made the call about a year ago, and mentioned it in several of the
 dozens of mailing list and on-wiki posts made about the development of
 VisualEditor since then. Clearly my communication about it wasn't read, or
 wasn't understood, by the people who subsequently complained, but I
 wouldn't describe it as being done silently.

Users respond to things that happen to them and especially at the
point in time when they are negatively affected by something. Its
unrealistic to expect users to respond to comments about a piece of
software before they have to deal with it.


This thread has served it's purpose; to surface various arguments about
whether the preference to disable VisualEditor should be hidden or not.

It's debatable if this thread was ever really about that.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New Senior Software Engineer: Bryan Davis

2013-07-29 Thread Brian Wolff
On 7/29/13, Rob Lanphier ro...@wikimedia.org wrote:
 Hi everyone,

 I'm pleased to welcome Bryan Davis to Wikimedia Foundation's Platform
 Engineering team.  Bryan joins us from Keynetics in Boise, Idaho,
 where he was the senior programmer and architect on a team responsible
 for new product development, building the Kount fraud control system.

 Bryan joins Platform Engineering as a Senior Software Engineer working
 generally on backend software issues.  His first job will be in
 improving the robustness of media  infrastructure, such as improving
 large file uploads, but he'll probably also get mixed up in the usual
 Platform-y kinds of stuff that other developers in Platform
 Engineering frequently get involved in.

 Bryan will be working remotely from Boise.

 Rob

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Yay, Another Bryan! Welcome aboard.

--Brian

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] no TLS 1.1 and 1.2 support

2013-07-29 Thread Brian Wolff
I agree. I'm just saying there are so many other security issues to choose
from that this one is almost irrelevant.

Specifically? Are there bugs filed for them? I'm personally not aware
of any serious security issues currently affecting us (Obviously I
don't have access to security bugs, but I don't have the impression
there are serious issues affecting us).

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] gwtoolset : architecture design help

2013-07-31 Thread Brian Wolff

 Metadata Set Repo
 -
 one of the goals of the project is to store Metadata Sets, such as XML
 under some type of version control. those Metadata Sets need to be
 accessible so that the extension can grab the content from it and process
 it. processing involves iterating over the entire Metadata Set and creating
 Jobs for the Job Queue which will upload each individual media file and
 metadata into a media file page using a Mediawiki template format, such as
 Artwork.

 some initial requirements
 • File sizes
   • can range from a few kilobytes to several megabytes.
   • max file-size is 100mb.

 • XML Schema - not required.
 • XML DTD - not required.

 • When metadata is in XML format, each record must consist of a single
 parent with many child
   • XML attribute lang= is the only one currently used and without user
 interaction

 • There is no need to display the Metadata sets in the wiki.
 • There is no need to edit the Metadata sets in the wiki.

 we initially developed the extension to store the files in the File:
 namespace, but we were told by the Foundation that we should use
 ContentHandler instead. unfortunately there is an issue with storing
 content  1mb in the db so we need to find another solution.

 1. any suggestions?


What I would suggest is a hybrid approach. The metadata file gets
uploaded, and is stored using FileBackend class. (There's a couple
extensions that store files without them being a file page. For
example the Score extension stores the rendered files on the server,
but its not attached to any file page). Once the xml file is on the
server, use ContentHandler to make a new content type that stores a
reference to the file [instead of the original file] (probably in the
form of a mediawiki virtual file url).


--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How's the SSL thing going?

2013-07-31 Thread Brian Wolff
Which kind of ignores the issue that encrypting with ssl doesn't do a
lot against traffic analysis, when its publicly known how big the
pages you're downloading are, and how many images/other assets they
have on them. NSA certainly has the resources to do this if they want.


If you can do this sort of thing:
http://blog.ioactive.com/2012/02/ssl-traffic-analysis-on-google-maps.html
against google maps, I imagine it should be much simpler to do
something like that for Wikipedia. (Our data has more variation in it,
and the data is all publicly available)

--bawolff

On 7/31/13, Tyler Romeo tylerro...@gmail.com wrote:
 Good question.

 There are two steps to this:
 1) Move all logins to TLS
 2) Move all logged in users to TLS

 The former was dependent on a bug with E:CentralAuth that was causing
 $wgSecureLogin to malfunction. I am not sure whether this bug was ever
 fixed (I remember seeing Chris submit a patch for it, but I think it was
 abandoned).

 Also, the discussion on https://bugzilla.wikimedia.org/show_bug.cgi?id=52283
 is
 probably a blocker for enabled $wgSecureLogin (which would be a
 pre-requisite for either of the two above steps).


 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2016
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com


 On Wed, Jul 31, 2013 at 2:36 PM, David Gerard dger...@gmail.com wrote:

 Jimmy just tweeted this:

 https://twitter.com/jimmy_wales/status/362626509648834560

 I think that's the first time I've seen him say fuck in a public
 communication ...

 Anyway, I expect people will ask us how the move to all-SSL is
 progressing. So, how is it going?

 (I've been telling people it's slowly moving along, we totally want
 this, it's just technical resources. But more details would be most
 useful!)


 - d.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RandomInCategory Patch Merged!

2013-08-01 Thread Brian Wolff
On 2013-08-01 6:21 PM, Antoine Musso hashar+...@free.fr wrote:

 Le 01/08/13 21:04, Yuvi Panda a écrit :
  Bug 25931[1], Implement efficient way to select random page from
  specified category on Wikimedia wikis has just been marked as
  resolved, thanks to this patch[2] from Bawolff that Brion merged.
 
  Yay! :)
 
  [1]: https://bugzilla.wikimedia.org/show_bug.cgi?id=25931
  [2]: https://gerrit.wikimedia.org/r/#/c/71997/

 It is always a pleasure to find old bugs being fixed. Thank you!


 For those wondering about randomly selecting an article, I invite you to
 have a look at a 2007 post by Brion on his blog:

   https://brionv.com/log/2007/11/22/random-tests/

 You will learn why we have (had?) some articles showing up more often
 when using RandomPage :-]


 --
 Antoine hashar Musso


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

It should be noted that the method used in the patch in question is
significantly less random than random page.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GMail sending lots of WIkimedia mail to spam again

2013-08-05 Thread Brian Wolff
https://bugzilla.wikimedia.org/show_bug.cgi?id=52556 sounds related...

--bawolff

On 8/5/13, Emilio J. Rodríguez-Posada emi...@gmail.com wrote:
 What is the explanation for this? My spam folder is full of emails from
 wiki mailing lists too.

 Perhaps many users don't know how to unsubscribe and mark them as spam and
 Google filter has learn it?


 2013/8/5 Mathieu Stumpf psychosl...@culture-libre.org

 Le lundi 05 août 2013 à 23:01 +0530, Yuvi Panda a écrit :
  All emails to labs-l always end up in spam for me (I've a special rule
  that picks them out of spam, and GMail still warns me).
 
  /end-data-point
 
 
 Bad mail provider, change mail provider. ;)

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


-- 
--
- Brian
Caution: The mass of this product contains the energy equivalent of 85
million tons of TNT per net ounce of weight.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

<    1   2   3   4   5   6   7   >