[Wikitech-l] Re: Join the Mobile Apps Team Meeting!

2023-09-21 Thread rupert THURNER
Fyi there is some discussion over using zoom nowadays:
https://techcrunch.com/2023/08/16/open-source-developers-urged-to-ditch-zoom-over-user-data-controversy/

Rupert


Amal Ramadan  schrieb am Do., 21. Sept. 2023, 10:54:

> Hello!
>
> We're excited to invite you to our Mobile Apps Team online meeting. This
> is a great opportunity to learn about the latest developments in
> Wikipedia's mobile apps and engage with the team.
>
> Date: 27th October
> Time: 5 p.m. UTC
> Meeting Link:  https://wikimedia.zoom.us/j/83695206107
>
> Our host, Jazmin Tanner
> , Product Manager of the
> Apps Team , and our
> software engineers will be there to provide updates, answer your questions,
> and hear your suggestions.
>
> Agenda:
>
> Mobile app updates
> Q session
> Share your thoughts
> Contribute by posting your questions and insights about Wikipedia’s mobile
> apps on the Wikimedia Apps/Office Hours page on mediawiki.org
> .
> The deadline for input is 24th October at 12:00 UTC.
>
> We can provide Arabic and French interpretations if we get +7 sign-ups for
> each language by 3rd October on the same link.
> Also, for a one-day reminder before the meeting, add your username.
>
> Please help us spread the word to interested developers in Android, iOS
> Wikipedia mobile apps, and Commons. We value your contribution to the
> Wikimedia community and look forward to your active participation in this
> meeting!
>
> Respectfully,
> *Amal Ramadan* (She\Her)
> Sr. Community Relations Specialist
> Wikimedia Foundation 
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Goto for microoptimisation

2021-08-01 Thread rupert THURNER
On Sat, Jul 31, 2021 at 6:10 AM Tim Starling  wrote:
>
> For performance sensitive tight loops, such as parsing and HTML construction, 
> to get the best performance it's necessary to think about what PHP is doing 
> on an opcode by opcode basis.
...
> I am proposing
>
> if ( $x == 1 ) {
> action1();
> goto not_2; // avoid unnecessary comparison $x == 2
> } else {
> action_not_1();
> }
> if ( $x == 2 ) {
> action2();
> } else {
> not_2:
> action_not_2();
> }
...
> I am requesting that goto be considered acceptable for micro-optimisation.

ha, what question. the single goto and its target are 5 lines apart,
even me php incompetent person can understand it.

you triggered me reading more about it though. the commit comment
states it takes 30% less instructions:
  Measuring instruction count per iteration with perf stat, averaged over
  10M iterations, PS1. Test case:
  Html::openElement('a', [ 'class' => [ 'foo', 'bar' ] ] )

  * Baseline: 11160.7265433
  * in_array(): 10390.3837233
  * dropDefaults() changes: 9674.1248824
  * expandAttributes() misc: 9248.1947500
  * implode/explode and space check: 8318.9800417
  * Sanitizer inline: 8021.7371794

does this mean these changes bring 30% speed improvement? that is
incredible! how often is this part called to retrieve one article?

now i understand why legoktm is prepared to rewrite mediawiki in rust
(https://www.mediawiki.org/wiki/Template:User_Rust), and why proposals
exist to extend php with rust (https://github.com/rethinkphp/php-rs ,
https://docs.rs/solder/0.1.6/solder/ ). tempted i was to use legoktm's
template on my user page, when i finally saw that php is amazing with
regular expressions by including pcre c library:
https://benchmarksgame-team.pages.debian.net/benchmarksgame/performance/regexredux.html
.

rupert
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


[Wikitech-l] which programs convert? keep metadata when transforming metadata

2018-11-05 Thread rupert THURNER
hi,

fotos and films are converted quite often in the wikimedia space. metadata
is not so often preserved. what programs one would need to adjust to
preserve the metadata?

i know of ffmpeg for videos. for fotos is it convert from gimp? anything
else? some other aspects one would need to pay attention?

rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to make oauth authentication with wikipedia?

2018-08-05 Thread rupert THURNER
On Sun, Jul 29, 2018 at 4:30 PM, Bryan Davis  wrote:

> On Sun, Jul 29, 2018 at 12:37 AM rupert THURNER
>  wrote:
> >
> > if one takes an example, lke https://tools.wmflabs.org/video2commons/,
> is
> > this implemented like it should? is there any difference from "any"
> > application or applications on the tools server? am looking at the code
> > here currently:
> > https://github.com/toolforge/video2commons/blob/master/
> video2commons/frontend/app.py
> > the "dologin" method.
>
> Yes, there is a major difference between a web application like the
> video2commons tool and a device native application like an Android
> app. That difference is that in a web application secret data can be
> kept on the web server side that is not visible to the end user. This
> allows the OAuth application secret to be used in signing requests to
> the Wikimedia servers without exposing that secret to anyone who is
> looking at the source code of the web application. This separation is
> not possible when the application is running on end-user controlled
> devices as a phone or desktop application does.
>
>
interesting, never thought about it. i found an entry on stackexchange
confirming what you said. additionally it states that oauth is not for
authenticaiton. oauth's purpose is to access users resources from some
resource provider, while openid_connect should be used to authenticate.
does openid_connect work with wikipedia and is it the best option currently?

[0]
https://security.stackexchange.com/questions/133065/why-is-it-a-bad-idea-to-use-plain-oauth2-for-authentication

[1] https://connect2id.com/learn/openid-connect
[2] https://www.mediawiki.org/wiki/Extension:OpenID_Connect

rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] phone cannot play mobile site video, but standard site video

2018-08-03 Thread rupert THURNER
why is it that the excellent player is available on the standard wikipedia
view, and not on mobile view? is this just my configuration? as eyample:
* https://en.wikipedia.org/wiki/Polar_orbit
* https://en.m.wikipedia.org/wiki/Polar_orbit
the effect is that my mobile can play the standard view video, but not the
mobile view video.

rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to make oauth authentication with wikipedia?

2018-07-29 Thread rupert THURNER
On Sun, Jul 29, 2018 at 8:16 AM, Sam Wilson  wrote:

> On 29/07/18 00:35, Gergo Tisza wrote:
>
>> On Sat, Jul 28, 2018 at 10:37 AM Sam Wilson  wrote:
>>
>> I've been wondering about how this sort of thing would work as well, in
>>> the context of a Piwigo plugin that I've been working on for easily
>>> uploading photos to Commons (or any MediaWiki site). It seems the
>>> easiest way to do it is to get users to register their own personal
>>> (owner-only) OAuth consumer (which I think never requires approval?) and
>>> then have them enter the consumer token in the app.
>>>
>>
>>
>> I'd say that's the hardest (although it could be made more user-friendly
>> with some effort).
>> See https://phabricator.wikimedia.org/T179519#3727899 for other options.
>>
>
> Good point: I guess I meant "easiest" as in "closest to getting oauth
> working"! :) But yeah, hardly simple from the user's point of view.
>
> I was thinking that bot passwords are generally discouraged as part of a
> normal user workflow. I'm probably thinking too strongly though, and it's
> fine to direct people to Special:BotPasswords. Although, by default it does
> say "If you don't know why you might want to do this, you should probably
> not do it." which might be discouraging to some people. Still, easy enough
> to spell out what's going on before sending people there.
>

if one takes an example, lke https://tools.wmflabs.org/video2commons/, is
this implemented like it should? is there any difference from "any"
application or applications on the tools server? am looking at the code
here currently:
https://github.com/toolforge/video2commons/blob/master/video2commons/frontend/app.py
the "dologin" method.

rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Multimedia] Video output changing to WebM VP9/Opus soon

2018-07-26 Thread rupert THURNER
i tried VP9 uploading a video with a spinning wheel made with a
samsung galaxy s5, mp4, original size 99.5MB. i removed the sound
beforehand though:
https://commons.wikimedia.org/wiki/File:180522-alpaca-spinnen-gotthard-passh%C3%B6he-silent.webm

using:
https://tools.wmflabs.org/video2commons/

which used the following (i removed the paths ...)
ffmpeg -y -i dl.unknown_video -threads 0 -skip_threshold 0 -bufsize
6000k -rc_init_occupancy 4000 -qmin 19 -qmax 19 -vcodec libvpx-vp9
-tile-columns 4 -f webm -ss 0 -an -pass 1 -passlogfile
dl.unknown_video.an.vp9.webm.log /dev/null

ffmpeg -y -i dl.unknown_video -threads 0 -skip_threshold 0 -bufsize
6000k -rc_init_occupancy 4000 -qmin 19 -qmax 19 -vcodec libvpx-vp9
-tile-columns 4 -f webm -ss 0 -an -pass 2 -passlogfile
dl.unknown_video.an.vp9.webm.log dl.unknown_video.an.vp9.webm

the size increased by 30 mb, the quality on a smaller screen is the
same, i did not verify on a high resolution screen if a difference can
be noticed.

rupert


On Fri, Jul 27, 2018 at 2:47 AM, Brion Vibber  wrote:
> Oh and one more thing!
>
> For the VP9 configuration I'll be enabling 1440p and 2160p ("4K")
> resolutions, which people can manually bump up to when watching videos with
> a suitable 4K source on a high-res screen. They use higher data rates, but
> only a small fraction of input files are 4K so should not significantly
> increase disk space projections for now.
>
> These can take a long time to compress, so if we find it's problematic we'll
> turn them back off until the jobs can be split into tiny chunks (future work
> planned!), but it works in my testing and shouldn't clog the servers now
> that we have more available.
>
> (Note that the ogv.js player shim for Safari will not handle greater-than-HD
> resolutions fast enough for playback, even on a fast Mac or iPad; for best
> results for 4K playback use Firefox, Chrome, or a Chromium-based browser.)
>
> -- brion
>
> On Thu, Jul 26, 2018 at 5:39 PM Brion Vibber  wrote:
>>
>> Ok, after some delay for re-tweaking the encoding settings for higher
>> quality when needed, and pulling in some other improvements to the config
>> system, all related updates to TimedMediaHandler have been merged. :D
>>
>> If all goes well with the general deployments in the next few days, expect
>> the beginning of VP9 rollout starting next week.
>>
>> Changes since the earlier announcement:
>> * the new row-multithreading will be available, which allows higher
>> threading usage at all resolutions; encoding times will be more like 1.5-2x
>> slower instead of 3-4x slower.
>> * switch to constrained quality with a larger max bitrate: many files will
>> become significantly smaller in their VP9 versions, but some will actually
>> increase in exchange for a huge increase in quality -- this is mostly 60fps
>> high-rate files, and those with lots of motion and detail that didn't
>> compress well at the default low data rates.
>>
>> -- brion
>>
>> On Fri, Jun 29, 2018 at 9:46 AM Brion Vibber 
>> wrote:
>>>
>>> Awesome sauce. Thanks Moritz!
>>>
>>> -- brion
>>>
>>> On Fri, Jun 29, 2018 at 7:39 AM Moritz Muehlenhoff
>>>  wrote:

 Hi all,

 On Thu, Jun 28, 2018 at 01:54:18PM -0700, Brion Vibber wrote:
 > Current state on this:
 >
 > * still hoping to deploy the libvpx+ffmpeg backport first so we start
 > with
 > best performance; Moritz made a start on libvpx but we still have to
 > resolve ffmpeg (possibly by patching 3.2 instead of updating all the
 > way to
 > 3.4)

 I've completed this today. We now have a separate repository component
 for stretch-wikimedia (named component/vp9) which includes ffmpeg 3.2.10
 (thus allowing us to follow the ffmpeg security updates released in
 Debian
 with a local rebuild) with backported row-mt support and linked against
 libvpx 1.7.0.

 I tested re-encoding

 https://commons.wikimedia.org/wiki/File:Wall_of_Death_-_Pitts_Todeswand_2017_-_Jagath_Perera.webm
 (which is a nice fast-paced test file) from VP8 to VP9, which results in
 a size reduction from 48M to 31M.

 When using eight CPU cores on one of our video scaler servers, enabling
 row-mt
 gives a significant performance boost; encoding time went down from 5:31
 mins
 to 3:36 mins.

 All the details can be found at
 https://phabricator.wikimedia.org/T190333#4324995

 Cheers,
 Moritz

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> ___
> Multimedia mailing list
> multime...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/multimedia
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introduction

2017-03-20 Thread rupert THURNER
On Mar 19, 2017 2:17 PM, "Brad Jorsch (Anomie)" <bjor...@wikimedia.org>
wrote:

On Sun, Mar 19, 2017 at 7:23 AM, rupert THURNER <rupert.thur...@gmail.com>
wrote:

> where is this beginners page hidden,


Is https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker what
you're looking for?


> and why are google, bing, and duckduckgo not able to find it?
>

I don't know. When I try that search on Google the above page shows up as
the 9th result, even in a private browsing session with a different browser
from a different IP.


Really? What keywords are you using?

Rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introduction

2017-03-19 Thread rupert THURNER
ha, how embarrassing! i used google to search "wikimedia projects
programming beginner" and google found the beginners guide for python
programmers, which is not what i wanted :)
https://wiki.python.org/moin/BeginnersGuide/Programmers . then i went
to https://www.wikipedia.org/ and i was lost as well. donation, shop,
help, and, and, and. no programming.

where is this beginners page hidden, and why are google, bing, and
duckduckgo not able to find it?

best,
rupert

On Sun, Mar 19, 2017 at 11:31 AM, Egbe Eugene  wrote:
> Thanks for the warm welcome into the community *All*
>
> I wish to express my desire to participate in this year's summer of code as
> i know it is the most rapid way ( apart from the hackathons) to hack the
> Foundation projects. For someone as new to the foundation as i am, are
> there any projects which could be suggested for me to quickly get about and
> straight to work.
>
> Thanks very much
> Egbe
>
> On Wed, Mar 8, 2017 at 8:55 AM, Jan Dittrich 
> wrote:
>
>> Hi Egbe,
>>
>> Great to have you on board!
>>
>> Don’t hesitate to poke people if you have trouble finding or understanding
>> documentation, we may be already used to its quirks :-)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making Wikipedia speak to you

2016-12-20 Thread rupert THURNER
i wanted to endorse it because i like the idea - but it links to a
commercial site, ispeech.org. this confuses me a little. nothing is
mentioned about open source software in the lines of:
http://www.findbestopensource.com/tagged/text-to-speech

rupert

On Mon, Dec 19, 2016 at 6:58 PM, André Costa  wrote:

> Thanks for the comments.
>
> The page is indeed lacking in details related to the implementation. Some
> of it may be found in the pilot study but most isn't.
>
> I've made a first answered at https://www.mediawiki.org/wiki
> /Extension_talk:Wikispeech. I also copied your (Bawolff) questions there,
> hope you don't mind. I would appreciate it if the follow-up was kept there
> to make it accessible to people off-list.
>
> Thanks,
> André
>
> On 18 December 2016 at 20:49, Brian Wolff  wrote:
>
> > On Friday, December 16, 2016, André Costa 
> wrote:
> > > *Cross posting on purpose, no excuses made.*
> > >
> > > Hi,
> > >
> > > At Wikimedia Sverige we have been working on an extension called
> > > Wikispeech. It will be a text-to-speech solution which aim to make the
> > > information on Wikipedia more accessible for people that have limited
> > > abilities to read.
> > >
> > > This is Wikimedia Sverige's first MediaWiki development project from
> > > scratch and it has been suggested to us that we should ask for
> > endorsements
> > > - as this will make the need clear if/when the extension needs support.
> > So,
> > > if you think that this sound like something important, please let
> > everybody
> > > know it! https://www.mediawiki.org/wiki/Wikispeech#Endorsement
> > >
> > > Please spread the word. Best,
> > > André Costa
> > > André Costa | Senior Developer, Wikimedia Sverige |
> > andre.co...@wikimedia.se
> > > | +46 (0)733-964574
> > >
> > > Stöd fri kunskap, bli medlem i Wikimedia Sverige.
> > > Läs mer på blimedlem.wikimedia.se
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> > I agree with nemo, a more detailed implementation plan would be very
> > welcome.
> >
> > From what I gather from your existing implementation, your current plan
> is:
> > * Using a ParserAfterParse hook, do some complex regexes/DomDocument
> > manipulation to create "utterance" annotations of clean html for the tts
> > server.
> > * insert this utterance html at end of page html
> > * javascript posts this to a (currently) python api, that returns a json
> > response that contains a url for the current utterance (not sure how long
> > an utterance is, but im assuming its about a paragraph)
> > * javascript plays the file.
> >
> > Is this your general plan, or is the existing code more a proof of
> concept?
> > Im not sure im a fan of adding extra markup in this fashion if its only
> > going to be used by a fraction of our users.
> >
> > --
> > bawolff
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Communication about proposed and planned UI changes

2016-12-15 Thread rupert THURNER
On Fri, Dec 16, 2016 at 6:50 AM, Tyler Romeo  wrote:

> On Thu, Dec 15, 2016 at 6:51 PM, Pine W  wrote:
>
> > My proposal would be that proposed UI changes which affect large
> > proportions of the user base should be announced 3 months in advance.
> > This would provide plenty of opportunity for discussion,
> > synchronization, and testing of proposed changes.
> >
>
> A much finer definition is needed here. "Proposed UI changes which affect
> large proportions of the user base", to my mind, includes basically any UI
> change that affects any public user page, e.g., public Special Pages,
> article pages, etc. It does not include any specification about the
> significance of the change. (I understand this is intentional based on your
> replies in the previous thread.)
>

uh, in this discussion i have two hearts in the breast, fully supportive of
both opinions here. i am wondering if the same could not be reachted with a
different, less discussion intensive style, more solved by technology? with
user interfaces i'd love, as a consumer, to have something like linux does:
 continuous features included all the time, not too many discussions in a
rolling version (which may be compared to the usual new kernel coming out),
and a version which stays stable (compared to the long term support).
personally i would opt for the brand new beta version, not wanting to wait
that somebody discusses a feature which i can barely imagine how it may
look like. an easy switch back and forth gives imo a better indication if
its going the right direction.

rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit screen size

2016-09-27 Thread rupert THURNER
Isn't Gerrit more for developers who as a consequence anyway run multiple
browsers and therefore do not care so much that it does not support Ie?

On Sep 26, 2016 20:14, "Paladox"  wrote:

> There new skin called polygerrit fixes all the issues described here. It
> is moving along greatly but it doesn't support all browsers yet, namely
> Internet Explorer due to polygerrit using es6 and internet explorer does
> not support es6. They are going to something in the q2 of next year work on
> internet explorer support, hopefully it will make it into gerrit 2.14,
> 2.15, and hopefully we will still be using gerrit then and update it and
> also hopefully polygerrit will have added all the missing features.
> Polygerrit is very responsive as I tried this on my mobile (iPhone) and it
> worked.
>
>
>
> On Monday, 26 September 2016, 18:39, Rob Lanphier 
> wrote:
>
>
>  On Sun, Sep 25, 2016 at 5:41 AM, Tim Starling 
> wrote:
>
> > On 25/09/16 21:09, Bináris wrote:
> > > I try to familiarize myself with Gerrit which is not a good example for
> > > user-friendly interface.
> > > I noticed a letter B in the upper right corner of the screen, and I
> > > suspected it could be a portion of my login name. So I looked at it in
> > HTML
> > > source, and it was. I pushed my mouse on it and I got another half
> window
> > > as attached.
> > >
> > > So did somebody perhaps wire the size of a 25" monitor into page
> > rendering?
> > > My computer is a Samsung notebook.
> >
> > In T38471 I complained that the old version was too wide at 1163px
> > (for my dashboard on a random day). Now the new version is 1520px. I'm
> > not sure if the Gerrit folks are serious or are trolling us. Perhaps
> > it is a tactic to encourage UI code contributions?
> >
>
> My suspicion is that the Gerrit folks (in particular, Shawn Pierce) aren't
> so much trolling us as saying "stop relying on the UI of Gerrit!  That's
> not the point!"  The last time I was paying close attention to this, Gerrit
> upstream seems to be particularly focused on building code review features
> suitable for:
> 1.  Incorporation into git upstream
> 2.  Integrated into development UIs like Eclipse
>
> The strategy seems to be "Gerrit is a reference implementation of a code
> review UI for Git".  I haven't paid close enough attention to either Gerrit
> upstream or Git upstream to know if the Gerrit core contributors have made
> progress in getting code review functionality added to Git core (or if
> they've given up, or if I completely misunderstood their strategy).  I'm
> guessing that Eclipse has pretty good Gerrit support, but I rarely play
> with Eclipse, so that's just a guess based on the Eclipse Foundation's
> involvement with Gerrit.
>
> As bd808 noted, Gerrit upstream seems to be working on yet another UI,
> which would make sense if their goal is to create a variety of compatible
> implementations of advanced Git functionality.
>
> Rob
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Voting reusable

2016-05-31 Thread rupert THURNER
From time to time mails like below pass by where I wish that voting should
be generalized on a technical level.  So that any organization
participating in the wikiverse would be able to conduct votes and reuse
voting rights. Would this be something of broader value?

Best
Rupert
-- Forwarded message --
From: "Sandister Tei" 
Date: May 31, 2016 16:33
Subject: Re: [Wikimedia-GH] Help us choose volunteers to serve you
To: "Sadat" 
Cc: "Planning Wikimedia Ghana Chapter" 

And if they vote with another account?

Regards,
Sandister Tei

www.sandistertei.com

Via mobile
On 31 May 2016 2:28 p.m., "Mohammed S. Abdulai"  wrote:

> You can just restrict multiple entries, I believe you're familiar with
> that process.
>
> Cheers
>
> -Masssly
>
> Sent from my Samsung Galaxy smartphone.
>  Original message 
> From: Sandister Tei 
> Date: 05/31/2016 10:38 (GMT+00:00)
> To: Sadat 
> Cc: Planning Wikimedia Ghana Chapter 
> Subject: RE: [Wikimedia-GH] Help us choose volunteers to serve you
>
> You can suggest another means to check double voting.
>
> Regards,
> Sandister Tei
>
> www.sandistertei.com
>
> Via mobile
> On 31 May 2016 10:07 a.m., "Mohammed S. Abdulai" 
> wrote:
>
>> For best practices we shouldn't REQUIRE prospective respondents to enter
>> their names.
>>
>> I would like to see that relaxed.
>>
>> Thanks
>>
>> -Masssly
>>
>>
>> Sent from my Samsung Galaxy smartphone.
>>  Original message 
>> From: Sandister Tei 
>> Date: 05/31/2016 08:00 (GMT+00:00)
>> To: Planning Wikimedia Ghana Chapter 
>> Subject: [Wikimedia-GH] Help us choose volunteers to serve you
>>
>> Hello all.
>>
>> Help us choose volunteers to serve you.
>> Please take this survey . It's
>> just two questions.
>>
>>
>> *Regards, *
>> *Sandister Tei *
>>
>> *www.sandistertei.com *
>>
>
___
Wikimedia-GH mailing list
wikimedia...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikimedia-gh
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] update: wikipedia.org portal

2016-05-22 Thread rupert THURNER
MZMcBride wrote:
> rupert THURNER wrote:
>>quim, i would not be angry if you would show a little bit more empathy
>>towards a client, a volunteer. if mzmcbride is right and there is a
>>well established procedure to change this page which was not followed,
>>the person not following might read the "expected behaviour" page.
>
> Project portals such as www.wikipedia.org were managed on Meta-Wiki:
> <https://meta.wikimedia.org/wiki/Project_portals>. I believe most of the
> portals continue to be managed on Meta-Wiki, with the exception of
> www.wikipedia.org; background: <https://phabricator.wikimedia.org/T110070>.

ah, interesting. moving this to git sounds ok to me from a technical
viewpoint. i read
https://www.mediawiki.org/wiki/Wikipedia.org_Portal/Migration_to_gerrit.
despite that i am not clear how the current portal maintainers would
then activate a proposal e.g. from the discovery team?

rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] update: wikipedia.org portal

2016-05-21 Thread rupert THURNER
On Sat, May 21, 2016 at 1:07 AM, MZMcBride  wrote:

>> It's frustrating and annoying that your happy team hijacked this portal...
> (snip)
> Hostility and anger are not welcomed in this mailing list, neither in the
> rest of Wikimedia spaces. Any problems can be reported and explained in a
> respectful and even friendly way. The topics you are exposing in this
> thread are no exception.
>
> Please read https://www.mediawiki.org/wiki/Expected_behavior for more
> thoughts about keeping our communities respectful, welcoming, and friendly.

quim, i would not be angry if you would show a little bit more empathy
towards a client, a volunteer. if mzmcbride is right and there is a
well established procedure to change this page which was not followed,
the person not following might read the "expected behaviour" page.
putting the blame on the person harrassed/frustrated (mzmcbride), not
on the harrasser (whoever changed the page) seems not a nice move in
that case.

for the description of meta, I think it is quite misleading. Meta is
administrative, about other sites. not "the community site". not so
ideal i find also that this page then would be english only without
language links.

best,
rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tags are a usability nightmare for editing on mediawiki.org

2016-04-04 Thread rupert THURNER
On Sun, Apr 3, 2016 at 9:48 PM, Niklas Laxström
 wrote:
> 2016-04-03 11:29 GMT+03:00 Jon Robson :
>> The Translate tag has always seemed like a hack that I've never quite
>> understood.
>
> I am happy to direct to our documentation [1] anyone who asks, or
> explain if the documentation is not sufficient.
>
> The word hack can have both positive and negative meanings and it is
> unclear what do you mean with it here.

lol - negative :) but i did not make that connection to translatewiki.
how many approaches to "translate" exist now? one for translating
articles? a translate extension? something in wikidata? makes 3?

rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tags are a usability nightmare for editing on mediawiki.org

2016-04-03 Thread rupert THURNER
On Sun, Apr 3, 2016 at 4:13 PM, Adam Wight  wrote:
> On Apr 3, 2016 1:30 AM, "Jon Robson"  wrote:
>> The Translate tag has always seemed like a hack that I've never quite
> understood.
>
> +1. Couldn't we use Parsoid data tags to identify paragraphs? It seems like
> that would lend itself to an incremental migration.
>

i tried it once, at
https://meta.wikimedia.org/wiki/Grants:IdeaLab/Inspire/de. if you look
at the page its a mix of english and german. it shows 96% translated.
if you look at 
https://meta.wikimedia.org/w/index.php?title=Special:Translate=page-Grants%3AIdeaLab%2FInspire=page==de
it seems complete. and if you want to edit the original page i feel
the user experience nightmarish. the reusability factor to translate
articles from de.wikipedia to en.wikipedia or the other way round is
zero. if this mess goes away i'd be happy :)

rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] productivity of mediawiki developers

2016-04-02 Thread rupert THURNER
hi,

is there a statistics about mediawiki developer productivity? i just
fell over a couple of pages and i am quite impressed i must say:
* gabriel, https://github.com/gwicke, 2'300 commits last year
* jeroen, https://github.com/JeroenDeDauw, 3'700 commits a year
* ori, https://github.com/atdt, 1'700
* james, https://github.com/jdforrester, 1'200
* yuri, https://github.com/nyurik, 900
* matt: https://github.com/mattflaschen, 400 commits

rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tags are a usability nightmare for editing on mediawiki.org

2016-04-01 Thread rupert THURNER
On Apr 1, 2016 18:32, "Brion Vibber"  wrote:
>
> Lots of pages on mediawiki.org are pretty much uneditable because they're
> strewn with  spanning multiple paragraphs that make
VisualEditor
> completely unusable and the source editor very difficult to use.
>
> I'm trying to update documentation, and I'm seriously thinking of regexing
> out all the  stuff so I can actually edit the wiki pages.
>
> This is probably not a good thing.
>
>
> https://phabricator.wikimedia.org/T131516

This is not only on mediawiki.org but also on meta. I d not unhappy if this
replaced by nothing.

Rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikimedia-l] Update from the Wikimedia Performance Team

2015-12-08 Thread rupert THURNER
I appreciate a ton what you guys achieve, many thanks!!

Rupert
On Dec 7, 2015 23:28, "Gilles Dubuc"  wrote:

> Hello,
>
> This is the monthly report from the Wikimedia Performance team.
>
> ## Our progress ##
>
> * Availability. We've done a major overhaul of the ObjectCache interfaces.
> Many
> factory methods were deprecated or removed, reducing it to just four simple
> entry points. New docs at
>
> https://doc.wikimedia.org/mediawiki-core/master/php/classObjectCache.html#details
>
> We've written a new IExpiringStore interface for convenient TTL constants,
> e.g. $cache::TTL_WEEK. See
>
> https://doc.wikimedia.org/mediawiki-core/master/php/interfaceIExpiringStore.html
>
> We've migrated most use of wfGetMainCache() to WANObjectCache. Work
> continued on the librarization of BagOStuff, Memcached, and other object
> cache classes.
>
> * Performance testing infrastructure. We've created dedicated dashboards
> for portals:
>
> https://grafana.wikimedia.org/dashboard/db/webpagetest-portals
>
> And for mobile:
>
> https://grafana.wikimedia.org/dashboard/db/mobile-webpagetest
>
> We now test one page using real 3G connections (from San Francisco and
> Bangalore) and test other pages using the following physical devices:
> iPhone 6, iPad mini 2 and Moto G.
>
> * Media stack. We've extended Thumbor with 12 small plugins to meet our
> needs and match our existing thumbnailing feature set. This includes
> support for all the file formats in use on Commons. The Thumbor Vagrant
> stack is now very close to having all the moving parts needed in
> production, with basic Vagrant roles for Varnish and Swift having been
> written to that end. Our objective is to finish the work on VM by the
> holidays and have it ready to be showcased and discussed collectively at
> the developer summit in a breakout session.
>
> * ResourceLoader. We've written a new mw.requestIdleCallback API for
> scheduling deferred tasks. We've removed usage of the  msg_resource_links
> DB table. We now use message config from the module registry directly.
> We've migrated MessageBlobStore msg_resource DB table to an object cache
> (to be deployed in January 2016):
> https://phabricator.wikimedia.org/T113092
>
> ## How are we doing? ##
>
> Client-side performance has remained stable over the past month. Save
> Timing
> has also remained stable, around the 1s median mark.
>
> The job queue's health improved greatly after adding a new server to the
> pool, with the job queue size dropping drastically and the 99th percentile
> job processing time going from one day to one hour:
>
> * https://grafana.wikimedia.org/dashboard/db/job-queue-health
>
> There was a small scare about a sudden increase of the SpeedIndex value
> across the board:
>
> https://grafana.wikimedia.org/dashboard/db/webpagetest
>
> But it was entirely explained by the fundraising banner, which doesn't
> appear immediately on pageload. SpeedIndex measures the time it takes for
> the above-the-fold area to "settle" visually. The banner appears late and
> pushes the content down, which delays the time when visual changes stop
> happening for the above-the-fold area.
>
> Until next time,
> Aaron, Gilles, Peter, Timo, and Ori.
> ___
> Wikimedia-l mailing list, guidelines at:
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines
> wikimedi...@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> 
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] WMF Survey: Please help us understand third-party use of Wikipedia's content

2015-12-04 Thread rupert THURNER
On Dec 3, 2015 20:56, "Bryan Davis"  wrote:
>
> On Mon, Nov 16, 2015 at 1:29 PM, Sylvia Ventura 
wrote:
> > Dear Wikipedia data/API user,
> >
> > The WMF’s Engineering, Product and Partnerships teams are conducting a
> > short survey to help us understand how organizations are pulling and
using
> > data from our projects. This information will inform future features and
> > improvements to our data tools and APIs.
> >
> > We would appreciate a few minutes of your time. The link to the Survey
> > below will take you to a Google Form - there is no need to sign up to
fill
> > out the survey. The survey should take no more than 10 minutes.
> >
> >
https://docs.google.com/forms/d/1yUrHzyLABN419RCDbzepjoRWCbaWYV4wbtbKPa95C4o/viewform?usp=send_form
> >
> > Thank you for your input and feedback!
>
> I heard from Sylvia this week that we got a grand total of ONE
> response from the mailing list postings of this survey. If any of you
> saw this request the first time around and thought you might have
> useful input but just didn't get around to filling out the survey I
> would encourage you to take 10 minutes and do so now.
>
> The questions are very high level and open ended but primarily are
> seeking to get an overview of how dumps, the action api, irc edit
> notifications, rcstream and other tools and services that the
> Wikimedia Foundation or others provide to allow off-wiki access to the
> Wikimedia community created and curated content are used. One of the
> hoped for outcomes of this and related surveys is guidance for the
> Wikimedia Foundation on areas that deserve increased focus in the
> future.
>

Bryan, did you consider looking into the bug tracker for open issues and
suggestions?

I d find an ongoing focus on interfacing much more valuable and agile than
do one survey. This means clearly publish the interfaces and the way how to
handle imperfections.

Rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Superprotect is gone

2015-11-05 Thread rupert THURNER
Excellent,  many thanks Quim!!
On Nov 5, 2015 18:35, "Quim Gil"  wrote:

> Superprotect [1] was introduced by the Wikimedia Foundation to resolve a
> product development disagreement. We have not used it for resolving a
> dispute since. Consequently, today we are removing Superprotect from
> Wikimedia servers.
>
> Without Superprotect, a symbolic point of tension is resolved. However, we
> still have the underlying problem of disagreement and consequent delays at
> the product deployment phase. We need to become better software partners,
> work together towards better products, and ship better features faster. The
> collaboration between the WMF and the communities depends on mutual trust
> and constructive criticism. We need to improve Wikimedia mechanisms to
> build consensus, include more voices, and resolve disputes.
>
> There is a first draft of an updated Product Development Process [2] that
> will guide the work of the WMF Engineering and Product teams.[3] It
> stresses the need for community feedback throughout the process, but
> particularly in the early phases of development. More feedback earlier on
> will allow us to incorporate community-driven improvements and address
> potential controversy while plans and software are most flexible.
>
> We welcome the feedback of technical and non-technical contributors. Check
> the Q for details.[4]
>
> [1] https://meta.wikimedia.org/wiki/Superprotect
> [2] https://www.mediawiki.org/wiki/WMF_Product_Development_Process
> [3] https://www.mediawiki.org/wiki/Wikimedia_Engineering
> [4]
>
> https://www.mediawiki.org/wiki/WMF_Product_Development_Process/2015-11-05#Q.26A
>
> --
> Quim Gil
> Engineering Community Manager @ Wikimedia Foundation
> http://www.mediawiki.org/wiki/User:Qgil
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] svn/git backends (was: Re: Project Idea " Extension: Offline MediaWiki ")

2015-09-26 Thread rupert THURNER
On Fri, Sep 25, 2015 at 5:58 PM, C. Scott Ananian 
wrote:

> On Fri, Sep 25, 2015 at 11:52 AM, Purodha Blissenbach
>  wrote:
> > There has been a project in the past that converted a MediaWiki code base
> > from SQL to use svn or git as message store. I do not remember which. It
> > worked afaicr but was discontinued as not being used irl, and pretty
> slow,
> > too.
>
> If you could dig up more details on that project I'd be very
> interested!  Those who don't learn from history, etc.  I'd love to
> read through the code and see what was done.  It would be relevant to
> https://phabricator.wikimedia.org/T113004
>  --scott
>

gollum for github: https://github.com/gollum/gollum, and jingo:
https://github.com/claudioc/jingo are examples of wikis using git as
document store.

rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code of Conduct: Intro, Principles, and Unacceptable behavior sections

2015-09-06 Thread rupert THURNER
On Sun, Sep 6, 2015 at 9:19 AM, Brian Wolff <bawo...@gmail.com> wrote:

> On 9/5/15, rupert THURNER <rupert.thur...@gmail.com> wrote:
> > On Fri, Sep 4, 2015 at 10:37 PM, Matthew Flaschen <
> mflasc...@wikimedia.org>
> > wrote:
> >
> >> There is consensus at
> >>
> >>
> https://www.mediawiki.org/wiki/Talk:Code_of_conduct_for_technical_spaces/Draft#Next_steps
> >> that the best way to finalize the CoC draft is to focus on a few
> >> sections at once (while still allowing people to comment on other
> >>
> > lol, consensus among whom, to what? i am against it (i'd love to send the
> > reasons in another mail though), do i count, and it is still consensus?
>
> Consensus, of the people participating in the talk page of the draft,
> in the typical Wikimedia definition of the word (Most arguments have
> .puttered out, and a super-majority of those participating seem to
> have settled on some agreement).
>
> > i
> > would prefer if you would be so kind to define one measurable criteria
> for
> > the question "do we need a code of conduct", no matter if entry or
> success
> > criteria. e.g
> >
> > * 50 volunteers from different part of the world saying that we need it
> > * 20% of committers want it
> > * after one year 20% more volunteer commits are done
> >
> > other critieria like "people attending conferences", or "mails written"
> > would be a bad idea, as the goal is to have more contributions, not more
> > conference tourists or mailing list tourists. what you think, matt, or
> quim
> > ?
>
> I feel like this is mixing up the question of whether we "need" a code
> of conduct, with whether we will get a code of conduct.
>
>
the mails sent here the last week made me think more thorough about what
the actual problem is, and reconsider my posiiton. i added comments to the
phabricator ticket at: https://phabricator.wikimedia.org/T90908#1612033.

to summarize the phabricator comments briefly, i experienced the wikimedia
technical community as arrogant and ignorant, paradoxically despite the
persons in the community are not arrogant and ignoring. mails get no
answer, ticktes get closed immediately or reshuffled, patches sit in gerrit
for years. contrary, the most successful open source community, linux /
git, tolerates things which we would not tolerate (e.g.
https://youtu.be/MShbP3OpASA?t=2895 f*ck nvidia). i experienced that
community as extremely welcoming and helpful.

after rethinking, i am now convinced that our community can be come more
welcoming with two measures. first, the mindset need to change to be
welcoming. if constant talking about the approach on the mailing list is
not enough a 10 lines code of conduct might help, containing a "WMF persons
assure on every contact that the client walks away happy." instead of a
"WMF punishes misbehave". contrary to all the punishment suggestions above,
it would be a positive policy. the ones involved in raising children
already saw how much more effective a praising and lauding approach is -
which i find works as well with adults. good is also that praising works
international, no cultural barriers.

second i think that our technical products should have a skilled programmer
as product owner who likes tinkering with the product. but - this ideally
goes into a separate mail thread.

best,
rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code of Conduct: Intro, Principles, and Unacceptable behavior sections

2015-09-05 Thread rupert THURNER
On Fri, Sep 4, 2015 at 10:37 PM, Matthew Flaschen 
wrote:

> There is consensus at
>
> https://www.mediawiki.org/wiki/Talk:Code_of_conduct_for_technical_spaces/Draft#Next_steps
> that the best way to finalize the CoC draft is to focus on a few
> sections at once (while still allowing people to comment on other
> ones).  This allows progress without requiring people to monitor all
> sections at once and lets us separate the questions of “what are our
> goals here?” and “how should this work?”.  After these sections are
> finalized, I recommend minimizing or avoiding later substantive
> changes to them.
>
> The first sections being finalized are the intro (text before the
> Principles section), Principles, and Unacceptable behavior.  These
> have been discussed on the talk page for the last two weeks, and
> appear to have stabilized.
>
> However, there may still be points that need to be refined. Please
> participate in building consensus on final versions of these sections:
>
> *
> https://www.mediawiki.org/wiki/Talk:Code_of_conduct_for_technical_spaces/Draft
>
> *
> https://www.mediawiki.org/wiki/Code_of_conduct_for_technical_spaces/Draft
>
> If you are not comfortable contributing to this discussion under your
> name or a pseudonym, you can email your feedback or suggestions to
> conduct-discuss...@wikimedia.org .  Quim Gil, Frances Hocutt, and
> Kalliope Tsouroupidou will be monitoring this address and will
> anonymously bring the points raised into the discussion at your
> request.
>
>
lol, consensus among whom, to what? i am against it (i'd love to send the
reasons in another mail though), do i count, and it is still consensus?
probably not, because i did maybe two unimportant commits for kiwix. i
would prefer if you would be so kind to define one measurable criteria for
the question "do we need a code of conduct", no matter if entry or success
criteria. e.g

* 50 volunteers from different part of the world saying that we need it
* 20% of committers want it
* after one year 20% more volunteer commits are done

other critieria like "people attending conferences", or "mails written"
would be a bad idea, as the goal is to have more contributions, not more
conference tourists or mailing list tourists. what you think, matt, or quim
?

best,
rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Code of conduct

2015-08-21 Thread rupert THURNER
On Sun, Aug 16, 2015 at 6:24 PM, Oliver Keyes oke...@wikimedia.org wrote:

 On 16 August 2015 at 04:06, rupert THURNER rupert.thur...@gmail.com
 wrote:
  that is an impressive list, amir. WMF hast its terms of use:
  https://wikimediafoundation.org/wiki/Terms_of_Use (TOU) . admitted, an
  illegible monster compared to the simple statements below, like
  contributor covenant. i honestly do not think that an open movement
  like the wikimedia movement should invent any new terms, licenses,
  codes, but influence existing ones. by putting your stuff on the
  mediawiki.org site you and all contributors are bound to the TOU. and
  we already see that the many rules contradict each other in little
  areas, they cannot be updated fast enough without an army of persons.
  the terms of use e.g. suggest to use CC-BY-SA 3.0, which lead to a
  collection of law suites in germany, while CC-BY-SA 4.0 would have
  prevented at least some of them, see here:
  https://lists.wikimedia.org/pipermail/wikimedia-l/2015-July/078685.html
  .

 I don't understand how the terms of use or copyright license relate in
 any way to codes of conduct.

 If you mean we should be looking for good examples of existing
 enforcement mechanisms or language, I absolutely agree, and that is
 part of what the Code of Conduct is trying to do.

 i mean that we duplicate text in hundreds of slightly differing rules,
guidelines, policies, terms, codes, in different languages. this inflation
of texts is very special to the wikimedia movement. my personal expectation
would be that movement paid persons do have as main task to reduce the
complexity for volunteers, readers, writers, photographers, coders, etc.
and as second task, they support innovative techniques. we should not
forget it takes time to write stuff, and it takes exponentially more time
to read it. if we make a wikimedia policy, it has the potential to be read
by 1 billion people. reading policies and writing policies can be
considered as waste because it is not the mission of wikipedia, not the
mission of WMF :) coming back to the example terms of use, they state:
*Civility* – You support a civil environment and do not harass other
users. paragraph
4 vastly elaborates on it. a 90% duplicate of the code of conduct. brion,
civility _is_ enforced already today by the terms of use, nothing new
necessary.

how does this relate to copyright license? directly not really, but i tried
to hint that i would expect a technical solution from a technical person.
as example where our written rules go wrong i cited the thread about
licenses and reuse in commons, in two aspects. ONE, updating a lot of
policies is a sisyphus task, and the WMF fails already today. the terms of
use include still the old CC license, using the new one would prevent law
suits in germany. TWO, you oliver, matt, quim and other technicians, would
have the responsibility to come up with technical solutions to exactly this
community problem, not paper. can we add metadata to images:
https://lists.wikimedia.org/pipermail/wikimedia-l/2015-July/078782.html.
problem would be solved by a technical implementation and maybe adapting
the license. which, in my biased opinion, has a huge impact and solves the
problem at source for 120 million german speaking persons, and probably in
many other countries as well.

best,
rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code of conduct

2015-08-16 Thread rupert THURNER
On Fri, Aug 14, 2015 at 2:28 AM, Matthew Flaschen
mflasc...@wikimedia.org wrote:
 On 08/13/2015 06:09 PM, David Gerard wrote:

 On 13 August 2015 at 22:30, rupert THURNER rupert.thur...@gmail.com
 wrote:

 Oliver,  I must be a little blind but I do not see examples of unfriendly
 behaviour in this thread.



 I linked to http://kovalc.in/2015/08/12/harassers.html - perhaps that
 doesn't count as unfriendly behaviour, or perhaps isn't in this
 thread. It was four messages before your post in GMail, which i see
 you are using; it's not clear to me how you missed it, but evidently
 you did.


 I think he meant unfriendly comments in the thread itself, not the thread
 linking to unfriendly behavior elsewhere.


hehe, matt, i see my english is not precise enough. i precisely ment
examples you want to address with an additional code of conduct. i
could find davids mail now - i must admit i deleted it earlier because
it was only a link without describing the behaviour in the mail. now i
opened the link, read it, used google trying to find out what happend,
and am still not able to make out a relationship to code contributions
in the wikimedia space, nor to the wikimedia movement in general.
would you please add examples and let us analyze if the existing terms
of use do not suffice to address this:
https://wikimediafoundation.org/wiki/Terms_of_Use .

best,
rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Code of conduct

2015-08-16 Thread rupert THURNER
that is an impressive list, amir. WMF hast its terms of use:
https://wikimediafoundation.org/wiki/Terms_of_Use (TOU) . admitted, an
illegible monster compared to the simple statements below, like
contributor covenant. i honestly do not think that an open movement
like the wikimedia movement should invent any new terms, licenses,
codes, but influence existing ones. by putting your stuff on the
mediawiki.org site you and all contributors are bound to the TOU. and
we already see that the many rules contradict each other in little
areas, they cannot be updated fast enough without an army of persons.
the terms of use e.g. suggest to use CC-BY-SA 3.0, which lead to a
collection of law suites in germany, while CC-BY-SA 4.0 would have
prevented at least some of them, see here:
https://lists.wikimedia.org/pipermail/wikimedia-l/2015-July/078685.html
.

rupert

On Sat, Aug 15, 2015 at 10:57 AM, Amir Ladsgroup ladsgr...@gmail.com wrote:
 I was trying to adapt such policy for technical spaces for two years, It is
 serious issue and it happens a lot, If it didn't happen to you, that
 doesn't mean it's not happening or doesn't worth being addressed. I'm
 working to adapt a CoC for pywikibot if this one fails [1]
 If you think it needs more work to be a feasible policy, I think so too,
 let's discuss on the talk page but if you think we don't need such policy,
 you are entitled to your opinion and that doesn't mean you are right. I
 feel we have this long discussion because some people from WMF is working
 on the CoC and there is the spirit of Since WMF did the superprotect, it
 hates the community between us.

 I just want to point out to so many CoCs that big tech communities have and
 remind us importance of the issue.
 * contributor-covenant: Adapted by AngularJS, Eclipse and more [2]
 * Open Code of Conduct: Adapted by Github, Yahoo, Facebook, Twitter. [3]
 * Djanog CoC [4]
 * Python CoC [5]
 * Ubuntu CoC [6]
 * Geek feminism [7]
 * And much more [8]

 [1]: https://www.mediawiki.org/wiki/Manual:Pywikibot/Code_of_conduct_RFC
 [2]: http://contributor-covenant.org/
 [3]: https://github.com/todogroup/opencodeofconduct
 [4]: https://www.djangoproject.com/conduct/reporting/
 [5]: https://www.python.org/community/diversity/
 [6]: http://www.ubuntu.com/about/about-ubuntu/conduct
 [7]: http://geekfeminism.org/about/code-of-conduct/
 [8]: http://geekfeminism.wikia.com/wiki/Code_of_conduct_evaluations

 Best

 On Sat, Aug 15, 2015 at 2:53 AM David Gerard dger...@gmail.com wrote:

 On 14 August 2015 at 22:45, Matthew Flaschen mflasc...@wikimedia.org
 wrote:
  On 08/12/2015 06:41 PM, David Gerard wrote:
  On 12 August 2015 at 23:00, Matthew Flaschen mflasc...@wikimedia.org
  wrote:

  Enforcement is still to-be-determined.

  This does need to be sorted out ahead of time.

  See my proposal at
 
 https://www.mediawiki.org/wiki/Talk:Code_of_conduct_for_technical_spaces/Draft#First_line_of_enforcement
  .  There are some details to be refined, but I like having a single
 initial
  point of contact.



 +1 - this looks a good start. Having *something* that can deal with
 the cases you can hardly believe and yet not fall apart at the
 social-mechanism gaming that wikicranks are so good at is an excellent
 start.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code of conduct

2015-08-13 Thread rupert THURNER
On Aug 13, 2015 10:16 PM, Oliver Keyes oke...@wikimedia.org wrote:

 On 13 August 2015 at 16:10, Antoine Musso hashar+...@free.fr wrote:
  Le 07/08/2015 02:17, Matthew Flaschen a écrit :
  We're in the process of developing a code of conduct for technical
  spaces.  This will be binding, and apply to all Wikimedia-related
  technical spaces (including but not limited to MediaWiki.org,
  Phabricator, Gerrit, technical IRC channels, and Etherpad).
 
  Please participate at
 
https://www.mediawiki.org/wiki/Code_of_conduct_for_technical_spaces/Draft .
   Suggestions are welcome here or at
 
https://www.mediawiki.org/wiki/Talk:Code_of_conduct_for_technical_spaces/Draft
  .
 
  Hello Matt,
 
  It seems the code of conduct is fairly similar to the friendly space
  policy. Though the later was meant for conferences, it can probably be
  amended to be applied to cyberspace interactions.
 
  https://wikimediafoundation.org/wiki/Friendly_space_policy
 
  Do we have any examples of unfriendly behaviour that occurred recently?
 

 The thread you are replying to contains both examples of unfriendly
 behaviour in a technical context and discussion over the direct
 applicability of the friendly spaces policy; reviewing it may be a
 good idea.

Oliver,  I must be a little blind but I do not see examples of unfriendly
behaviour in this thread.

In general,  Matt, I do experience that the wikimedia movement is
criticized having too many rules and policies. Add another one does not
help. At the end of the day your target group is code contributors,  not
policy readers. If somebody does not behave and not contribute,  the person
is easily shut up. If somebody contributes a lot, some diplomacy is
required. What you do here is, imho, an example of an organization busy
with itself. I won't be angry if you stop this thread and delete the wiki
page. Let me add,  I really appreciate and find very valuable all the other
technical contributions and discussions. And Matt, of course I appreciate
that you know what you are talking about beeing software and Wikipedia
content contributor.

Best,
Rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfall] VisualEditor on Wikipedia now faster with RESTBase

2015-03-25 Thread rupert THURNER
Fantastic, many thanks to all involved!!
On Mar 20, 2015 1:15 AM, Gabriel Wicke gwi...@wikimedia.org wrote:

 On Thu, Mar 19, 2015 at 4:50 PM, Jared Zimmerman 
 jared.zimmer...@wikimedia.org wrote:

  https://en.wikipedia.org/wiki/Barack_Obama?veaction=edit just loaded in
 2
  seconds.
 


 Much of this is also owed to *a lot* of optimization work in VisualEditor
 over the last months. Plenty of ingenuity and hard work by the entire
 VisualEditor team and Ori went into making this possible.

 Cheers!

 Gabriel
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia REST content API is now available in beta

2015-03-15 Thread rupert THURNER
Hi Gabriel,  I am so glad to read about this excellent achievement! Is the
wikitext the original wikitext wich could be changed and saved back? And
the difference is a real difference which would allow kind of patrol
applications?

Rupert
On Mar 10, 2015 11:23 PM, Gabriel Wicke gwi...@wikimedia.org wrote:

 Hello all,

 I am happy to announce the beta release of the Wikimedia REST Content API
 at

 https://rest.wikimedia.org/

 Each domain has its own API documentation, which is auto-generated from
 Swagger API specs. For example, here is the link for the English Wikipedia:

 https://rest.wikimedia.org/en.wikipedia.org/v1/?doc

 At present, this API provides convenient and low-latency access to article
 HTML, page metadata and content conversions between HTML and wikitext.
 After extensive testing we are confident that these endpoints are ready for
 production use, but have marked them as 'unstable' until we have also
 validated this with production users. You can start writing applications
 that depend on it now, if you aren't afraid of possible minor changes
 before transitioning to 'stable' status. For the definition of the terms
 ‘stable’ and ‘unstable’ see https://www.mediawiki.org/wiki/API_versioning
 .

 While general and not specific to VisualEditor, the selection of endpoints
 reflects this release's focus on speeding up VisualEditor. By storing
 private Parsoid round-trip information separately, we were able to reduce
 the HTML size by about 40%. This in turn reduces network transfer and
 processing times, which will make loading and saving with VisualEditor
 faster. We are also switching from a cache to actual storage, which will
 eliminate slow VisualEditor loads caused by cache misses. Other users of
 Parsoid HTML like Flow, HTML dumps, the OCG PDF renderer or Content
 translation will benefit similarly.

 But, we are not done yet. In the medium term, we plan to further reduce the
 HTML size by separating out all read-write metadata. This should allow us
 to use Parsoid HTML with its semantic markup
 https://www.mediawiki.org/wiki/Parsoid/MediaWiki_DOM_spec directly for
 both views and editing without increasing the HTML size over the current
 output. Combined with performance work in VisualEditor, this has the
 potential to make switching to visual editing instantaneous and free of any
 scrolling.

 We are also investigating a sub-page-level edit API for micro-contributions
 and very fast VisualEditor saves. HTML saves don't necessarily have to wait
 for the page to re-render from wikitext, which means that we can
 potentially make them faster than wikitext saves. For this to work we'll
 need to minimize network transfer and processing time on both client and
 server.

 More generally, this API is intended to be the beginning of a multi-purpose
 content API. Its implementation (RESTBase
 http://www.mediawiki.org/wiki/RESTBase) is driven by a declarative
 Swagger API specification, which helps to make it straightforward to extend
 the API with new entry points. The same API spec is also used to
 auto-generate the aforementioned sandbox environment, complete with handy
 try it buttons. So, please give it a try and let us know what you think!

 This API is currently unmetered; we recommend that users not perform more
 than 200 requests per second and may implement limitations if necessary.

 I also want to use this opportunity to thank all contributors who made this
 possible:

 - Marko Obrovac, Eric Evans, James Douglas and Hardik Juneja on the
 Services team worked hard to build RESTBase, and to make it as extensible
 and clean as it is now.

 - Filippo Giunchedi, Alex Kosiaris, Andrew Otto, Faidon Liambotis, Rob
 Halsell and Mark Bergsma helped to procure and set up the Cassandra storage
 cluster backing this API.

 - The Parsoid team with Subbu Sastry, Arlo Breault, C. Scott Ananian and
 Marc Ordinas i Llopis is solving the extremely difficult task of converting
 between wikitext and HTML, and built a new API that lets us retrieve and
 pass in metadata separately.

 - On the MediaWiki core team, Brad Jorsch quickly created a minimal
 authorization API that will let us support private wikis, and Aaron Schulz,
 Alex Monk and Ori Livneh built and extended the VirtualRestService that
 lets VisualEditor and MediaWiki in general easily access external services.

 We welcome your feedback here:
 https://www.mediawiki.org/wiki/Talk:RESTBase
 - and in Phabricator
 
 https://phabricator.wikimedia.org/maniphest/task/create/?projects=RESTBasetitle=Feedback
 :
 .

 Sincerely --

 Gabriel Wicke

 Principal Software Engineer, Wikimedia Foundation
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Idea for new desktop / mobile kiwix like application

2015-01-25 Thread rupert THURNER
Petr, do you think it would be an option to   use git version control as a
storage format instead of openzim? Which would facilitate edit and merge
back changes?

Rupert
On Jan 23, 2015 11:59 AM, Petr Bena benap...@gmail.com wrote:

 Hi,

 I know most of you hate reinventing a wheel so I first send it here,
 before I launch that project :)

 Some of you probably know kiwix - kiwix.org which is offline wikipedia
 reader. I think the idea of this reader is cool, most of you probably
 sometimes wanted to access wikipedia while being offline somewhere,
 but couldn't. Kiwix can help with this, however it has one big problem
 and solution for it is so complex that it would basically need a
 rewrite of whole thing.

 That problem is that you need to download pretty huge file (40+GB) in
 order to use it for en wikipedia for example. And if you wanted to
 update those few wikipages you are interested in, to a latest
 revision, then you again need to download that huge file.

 That suck. Especially with GPRS internet and similar connectivity and
 it also suck because mobile phones don't even have space for so much
 data. My idea is to create app similar to kiwix, that would use SQLite
 DB and using wikipedia API it would (slowly, apache friendly) download
 contents of any mediawiki installation based on user selection, so
 that you could download just a 1 page for offline reading, or 1
 category. Or 1000 categories. Or precompiled sets of pages created by
 users (books). You could easily update these using API anytime to
 latest version. You could get media files for these pages, etc, etc...
 (You could probably even edit the pages offline, and then update them
 when you are online, but that is just extra feature)

 I think this approach would work much better and it's sad kiwix
 already doesn't support it. At some point, if it worked I think this
 new code could be merged back into kiwix, I am going to use C++ in the
 end, which kiwix uses as well.

 What do you think about it, is it worth of working on? Is there
 actually a community of offline wikipedia readers that would
 appreciate it?

 Thanks

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Idea for new desktop / mobile kiwix like application

2015-01-25 Thread rupert THURNER
The storage format is very efficient and there is a c library for it :
https://libgit2.github.com
It should be not necessary to create complex versioning around it.

You plan to store html or wikitext?

Rupert
On Jan 25, 2015 6:37 PM, Petr Bena benap...@gmail.com wrote:

 I don't really know, it is technically possible but probably not
 suitable. I don't want to create offline wiki. Just a reader of a
 wiki, so no complex versioning is required for that.

 On Sun, Jan 25, 2015 at 6:00 PM, rupert THURNER
 rupert.thur...@gmail.com wrote:
  Petr, do you think it would be an option to   use git version control as
 a
  storage format instead of openzim? Which would facilitate edit and merge
  back changes?
 
  Rupert
  On Jan 23, 2015 11:59 AM, Petr Bena benap...@gmail.com wrote:
 
  Hi,
 
  I know most of you hate reinventing a wheel so I first send it here,
  before I launch that project :)
 
  Some of you probably know kiwix - kiwix.org which is offline wikipedia
  reader. I think the idea of this reader is cool, most of you probably
  sometimes wanted to access wikipedia while being offline somewhere,
  but couldn't. Kiwix can help with this, however it has one big problem
  and solution for it is so complex that it would basically need a
  rewrite of whole thing.
 
  That problem is that you need to download pretty huge file (40+GB) in
  order to use it for en wikipedia for example. And if you wanted to
  update those few wikipages you are interested in, to a latest
  revision, then you again need to download that huge file.
 
  That suck. Especially with GPRS internet and similar connectivity and
  it also suck because mobile phones don't even have space for so much
  data. My idea is to create app similar to kiwix, that would use SQLite
  DB and using wikipedia API it would (slowly, apache friendly) download
  contents of any mediawiki installation based on user selection, so
  that you could download just a 1 page for offline reading, or 1
  category. Or 1000 categories. Or precompiled sets of pages created by
  users (books). You could easily update these using API anytime to
  latest version. You could get media files for these pages, etc, etc...
  (You could probably even edit the pages offline, and then update them
  when you are online, but that is just extra feature)
 
  I think this approach would work much better and it's sad kiwix
  already doesn't support it. At some point, if it worked I think this
  new code could be merged back into kiwix, I am going to use C++ in the
  end, which kiwix uses as well.
 
  What do you think about it, is it worth of working on? Is there
  actually a community of offline wikipedia readers that would
  appreciate it?
 
  Thanks
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Which old RFCs to discuss on next week's RFC chat

2014-12-09 Thread rupert THURNER
Not sure who is allowed to comment here, so I hope mentioning a +1 to the
ones which I understand sufficiently and like might be okay.

On Dec 4, 2014 10:41 PM, Daniel Kinzler dan...@brightbyte.de wrote:

 Hi all!
...
 * https://www.mediawiki.org/wiki/Requests_for_comment/Itemise_protection

 This argues that we should support multiple protections to apply to a
page at
 once, e.g. indefinite semi-protection and at the same time a short-term
full
 protection.

 I'd personally like to discuss this as part of a larger refactoring that
would
 implement protection based on our permission system. Basically, applying
 protection would mean overriding which group has which permissions on a
given page.
This makes a lot of sense imo.

 *
https://www.mediawiki.org/wiki/Requests_for_comment/Regex-based_blacklist

 A proposal to overhaul SpamBlacklist (from 2008). I'd personally be more
 interested in integrating this with (a rewrite of) AbuseFilter. We could
have
 multiple lists, accessible from AbuseFilter rules.
Imo not obsolete...


 There are also some RFCs that relate to organizational issues rather than
 MediaWiki features and architecture as such:

 *
https://www.mediawiki.org/wiki/Requests_for_comment/Release_notes_automation

 Automatically compose RELEASE-NOTES based on special lines in the git
commit
 message. I like the idea!
Me too :-)

Rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: [African Wikimedians] Afripédia Douala

2014-12-03 Thread rupert THURNER
fyi, a feedback from duala, cameroon, concerning uploading fotos to commons.


-- Forwarded message --
From: Kasper Souren kasper.sou...@gmail.com
Date: Wed, Dec 3, 2014 at 12:01 AM
Subject: Re: [African Wikimedians] Afripédia Douala
To: Mailing list for African Wikimedians
african-wikimedi...@lists.wikimedia.org



On Tuesday, December 2, 2014, Florence Devouard anth...@anthere.org wrote:

 I wanted to outline that both Kumusha Takes Wiki and Wiki Loves Africa are 
 being conducted in English and French

Great!


 https://commons.wikimedia.org/wiki/Commons:Wiki_Loves_Africa


I see the contest is over now, will there be another one coming up?




 I was quite disappointed by the limited participation of Cameroom to the 
 photo contest. Given the effort already been done in that country to train 
 editors and to promote the project, I expected more input.


While trying to upload some pictures to
https://commons.wikimedia.org/wiki/Category:Afripedia_Douala I'm
starting to understand at least one part of the issue. Internet
connections are really bad. Very high ping times, both at the French
institute as well as in the hotel I'm staying in now, which I can't
easily consider cheap (at least in terms of pricing).

Is there a robust way to upload pictures to Commons over bad internet
connections?

Facebook and G+ Android apps have done a fairly good job at uploading
pictures automatically, but now I still need to first download them to
my laptop and then upload them through the Upload Wizard, which is
failing me. Using the Commons Android app is not a good alternative
from the hotel connection because it wants me to enter a user/pass
combination too often on my phone.

Cheers,
Kasper


___
African-Wikimedians mailing list
african-wikimedi...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/african-wikimedians

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Simplest UI for creating editing Phabricator tasks

2014-08-28 Thread rupert THURNER
Am 27.08.2014 17:02 schrieb Quim Gil q...@wikimedia.org:

 On Wednesday, August 27, 2014, rupert THURNER rupert.thur...@gmail.com
 wrote:

  is there a way to simply order tasks so one can retrieve a list? top of
  list then gets fixed first?
 

 There are workboards for projects, which are a bit more complex/flexible
 than this. See for instance the workboard of the Wikimedia Phabricator Day
 1 project:

 http://fab.wmflabs.org/project/board/31/

 Is this what you are looking for? You can also create lists of tasks
 through search queries i.e. open tasks assigned to qgil sorted by
priority:
 http://fab.wmflabs.org/maniphest/query/j.RJXIy2lbry

yes exactly. atlassians jira e.g. is using an additional number field to
allow reordering tasks via drag and drop. the task list can ne displayed
then in this order in other views and searches.

 You can try more searches at
 http://fab.wmflabs.org/maniphest/query/advanced/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Simplest UI for creating editing Phabricator tasks

2014-08-27 Thread rupert THURNER
is there a way to simply order tasks so one can retrieve a list? top of
list then gets fixed first?

rupert
 Am 27.08.2014 09:18 schrieb Quim Gil q...@wikimedia.org:

 Hi,

 Users of the future Wikimedia Phabricator will have a very simple and
 straightforward interface to create and edit tasks. Bugzilla's UI was one
 of the top complaints from new/casual users, and we can fix this in
 Phabricator easily. Currently we are testing this setup in fab.wmflabs.org
 ,
 where you can see an example:

 http://fab.wmflabs.org/maniphest/task/create/ (imagine that the Points
 field is not there).

 Advanced users needing to prioritize tasks, assign them to someone, and
 resolve them can join the Triagers team in order to get the additional
 permissions:

 http://fab.wmflabs.org/project/view/74/

 We are still fine tuning this process at http://fab.wmflabs.org/T64 --
 your
 feedback is welcome.


 --
 Quim Gil
 Engineering Community Manager @ Wikimedia Foundation
 http://www.mediawiki.org/wiki/User:Qgil
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Future platforms, devices, and consumers

2014-08-16 Thread rupert THURNER
imo this is an intereating question but too far reaching. simpler to
understand and keep in mind are minimum display size, resolution, viewer
and editing application. with it automatically restricts itself to a
couple of operating systems, browsers and native apps.

rupert
Am 15.08.2014 23:21 schrieb Pine W wiki.p...@gmail.com:

 Following up on Lila's Wikimania keynote: what platforms and devices should
 we have in mind when making decisions today or in the near future about
 Wikimedia content creation and delivery?

 *Digital eyewear?

 *Smart watches?

 *3D displays?

 *Large format displays?

 *Health monitoring devices?

 *Smart homes and buildings?

 *Computer-led education systems in classrooms and remote learning?

 *Driverless or semi-driverless cars?

 *GPS-enabled devices of all sizes?

 *Artificially intelligent consumers of Wikimedia content?

 I would be interested in hearing others' thoughts.

 Thanks,

 Pine
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Feature request.

2014-08-06 Thread rupert THURNER
On Tue, Aug 5, 2014 at 10:38 PM, Quim Gil q...@wikimedia.org wrote:
 On Mon, Aug 4, 2014 at 9:50 PM, Pine W wiki.p...@gmail.com wrote:

 I am asking Quim to provide us an update.


 Me? :) I'm just an editor who, like many of you, has suffered this problem
 occasionally.

 On Mon, Aug 4, 2014 at 10:02 AM, rupert THURNER rupert.thur...@gmail.com
 wrote:

 that would be a hullarious feature! which is btw available in some other
 opensoure and proprietory wikis.


 TWiki is an open source wiki and also has (had?) a concept of blocking a
 page while someone else is editing. This feature might sound less than
 ideal in the context of, say, Wikipedia when a new Pope is being nominated,
 but I can see how many editors and MediaWiki adminis have missed such
 feature at some point.

 If I understood correctly, VisualEditor already represents an improvement
 vs Wikitext because the chances of triggering conflicting edits are
 smaller, because of the way the actual content is modified and updated in
 every edit.

i'd have strong doubts here, from a technical standpoint :)

 Rupert, in any case you see that the trend is going in the direction of
 being more efficient handling concurrent edits. Blocking pages while
 another editor supposedly is working on them might work in e.g. corporate
 wikis where most f the times the Edit link is clicked for a reason, but it
 could be potentially counterproductive in sites like Wikipedia.


i can only 100% agree, quim, and i am glad you help clarifying the
feature request in this direction. the suggestion is notify or
show, not block. if a user presses edit, the page shows other
persons who pressed edit in the last, say 15 minutes, and did not save
yet. it sounds simple to implement, but big benefit, especially with
many users. an additional goodie could be to show it on the edit
window of other user as well, so she can quickly save. if a user
ignores the notification and is quicker in saving, we have the current
situation. additionally these in work templates would be made
superfluous in many cases.

rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Feature request.

2014-08-04 Thread rupert THURNER
that would be a hullarious feature! which is btw available in some other
opensoure and proprietory wikis.

rupert
Am 04.08.2014 11:47 schrieb Nkansah Rexford nkansahrexf...@gmail.com:

 Hi all,

 I'm Rexford, and just posting here for the first time. Please inform if
 this place isn't the right area to suggest features. I was encouraged to
 request features here. Please correct me if I'm wrong.

 Its about edit conflict on Wikipedia and other projects. It happens when I
 get into an article to edit, but before I could save, someone else goes
 into the article, edits and save. It happens to myself and many out there.

 Sometimes many minutes work of changes can be lost.

 The feature request is this: When a person starts editing an article, and
 another person tries to edit that same article, he or she gets a message on
 screen that the article is already engaged. This suggestion is similar to
 how WordPress informs the second person who tries to edit a page whiles
 someone else is already editing.

 Its likely one wouldn't like to edit a page when he or she knows someone is
 in it editing. I think its much better that way than allowing multiple
 edits on the page but allowing one persons edit to go in per save.

 Thanks.

 rexford | google.com/+Nkansahrexford | sent from smartphone
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] flow allows to mail change?

2014-07-15 Thread rupert THURNER
hi

does flow allow to mail the added text. and allows to reply via mail?

rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tool for quickly uploading 10000 images with description

2014-07-07 Thread rupert THURNER
glamwikitoolset might be an option as well.
http://m.mediawiki.org/wiki/Extension:GWToolset/Technical_Design

rupert
 Am 07.07.2014 17:03 schrieb Yury Katkov katkov.ju...@gmail.com:

 It's a nice one, thanks! I will need to add just a little bit to it to suit
 my needs!

 -
 Yury Katkov


 On Mon, Jul 7, 2014 at 4:42 PM, Florian Schmidt 
 florian.schmidt.wel...@t-online.de wrote:

  Hello!
 
  O think that is the right for you:
  https://m.mediawiki.org/wiki/Manual:ImportImages.php
 
  Simply upload the images via FTP for example to the server and run the
  script like explainend on Manual page.
 
  Kind regards
  Florian
 
  Gesendet mit meinem HTC
 
  - Reply message -
  Von: Yury Katkov katkov.ju...@gmail.com
  An: Wikimedia developers wikitech-l@lists.wikimedia.org
  Betreff: [Wikitech-l] tool for quickly uploading 1 images with
  description
  Datum: Mo., Juli 7, 2014 16:36
 
  Hi everyone!
 
  Does anyone knows about the tool that can help to upload a lot of files
 and
  create the page for every file with a given description? I'd say that it
  should be a maintenance script since for some reason the API upload works
  pretty slow. I saw UploadLocal Extension but it's too manual and it
 doesn't
  work well when the amount of files to upload is very large.
 
  Cheers,
  -
  Yury Katkov
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: [Wikimedia-GH] Wiki Loves Earth Begins!

2014-05-01 Thread rupert THURNER
hi,

is there a possibility to get a banner on enwp for ghana to wiki loves
earth, as this is this years main contest there?
https://commons.wikimedia.org/wiki/Commons:Wiki_Loves_Earth_2014_in_Ghana

rupert


-- Forwarded message --
From: Enock Seth Nyamador kwadzo...@gmail.com
Date: Thu, May 1, 2014 at 11:37 AM
Subject: Re: [Wikimedia-GH] Wiki Loves Earth Begins!
To: Planning Wikimedia Ghana Chapter wikimedia...@lists.wikimedia.org


Here is our poster:


Regards,

Enock
enwp.org/User:Enock4seth


On Thu, May 1, 2014 at 1:47 AM, Enock Seth Nyamador kwadzo...@gmail.com wrote:

 Hi All,

 Wiki Loves Earth has started, you can now upload your photos, here.

 FYI  anyone reading Wikipedia and Wikimedia Commons from Ghana (specifically 
 Ghanaian IP's) whether logged in or not will see the image below:


 Regards,

 Enock
 enwp.org/User:Enock4seth



___
Wikimedia-GH mailing list
wikimedia...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikimedia-gh

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Forget mailing lists and on-wiki discussions; Twitter's the place!

2014-04-07 Thread rupert THURNER
Am 07.04.2014 01:20 schrieb Steven Walling steven.wall...@gmail.com:

 On Sun, Apr 6, 2014 at 4:05 PM, Tomasz W. Kozlowski
 tom...@twkozlowski.netwrote:

 1. I am deeply uncomfortable with the fact that you are choosing un-free
  fonts over free ones.
  2. I am deeply uncomfortable with the fact that you decided not to
respect
  the consensus /not/ to choose non-free fonts -- such as Arial and
Helvetica
  --
  over free fonts; a discussion which I only read, but which, as far as I
  remember, saw participation from yourself, Quim, Greg, and some other
  people.
 

 We've tried the alternative and it's untenable according to the feedback
 we're getting. I wish it wasn't. I'd rather put free fonts first in the
 stack, if they actually work for users. Twice now we've tried putting
 different freely-licensed fonts first. Both times, Windows users who had
 them have told us they either merely disliked them or they have caused
 unacceptably poor rendering, particularly for those without font
smoothing.
 There simply is not widely-available font that meets all our needs while
 also being freely-licensed. The compromise is either to deliver a
 freely-licensed webfont to all users (which we're not going to do right
 now, though it's the ideal IMO) or to specify the best fonts users already
 have on their system free or not, which accomplish the consistency and
 legibility we're looking for. This is just the reality. Whether or not the
 CSS/LESS declares them explicitly or not, non-free fonts are what most
 users have already and want to use, because they actually work. This is
 true whether we set a more specific stack than sans-serif or not.


  As for your suggestion that I'm only looking to make a fuss, here's some
  basic facts for you to ponder.
 
  A. /I/ pointed it out to Greg and to you on IRC that deploying
Typography
  Refresh to all wikis on the same day (March 28) was a bad idea, and
that it
  would be better to roll it out with MediaWiki 1.23wmf21, as it would
give
  time to inform the community (as well as to push some last-minute
fixes).
 

 Delaying release to anticipate bugs that have not yet been reported by
 anyone makes no sense. At the time of release there were only four bugs
 open related to VectorBeta as an extension, none of which could have told
 us about the issue. How could last minute fixes be pushed for a bug no one
 had actually reported

Steven, given there was a Font stack which worked fine for years, i am a
little puzzled how you can break it and then argue using non free fonts is
a solution listening to people on twitter telling you it is broken now.
instead of reverting the change, or fix it properly? I mean, windows xp is
not new. And turning off clear type is common. I do it first because the
Microsoft standard fonts look dizzy with small font sizes on windows7.

Rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki, Cookies and EU Privacy Policy 95/46/EG

2014-03-11 Thread rupert THURNER
Am 10.03.2014 17:01 schrieb Manuel Schneider 
manuel.schnei...@wikimedia.ch:

 Am 10.03.2014 16:54, schrieb Chris Steipp:
  1) catch the click on the Login link to show a banner first to ask
for
  the users consent, on acceptance forward the user to the login page
 
  2) modify the login process to set the cookie after the actual login
and
  put an additional text on the login page like by logging in I accept
  the usage of cookies by this website

  The cookie on the login page is for the anti-csrf (and captcha if
needed)
  validation, so getting rid of it would be problematic from a technical
  perspective (or would require a second click on the login page).

 Thanks Chris for this comment.

 So that leaves us with option 1) - a javascript banner. I think that
 shouldn't be too hard to implement.

 A div which hovers over the Wiki page, the text, two buttons [accept]
 / [leave]. Accept points to Special:Userlogin, leave just closes the
banner.
 A javascript that shows this div onclick() on the Login link, if no
 cookie has already been set by the Wiki.

 Maybe even a LocalSettings.php variable $wgApproveCookies = true; that
 is true by default and allows admins of internal company wikis etc. to
 disable that banner.

 As an option we could even add another setting $wgApproveCookiesAlways,
 which makes the same div to show up as soon as a user enters the wiki.
 That way we can support admins that have further extensions installed in
 their wiki which add cookies right away - like Google Analytics.


Is there any technical argument against this proposal?

Rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Multimedia team architecture update

2014-03-07 Thread rupert THURNER
+1
If this helps to get it on the list
Am 07.03.2014 12:28 schrieb Gerard Meijssen gerard.meijs...@gmail.com:

 Hoi,

 When Commons gets the Wikidata treatment, almost everything that has to do
 with meta data will gain wikidata statements on the Wikidata item that
 reflects a media file (sound photo movie no matter). When items refer to
 Creators or Institutions they will refer to Wikidata proper.

 This will replace much if not most of how information is stored about media
 files.

 Pretty much almost everything will be impacted and that is what I expect to
 be reflected in considerations now because the alternative is that much of
 it will need to be revisited on a massive scale.
 Thanks,
   GerardM


 On 7 March 2014 12:19, Andre Klapper aklap...@wikimedia.org wrote:

  On Fri, 2014-03-07 at 11:50 +0100, Gerard Meijssen wrote:
   On the Wikidata roadmap it says that Commons will be targeted for
  inclusion
   in the second half of 2014. This will have a big impact on Commons.
   Consequently it will have a big impact on the things that you are
   discussing. Chances are that much of what you come up with now will be
   obsolete in a few months time or even worse make the development of the
   inclusion of Wikidata into Commons even harder.
  
   I find it odd that Wikidata is not mentioned at all in this overview.
 
  Please elaborate where / in which specific areas you would have expected
  to see Wikidata being mentioned in Gergo's overview, as I cannot
  interpret the consequently in your statement yet.
 
  andre
  --
  Andre Klapper | Wikimedia Bugwrangler
  http://blogs.gnome.org/aklapper/
 
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Design] Should MediaWiki CSS prefer non-free fonts?

2014-02-17 Thread rupert THURNER
On Tue, Feb 18, 2014 at 4:19 AM, Steven Walling swall...@wikimedia.orgwrote:

 On Mon, Feb 17, 2014 at 4:58 PM, Erik Moeller e...@wikimedia.org wrote:

  So, what would be the downside of listing a font like Arimo for
  sans-serif and Libertine for serif first in the stack? While not
  affecting the reader experience for a significant number of users, it
  would still be a symbolic expression of a preference for freely
  licensed fonts, and a conscious choice of a beautiful font for readers
  that have installed it.
 

 We basically tried the equivalent of this (placing relatively free fonts
 unknown on most platforms first) which Kaldari talked about previously.
 Ultimately that kind of declaration is useless for the vast majority of
 users and we got very specific negative feedback about it on the Talk page.
 These fonts are ignored by most systems when placed first or when placed
 later in the stack. Systems match the first font they recognize, so using
 something they don't recognize or putting it later is a largely just
 feel-good measure.

 The whole Arimo/Arial conundrum is largely a matter of the fact that
 Windows users simply do not have a Helvetica-like font available on most
 versions which is better than Arial, warts and all. Again, the best
 solution is to deliver a webfont, which most people with good design sense
 are doing these days, and we can't yet.


would you be so kind to invest a little of your precious time and make this
story easy to read/digest for the many people on this list? you might add
verifiable links of what you say and explain in a manner somebody normally
technically gifted can follow why we cannot (webfonts), and why we need
the change at all (i.e. who is the target), and why free fonts like ubuntu
are not good enough, what you tried, the feedback on talk pages? i am
already ashamed that i ask this now the second or third time ... and i
still try to do it in a nice and welcoming way, not shouting or swearing ...

rupert.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Design] Should MediaWiki CSS prefer non-free fonts?

2014-02-16 Thread rupert THURNER
hi steven, ryan,

thank you so much for jumping in here. could you please elaborate a little
on and in a more structured way:

1. why a change is needed?
2. what are the problems with webfonts?
3. why ubuntu (or replace it with any other free font) is not good enough?
4. why there is no budget to solve it proper, is so many are concerned?
5. what are your design goals?
6. who are the designers?

references to some free fonts:
* https://en.wikipedia.org/wiki/Ubuntu_(typeface)
* https://en.wikipedia.org/wiki/Palatino (urw palladio l and descendants)
* https://en.wikipedia.org/wiki/OpenDyslexic

best regards,

rupert



On Sun, Feb 16, 2014 at 11:07 AM, Federico Leva (Nemo)
nemow...@gmail.comwrote:

 Ryan Kaldari, 16/02/2014 06:54:

  Now that I've blamed everyone except for myself, I would like to suggest
 that we stop pointing fingers and get down to brass tacks.


 Brad's email was a bit caustic but IMHO it wasn't pointing fingers, unlike
 yours (though you helpfully pointed fingers towards everyone). ;-)



 My question for both the designers and the free font advocates is: Are
 there any free fonts that are...
 1. widely installed (at least on Linux systems)
 2. easily readable and not distractingly ugly
 3. would not be mapped to by the existing stack anyway (i.e. are not
 simply clones or substitutes for popular commercial fonts)


 I'm sorry but this question to the free font advocates does not make
 sense and I refuse to accept it, for two reasons:
 1) is not a given or an immutable law of physics, it's the designers' job
 to assess: if you really care for a specific font you serve it; if you
 don't want to serve fonts, then design must adapt to availability and not
 the opposite;
 2) is again the designers' job, I have no idea how one assesses easily
 readable* and I'd like us to banish personal opinions including adjectives
 like strange or ugly from any and all design decision;** moreover, if
 feedback had ever been desired on font choices, we would have a document
 explaining what this mythical style desired by the designers actually is,
 other than the superlunar ideal no human MediaWiki commentator can sense
 and comment.

 So again, I'm waiting for documentation. Whoever refrains from publishing
 documentation, research, design documents etc. as soon as they are produced
 prevents iterations and feedback from happening and hence takes full
 personal responsibility of whatever outcome of the process, begging to be
 personally blamed.

 Nemo

 (*) In my very biased and personal experience of a Latin alphabet
 languages reader, readable equals serif so that I can tell I from l
 etc., and DejaVu serif is the most beautiful font ever because it covers so
 many characters.
 (**) I'm really hearing them too often. They are suppressors of
 discussion/rational discourse and polarise discussions unnecessarily. Cf. 
 https://en.wikipedia.org/wiki/WP:IDONTLIKEIT.


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The Zürich Hackathon and you

2014-02-07 Thread rupert THURNER
Now in cc
Am 07.02.2014 10:26 schrieb Željko Filipin zfili...@wikimedia.org:

 On Fri, Feb 7, 2014 at 1:17 AM, Quim Gil q...@wikimedia.org wrote:

  you can ask any questions about the event to Manuel, CCed.


 It could be just my mail client (gmail) but I do not see anybody in cc.

 Željko
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Let's improve our password policy

2014-01-25 Thread rupert THURNER
hi steven,

thanks for this proposal. what i trap into consistently since years is not
beeing logged in, when i want to. i'd really appreaciate if this is shown
clearly, on all wiki's. i never can remember which ones indicate it and
which ones not. mediawiki.org indicates it, btw ... and i was trying to
comment there not logged in :)

for the password policy: display a strength indicator is great. anything
more? i would say just leave it to the user.

rupert.



On Fri, Jan 24, 2014 at 8:50 PM, Steven Walling steven.wall...@gmail.comwrote:

 Hi everyone,

 For some time now we've had two Requests for Comment floating around
 related to passwords, neither of them making much progress.

 One is the older password strength RFC which proposed creating a module
 to tell users about the strength of their passwords. The second, Password
 requirements, had some discussion but wasn't reaching consensus and
 implementation.

 After proposing it about a month ago, I've merged these two RFCs and
 refactored them in to
 https://www.mediawiki.org/wiki/Requests_for_comment/Passwords, partially
 based on feedback from Chris Steipp.

 Please comment. I've tried to sharpen the proposals down in to one thing we
 can do _right now_ which will do the most good for the most users. However,
 there are several other viable ideas which merit discussion and example
 implementations.

 Thanks!

 Steven
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikidata-l] Italian Wikipedia complements search results with Wikidata based functionality

2013-12-02 Thread rupert THURNER
Hi gerard, that sounds really exciting! Is it necessary to change a setting
to see this behaviour? If not i d appreciate if you could give an example
where one could see this best.

Rupert
Am 02.12.2013 18:50 schrieb Gerard Meijssen gerard.meijs...@gmail.com:

 Hoi,

 The Italian Wikipedia is the first project where people who use search
 will find results added from Wikidata. As you may know, Wikidata knows has
 more items with a label in a language than a Wikipedia has articles. With
 the Wikidata based functionality people will gain several functionalities
 that are new to them

- a link to Commons categories for a subject
- a link to Wikipedia articles in other languages
- a link to the Wikidata item
- visualisation care of the Reasonator

 When there are multiple items found in the search request, disambiguation
 will be provided based on the statements available on the items. Obviously
 as more labels are available in a language for statements, the experience
 will improve.

 This is a really exciting new development and I want to thank Magnus and
 Nemo for making it possible. I hope and expect that many Wikipedias will
 follow the example of the Italian Wikipedia. Particularly the smaller
 Wikipedias have much to gain from this new functionality.
 Thanks,
   GerardM

 ___
 Wikidata-l mailing list
 wikidat...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Architectural leadership in Wikimedia's technical community

2013-11-10 Thread rupert THURNER
On Wed, Nov 6, 2013 at 2:57 AM, Erik Moeller e...@wikimedia.org wrote:
...
 in March 2011 and June 2011, Brion Vibber, Mark Bergsma and Tim
 Starling were announced as Lead Software Architect, Lead Operations
 Architect and Lead Platform Architect of the Wikimedia Foundation,
 respectively.

 At WMF, this has increasingly raised the question how the architecture
 of Wikimedia’s technical infrastructure can be evolved at this new,
 larger scale, and how we can bring more voices into that conversation.
 I've shared this note with the architects ahead of time and taken some
 initial feedback into account.

 So how should this role evolve going forward? Some possible paths (you
 know I like to present options ;-):
...

 Option D: We come up with some kind of open process for
 designating/confirming folks as architects, according to some
 well-defined criteria (including minimum participation in the RFC
 process, well-defined domain expertise in certain areas, a track
 record of constructive engagement, etc.).

besides beeing technically very capable, mark, brion, and tim are
really nice persons on a personal level. they stay out of political
discussions, are not arrogant, always concentrate on helping to evolve
technically, and do not shy away from dirt work. as imo people tend to
attract theirlikes, i would see also an option that each of them is
allowed to choose the person who helps in their respective domain.

but, if you think its better to take option D, and mark, brion and tim
think this helps i am all for it.

rupert.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 2013 Datacenter RFP - open for submissions

2013-10-18 Thread rupert THURNER
Hi, would you be open to a data center outside the US in future, and if no,
why not?

Rupert
Am 18.10.2013 22:05 schrieb Ken Snider ksni...@wikimedia.org:

 The Wikimedia Foundation's Technical Operations team is seeking proposals
 on the provisioning of a new data-centre facility.

 After working through the specifics internally, we now have a public RFP
 posted[1] and ready for proposals. We invite any organization meeting the
 requirements outlined to submit a proposal for review.

 Most of the relevant details are in the document itself, but feel free to
 reach out to myself or the list should anyone have any questions.

 Please, feel free to forward this link far and wide - have colleagues,
 contacts or friends in the data-centre sector? Then please, forward it on!
 :)

 Thanks!

 --Ken.

 [1] https://wikimediafoundation.org/wiki/RFP/2013_Datacenter


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GitHub-based triangular workflow

2013-09-15 Thread rupert THURNER
Am 15.09.2013 17:13 schrieb Merlijn van Deen valhall...@arctus.nl:
...
 As you may be aware, the git-review based developer experience on Windows
 is less than perfect - especially compared to the old TortoiseSVN based
 workflow.

Merlijn, as i m not contributing code here but am very interested in
suitable git workflows, i am a little shy to ask, but what is the main
disadvantage compared to contributors on linux, and whats the main
disadvantage compared to subversion?

Rupert
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [WikimediaMobile] [Wiki Loves Monuments] WLM app: create account?

2013-08-28 Thread rupert THURNER
hi,

yuvi said he is not able to add account creation to the wlm mobile app
because the mw api is not usable. there is a bug filed in march, now
approaching 6 months age:
https://bugzilla.wikimedia.org/show_bug.cgi?id=46072
with priority high, which means according to andre klapper:
https://www.mediawiki.org/wiki/Bugzilla/Fields#Priority

Not the next task, but should be fixed soon. Depending on teams 
manpower this can take between one and six months.

who needs to do what to get this fixed?
(sorry for crossposting to wikitech, as i understood this is not a
mobile problem ...)

rupert.

On Sun, Sep 2, 2012 at 2:57 AM, Tomasz Finc tf...@wikimedia.org wrote:
 Sadly not for this contest. The API to create accounts never reached
 enough maturity while this app was in development.

 Background info here : http://www.mediawiki.org/wiki/User:Akshay.agarwal

 Thats why we dont require it to save images for later upload. I agree
 that this would be great to have in the future.

 --tomasz


 On Sat, Sep 1, 2012 at 1:41 PM, Cristian Consonni
 kikkocrist...@gmail.com wrote:
 2012/9/1 rupert THURNER rupert.thur...@gmail.com:
 hi philip,

 would it be possible to add an account creation screen to the wlm mobile 
 app?

 I posted the same request here:
 http://www.mediawiki.org/wiki/Wiki_Loves_Monuments_mobile_application/Feedback#Registration

 a couple of days ago.

 Cristian

 ___
 Mobile-l mailing list
 mobil...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/mobile-l

 ___
 Mobile-l mailing list
 mobil...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/mobile-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [WikimediaMobile] Wikimedia Commons mobile photo uploader app updated on iOS and Android

2013-08-25 Thread rupert THURNER
hi brion,

thank you so much for that! where is the source code? i tried to
search for commons on https://git.wikimedia.org/. i wanted to look
if there is really no account creation at the login screen or it is
just my phone which does not display one, and which URL the aplication
connects to (https i hoped). as well, i was quite puzzled that
wikipedia zero might not include traffic coming from apps.

rupert.


On Tue, Aug 20, 2013 at 10:57 PM, Brion Vibber bvib...@wikimedia.org wrote:
 We have just released Commons for iOS (version 1.0.8) and Android
 (1.0beta11), with *major* UI and performance improvements on iOS and minor
 bug fixes on Android.

 This is our first release in a couple months on iOS -- we hoped to have one
 out before Wikimania but were delayed due to problems with Apple's online
 developer tools being offline for a while. There are huge UI improvements
 and lots of bug fixes, thanks to our new mobile developer Monte Hurd. Thanks
 Monte!

 We've been releasing smaller updates on Android in the meantime, so the
 updates have been more incremental there.

 Both versions now include a quick 3-screen acceptable-use tutorial on first
 login. iOS includes featured photos as examples as well; this will come to
 Android in a future version.

 Downloads and release notes:
 * Apple App Store:
 https://itunes.apple.com/us/app/wikimedia-commons/id630901780
 * Google Play:
 https://play.google.com/store/apps/details?id=org.wikimedia.commons
 * Android direct download:
 http://download.wikimedia.org/android/wikimedia-commons-1.0beta11.apk

 -- brion vibber (brion @ pobox.com / bvibber @ wikimedia.org)

 ___
 Mobile-l mailing list
 mobil...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/mobile-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [WikimediaMobile] Wikimedia Commons mobile photo uploader app updated on iOS and Android

2013-08-25 Thread rupert THURNER
On Sun, Aug 25, 2013 at 7:46 PM, Yuvi Panda yuvipa...@gmail.com wrote:
 Hey rupert!

 On Sun, Aug 25, 2013 at 10:21 PM, rupert THURNER
 rupert.thur...@gmail.com wrote:
 hi brion,

 thank you so much for that! where is the source code? i tried to
 search for commons on https://git.wikimedia.org/. i wanted to look

 Android: https://git.wikimedia.org/summary/apps%2Fandroid%2Fcommons.git
 iOS: github.com/wikimedia/Commons-iOS

 if there is really no account creation at the login screen or it is
 just my phone which does not display one, and which URL the aplication

 Mediawiki doesn't have API support for creating accounts, and hence
 the apps don't have create account support yet.

created https://bugzilla.wikimedia.org/show_bug.cgi?id=53328, maybe
you could detail a little bit more how this api should look like?

rupert.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia's anti-surveillance plans: site hardening

2013-08-17 Thread rupert THURNER
On Sat, Aug 17, 2013 at 12:47 PM, Faidon Liambotis fai...@wikimedia.org wrote:
 On Fri, Aug 16, 2013 at 08:04:24PM -0400, Zack Weinberg wrote:

 Hi, I'm a grad student at CMU studying network security in general and
 censorship / surveillance resistance in particular. I also used to work for
 Mozilla, some of you may remember me in that capacity. My friend Sumana
 Harihareswara asked me to comment on Wikimedia's plans for hardening the
 encyclopedia against state surveillance.
 snip


 First of all, thanks for your input. It's much appreciated. As I'm sure
 Sumanah has already mentioned, all of our infrastructure is being developed
 in the open using free software and we'd be also very happy to accept
 contributions in code/infrastructure-as-code as well.

hi faidon, i do not think you personally and WMF are particularly
helpful in accepting contributions. because you:
* do not communicate openly the problems
* do not report upstream publically
* do not ask for help, and even if it gets offered you just ignore it
with quite some arrogance

let me give you an example as well. git.wikimedia.org broke, and you,
faidon, did _absolutely nothing_ to give good feedback to upstream to
improve the gitblit software. you and colleagues did though adjust
robots.txt to reduce the traffic arriving at the git.wikimedia.org.
which is, in my opinion, paying half of the rent. see
* our bug: https://bugzilla.wikimedia.org/show_bug.cgi?id=51769,
includes details how to take a stack trace
* upstream bug:
https://code.google.com/p/gitblit/issues/detail?id=294, no stacktrace
reported

 That being said, literally everything in your mail has been already
 considered and discussed multiple times :), plus a few others you didn't
 mention (GCM ciphers, OCSP stapling, SNI  split certificates, short-lived
 certificates, ECDSA certificates).  A few have been discussed on wikitech,
 others are under internal discussion  investigation by some of us with
 findings to be posted here too when we have something concrete.

 I don't mean this to sound rude, but I think you may be oversimplifying the
 situation quite a bit.

 Is dedicating (finite) engineering time to write the necessary code for
 e.g. gdnsd to support DNSSEC, just to be able to support DANE for
 which there's exactly ZERO browser support, while at the same time
 breaking a significant chunk of users, a sensible thing to do?

i don't mean this to sound rude, but you give me the impression that
you handle the https and dns case similarly than the gitblit case. you
tried some approaches, and let me perceive you think only in your wmf
box. i'd really appreciate some love towards other projects here, and
get things fixed at source as well, in mid term (i.e months, one or
two years).

rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] trace gitblit, was: Re: Wikimedia's anti-surveillance plans: site hardening

2013-08-17 Thread rupert THURNER
On Sat, Aug 17, 2013 at 8:40 PM, Ken Snider ksni...@wikimedia.org wrote:

 On Aug 17, 2013, at 1:33 PM, rupert THURNER rupert.thur...@gmail.com wrote:

 hi faidon, i do not think you personally and WMF are particularly
 helpful in accepting contributions. because you:
 * do not communicate openly the problems
 * do not report upstream publically
 * do not ask for help, and even if it gets offered you just ignore it
 with quite some arrogance

 Rupert, please don't call out or attack specific people. We're all on the 
 same team, and I can
...
let me change the title, as this is not site hardening any more.

 Further, Ops in general, and Faidon in particular, routinely report issues 
 upstream. Our recent bug reports or patches to Varnish and Ceph are two 
 examples that easily come to mind. Faidon was (rightly) attempting to restore 
 service first ...

yes ken, you are right, lets stick to the issues at hand:
(1) by when you will finally decide to invest the 10 minutes and
properly trace the gitblit application? you have the commands in the
ticket:
https://bugzilla.wikimedia.org/show_bug.cgi?id=51769

(2) by when you will adjust your operating guideline, so it is clear
to faidon, ariel and others that 10 minutes tracing of an application
and getting a holistic view is mandatory _before_ restoring the
service, if it goes down for so often, and for days every time. the 10
minutes more can not be noticed if it is gone for more than a day.

(3) how you will handle offers to help out of the community in future.
like in the gitblit case, i offered to help tracing the problem while
the service was down. max semenik now reported that gitblit should set
rel=nofollow.

best regards, rupert.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] trace gitblit, was: Re: Wikimedia's anti-surveillance plans: site hardening

2013-08-17 Thread rupert THURNER
On Sat, Aug 17, 2013 at 10:48 PM, bawolff bawolff...@gmail.com wrote:
 yes ken, you are right, lets stick to the issues at hand:
 (1) by when you will finally decide to invest the 10 minutes and
 properly trace the gitblit application? you have the commands in the
 ticket:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=51769

 (2) by when you will adjust your operating guideline, so it is clear
 to faidon, ariel and others that 10 minutes tracing of an application
 and getting a holistic view is mandatory _before_ restoring the
 service, if it goes down for so often, and for days every time. the 10
 minutes more can not be noticed if it is gone for more than a day.


 What information are you hoping to get from a trace that isn't currently 
 known?
if a web application dies or stops responding this can be (1) caused
by too many requests for the hardware it runs on. which can be
influenced from outside the app by robots.txt, cache, etc. and inside
the app by links e.g using nofollow. but it can be (2) influenced by
the application itself. a java application uses more or less operating
system resources depending on how it is written. one might find this
out by just reading the code. having a trace helps a lot here. a trace
may reveal locking problems in case of multi threading, string
operations causing OS calls for every character, creating and garbage
collecting objects, and 100s of others. it is not necessary to wait
until it stalls again to get the trace. many things can be seen during
normal operations as well.

so i hope to get (2). (1) was handled ok in my opinion.

rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] git.wikimedia.org dead?

2013-08-12 Thread rupert THURNER
On Mon, Aug 12, 2013 at 6:27 AM, rupert THURNER
rupert.thur...@gmail.com wrote:
 faldon, can you attach the trace to the bugzilla ticket please?

 On Mon, Aug 12, 2013 at 3:58 AM, Faidon Liambotis fai...@wikimedia.org 
 wrote:
 Hi,


 On Sun, Aug 11, 2013 at 12:51:15PM +0200, rupert THURNER wrote:

 As chad points out, its being served now


 it's plural (robots.txt)


 many thanks for getting it up quickly last time! unfortunately
 https://git.wikimedia.org is unresponsive again.


 Thanks for the report! I just restarted it again. Root cause was the same,
 unfortunately it's not just zip files that kill it; googlebot asking for
 every file/revision is more than enough.

 Until we have a better solution (and monitoring!) in place, I changed
 robots.txt to Disallow /. This means no search indexing for now,
 unfortunately.

its dead again. would be somebody so kind to trace this? as stated in
https://bugzilla.wikimedia.org/show_bug.cgi?id=51769 one might do,
_before_ restarting it:

* jps -l to find out the process id
* strace to see if it excessivly calls into the operating system
* jstack
* kill -QUIT p to print the stacktrace
* jmap -heap p to find memory usage
* jmap -histo:live p | head to find excessivly used classes
* if you have ui, you might try jconsole or http://visualvm.java.net as well

as i already asked a couple of mails earlier, i d volunteer to do it as well.

rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] git.wikimedia.org dead?

2013-08-12 Thread rupert THURNER
On Tue, Aug 13, 2013 at 12:12 AM, Chad innocentkil...@gmail.com wrote:
 On Mon, Aug 12, 2013 at 3:11 PM, rupert THURNER 
 rupert.thur...@gmail.comwrote:

 On Mon, Aug 12, 2013 at 6:27 AM, rupert THURNER
 rupert.thur...@gmail.com wrote:
  faldon, can you attach the trace to the bugzilla ticket please?
 
  On Mon, Aug 12, 2013 at 3:58 AM, Faidon Liambotis fai...@wikimedia.org
 wrote:
  Hi,
 
 
  On Sun, Aug 11, 2013 at 12:51:15PM +0200, rupert THURNER wrote:
 
  As chad points out, its being served now
 
 
  it's plural (robots.txt)
 
 
  many thanks for getting it up quickly last time! unfortunately
  https://git.wikimedia.org is unresponsive again.
 
 
  Thanks for the report! I just restarted it again. Root cause was the
 same,
  unfortunately it's not just zip files that kill it; googlebot asking for
  every file/revision is more than enough.
 
  Until we have a better solution (and monitoring!) in place, I changed
  robots.txt to Disallow /. This means no search indexing for now,
  unfortunately.

 its dead again. would be somebody so kind to trace this? as stated in
 https://bugzilla.wikimedia.org/show_bug.cgi?id=51769 one might do,
 _before_ restarting it:

 * jps -l to find out the process id
 * strace to see if it excessivly calls into the operating system
 * jstack
 * kill -QUIT p to print the stacktrace
 * jmap -heap p to find memory usage
 * jmap -histo:live p | head to find excessivly used classes
 * if you have ui, you might try jconsole or http://visualvm.java.net as
 well

 as i already asked a couple of mails earlier, i d volunteer to do it as
 well.


 None of this trace info would be useful. We know what's killing it. The
 fix for disallowing all indexing wasn't puppetized, so puppet reverted it.

 https://gerrit.wikimedia.org/r/#/c/78919/

chad, could you please anyway take the traces so we can have a look at
it? an internet facing app should _not_ die like this ...

rupert.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] git.wikimedia.org dead?

2013-08-11 Thread rupert THURNER
On Sun, Aug 11, 2013 at 4:27 AM, Jeremy Baron jer...@tuxmachine.com wrote:
 On Sun, Aug 11, 2013 at 2:25 AM, K. Peachey p858sn...@gmail.com wrote:
 On Sun, Aug 11, 2013 at 10:44 AM, Leslie Carr lc...@wikimedia.org wrote:
 looks like the robots.txt isn't being served - so googlebot is
 grabbing things from the zip files
 client denied by server configuration: /var/www/robots.txt

 sadly too jetlagged to keep looking at this :(

 make sure you look at robot.txt and not Robot.txt,

 As chad points out, its being served now

 it's plural (robots.txt)

many thanks for getting it up quickly last time! unfortunately
https://git.wikimedia.org is unresponsive again.

rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] git.wikimedia.org dead?

2013-08-11 Thread rupert THURNER
faldon, can you attach the trace to the bugzilla ticket please?

On Mon, Aug 12, 2013 at 3:58 AM, Faidon Liambotis fai...@wikimedia.org wrote:
 Hi,


 On Sun, Aug 11, 2013 at 12:51:15PM +0200, rupert THURNER wrote:

 As chad points out, its being served now


 it's plural (robots.txt)


 many thanks for getting it up quickly last time! unfortunately
 https://git.wikimedia.org is unresponsive again.


 Thanks for the report! I just restarted it again. Root cause was the same,
 unfortunately it's not just zip files that kill it; googlebot asking for
 every file/revision is more than enough.

 Until we have a better solution (and monitoring!) in place, I changed
 robots.txt to Disallow /. This means no search indexing for now,
 unfortunately.

 Faidon


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] git.wikimedia.org dead?

2013-08-10 Thread rupert THURNER
hi,

https://git.wikimedia.org/ seems to be dead.

rupert.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] git.wikimedia.org dead?

2013-08-10 Thread rupert THURNER
On Sat, Aug 10, 2013 at 4:49 PM, MZMcBride z...@mzmcbride.com wrote:
 rupert THURNER wrote:
https://git.wikimedia.org/ seems to be dead.

 Yup. It keeps happening: https://bugzilla.wikimedia.org/51769.

would it be possible to help debugging this?

rupert.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VE: why editing a paragraph opens the whole page?

2013-08-03 Thread rupert THURNER
On Sat, Aug 3, 2013 at 3:56 AM, Matthew Flaschen
mflasc...@wikimedia.org wrote:
 On 08/02/2013 07:43 AM, rupert THURNER wrote:
 hi,

 in visual editor, would it be possible to edit only a paragraph, when one
 clicks the edit link on a paragraph? if not, why not? currently an a decent
 laptop, clicking the edit link on whatever page or section takes at least
 4 seconds. this is unexpectedly slow.

 Section editing is bug
 https://bugzilla.wikimedia.org/show_bug.cgi?id=48429 .  My understanding
 is that it's on their road map, but down the road.

i tried it at https://en.wikipedia.org/wiki/Jos%C3%A9_Mourinho, and it
is 6 clicks + 5 pgdn + 75 seconds compared to 3 clicks + 1 pgdn + 14
secs.
* 1 click and 15 secs to edit
* 1 click to make go a away a note (see attachment at the bug)
* 1 click to edit summary
* 1 click and 20 secs to review changes
* 1 click to return to save form
* 1 click and 40 secs to save
* 5* pg down to go to the section just edited

section edit with the text editor takes:
* 1 click and 2 secs to open
* 1 click and 2 secs to preview
* 1 pg-down to find the save button (which is imo unnecessary, should
be on top as well)
* 1 click and 10 secs to save

rupert.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VE: why editing a paragraph opens the whole page?

2013-08-03 Thread rupert THURNER
I tried the example from the bug report on the VE talk page, it did not
render properly. It is only hypothetical, isn t it?

Am 03.08.2013 17:31 schrieb Tyler Romeo tylerro...@gmail.com:

 I think we can agree that VE has some performance considerations, but if
 you take a look at the bug report, it's explained why it would be so
 incredibly difficult to implement section editing.

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2016
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com


 On Sat, Aug 3, 2013 at 4:12 AM, rupert THURNER rupert.thur...@gmail.com
wrote:

  On Sat, Aug 3, 2013 at 3:56 AM, Matthew Flaschen
  mflasc...@wikimedia.org wrote:
   On 08/02/2013 07:43 AM, rupert THURNER wrote:
   hi,
  
   in visual editor, would it be possible to edit only a paragraph, when
  one
   clicks the edit link on a paragraph? if not, why not? currently an a
  decent
   laptop, clicking the edit link on whatever page or section takes at
  least
   4 seconds. this is unexpectedly slow.
  
   Section editing is bug
   https://bugzilla.wikimedia.org/show_bug.cgi?id=48429 .  My
understanding
   is that it's on their road map, but down the road.
 
  i tried it at https://en.wikipedia.org/wiki/Jos%C3%A9_Mourinho, and it
  is 6 clicks + 5 pgdn + 75 seconds compared to 3 clicks + 1 pgdn + 14
  secs.
  * 1 click and 15 secs to edit
  * 1 click to make go a away a note (see attachment at the bug)
  * 1 click to edit summary
  * 1 click and 20 secs to review changes
  * 1 click to return to save form
  * 1 click and 40 secs to save
  * 5* pg down to go to the section just edited
 
  section edit with the text editor takes:
  * 1 click and 2 secs to open
  * 1 click and 2 secs to preview
  * 1 pg-down to find the save button (which is imo unnecessary, should
  be on top as well)
  * 1 click and 10 secs to save
 
  rupert.
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] VE: why editing a paragraph opens the whole page?

2013-08-02 Thread rupert THURNER
hi,

in visual editor, would it be possible to edit only a paragraph, when one
clicks the edit link on a paragraph? if not, why not? currently an a decent
laptop, clicking the edit link on whatever page or section takes at least
4 seconds. this is unexpectedly slow.

i tried to create a bug to discuss splitting VE up to only edit parts of a
page, see below. andre klapper suggested this ideally is broken up into
requirements easier to implement, and i should post this to wikitext-l. as
i did not see any discussion going on there about VE i tried here. please
forward if this to the appropriate channel if i did not get it right again.

rupert


-- Forwarded message --
From: bugzilla-dae...@wikimedia.org
Date: Fri, Aug 2, 2013 at 11:03 AM
Subject: [Bug 52380] split up VE into components, clickable via links where
it is applicable
To: rupert.thur...@gmail.com


Andre Klapper aklap...@wikimedia.org changed bug
52380https://bugzilla.wikimedia.org/show_bug.cgi?id=52380

WhatRemovedAddedStatusREOPENEDRESOLVED Resolution---WONTFIX

*Comment # 5 https://bugzilla.wikimedia.org/show_bug.cgi?id=52380#c5 on bug
52380 https://bugzilla.wikimedia.org/show_bug.cgi?id=52380 from Andre
Klapper aklap...@wikimedia.org*

Please discuss such huge design decision suggestions first with developers on a
mailing list, like http://lists.wikimedia.org/pipermail/wikitext-l/ , to break
them down into manageable subtasks. Even if this was a valid request, it's
pretty unhandable to define when this good be fixed.
I mark your proposed solution again as WONTFIX, as bug reports should be about
problems instead. Please leave it like that as the solution proposed here is
not planned to be implemented by developers like that.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] kraken available?

2013-06-02 Thread rupert THURNER
hi,

magnus mentioned on the cultural partners list that there should be a
non-overloaded alternative (kraken) to stats.grok.se to have tracking
for baglama and other view trackers?

when is this available?

rupert.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Support for multiple content languages in MW core

2013-04-26 Thread rupert THURNER
many thanks for this proposal, erik! what i would love to be considered in
this context as well would be native language. to give an example:

i am speaking english and german. therefor i like to read the contents in
the original version, as long as it is available in this language. e.g.
wmch s bylaws are in 5 languages, the authoritative version is german as it
is registered in zürich. most other texts on our wiki are written in
english. so i d love to get these pages in english, and the bylaws in
german.

rupert
Am 24.04.2013 05:30 schrieb Erik Moeller e...@wikimedia.org:

 Hi folks,

 I'd like to start a broader conversation about language support in MW
 core, and the potential need to re-think some pretty fundamental
 design decisions in MediaWiki if we want to move past the point of
 diminishing returns in some language-related improvements.

 In a nutshell, is it time to make MW aware of multiple content
 languages in a single wiki? If so, how would we go about it?

 Hypothesis: Because support for multiple languages existing in a
 single wiki is mostly handled through JS hacks, templates, and manual
 markup added to the content (such as divs indicating language
 direction), we are providing an opaque, confusing and often
 inconsistent user experience in our multilingual wikis, which is a
 major impediment for growth of non-English content in those wikis, and
 participation by contributors who are not English speakers.

 Categories have long been called out as one of the biggest factors,
 and they certainly are (since Commons categories are largely in
 English, they are by definition excluding folks who don't speak the
 language), but I'd like to focus on the non-category parts of the
 problem for the purposes of this conversation.

 Support for the hypothesis (please correct misconceptions or errors):

 1) There's no consistent method by which multiple language editions of
 the same page are surfaced for selection by the use. Different wikis
 use different templates (often multiple variants and layouts in a
 single wiki), different positioning, different rules, etc., leading to
 inconsistent user experience. Consistency is offered by language
 headers generated by the Translate extension, but these are used for
 managing translations, while multilingual content existing in the same
 wiki may often not take the form of 1:1 translations.

 Moreover, language headers have to be manually updated/maintained,
 consider the user-friendliness of something like the +/- link in the
 language header on a page like
 https://commons.wikimedia.org/wiki/Commons:Kooperationen
 which leads to:

 https://commons.wikimedia.org/w/index.php?title=Template:Lang-Partnershipsaction=edit

 Chances are that a lot of people who'd have the ability to provide a
 version (not necessarily a translation) of the page in a given
 language will give up even on the process of doing so correctly.

 2) There's no consistent method by which page name conflicts (which
 may often occur in similar languages) are resolved, and users have to
 manually disambiguate.

 3) There are basic UX issues in the language selection tools offered
 today. For example, after changing the language on Commons to German,
 I will see the page I'm on (say English) with a German user interface,
 even if there's an actual German content version of the page
 available. This is because these language selection tools have no
 awareness of the existence of content in relevant languages.

 4) In order to ensure that content is rendered correctly irrespective
 of the UI language set, we require content authors to manually add
 divs around RTL content, even if that's all the page contains.

 5) It's impossible to restrict searches to a specific language. It's
 impossible to restrict recent changes and similar tools to a specific
 language.

 I'll stop there - I'm sure you can think of other issues with the
 current approach. For third party users, the effort of replicating
 something like the semi-acceptable Commons or Meta user experience is
 pretty significant, as well, due to the large number of templates and
 local hacks employed.

 This is a very tricky set of architectural issues to solve well, and
 it would be easy to make the user experience worse by solving it
 poorly. Still, as we grow our bench strength to take on hard problems,
 I want to raise the temperature of this problem a bit again,
 especially from the standpoint of future platform engineering
 improvements.

 Would it make sense to add a language property to pages, so it can be
 used to solve a lot of the above issues, and provide appropriate and
 consistent user experience built on them? (Keeping in mind that some
 pages would be multilingual and would need to be identified as such.)
 If so, this seems like a major architectural undertaking that should
 only be taken on as a partnership between domain experts (site and
 platform architecture, language engineering, Visual Editor/Parsoid,
 etc.).

 

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-13 Thread rupert THURNER
On Thu, Mar 14, 2013 at 12:07 AM, Christian Aistleitner
christ...@quelltextlich.at wrote:
 Hi,

 On Tue, Mar 12, 2013 at 08:28:25PM -0700, Rob Lanphier wrote:
 The Bugzilla-based solution has some of the advantages of the
 MediaWiki-based solution.  We may be able to implement it more quickly
 than something native to Gerrit because we're already working on
 Bugzilla integration, and we get features like queries for free, as
 well as the minor convenience of not having to have a new database
 table or two to manage.

 The problem at this point is that the gerrit-plugin interface is
 rather new, and that shows at various ends:
 * It's not possible to add GUI elements from a plugin.
 * Plugins cannot add their own database tables through gerrit.
 [...]

 So whatever we could possibly get into upstream gerrit, we should
 really try to get into upstream and not put into the plugin.

 But thinking about whether gerrit or bugzilla would be the correct
 place to store those tags ...
 It seems to me that the tags are not really tied to issues or changes,
 but rather to commits...
 Wouldn't git notes be a good match [1]?

+1

rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Page view stats we can believe in

2013-02-13 Thread rupert THURNER
acording to http://stats.grok.se/da.d/latest90/mandag has been viewed
127 times in the last 3 months, and ranks on 927. the raw pagecount
files are here:
http://dumps.wikimedia.org/other/pagecounts-raw/

i then took an arbitrary file and looked into it, at midnight, i guess
UTC, feb 1st. as all projects are in this file, lets grep for danish
wiktionary, da.d  at the beginning of the line:

grep '^da\.d\s' pagecounts-20130201-00  | wc
5692276   19572

this means 569 pages accessed in this hour, at least once. so lets
sort by third column, which is the page accesses. largest access are
at the bottom, so lets take the last 20 lines:

grep '^da\.d\s' pagecounts-20130201-00  | sort -k3n,3 | tail -20
da.d pony 2 30008
da.d skak 2 44151
da.d Speciel:Eksporter/engelsk 2 7818
da.d Speciel:Eksporter/hyle 2 4630
da.d Speciel:Eksporter/krog 2 4632
da.d Speciel:Eksporter/skaml%C3%A6ber 2 4632
da.d Forside 3 96050
da.d horse 3 54974
da.d interessant 3 9339
da.d Speciel:Eksporter/arrang%C3%B8rer 3 6948
da.d Speciel:Eksporter/b%C3%B8ger 3 6948
da.d Speciel:Eksporter/forg%C3%A6ves 3 6946
da.d Speciel:Eksporter/hensigtsm%C3%A6ssig 3 6946
da.d Speciel:Eksporter/hvad 3 9900
da.d Speciel:Eksporter/indvendig 3 6948
da.d Speciel:Eksporter/k%C3%A6le 3 6948
da.d Speciel:Eksporter/monogame 3 6944
da.d Speciel:Eksporter/revet 3 6946
da.d Speciel:Eksporter/topstykke 3 6944
da.d springer 3 45292

this means that e.g. springer was supposedly accessed 3 times in
that hour. the article does not exist, but there is a red link out of
http://da.wiktionary.org/wiki/Wiktionary:Top_1_(Dansk).

rupert.

On Wed, Feb 13, 2013 at 10:18 PM, Lars Aronsson l...@aronsson.se wrote:
 I stumbled on the Danish Wiktionary, of all projects.
 Danish is the 68th biggest language of Wiktionary, and
 has a little more than 8,000 articles in total.
 Most of these articles are very short and provide no
 value to a reader. There is no reason to link to them,
 and so very unlikely that the next user should stumble
 upon them unless they are me.

 Yet, wikistats tries to make be believe that this tiny
 project has 400,000 or 500,000 page views each month,
 and has had so for a long time,
 http://stats.wikimedia.org/wiktionary/EN/TablesPageViewsMonthly.htm

 (I'm not talking about January 2012, which seems to have
 been an error, and reports 2-3 times that many views.)

 My guess is that da.wiktionary has 4,000 page views per
 month, not 400,000. It's more likely that 400,000 is
 some background noise, an offset number that should be
 subtracted from the number of page views for any project.

 If you look at the log files for just one day, you should
 see my IP address (85.228.something) and 3-4 other users
 who have been editing lately, and not many more people,
 but perhaps a bunch of interwiki bots.

 We need an explanation to these vastly inflated page view
 statistics.


 --
   Lars Aronsson (l...@aronsson.se)
   Aronsson Datateknik - http://aronsson.se



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: IRC office hours with the Editor Engagement Experiments team

2013-02-05 Thread rupert THURNER
On Mon, Feb 4, 2013 at 10:54 PM, Matthew Flaschen
mflasc...@wikimedia.org wrote:
 On 02/04/2013 03:58 PM, Steven Walling wrote:
 Sorry for cross-posting, but for MediaWiki developers, this is a good
 opportunity to ask any questions you might have about the newly-released
 Extension:GuidedTour, and how to leverage it to build any tours yourself.

 Specifically, any extension with a UI can include a tour with full
 internationalization support.

 I'd be glad to help you get started.

you have a link to a demo?

rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-03 Thread rupert THURNER
the reason to use two fields instead of one makes it much easier to
implement or performant?

On Sun, Feb 3, 2013 at 1:08 AM, David Schoonover d...@wikimedia.org wrote:
 Huh! News to me as well. I definitely agree with that decision. Thanks, Ori!

 I've already written the Varnish code for setting X-MF-Mode so it can be
 captured by varnishncsa. Is there agreement to switch to Mobile-Mode, or at
 least, MF-Mode?

 Looking especially to hear from Arthur and Matt.

 --
 David Schoonover
 d...@wikimedia.org


 On Sat, Feb 2, 2013 at 2:16 PM, Diederik van Liere
 dvanli...@wikimedia.orgwrote:

 Thanks Ori, I was not aware of this
 D

 Sent from my iPhone

 On 2013-02-02, at 16:55, Ori Livneh o...@wikimedia.org wrote:

 
 
  On Saturday, February 2, 2013 at 1:36 PM, Platonides wrote:
 
  I don't like it's cryptic nature.
 
  Someone looking at the headers sent to his browser would be very
  confused about what's the point of «X-MF-Mode: b».
 
  Instead something like this would be much more descriptive:
  X-Mobile-Mode: stable
  X-Mobile-Request: secondary
 
  But that also means sending more bytes through the wire :S
  Well, you can (and should) drop the 'X-' :-)
 
  See http://tools.ietf.org/html/rfc6648: Deprecating the X- Prefix and
 Similar Constructs in Application Protocols
 
 
  --
  Ori Livneh
 
 
 
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] wikiscan / similar for english wikipedia

2013-01-30 Thread rupert THURNER
hi,

is there any possibility to have a list of users with contributions similar to:
http://wikiscan.org/?menu=userstatsuserlist=Cat%C3%A9gorie%3AUtilisateur+participant+au+projet+Afrip%C3%A9dia

for the english wikipedia?

kr, rupert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal: MediaWiki Groups

2012-11-29 Thread rupert THURNER
hi quim,

you managed to confuse me :) i thought that it is a great idea to
finally implement groups, and access control lists in mediawiki as
first class citizen, like e.g. moinmoin has it. one enters one ACL
line on top of the wiki wiki page see here for details:
http://moinmo.in/HelpOnAccessControlLists

rupert.

On Fri, Nov 30, 2012 at 1:50 AM, Quim Gil q...@wikimedia.org wrote:
 Hi, here you have a first draft about MediaWiki Groups, and implicitly
 MediaWiki reps:

 http://www.mediawiki.org/wiki/User:Qgil/MediaWiki_groups

 MediaWiki groups organize open source community activities within the scope
 of specific topics and geographical areas. They extend the capacity of the
 Wikimedia Foundation in events, training, promotion and other technical
 activities benefiting Wikipedia, the Wikimedia movement and the MediaWiki
 software.

 Imagine MediaWiki Germany Group, MediaWiki Lua Group...

 These groups may become a significant source of growth and wider diversity
 of our community.

 Please bring your ideas to the discussion page - or here. Thank you!

 --
 Quim Gil
 Technical Contributor Coordinator
 Wikimedia Foundation

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l