Re: [Wikitech-l] IRC Office Hour on Project Management Tools Review: Friday 28th, 17:00UTC

2014-03-28 Thread Guillaume Paumier
A quick reminder: This will take place in 30 minutes in
#wikimedia-office . If all goes well, we'll even have Phabricator's
lead developer around to help answer questions :)

On Mon, Mar 24, 2014 at 6:53 PM, Andre Klapper aklap...@wikimedia.org wrote:
 Hi,

 Guillaume and I will be hosting an IRC office hour
 on March 28, 2014 (Friday) at 17:00 UTC / 10:00 PDT
 in #wikimedia-office on irc.freenode.net.

 We will quickly present the progress and status of the ongoing Project
 management tools review [1] and after that we are happy to answer your
 questions!

 See you at the IRC office hour!

 Thanks,
 andre

 [1] https://www.mediawiki.org/wiki/Project_management_tools/Review

-- 
Guillaume Paumier
Technical Communications Manager — Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GSoC FOSS OPW selection process

2014-03-28 Thread Quim Gil
Hi, an update.

On Friday, March 21, 2014, Quim Gil q...@wikimedia.org wrote:


 Everybody is invited to join the selection process.

 https://www.mediawiki.org/wiki/Mentorship_programs/Selection_process


Mentors have started evaluating candidates yes/maybe/no in the GSoC and OPW
private sites. I have started checking the yes/no against the criteria of
our selection process, agreeing or not with the evaluation, and usually
leaving feedback in the proposal's talk page. Basically, I'm playing evil's
advocate, trying to push the announced goal of quality over quantity.

Next Monday, one week before our deadline to request a number of slots for
GSoC and OPW, we will start focusing on the maybes, resolving them in one
direction or another.

We still have three relevant proposals from active candidates that are
lacking confirmed mentors and a proper evaluation:

UniversalLanguageSelector fonts for Chinese wikis, by Aaron Xiao
https://www.mediawiki.org/wiki/User:Xiaoxiangquan/UniversalLanguageSelector_Fonts_for_Chinese_wikis
Liangent and David Chan are interested in co-mentoring, they need help from
a developer familiar with ULS.

Automatic cross-language screenshots for user documentation, by Vikas S
Yaligar
https://www.mediawiki.org/wiki/User:Vikassy/GSoC14
There have been some informal conversations, but there is nothing
confirmed. At least one developer familiar with our Coninuous Integration /
QA testing tools is needed.

Frontend for Vector skin CSS customizations, by Ioannis Protonotarios
https://www.mediawiki.org/wiki/User:Protnet/Frontend_for_Vector_skin_CSS_customizations
No mentors confirmed (or mentioned at all?). At least one mentor familiar
with MediaWiki extension development is required.

Volunteers are still welcome! Mentoring or helping evaluating proposals and
candidates. Ten days to go, there is still time to make informed decisions.

https://www.mediawiki.org/wiki/Google_Summer_of_Code_2014

https://www.mediawiki.org/wiki/FOSS_Outreach_Program_for_Women/Round_8


By April 7 we need to decide how many slots we want to request in each
 program. In GSoC we request a number of slots to Google, which we might or
 might not get. In OPW we fund some slots, we might request more, and then
 we might get them or not.

 Questions? Just ask.



-- 
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [GSoC] Chemical Markup support for Wikimedia Commons

2014-03-28 Thread Rainer Rillke
Chemical Markup support for Wikimedia Commons has been a feature
requested for a long time[1][2] and now I think it's time to act. Please
consider my GSoC proposal[3]. I will focus on molecules and reactions,
both supported by MDL molfiles, through a media file handler and a
JavaScript front end.

Spectra and analytical data, chemical crystallography and materials --
this is somewhat more complicated and I think; let's start it simple. If
there is sufficient time to play with CML after implementing MOL file
support, no one prevents us from doing so. But past experience has shown
that even simple projects grew into dimensions that were not expected
resulting in software that is not ready to use till today.

Curious about your opinion.

Kind regards
Rainer Rillke



[1] http://lists.wikimedia.org/pipermail/wikitech-l/2004-June/010715.html
[2] http://lists.wikimedia.org/pipermail/wikitech-l/2013-April/068573.html
[3]
https://www.mediawiki.org/wiki/User:Rillke/Chemical_Markup_support_for_Wikimedia_Commons

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [GSoC] Chemical Markup support for Wikimedia Commons

2014-03-28 Thread Rainer Rillke
Chemical Markup support for Wikimedia Commons has been a feature
requested for a long time[1][2] and now I think it's time to act. Please
consider my GSoC proposal[3]. I will focus on molecules and reactions,
both supported by MDL molfiles, through a media file handler and a
JavaScript front end.

Spectra and analytical data, chemical crystallography and materials --
this is somewhat more complicated and I think; let's start it simple. If
there is sufficient time to play with CML after implementing MOL file
support, no one prevents us from doing so. But past experience has shown
that even simple projects grew into dimensions that were not expected
resulting in software that is not ready to use till today.

Curious about your opinion.

Kind regards
Rainer Rillke



[1] http://lists.wikimedia.org/pipermail/wikitech-l/2004-June/010715.html
[2] http://lists.wikimedia.org/pipermail/wikitech-l/2013-April/068573.html
[1]
https://www.mediawiki.org/wiki/User:Rillke/Chemical_Markup_support_for_Wikimedia_Commons

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Roadmap and deployment highlights - week of March 31st

2014-03-28 Thread Greg Grossmeier
Hello and welcome to the latest edition of the WMF Engineering Roadmap
and Deployment update.

The full log of planned deployments next week can be found at:
https://wikitech.wikimedia.org/wiki/Deployments#Week_of_March_31st

Notable items...

== Monday ==

* The new logging tool in use at WMF, Logstash, will have it's backend
  search cluster (powered by ElasticSearch) upgraded to a newer version
  (1.0.1).
** No user-facing change or disruption associated with this.


== Tuesday ==

* MediaWiki deploy window, currently following the 1.23 schedule
** group1 to 1.23wmf20: All non-Wikipedia sites (Wiktionary, Wikisource,
   Wikinews, Wikibooks, Wikiquote, Wikiversity, and a few other sites)
** https://www.mediawiki.org/wiki/MediaWiki_1.23/wmf20
** Schedule: 
https://www.mediawiki.org/wiki/MediaWiki_1.23/Roadmap#Schedule_for_the_deployments
** NOTE: This includes the continuing rollout of Typography Refresh:
   https://www.mediawiki.org/wiki/Typography_refresh


== Wednesday ==

* Cirrus Search will be graduated from Beta Feature to enabled for all
  users on all non-wikipedia wikis (eg Commons, etc)
** https://www.mediawiki.org/wiki/Search


== Thursday ==

* MediaWiki deploy window, currently following the 1.23 schedule)
** group2 to 1.23wmf20 (all Wikipedias)
** NOTE: This includes the continuing rollout of Typography Refresh:
   https://www.mediawiki.org/wiki/Typography_refresh
** group0 to 1.23wmf21 (test/test2/testwikidata/mediawiki)
** https://www.mediawiki.org/wiki/MediaWiki_1.23/wmf21



Thanks, and as always, questions welcome,

Greg


-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |


signature.asc
Description: Digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikitech-ambassadors] Roadmap and deployment highlights - week of March 31st

2014-03-28 Thread Nikolas Everett
On Fri, Mar 28, 2014 at 4:57 PM, Greg Grossmeier g...@wikimedia.org wrote:

 == Wednesday ==

 * Cirrus Search will be graduated from Beta Feature to enabled for all
   users on all non-wikipedia wikis (eg Commons, etc)
 ** https://www.mediawiki.org/wiki/Search


I'd prefer to do commons on its own some other time because it is much
higher traffic.  Also, we're not even a BetaFeature in a few non-wikipedias
and it wouldn't be fair (or even work) to just switch them on too.  So all
non-wikipedias that aren't Commons, Meta, or Incubator.

As always I'm open if you find any show stopper issues while trying Cirrus
as a BetaFeature - we won't deploy it to a wiki that it'll make worse.

Sorry for the late notice,

Nik Everett/manybubbles
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Caching with Varnish

2014-03-28 Thread Shawn Jones

Hi,

As part of our extension testing, we've set up varnish in accordance 
with http://www.mediawiki.org/wiki/Manual:Varnish_caching


One of the things we've noticed is that our oldid URIs are cached, 
whereas Wikipedia doesn't seem to cache those pages.


Is there a reason why Wikipedia doesn't do this?  Is there some 
threshold that Wikipedia uses for caching?


Thanks in advance,

Shawn M. Jones
Graduate Research Assistant
Department of Computer Science
Old Dominion University

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Caching with Varnish

2014-03-28 Thread Brian Wolff
On 3/28/14, Shawn Jones sj...@cs.odu.edu wrote:
 Hi,

 As part of our extension testing, we've set up varnish in accordance
 with http://www.mediawiki.org/wiki/Manual:Varnish_caching

 One of the things we've noticed is that our oldid URIs are cached,
 whereas Wikipedia doesn't seem to cache those pages.

 Is there a reason why Wikipedia doesn't do this?  Is there some
 threshold that Wikipedia uses for caching?

 Thanks in advance,

 Shawn M. Jones
 Graduate Research Assistant
 Department of Computer Science
 Old Dominion University

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I think your caching is set up incorrectly. MediaWiki does not (afaik)
send an smaxage caching header for requests with an oldid in them. See
the $output-setSquidMaxage( $wgSquidMaxage ); line in
MediaWiki::performAction (line 425 of includes/Wiki.php). The caching
headers are only sent if the url is in the list of urls that can be
purged (basically normal page views and history page views). If other
pages are being cached, it probably means all pages are being cached
for you, which is not a good thing and will cause problems, since
there are some pages that really should not be cached.

In the case of oldid urls, it may make sense for us to send caching
headers with oldid urls, since they do not change (excluding of course
oldid's that don't exist)

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikimedia-l] Quarterly reviews of high priority WMF initiatives

2014-03-28 Thread Tilman Bayer
Minutes and slides from Wednesday's quarterly review of the
Foundation's VisualEditor team are now available at
https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_reviews/VisualEditor/March_2014

(A separate but related quarterly review meeting of the Parsoid team
took place today, those minutes should be up on Monday.)

On Wed, Dec 19, 2012 at 6:49 PM, Erik Moeller e...@wikimedia.org wrote:
 Hi folks,

 to increase accountability and create more opportunities for course
 corrections and resourcing adjustments as necessary, Sue's asked me
 and Howie Fung to set up a quarterly project evaluation process,
 starting with our highest priority initiatives. These are, according
 to Sue's narrowing focus recommendations which were approved by the
 Board [1]:

 - Visual Editor
 - Mobile (mobile contributions + Wikipedia Zero)
 - Editor Engagement (also known as the E2 and E3 teams)
 - Funds Dissemination Committe and expanded grant-making capacity

 I'm proposing the following initial schedule:

 January:
 - Editor Engagement Experiments

 February:
 - Visual Editor
 - Mobile (Contribs + Zero)

 March:
 - Editor Engagement Features (Echo, Flow projects)
 - Funds Dissemination Committee

 We'll try doing this on the same day or adjacent to the monthly
 metrics meetings [2], since the team(s) will give a presentation on
 their recent progress, which will help set some context that would
 otherwise need to be covered in the quarterly review itself. This will
 also create open opportunities for feedback and questions.

 My goal is to do this in a manner where even though the quarterly
 review meetings themselves are internal, the outcomes are captured as
 meeting minutes and shared publicly, which is why I'm starting this
 discussion on a public list as well. I've created a wiki page here
 which we can use to discuss the concept further:

 https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_reviews

 The internal review will, at minimum, include:

 Sue Gardner
 myself
 Howie Fung
 Team members and relevant director(s)
 Designated minute-taker

 So for example, for Visual Editor, the review team would be the Visual
 Editor / Parsoid teams, Sue, me, Howie, Terry, and a minute-taker.

 I imagine the structure of the review roughly as follows, with a
 duration of about 2 1/2 hours divided into 25-30 minute blocks:

 - Brief team intro and recap of team's activities through the quarter,
 compared with goals
 - Drill into goals and targets: Did we achieve what we said we would?
 - Review of challenges, blockers and successes
 - Discussion of proposed changes (e.g. resourcing, targets) and other
 action items
 - Buffer time, debriefing

 Once again, the primary purpose of these reviews is to create improved
 structures for internal accountability, escalation points in cases
 where serious changes are necessary, and transparency to the world.

 In addition to these priority initiatives, my recommendation would be
 to conduct quarterly reviews for any activity that requires more than
 a set amount of resources (people/dollars). These additional reviews
 may however be conducted in a more lightweight manner and internally
 to the departments. We're slowly getting into that habit in
 engineering.

 As we pilot this process, the format of the high priority reviews can
 help inform and support reviews across the organization.

 Feedback and questions are appreciated.

 All best,
 Erik

 [1] https://wikimediafoundation.org/wiki/Vote:Narrowing_Focus
 [2] https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings
 --
 Erik Möller
 VP of Engineering and Product Development, Wikimedia Foundation

 Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate

 ___
 Wikimedia-l mailing list
 wikimedi...@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l



-- 
Tilman Bayer
Senior Operations Analyst (Movement Communications)
Wikimedia Foundation
IRC (Freenode): HaeB

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Caching with Varnish

2014-03-28 Thread Liangent
No they were deliberately made not cached because otherwise in the event
that some sensitive data got oversighted, they may still keep accessible
from cached data, and there are even no way to have them go away.
On Mar 29, 2014 11:40 AM, Brian Wolff bawo...@gmail.com wrote:

 On 3/28/14, Shawn Jones sj...@cs.odu.edu wrote:
  Hi,
 
  As part of our extension testing, we've set up varnish in accordance
  with http://www.mediawiki.org/wiki/Manual:Varnish_caching
 
  One of the things we've noticed is that our oldid URIs are cached,
  whereas Wikipedia doesn't seem to cache those pages.
 
  Is there a reason why Wikipedia doesn't do this?  Is there some
  threshold that Wikipedia uses for caching?
 
  Thanks in advance,
 
  Shawn M. Jones
  Graduate Research Assistant
  Department of Computer Science
  Old Dominion University
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 I think your caching is set up incorrectly. MediaWiki does not (afaik)
 send an smaxage caching header for requests with an oldid in them. See
 the $output-setSquidMaxage( $wgSquidMaxage ); line in
 MediaWiki::performAction (line 425 of includes/Wiki.php). The caching
 headers are only sent if the url is in the list of urls that can be
 purged (basically normal page views and history page views). If other
 pages are being cached, it probably means all pages are being cached
 for you, which is not a good thing and will cause problems, since
 there are some pages that really should not be cached.

 In the case of oldid urls, it may make sense for us to send caching
 headers with oldid urls, since they do not change (excluding of course
 oldid's that don't exist)

 --bawolff

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Some notes on captchas and account creation

2014-03-28 Thread Smolenski Nikola
Yesterday I have held lectures about Wikipedia at the School of Electrical and
Computer Engineering of Applied Studies in Belgrade
[http://www.viser.edu.rs/?lang=EN] and observed them trying to create a new
user
and save an article. Since all the students used the same IP, they have all
triggered captchas, so here are my general remarks. Note that these students
should be more computer literate than most students of their age (early 20s).

Some students have found the captchas to be difficult to read - for example
some
could not differ 'a' from 'x' or 'r' from 'y' - so perhaps captchas should not
be made even more difficult. Captcha localization could help here.

The warning that the article is not saved is not clearly visible, and some
students have managed to not save their articles without realizing that they
have not saved them. The warning could be made more visible, but I believe that
the best way of solving this would be to try to save the article by ajax, and
to
display the captcha on the article editing page itself.

Account creation is botched in several ways, and these are really beginners'
errors for which I don't understand how could they happen in age-old software
as
MediaWiki. Examples:

- If you enter a mismatched password, the error message will only appear after
you click on 'Create your account'. The mismatch could be checked in the
javascript after entering the passwords on the page itself, the same way
duplicate username is checked.

- If you enter the passwords right but the captcha wrong, you have to reenter
the passwords. Basically, whatever mistake you make, you have to reenter the
passwords. I see no reason to do this, the password fields could be pre-filled
with the entered passwords. Also, similar to article saving above, it would be
even better if the captcha would be verified via ajax on the page itself.

- The opposite: if you enter a password wrong but the captcha right, you have
to
reenter the passwords (good) AND reenter the captcha (bad). Some quick typers
have oscillated several times between entering one or the other wrong. I see no
need to have to reenter the captcha once you have entered it correctly - you
have already authenticated as a human.

Also, if I may ask for a wish, a special page where I could enter an IP and
free
it of all the spam filters and protections would be nice and all the
wikieducators would thank you :)



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Some notes on captchas and account creation

2014-03-28 Thread Smolenski Nikola
Yesterday I have held lectures about Wikipedia at the School of Electrical and
Computer Engineering of Applied Studies in Belgrade
[http://www.viser.edu.rs/?lang=EN] and observed them trying to create a new
user
and save an article. Since all the students used the same IP, they have all
triggered captchas, so here are my general remarks. Note that these students
should be more computer literate than most students of their age (early 20s).

Some students have found the captchas to be difficult to read - for example
some
could not differ 'a' from 'x' or 'r' from 'y' - so perhaps captchas should not
be made even more difficult. Captcha localization could help here.

The warning that the article is not saved is not clearly visible, and some
students have managed to not save their articles without realizing that they
have not saved them. The warning could be made more visible, but I believe that
the best way of solving this would be to try to save the article by ajax, and
to
display the captcha on the article editing page itself.

Account creation is botched in several ways, and these are really beginners'
errors for which I don't understand how could they happen in age-old software
as
MediaWiki. Examples:

- If you enter a mismatched password, the error message will only appear after
you click on 'Create your account'. The mismatch could be checked in the
javascript after entering the passwords on the page itself, the same way
duplicate username is checked.

- If you enter the passwords right but the captcha wrong, you have to reenter
the passwords. Basically, whatever mistake you make, you have to reenter the
passwords. I see no reason to do this, the password fields could be pre-filled
with the entered passwords. Also, similar to article saving above, it would be
even better if the captcha would be verified via ajax on the page itself.

- The opposite: if you enter a password wrong but the captcha right, you have
to
reenter the passwords (good) AND reenter the captcha (bad). Some quick typers
have oscillated several times between entering one or the other wrong. I see no
need to have to reenter the captcha once you have entered it correctly - you
have already authenticated as a human.

Also, if I may ask for a wish, a special page where I could enter an IP and
free
it of all the spam filters and protections would be nice and all the
wikieducators would thank you :)



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l