Re: [Wikitech-l] Live broadcast from WP engineering meetup happening now

2012-10-18 Thread Erik Moeller
Video on Commons:
https://commons.wikimedia.org/wiki/File:Wikipedia_Engineering_Meetup-2012-10-18.ogv


-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Live broadcast from WP engineering meetup happening now

2012-10-18 Thread Erik Moeller
A bit more background:

* The decision to broadcast this meeting via Hangout was made at the
last minute, so sorry for the short notice. Accordingly we've also not
set up an IRC backchannel for this one (maybe next time, if it makes
sense - these are more presentation-focused).

* This is the second time we've done this, and we've not yet made a
final decision about frequency.

* The motivation for organizing these meetups in SF is to be visible
to the local technology community, looking for opportunities to
connect with potential volunteers and hires.

The presentations this time around:
- i18n Project Milkshake - jQuery.i18n - Alolita Sharma
- Parsoid - Mark "Traceur" Holmquist & Gabriel Wicke
- Account creation UX - S Page
- Page Curation - Ryan Kaldari
- Admin tools - Chris Steip
- Vagrant - Ori Livneh

Erik
-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Live broadcast from WP engineering meetup happening now

2012-10-18 Thread Erik Moeller
WMF has started doing these outreach-focused meetups regularly:
http://www.meetup.com/Wikipedia-Engineering-Meetup/

Tech presentations by WMF. Live stream happening right now, recording later:
https://www.youtube.com/watch?v=aBmKYWJJJ94

-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikidata bug (or semi-bug ;)

2012-10-18 Thread Sumana Harihareswara
On 10/18/2012 04:48 PM, Amir Ladsgroup wrote:
> Hello I'm working on running PWB on wikidata and i want to do some
> edits via API so i did this:
> http://wikidata-test-repo.wikimedia.de/w/api.php?action=wbsetitem&id=392&data={%22labels%22:{%22ar%22:{%22language%22:%22ar%22,%22value%22:%22Bar%22}}}&format=jsonfm
> 
> but gives me this:
> http://wikidata-test-repo.wikimedia.de/w/index.php?title=Q392&diff=105675&oldid=33972
> 
> As you can see, the label changed but the change is not shown in "List
> of pages linked to this item(79 entries)" section. I think it's
> because they are not the same but it's better to be!
> 
> and besides does anybody know how can i take page id? i mean i give
> "Tantalum" and take "392" or "Q392"
> 
> Best wishes

Amir, thanks for the bug report! it would probably be best if you
continued this conversation on the wikidata-l list
https://lists.wikimedia.org/mailman/listinfo/wikidata-l (cc'd) or in
#wikimedia-wikidata on Freenode IRC.

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Should JS/CSS pages be parsed?

2012-10-18 Thread MZMcBride
Krinkle wrote:
> On Oct 18, 2012, at 5:04 AM, Daniel Kinzler  wrote:
>> When designing the ContentHandler, I asked around about whether JS and CSS
>> pages
>> should be parsed as wikitext, so categories etc would work. The gist of the
>> responses I got was "naw, lets get rid of that". So I did (though PST is
>> still
>> applied - Tim asked for that at the Berlin Hackathon).
>> 
>> Sure enough, people are complaining now, see
>> . Also note that an
>> older
>> request for disablingt parsing of script pages was closed as WONTFIX:
>> .
>> 
>> I'm inclined to (at least optionally) enable the parsing of script pages, but
>> I'd like to get some feedback first.
> 
> Yeah, as more elaborately put on the bug[1], it was disabled in ContentHandler
> without dedicated discussion because it was thought of as a minor oddity that
> should be removed as a bug.
> 
> We know now that (though it might have been a bug originally) it is a major
> feature that unless replaced, must not be removed.
> 
> [1] https://bugzilla.wikimedia.org/41155

Well, the current approach is hackish. The links are kind of stored, but not
rendered, so you still end up with dead-end pages and a completely
surprising result to most users.

I think the last thing we need is yet another parser. There is already
distinct parsing for weird parts of the MediaWiki UI (such as edit summaries
and log comments). I think any further specialized parsers should be shot
on-sight.

More thoughts here:  ("Limit scope of
title-based syntax highlighting").

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Should JS/CSS pages be parsed?

2012-10-18 Thread Tim Starling
On 18/10/12 20:04, Daniel Kinzler wrote:
> Hi!
> 
> When designing the ContentHandler, I asked around about whether JS and CSS 
> pages
> should be parsed as wikitext, so categories etc would work. The gist of the
> responses I got was "naw, lets get rid of that". So I did (though PST is still
> applied - Tim asked for that at the Berlin Hackathon).
> 
> Sure enough, people are complaining now, see
> . Also note that an 
> older
> request for disablingt parsing of script pages was closed as WONTFIX:
> .

Yes, categories on JS/CSS pages should continue to work, per my
comments on bug 32858 where this was discussed in detail.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Let's talk about Solr

2012-10-18 Thread Max Semenik
On 19.10.2012, 1:56 Denny wrote:

> That is great to hear. Thanks for tying us together, Asher.

> For Wikidata, we have not uploaded our Solr extension yet (mostly
> because we are waiting for the repository to be set up), but we will
> then upload it soon once it is there. I would be especially interested
> in sharing schema and config snippets for the many languages we
> support, but as far as I can tell this is not so much a requirement
> for Max and maybe not even for the TranslateMemory, not sure.

Yes, GeoData doesn't use text search at all.

> We also selected Solarium to connect to Solr. It is a bit worrysome
> that it is basically a one person project if I see this correctly, but
> the library seems small enough not to pose the risk of becoming too
> much of a maintenance legacy I'd say -- especially compared to the
> alternatives.

> Any preferences for Solr 3 v 4? It seems that 4 is the smarter choice,
> but we are having trouble to get 4 run on labs. Solr 3 works pretty
> much out of the box, though.

At the moment, the only option for WMF is our custom-built 3.6.0, and
when thinking about Solr 4 please remember that it's only 4.*0* :)

Another point: I hope, everyone is happy with Jetty as a servlet
container?

> Also, we should probably at some point consider how the different
> extensions and their dependencies should be handled. I'd prefer not to
> ship three different versions of Solarium with three extensions :)

I've created a repo request at 
https://www.mediawiki.org/wiki/Git/Conversion/Extensions_queue#List

-- 
Best regards,
  Max Semenik ([[User:MaxSem]])


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Let's talk about Solr

2012-10-18 Thread Denny Vrandečić
That is great to hear. Thanks for tying us together, Asher.

For Wikidata, we have not uploaded our Solr extension yet (mostly
because we are waiting for the repository to be set up), but we will
then upload it soon once it is there. I would be especially interested
in sharing schema and config snippets for the many languages we
support, but as far as I can tell this is not so much a requirement
for Max and maybe not even for the TranslateMemory, not sure.

We also selected Solarium to connect to Solr. It is a bit worrysome
that it is basically a one person project if I see this correctly, but
the library seems small enough not to pose the risk of becoming too
much of a maintenance legacy I'd say -- especially compared to the
alternatives.

Any preferences for Solr 3 v 4? It seems that 4 is the smarter choice,
but we are having trouble to get 4 run on labs. Solr 3 works pretty
much out of the box, though.

Also, we should probably at some point consider how the different
extensions and their dependencies should be handled. I'd prefer not to
ship three different versions of Solarium with three extensions :)

Cheers,
Denny

P.S.: Yuri, regarding GESIS' Solr implementation, they have done some
great work for using Solr as a store for the structured data in SMW.
Funny thing is, they actually do not use Solr for the search itself!
This work is somewhat relevant for Wikidata phase 3, but unfortunately
quite irrelevant for TranslationMemory or Geodata.


2012/10/18 Asher Feldman :
> Hi all,
>
> I'm excited to see that Max has made a lot of great progress in adding Solr
> support to the GeoData extension so that we don't have to use mysql for
> spatial search - https://gerrit.wikimedia.org/r/#/c/27610/
>
> GeoData makes use of the Solarium php client, which is currently included as
> a part of the extension.  GeoData will be our second use of Solar, after
> TranslationMemory extension which is already deployed -
> https://www.mediawiki.org/wiki/Help:Extension:Translate/Translation_memories
> and the Wikidata team is working on using Solr in their extensions as well.
>
> TranslationMemory also uses Solarium, a copy of which is also bundled with
> and loaded from the extension.  For a loading and config example -
> https://gerrit.wikimedia.org/r/gitweb?p=operations/mediawiki-config.git;a=blob;f=wmf-config/CommonSettings.php;h=1e7a0e24dcbea106042826474607ec065d328472;hb=HEAD#l2407
>
> I think Solr is the right direction for us to go in.  Current efforts can
> pave the way for a complete refresh of WMF's article full text search as
> well as how our developers approach information retrieval.  We just need to
> make sure that these efforts are unified, with commonality around the client
> api, configuration, indexing (preferably with updates asynchronously pushed
> to Solr in near real-time), and schema definition.  This is important from
> an operational aspect as well, where it would be ideal to have a single
> distributed and redundant cluster.
>
> It would be great to see the i18n, mobile tech, wikidata, and any other
> interested parties collaborate and agree on a path forward, with a quick
> sprint around common code that all can use.
>
> -Asher



-- 
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] October 25 open tech chat

2012-10-18 Thread Erik Moeller
Thanks to all who attended the open tech chat today!

If you want to continue this format, please sign up / suggest topics
for next week here:
https://www.mediawiki.org/wiki/Meetings/2012-10-25

Cheers,
Erik

-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Let's talk about Solr

2012-10-18 Thread Alolita Sharma
Faidon, FYI - the i18n eng team considered Elastic Search but did not
do a deep evaluation on it before selecting Solr.

-Alolita

On Thu, Oct 18, 2012 at 1:35 PM, Faidon Liambotis  wrote:
> On Thu, Oct 18, 2012 at 11:22:05AM -0700, Asher Feldman wrote:
>> I think Solr is the right direction for us to go in.  Current efforts can
>> pave the way for a complete refresh of WMF's article full text search as
>> well as how our developers approach information retrieval.  We just need to
>> make sure that these efforts are unified, with commonality around the
>> client api, configuration, indexing (preferably with updates asynchronously
>> pushed to Solr in near real-time), and schema definition.  This is
>> important from an operational aspect as well, where it would be ideal to
>> have a single distributed and redundant cluster.
>
> I'm curious, has anyone evaluated ElasticSearch and whether it'd be more
> or less suitable for us than Solr? If so, I'd be very interested in the
> comparison results for our use cases.
>
> Regards,
> Faidon
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 

Alolita Sharma
Director of Engineering
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] First open tech meeting with video broadcast - 10/18

2012-10-18 Thread Erik Moeller
Thanks all for joining :)

Video recording is here:
https://www.youtube.com/watch?v=q9L4TMRrAzw&feature=player_detailpage#t=585s

IRC log is here (a bit odd if you're not watching the video:
https://www.mediawiki.org/wiki/Meetings/2012-10-18/IRC_log

We discussed:

* git-flow:
https://github.com/nvie/gitflow
http://nvie.com/posts/a-successful-git-branching-model/

* what development environment folks are using:
https://www.jetbrains.com/phpstorm/ - PhpStorm received some strong shoutouts

* Vagrant and pre-built dev environments -
https://github.com/mozilla/kuma/ example

Feedback on the setup welcome - hope to do many more of these :)

Erik

-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikidata review status

2012-10-18 Thread Lydia Pintscher
On Thu, Oct 18, 2012 at 10:24 PM, Strainu  wrote:
> Hi Denny,
>
> What is the preferred feedback channel? I left some comments on the
> talk page [1].

The talk page is fine. Thanks for your feedback.


Cheers
Lydia

-- 
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata

Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Let's talk about Solr

2012-10-18 Thread Faidon Liambotis
On Thu, Oct 18, 2012 at 11:22:05AM -0700, Asher Feldman wrote:
> I think Solr is the right direction for us to go in.  Current efforts can
> pave the way for a complete refresh of WMF's article full text search as
> well as how our developers approach information retrieval.  We just need to
> make sure that these efforts are unified, with commonality around the
> client api, configuration, indexing (preferably with updates asynchronously
> pushed to Solr in near real-time), and schema definition.  This is
> important from an operational aspect as well, where it would be ideal to
> have a single distributed and redundant cluster.

I'm curious, has anyone evaluated ElasticSearch and whether it'd be more
or less suitable for us than Solr? If so, I'd be very interested in the
comparison results for our use cases.

Regards,
Faidon

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikidata review status

2012-10-18 Thread Strainu
2012/10/18 Denny Vrandečić :
> Second, we have created a document describing how data from Wikidata
> is being synched (or percolated, or propagated, or moved, or whatever
> the word shall be) to the Wikipedias. This is the core technical heart
> of Wikidata's inner magic, and we really would benefit from peer
> review and scrutiny on this topic. Comment. Suggest ideas. Help us
> out. We do not have the one perfect solution here, and it can make
> quite an impact.
>
> 
>

Hi Denny,

What is the preferred feedback channel? I left some comments on the
talk page [1].

Regards,
  Strainu

[1] https://meta.wikimedia.org/wiki/Talk:Wikidata/Notes/Change_propagation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Message system based HTML snippets

2012-10-18 Thread Daniel Werner
Right now we are about to implement some kind of basic template Engine
to share HTML on the server side as well as on the client side in
JavaScript. Basically this will be a bunch of HTML snippets which we
will put into a resource loader module and send to the client. The
snippets will need some kind of placeholder which can then be replaced
with different content. The replacement would be done by some basic
parser which had to be implemented in PHP as well as JS.

One thought was to simply use the MW message system for this. The
templates would of course get their own store, but the message parser
could perhaps be reused. $1 etc. could be used as placeholders and
even nice-to-haves such as PLURAL or {{int:}} would work out of the
box.
>From a first look, it could be as easy as overwriting
Message::fetchMessage in as subclass.
Off course it had to be taken care of the JavaScript side as well.
Doesn't seem like mw.Message would be a problem.

Any thoughts on this or does anyone know about some similar
implementation in any extensions?

Cheers,
Daniel

-- 
Daniel Werner
Software Engineer

Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. (030) 219 158 26-0

http://wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] First open tech meeting with video broadcast - 10/18

2012-10-18 Thread Erik Moeller
We're about to get started, join #wikimedia-dev-meetings on
irc.freenode.net for details.


-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Let's talk about Solr

2012-10-18 Thread Yury Katkov
Hi guys,

as far as I know, there is also a project in GESIS institute that
couples Semantic MediaWiki and Solr to get cool faceted search. Simon
Bachenberg (in Cc) will make the presentation of this project soon on
a conference.

http://semantic-mediawiki.org/wiki/SMWCon_Fall_2012/SolrStore
-
Yury Katkov



On Thu, Oct 18, 2012 at 10:22 PM, Asher Feldman  wrote:
> Hi all,
>
> I'm excited to see that Max has made a lot of great progress in adding Solr
> support to the GeoData extension so that we don't have to use mysql for
> spatial search - https://gerrit.wikimedia.org/r/#/c/27610/
>
> GeoData makes use of the Solarium php client, which is currently included
> as a part of the extension.  GeoData will be our second use of Solar, after
> TranslationMemory extension which is already deployed -
> https://www.mediawiki.org/wiki/Help:Extension:Translate/Translation_memoriesand
> the Wikidata team is working on using Solr in their extensions as
> well.
>
> TranslationMemory also uses Solarium, a copy of which is also bundled with
> and loaded from the extension.  For a loading and config example -
> https://gerrit.wikimedia.org/r/gitweb?p=operations/mediawiki-config.git;a=blob;f=wmf-config/CommonSettings.php;h=1e7a0e24dcbea106042826474607ec065d328472;hb=HEAD#l2407
>
> I think Solr is the right direction for us to go in.  Current efforts can
> pave the way for a complete refresh of WMF's article full text search as
> well as how our developers approach information retrieval.  We just need to
> make sure that these efforts are unified, with commonality around the
> client api, configuration, indexing (preferably with updates asynchronously
> pushed to Solr in near real-time), and schema definition.  This is
> important from an operational aspect as well, where it would be ideal to
> have a single distributed and redundant cluster.
>
> It would be great to see the i18n, mobile tech, wikidata, and any other
> interested parties collaborate and agree on a path forward, with a quick
> sprint around common code that all can use.
>
> -Asher
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Should JS/CSS pages be parsed?

2012-10-18 Thread Strainu
2012/10/18 Platonides :
> Yes, it should be put back.
> Unless maybe if there was a way to get out to wikitext from js.

Perhaps it would make sense to only parse comments? it might slightly
degrade performance, though, and would also require a small level of
adaptation from users.

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] ResourceLoader support coming soon for mobile

2012-10-18 Thread Brion Vibber
On Wed, Oct 17, 2012 at 3:46 PM, Brion Vibber  wrote:

> I've made a modest initial stab at MobileFrontend support for using
> ResourceLoader directly, using a 'target' filtering technique that we
> discussed with Trevor, Roan, and Timo. This is another step in integrating
> MobileFrontend/SkinMobile into the core MediaWiki ecosystem.
>
> Once we're happy with this and merge it, this'll let both core code and
> extensions add appropriate JS and CSS by whitelisting their modules for
> mobile -- or including a separate mobile module if necessary -- without
> having to special-case JS and CSS loading into MobileFrontend.
>
> Core changes: https://gerrit.wikimedia.org/r/#/c/28433/
> MobileFrontend: https://gerrit.wikimedia.org/r/#/c/28434/
>

These have now been merged... yay!

Don't start relying on them until both the core and MobileFrontend sides
have been deployed, just to be sure, but start planning to. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Let's talk about Solr

2012-10-18 Thread Alolita Sharma
Asher - great suggestion!

TranslationMemory also uses Solarium, a copy of which is also bundled
with and loaded from the extension.  For a loading and config example
- 
https://gerrit.wikimedia.org/r/gitweb?p=operations/mediawiki-config.git;a=blob;f=wmf-config/CommonSettings.php;h=1e7a0e24dcbea106042826474607ec065d328472;hb=HEAD#l2407

Niklas has been pretty satisfied with Solr's performance for TM. We
are very interested in collaborating and working with you to make Solr
more pervasive on our production infrastructure.

Cheers,
Alolita

On Thu, Oct 18, 2012 at 11:46 AM, Max Semenik  wrote:
> Whee!
>
> On 18.10.2012, 22:22 Asher wrote:
>
>> Hi all,
>
>> I'm excited to see that Max has made a lot of great progress in
>> adding Solr support to the GeoData extension so that we don't have
>> to use mysql for spatial search -
>> https://gerrit.wikimedia.org/r/#/c/27610/
>
>> GeoData makes use of the Solarium php client, which is currently
>> included as a part of the extension.  GeoData will be our second use
>> of Solar, after TranslationMemory extension which is already
>> deployed -
>> https://www.mediawiki.org/wiki/Help:Extension:Translate/Translation_memories
>> and the Wikidata team is working on using Solr in their extensions as well.
>
> A little comment on my choice of client library: I initially tried to use
> http://php.net/solr but quickly dicovered that it lacks many features,
> e.g. core support.
>
>> TranslationMemory also uses Solarium, a copy of which is also
>> bundled with and loaded from the extension.  For a loading and
>> config example -
>> https://gerrit.wikimedia.org/r/gitweb?p=operations/mediawiki-config.git;a=blob;f=wmf-config/CommonSettings.php;h=1e7a0e24dcbea106042826474607ec065d328472;hb=HEAD#l2407
>
>> I think Solr is the right direction for us to go in.  Current
>> efforts can pave the way for a complete refresh of WMF's article
>> full text search as well as how our developers approach information
>> retrieval.
>
> We still need a Java developer to port our custom Lucene code to
> Solr in order to use Solr for wiki search.
>
>>  We just need to make sure that these efforts are
>> unified, with commonality around the client api, configuration,
>> indexing (preferably with updates asynchronously pushed to Solr in
>> near real-time), and schema definition.  This is important from an
>> operational aspect as well, where it would be ideal to have a single
>> distributed and redundant cluster.
>
> I've already discussed with Niklas the possibility of moving Solarium
> to a shared extension to keep things centralised. Guess we just need a
> repo set up to move forward.
>
>> It would be great to see the i18n, mobile tech, wikidata, and any
>> other interested parties collaborate and agree on a path forward,
>> with a quick sprint around common code that all can use.
>
> +100
>
> --
> Best regards,
>   Max Semenik ([[User:MaxSem]])
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 

Alolita Sharma
Director of Engineering
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Let's talk about Solr

2012-10-18 Thread Max Semenik
Whee!

On 18.10.2012, 22:22 Asher wrote:

> Hi all,

> I'm excited to see that Max has made a lot of great progress in
> adding Solr support to the GeoData extension so that we don't have
> to use mysql for spatial search -
> https://gerrit.wikimedia.org/r/#/c/27610/ 

> GeoData makes use of the Solarium php client, which is currently
> included as a part of the extension.  GeoData will be our second use
> of Solar, after TranslationMemory extension which is already
> deployed -
> https://www.mediawiki.org/wiki/Help:Extension:Translate/Translation_memories
> and the Wikidata team is working on using Solr in their extensions as well.

A little comment on my choice of client library: I initially tried to use
http://php.net/solr but quickly dicovered that it lacks many features,
e.g. core support.

> TranslationMemory also uses Solarium, a copy of which is also
> bundled with and loaded from the extension.  For a loading and
> config example -
> https://gerrit.wikimedia.org/r/gitweb?p=operations/mediawiki-config.git;a=blob;f=wmf-config/CommonSettings.php;h=1e7a0e24dcbea106042826474607ec065d328472;hb=HEAD#l2407

> I think Solr is the right direction for us to go in.  Current
> efforts can pave the way for a complete refresh of WMF's article
> full text search as well as how our developers approach information
> retrieval.

We still need a Java developer to port our custom Lucene code to
Solr in order to use Solr for wiki search.

>  We just need to make sure that these efforts are
> unified, with commonality around the client api, configuration,
> indexing (preferably with updates asynchronously pushed to Solr in
> near real-time), and schema definition.  This is important from an
> operational aspect as well, where it would be ideal to have a single
> distributed and redundant cluster.

I've already discussed with Niklas the possibility of moving Solarium
to a shared extension to keep things centralised. Guess we just need a
repo set up to move forward.

> It would be great to see the i18n, mobile tech, wikidata, and any
> other interested parties collaborate and agree on a path forward,
> with a quick sprint around common code that all can use. 

+100

-- 
Best regards,
  Max Semenik ([[User:MaxSem]])


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mediawiki + Vagrant

2012-10-18 Thread Tomasz Finc
Chip is trying to use Hangout/Youtube to make it happen.

--tomasz


On Thu, Oct 18, 2012 at 11:31 AM, Patrick Reilly  wrote:
> That's awesome! Will it be recorded tonight?
>
> — Patrick
>
> On Thu, Oct 18, 2012 at 2:28 PM, Ori Livneh  wrote:
>
>> Woot! Thanks guys. I'll use my five-minute slot at the meet up tonight to
>> demo its use.
>>
>> --
>> Ori Livneh
>> o...@wikimedia.org
>>
>>
>> On Thursday, October 18, 2012 at 10:05 AM, Patrick Reilly wrote:
>>
>> > Done and done... 
>> >
>> > Staff team has the following permissions push & pull granted.
>> >
>> > — Patrick
>> >
>> > On Thu, Oct 18, 2012 at 12:52 PM, Tomasz Finc > tf...@wikimedia.org)> wrote:
>> > >
>> > > Since were getting serious about it let's move it to the Wikimedia
>> repot
>> > > On Oct 17, 2012 11:39 PM, "Erik Moeller" > e...@wikimedia.org)> wrote:
>> > >
>> > > > On Wed, Oct 17, 2012 at 3:25 AM, Ori Livneh > o...@wikimedia.org)> wrote:
>> > > >
>> > > > > Ok, I made it work, I think
>> > > > >
>> > > > > git clone https://github.com/atdt/wmf-vagrant.git
>> > > > > cd ./wmf-vagrant:
>> > > > > git submodule update --init
>> > > > > vagrant up
>> > > >
>> > > >
>> > > >
>> > > > And indeed, it works like magic. This is an awesome beginning, Ori -
>> > > > thanks so much for pulling this off. I really think this is a
>> > > > potentially great path to getting pre-built and optimized dev
>> > > > environments into people's hands.
>> > > >
>> > > > As for a permanent home, this isn't really "operations" and probably
>> > > > lots of folks should have merge rights on it, so perhaps a
>> > > > mediawiki/vagrant repo with a broad permission set would make sense?
>> > > >
>> > > > Erik
>> > > >
>> > > > --
>> > > > Erik Möller
>> > > > VP of Engineering and Product Development, Wikimedia Foundation
>> > > >
>> > > > Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
>> > > >
>> > > > ___
>> > > > Wikitech-l mailing list
>> > > > Wikitech-l@lists.wikimedia.org (mailto:
>> Wikitech-l@lists.wikimedia.org)
>> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>> > >
>> > >
>> > > ___
>> > > Wikitech-l mailing list
>> > > Wikitech-l@lists.wikimedia.org (mailto:Wikitech-l@lists.wikimedia.org)
>> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>> >
>> >
>> > ___
>> > Wikitech-l mailing list
>> > Wikitech-l@lists.wikimedia.org (mailto:Wikitech-l@lists.wikimedia.org)
>> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>>
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki + Vagrant

2012-10-18 Thread Patrick Reilly
That's awesome! Will it be recorded tonight?

— Patrick

On Thu, Oct 18, 2012 at 2:28 PM, Ori Livneh  wrote:

> Woot! Thanks guys. I'll use my five-minute slot at the meet up tonight to
> demo its use.
>
> --
> Ori Livneh
> o...@wikimedia.org
>
>
> On Thursday, October 18, 2012 at 10:05 AM, Patrick Reilly wrote:
>
> > Done and done... 
> >
> > Staff team has the following permissions push & pull granted.
> >
> > — Patrick
> >
> > On Thu, Oct 18, 2012 at 12:52 PM, Tomasz Finc  tf...@wikimedia.org)> wrote:
> > >
> > > Since were getting serious about it let's move it to the Wikimedia
> repot
> > > On Oct 17, 2012 11:39 PM, "Erik Moeller"  e...@wikimedia.org)> wrote:
> > >
> > > > On Wed, Oct 17, 2012 at 3:25 AM, Ori Livneh  o...@wikimedia.org)> wrote:
> > > >
> > > > > Ok, I made it work, I think
> > > > >
> > > > > git clone https://github.com/atdt/wmf-vagrant.git
> > > > > cd ./wmf-vagrant:
> > > > > git submodule update --init
> > > > > vagrant up
> > > >
> > > >
> > > >
> > > > And indeed, it works like magic. This is an awesome beginning, Ori -
> > > > thanks so much for pulling this off. I really think this is a
> > > > potentially great path to getting pre-built and optimized dev
> > > > environments into people's hands.
> > > >
> > > > As for a permanent home, this isn't really "operations" and probably
> > > > lots of folks should have merge rights on it, so perhaps a
> > > > mediawiki/vagrant repo with a broad permission set would make sense?
> > > >
> > > > Erik
> > > >
> > > > --
> > > > Erik Möller
> > > > VP of Engineering and Product Development, Wikimedia Foundation
> > > >
> > > > Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
> > > >
> > > > ___
> > > > Wikitech-l mailing list
> > > > Wikitech-l@lists.wikimedia.org (mailto:
> Wikitech-l@lists.wikimedia.org)
> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > >
> > >
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org (mailto:Wikitech-l@lists.wikimedia.org)
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org (mailto:Wikitech-l@lists.wikimedia.org)
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mediawiki + Vagrant

2012-10-18 Thread Ori Livneh
Woot! Thanks guys. I'll use my five-minute slot at the meet up tonight to demo 
its use.  

--
Ori Livneh
o...@wikimedia.org


On Thursday, October 18, 2012 at 10:05 AM, Patrick Reilly wrote:

> Done and done... 
>  
> Staff team has the following permissions push & pull granted.
>  
> — Patrick
>  
> On Thu, Oct 18, 2012 at 12:52 PM, Tomasz Finc  (mailto:tf...@wikimedia.org)> wrote:
> >  
> > Since were getting serious about it let's move it to the Wikimedia repot
> > On Oct 17, 2012 11:39 PM, "Erik Moeller"  > (mailto:e...@wikimedia.org)> wrote:
> >  
> > > On Wed, Oct 17, 2012 at 3:25 AM, Ori Livneh  > > (mailto:o...@wikimedia.org)> wrote:
> > >  
> > > > Ok, I made it work, I think
> > > >  
> > > > git clone https://github.com/atdt/wmf-vagrant.git
> > > > cd ./wmf-vagrant:
> > > > git submodule update --init
> > > > vagrant up
> > >  
> > >  
> > >  
> > > And indeed, it works like magic. This is an awesome beginning, Ori -
> > > thanks so much for pulling this off. I really think this is a
> > > potentially great path to getting pre-built and optimized dev
> > > environments into people's hands.
> > >  
> > > As for a permanent home, this isn't really "operations" and probably
> > > lots of folks should have merge rights on it, so perhaps a
> > > mediawiki/vagrant repo with a broad permission set would make sense?
> > >  
> > > Erik
> > >  
> > > --
> > > Erik Möller
> > > VP of Engineering and Product Development, Wikimedia Foundation
> > >  
> > > Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
> > >  
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org (mailto:Wikitech-l@lists.wikimedia.org)
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >  
> >  
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org (mailto:Wikitech-l@lists.wikimedia.org)
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>  
>  
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org (mailto:Wikitech-l@lists.wikimedia.org)
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Let's talk about Solr

2012-10-18 Thread Asher Feldman
Hi all,

I'm excited to see that Max has made a lot of great progress in adding Solr
support to the GeoData extension so that we don't have to use mysql for
spatial search - https://gerrit.wikimedia.org/r/#/c/27610/

GeoData makes use of the Solarium php client, which is currently included
as a part of the extension.  GeoData will be our second use of Solar, after
TranslationMemory extension which is already deployed -
https://www.mediawiki.org/wiki/Help:Extension:Translate/Translation_memoriesand
the Wikidata team is working on using Solr in their extensions as
well.

TranslationMemory also uses Solarium, a copy of which is also bundled with
and loaded from the extension.  For a loading and config example -
https://gerrit.wikimedia.org/r/gitweb?p=operations/mediawiki-config.git;a=blob;f=wmf-config/CommonSettings.php;h=1e7a0e24dcbea106042826474607ec065d328472;hb=HEAD#l2407

I think Solr is the right direction for us to go in.  Current efforts can
pave the way for a complete refresh of WMF's article full text search as
well as how our developers approach information retrieval.  We just need to
make sure that these efforts are unified, with commonality around the
client api, configuration, indexing (preferably with updates asynchronously
pushed to Solr in near real-time), and schema definition.  This is
important from an operational aspect as well, where it would be ideal to
have a single distributed and redundant cluster.

It would be great to see the i18n, mobile tech, wikidata, and any other
interested parties collaborate and agree on a path forward, with a quick
sprint around common code that all can use.

-Asher
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Should JS/CSS pages be parsed?

2012-10-18 Thread Platonides
Yes, it should be put back.
Unless maybe if there was a way to get out to wikitext from js.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Wikidata review status

2012-10-18 Thread Denny Vrandečić
Dear all,

let me say it like this:

WOH!!!

All of our patchsets have been merged into core. You are awesome!
Thank you so much!

We won't leave you without new work, though. Three points:

First, the merge of the ContentHandler branch is, now that it is being
deployed, revealing issues in several places. If you discover
something, please file it in Bugzilla under the ContentHandler
component. The list of currently open bugs is here, and if you can
help us, this would be great.



Second, we have created a document describing how data from Wikidata
is being synched (or percolated, or propagated, or moved, or whatever
the word shall be) to the Wikipedias. This is the core technical heart
of Wikidata's inner magic, and we really would benefit from peer
review and scrutiny on this topic. Comment. Suggest ideas. Help us
out. We do not have the one perfect solution here, and it can make
quite an impact.



Third, there is an oldish bug to MW core, filed in May, which we have
a bad feeling about. We expect that it will bite us more often on
Wikidata. The trouble is, there is no fix, and actually, we do not
know what is going on there, and several others have also tried to
crack this nut. If you want a challenge, squash this beast!



Enjoy all,
Denny




-- 
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki + Vagrant

2012-10-18 Thread Patrick Reilly
Done and done... 

Staff team has the following permissions push & pull granted.

— Patrick

On Thu, Oct 18, 2012 at 12:52 PM, Tomasz Finc  wrote:
>
> Since were getting serious about it let's move it to the Wikimedia repot
> On Oct 17, 2012 11:39 PM, "Erik Moeller"  wrote:
>
> > On Wed, Oct 17, 2012 at 3:25 AM, Ori Livneh  wrote:
> >
> > > Ok, I made it work, I think
> > >
> > > git clone https://github.com/atdt/wmf-vagrant.git
> > > cd ./wmf-vagrant:
> > > git submodule update --init
> > > vagrant up
> >
> > And indeed, it works like magic. This is an awesome beginning, Ori -
> > thanks so much for pulling this off. I really think this is a
> > potentially great path to getting pre-built and optimized dev
> > environments into people's hands.
> >
> > As for a permanent home, this isn't really "operations" and probably
> > lots of folks should have merge rights on it, so perhaps a
> > mediawiki/vagrant repo with a broad permission set would make sense?
> >
> > Erik
> >
> > --
> > Erik Möller
> > VP of Engineering and Product Development, Wikimedia Foundation
> >
> > Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mediawiki + Vagrant

2012-10-18 Thread Tomasz Finc
Since were getting serious about it let's move it to the Wikimedia repot
On Oct 17, 2012 11:39 PM, "Erik Moeller"  wrote:

> On Wed, Oct 17, 2012 at 3:25 AM, Ori Livneh  wrote:
>
> > Ok, I made it work, I think
> >
> > git clone https://github.com/atdt/wmf-vagrant.git
> > cd ./wmf-vagrant:
> > git submodule update --init
> > vagrant up
>
> And indeed, it works like magic. This is an awesome beginning, Ori -
> thanks so much for pulling this off. I really think this is a
> potentially great path to getting pre-built and optimized dev
> environments into people's hands.
>
> As for a permanent home, this isn't really "operations" and probably
> lots of folks should have merge rights on it, so perhaps a
> mediawiki/vagrant repo with a broad permission set would make sense?
>
> Erik
>
> --
> Erik Möller
> VP of Engineering and Product Development, Wikimedia Foundation
>
> Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: Seeking feedback on new "Organizations" feature on Ohloh

2012-10-18 Thread Tomasz Finc
These are only on gerrit till we can test replication and have pull request
support.
On Oct 18, 2012 9:12 AM, "Quim Gil"  wrote:

> On 10/17/2012 11:49 AM, Tomasz Finc wrote:
>
>> And here are the links with relevant stats
>>
>> https://www.ohloh.net/p/**WikipediaMobile
>>
>> https://www.ohloh.net/p/**WLMMobile 
>>
>
> These projects point to https://github.com/wikimedia/*
>
> For sanity purposes, would it make sense to agree that Wikimedia projects
> in Ohloh should point to their canonical locations at Gerrit (unless they
> are only in GitHub, of course)?
>
> --
> Quim
>
>
>> --tomasz
>>
>>
>> On Wed, Oct 17, 2012 at 10:30 AM, Tomasz Finc 
>> wrote:
>>
>>> I like this as it generally raises the visibility of our projets.
>>> Having a top lovel *and* up to date view these days is next to
>>> impossible unless you follow multiple mailing lists and constantly
>>> read the monthly engineering reports.
>>>
>>> As a test I added our WLM and Wikipedia Cordova apps to Ohloh to see
>>> what they would look like. It would be far simpler if we could be
>>> added as an organization so that these get added automatically.
>>>
>>> --tomasz
>>>
>>>
>>> On Tue, Oct 16, 2012 at 2:42 PM, Quim Gil  wrote:
>>>
 Hi, what about having Wikimedia features as organization in Ohloh?

 The proposal is interesting considering the current state of things:
 MediaWiki seems to be stalled with the SVN to Git migration, and it is
 close
 impossible to find out what other projects come from this community.

 This would help or quest on community metrics, so here goes my humble
 +1. I
 also volunteer with some work, basically following the steps of
 https://github.com/wikimedia and pinging Sumana / here for anything
 else.



  Original Message 
 Subject:Seeking feedback on new "Organizations" feature on Ohloh
 Date:   Tue, 16 Oct 2012 18:22:54 +
 From:   Rich Sands 
 To: metrics-wg@theopensourceway.**org<
 metrics-wg@theopensourceway.**org >



 Hi all,

 As I mentioned here recently, we're rolling out a new feature on Ohloh
 next week I thought might be interesting to this list. We're adding
 rollups of projects into "Organizations", so project contributors and
 others could see for example all the projects in the FooBar Foundation,
 which ones are most active, the most active contributors, year over year
 summary statistics, etc. I know many of you or folks in your
 organizations have rolled your own metrics for watching your
 foundation's projects and their activity. This feature isn't intended to
 replace any of that, but rather to provide a view into how organizations
 are contributing to and influencing FOSS. We've spoken with a number of
 you who've expressed a need for this, and hope this new feature can be a
 valuable resource.

 We're rolling this out as a Beta feature and would love to get your
 feedback. There are two aspects we're looking at: which organizations
 steward which projects, and also which organizations contribute to which
 projects through developers affiliated with those organizations, whether
 they're for-profit, non-profit, governmental, or educational. Initially
 we're concentrating on the first aspect - projects in organizations. In
 a near-term iteration we'll add in the contribution bit, but for now
 we're not showing stats on which projects an organization contributes
 to.

 If you would like a preview of this feature before it is released, let
 me know and I'll send you a URL and a name/password combo so you can
 check it out. And if you'd like your organization to be one of the
 featured ones when we open this up, I'd be thrilled to add you. All I
 need is an organization name, a short description, a logo (if
 available), a homepage URL for the organization, and a list of projects
 to include. If your projects aren't already on Ohloh, I can help you add
 them as well.

 We're keen to make this useful, and we're rolling it out in a fairly raw
 state, so that the FOSS community can help it evolve. Also, please don't
 post about or publicize this new feature before it comes out next week.
 Looking forward to hearing from you!

 --  rms

 Rich Sands
 Director of Developer Communities
 Black Duck Software, Inc.
 rsa...@blackducksoftware.com 
 
 Cell: +1 617-283-0027
 www.ohloh.net 




 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-l

>>>
>> _

Re: [Wikitech-l] Fwd: Seeking feedback on new "Organizations" feature on Ohloh

2012-10-18 Thread Quim Gil

On 10/17/2012 11:49 AM, Tomasz Finc wrote:

And here are the links with relevant stats

https://www.ohloh.net/p/WikipediaMobile

https://www.ohloh.net/p/WLMMobile


These projects point to https://github.com/wikimedia/*

For sanity purposes, would it make sense to agree that Wikimedia 
projects in Ohloh should point to their canonical locations at Gerrit 
(unless they are only in GitHub, of course)?


--
Quim



--tomasz


On Wed, Oct 17, 2012 at 10:30 AM, Tomasz Finc  wrote:

I like this as it generally raises the visibility of our projets.
Having a top lovel *and* up to date view these days is next to
impossible unless you follow multiple mailing lists and constantly
read the monthly engineering reports.

As a test I added our WLM and Wikipedia Cordova apps to Ohloh to see
what they would look like. It would be far simpler if we could be
added as an organization so that these get added automatically.

--tomasz


On Tue, Oct 16, 2012 at 2:42 PM, Quim Gil  wrote:

Hi, what about having Wikimedia features as organization in Ohloh?

The proposal is interesting considering the current state of things:
MediaWiki seems to be stalled with the SVN to Git migration, and it is close
impossible to find out what other projects come from this community.

This would help or quest on community metrics, so here goes my humble +1. I
also volunteer with some work, basically following the steps of
https://github.com/wikimedia and pinging Sumana / here for anything else.



 Original Message 
Subject:Seeking feedback on new "Organizations" feature on Ohloh
Date:   Tue, 16 Oct 2012 18:22:54 +
From:   Rich Sands 
To: metrics...@theopensourceway.org 



Hi all,

As I mentioned here recently, we're rolling out a new feature on Ohloh
next week I thought might be interesting to this list. We're adding
rollups of projects into "Organizations", so project contributors and
others could see for example all the projects in the FooBar Foundation,
which ones are most active, the most active contributors, year over year
summary statistics, etc. I know many of you or folks in your
organizations have rolled your own metrics for watching your
foundation's projects and their activity. This feature isn't intended to
replace any of that, but rather to provide a view into how organizations
are contributing to and influencing FOSS. We've spoken with a number of
you who've expressed a need for this, and hope this new feature can be a
valuable resource.

We're rolling this out as a Beta feature and would love to get your
feedback. There are two aspects we're looking at: which organizations
steward which projects, and also which organizations contribute to which
projects through developers affiliated with those organizations, whether
they're for-profit, non-profit, governmental, or educational. Initially
we're concentrating on the first aspect - projects in organizations. In
a near-term iteration we'll add in the contribution bit, but for now
we're not showing stats on which projects an organization contributes to.

If you would like a preview of this feature before it is released, let
me know and I'll send you a URL and a name/password combo so you can
check it out. And if you'd like your organization to be one of the
featured ones when we open this up, I'd be thrilled to add you. All I
need is an organization name, a short description, a logo (if
available), a homepage URL for the organization, and a list of projects
to include. If your projects aren't already on Ohloh, I can help you add
them as well.

We're keen to make this useful, and we're rolling it out in a fairly raw
state, so that the FOSS community can help it evolve. Also, please don't
post about or publicize this new feature before it comes out next week.
Looking forward to hearing from you!

--  rms

Rich Sands
Director of Developer Communities
Black Duck Software, Inc.
rsa...@blackducksoftware.com 
Cell: +1 617-283-0027
www.ohloh.net 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A bot to create articles about species

2012-10-18 Thread John Erling Blad
For those interested this type of text synthesis, it can be done by
using finite-state automata and transducers (FST's). The simplest way
to make them is by cross-compiling into Lua from some other known
form.
John

On Thu, Oct 18, 2012 at 2:10 PM, Denny Vrandečić
 wrote:
> For now, we have no plans for Wikidata to create articles. This would,
> in my opinion, meddle too much with the autonomy of the Wikipedia
> language projects.
>
> What will be possible is to facilitate the creation of such bots, as
> some data that might be used for the article might be taken from and
> maintained in Wikidata, and the creation of templates that use data
> from Wikidata.
>
> Wikidata currently has no plans for creating text using natural
> language generation techniques. We would love for someone else to do
> this kind of awesome on top of Wikidata.
>
> I hope this helps,
> Denny
>
>
>
>
> 2012/10/18 Nikola Smolenski :
>> On 18/10/12 09:25, Steven Walling wrote:
>>>
>>> On Wed, Oct 17, 2012 at 11:46 PM, Nikola Smolenski
>>> wrote:

 The need for such bots should cease after Wikidata is fully deployed. I
 suggest to interested programmers that they should direct their effort
 there.
>>>
>>>
>>> Why is that the case?
>>>
>>> I didn't understand the scope of Wikidata to include actual creation
>>> of articles that don't exist. Only to provide data about topics
>>> across projects. Sure, that might be extremely helpful to someone with a
>>> bot to populate species articles, but I'm skeptical that Wikidata would
>>> or should be creating millions of articles about such things. If you
>>> consider something even slightly more controversial than species, such as
>>> schools, many projects would not welcome a third party mass-creating pages
>>> about a topic that is described in Wikidata.
>>
>>
>> Wikidata won't need to create articles. Rather, if you are trying to see a
>> page without an article, Wikipedia will check if an item with appropriate
>> name exists in Wikidata and generate the article on the fly if Wikipedia has
>> a local article template for this type of article.
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
> --
> Project director Wikidata
> Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
> Tel. +49-30-219 158 26-0 | http://wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
> unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] A bot to create articles about species

2012-10-18 Thread Denny Vrandečić
For now, we have no plans for Wikidata to create articles. This would,
in my opinion, meddle too much with the autonomy of the Wikipedia
language projects.

What will be possible is to facilitate the creation of such bots, as
some data that might be used for the article might be taken from and
maintained in Wikidata, and the creation of templates that use data
from Wikidata.

Wikidata currently has no plans for creating text using natural
language generation techniques. We would love for someone else to do
this kind of awesome on top of Wikidata.

I hope this helps,
Denny




2012/10/18 Nikola Smolenski :
> On 18/10/12 09:25, Steven Walling wrote:
>>
>> On Wed, Oct 17, 2012 at 11:46 PM, Nikola Smolenski
>> wrote:
>>>
>>> The need for such bots should cease after Wikidata is fully deployed. I
>>> suggest to interested programmers that they should direct their effort
>>> there.
>>
>>
>> Why is that the case?
>>
>> I didn't understand the scope of Wikidata to include actual creation
>> of articles that don't exist. Only to provide data about topics
>> across projects. Sure, that might be extremely helpful to someone with a
>> bot to populate species articles, but I'm skeptical that Wikidata would
>> or should be creating millions of articles about such things. If you
>> consider something even slightly more controversial than species, such as
>> schools, many projects would not welcome a third party mass-creating pages
>> about a topic that is described in Wikidata.
>
>
> Wikidata won't need to create articles. Rather, if you are trying to see a
> page without an article, Wikipedia will check if an item with appropriate
> name exists in Wikidata and generate the article on the fly if Wikipedia has
> a local article template for this type of article.
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Should JS/CSS pages be parsed?

2012-10-18 Thread Krinkle
On Oct 18, 2012, at 5:04 AM, Daniel Kinzler  wrote:

> Hi!
> 
> When designing the ContentHandler, I asked around about whether JS and CSS 
> pages
> should be parsed as wikitext, so categories etc would work. The gist of the
> responses I got was "naw, lets get rid of that". So I did (though PST is still
> applied - Tim asked for that at the Berlin Hackathon).
> 
> Sure enough, people are complaining now, see
> . Also note that an 
> older
> request for disablingt parsing of script pages was closed as WONTFIX:
> .
> 
> I'm inclined to (at least optionally) enable the parsing of script pages, but
> I'd like to get some feedback first.
> 
> -- daniel


Yeah, as more elaborately put on the bug[1], it was disabled in ContentHandler 
without dedicated discussion because it was thought of as a minor oddity that 
should be removed as a bug.

We know now that (though it might have been a bug originally) it is a major 
feature that unless replaced, must not be removed.

-- Krinkle

[1] https://bugzilla.wikimedia.org/41155


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Commit summaries

2012-10-18 Thread Chad
On Thu, Oct 18, 2012 at 5:08 AM, Tim Starling  wrote:
> On 18/10/12 19:08, Antoine Musso wrote:
>> During our first weeks using git, we have been asking people to write
>> nice summary lines since they are used in Gerrit email notifications and
>> in git log.  I wrote a basic guideline (which has been improved since)
>> that people can be pointed at:
>>
>>   https://www.mediawiki.org/wiki/Git/Commit_message_guidelines
>
> That's interesting. I usually don't put the bug number in the first
> line, because there's not enough room for it. There's barely enough
> room to fit in the most simplified summary of a change in 62
> characters, and the bug number would take up 12, leaving you only 50.
>

Indeed. And actually, I've kind of gotten out of the habit of
doing this as well. The "standard practice" is actually for
people to do it in the footer of the message, along with things
like the Change-Id and Signed-Off-By. For example:

'''
Fixing some broken feature

This was broken because..

Bug: 12345
Change-Id: 
'''

I rather prefer this format, to be honest. And actually, Gerrit
has a feature (we're not making use of, but we easily could
if people are interested) where you can track those bug footer
notes. We could easily add regexes for Bugzilla and RT (gerrit
calls them "tracking ids") if that's something people would use.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Should JS/CSS pages be parsed?

2012-10-18 Thread Tyler Romeo
It seems like people have some pretty good reasons for parsing JS/CSS pages
(categorization, backlinks, speedy deletion templates, etc.), so unless
there is some significant disadvantage to MW for enabling parsing, I'm
going to have to agree with the bug filer.
*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com



On Thu, Oct 18, 2012 at 5:04 AM, Daniel Kinzler wrote:

> Hi!
>
> When designing the ContentHandler, I asked around about whether JS and CSS
> pages
> should be parsed as wikitext, so categories etc would work. The gist of the
> responses I got was "naw, lets get rid of that". So I did (though PST is
> still
> applied - Tim asked for that at the Berlin Hackathon).
>
> Sure enough, people are complaining now, see
> . Also note that an
> older
> request for disablingt parsing of script pages was closed as WONTFIX:
> .
>
> I'm inclined to (at least optionally) enable the parsing of script pages,
> but
> I'd like to get some feedback first.
>
> -- daniel
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Commit summaries

2012-10-18 Thread Tyler Romeo
Keep in mind this is not a problem with git, but a problem with gerrit. Git
will work perfectly fine with summary lines over 62 characters (try git
shortlog).
*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com



On Thu, Oct 18, 2012 at 5:08 AM, Tim Starling wrote:

> On 18/10/12 19:08, Antoine Musso wrote:
> > During our first weeks using git, we have been asking people to write
> > nice summary lines since they are used in Gerrit email notifications and
> > in git log.  I wrote a basic guideline (which has been improved since)
> > that people can be pointed at:
> >
> >   https://www.mediawiki.org/wiki/Git/Commit_message_guidelines
>
> That's interesting. I usually don't put the bug number in the first
> line, because there's not enough room for it. There's barely enough
> room to fit in the most simplified summary of a change in 62
> characters, and the bug number would take up 12, leaving you only 50.
>
> Consider if you wanted to say what function it is you changed. Here's
> a histogram of lengths of "ClassName::methodName" strings from
> $wgAutoloadClasses on my test wiki:
>
> length count
> 
> 9  1
> 10 1
> 11 4
> 12 7
> 13 3
> 14 2
> 15 6
> 16 13
> 17 15
> 18 16
> 19 29
> 20 32
> 21 55
> 22 62
> 23 106
> 24 149
> 25 152
> 26 177
> 27 225
> 28 258
> 29 297
> 30 301
> 31 338
> 32 355
> 33 328
> 34 290
> 35 282
> 36 240
> 37 204
> 38 220
> 39 172
> 40 173
> 41 141
> 42 126
> 43 80
> 44 88
> 45 69
> 46 63
> 47 53
> 48 27
> 49 31
> 50 26
> 51 23
> 52 9
> 53 10
> 54 10
> 55 4
> 56 5
> 57 5
> 58 8
> 59 1
> 60 4
> 61 4
> 62 3
> 63 1
> 64 1
> 65 2
> 66 1
> 67 0
> 68 1
> 69 1
> 70 0
> 71 0
> 72 1
> 73 1
>
> Really, 62 characters is ridiculously short, and the existence of that
> limit is a flaw in git, but at least you can write a typical method
> name preceded by the word "fixed". With a limit of 50, you often can't.
>
> -- Tim Starling
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A bot to create articles about species

2012-10-18 Thread John Erling Blad
Getting working inflection rules for even a single language is a major
task, and doing so for several hundred languages would be a
overwhelming task.  I can't see how this can be implemented as part of
the Wikidata project within a reasonable time frame.

There is a few shortcuts that can be made, and it is possible to make
some generalized tools. For an open source alternative take a look at
Apertium (http://en.wikipedia.org/wiki/Apertium). Usually it is only
the generation/disambiguation phase that is necessary, and this makes
the task somewhat simpler, but it is still a major undertaking.

Note that some of the basic tools already exist, we only need to
interface them to Mediawiki, but the tools needs definition files to
work (that is inflection rules for Northern Sami language for example,
or Norwegian bokmål and nynorsk, or Swedish) and it is those
definitions that is the major task.

John

On Thu, Oct 18, 2012 at 11:14 AM, Nikola Smolenski  wrote:
> On 18/10/12 11:06, John Erling Blad wrote:
>>
>> well-formed text automatically. One of the more common problems are
>> names that uses different inflection rules due to context and how they
>> are written. Such inflection rules are not part of the Wikidata
>> project and will probably be a major undertaking in itself.
>
>
> Why do you think that inflection rules will not be a part of Wikidata? They
> would be hugely needed on Wiktionary and there is no reason for Wikidata not
> being able to contain them.
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A bot to create articles about species

2012-10-18 Thread Nikola Smolenski

On 18/10/12 11:06, John Erling Blad wrote:

well-formed text automatically. One of the more common problems are
names that uses different inflection rules due to context and how they
are written. Such inflection rules are not part of the Wikidata
project and will probably be a major undertaking in itself.


Why do you think that inflection rules will not be a part of Wikidata? 
They would be hugely needed on Wiktionary and there is no reason for 
Wikidata not being able to contain them.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Commit summaries

2012-10-18 Thread Tim Starling
On 18/10/12 19:08, Antoine Musso wrote:
> During our first weeks using git, we have been asking people to write
> nice summary lines since they are used in Gerrit email notifications and
> in git log.  I wrote a basic guideline (which has been improved since)
> that people can be pointed at:
> 
>   https://www.mediawiki.org/wiki/Git/Commit_message_guidelines

That's interesting. I usually don't put the bug number in the first
line, because there's not enough room for it. There's barely enough
room to fit in the most simplified summary of a change in 62
characters, and the bug number would take up 12, leaving you only 50.

Consider if you wanted to say what function it is you changed. Here's
a histogram of lengths of "ClassName::methodName" strings from
$wgAutoloadClasses on my test wiki:

length count

9  1
10 1
11 4
12 7
13 3
14 2
15 6
16 13
17 15
18 16
19 29
20 32
21 55
22 62
23 106
24 149
25 152
26 177
27 225
28 258
29 297
30 301
31 338
32 355
33 328
34 290
35 282
36 240
37 204
38 220
39 172
40 173
41 141
42 126
43 80
44 88
45 69
46 63
47 53
48 27
49 31
50 26
51 23
52 9
53 10
54 10
55 4
56 5
57 5
58 8
59 1
60 4
61 4
62 3
63 1
64 1
65 2
66 1
67 0
68 1
69 1
70 0
71 0
72 1
73 1

Really, 62 characters is ridiculously short, and the existence of that
limit is a flaw in git, but at least you can write a typical method
name preceded by the word "fixed". With a limit of 50, you often can't.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A bot to create articles about species

2012-10-18 Thread John Erling Blad
On Thu, Oct 18, 2012 at 10:08 AM, Nikola Smolenski  wrote:
> On 18/10/12 09:25, Steven Walling wrote:
>>
>> On Wed, Oct 17, 2012 at 11:46 PM, Nikola Smolenski
>> wrote:
>>>
>>> The need for such bots should cease after Wikidata is fully deployed. I
>>> suggest to interested programmers that they should direct their effort
>>> there.
>>
>>
>> Why is that the case?

The necessary data to create those articles will be available in
Wikidata, and possibly a lot more than we currently have in our
templates. That could make it possible to create really awsome
articles, if it were not for one thing - it is extremly hard to create
well-formed text automatically. One of the more common problems are
names that uses different inflection rules due to context and how they
are written. Such inflection rules are not part of the Wikidata
project and will probably be a major undertaking in itself.

Note that some languages does not need such inflection rules and then
it is fairly simple to create articles from templates. In other cases
it might be good enough to simply say "Pygochelidon cyanoleuca is a
bird" and add an automatic template.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Should JS/CSS pages be parsed?

2012-10-18 Thread Daniel Kinzler
Hi!

When designing the ContentHandler, I asked around about whether JS and CSS pages
should be parsed as wikitext, so categories etc would work. The gist of the
responses I got was "naw, lets get rid of that". So I did (though PST is still
applied - Tim asked for that at the Berlin Hackathon).

Sure enough, people are complaining now, see
. Also note that an older
request for disablingt parsing of script pages was closed as WONTFIX:
.

I'm inclined to (at least optionally) enable the parsing of script pages, but
I'd like to get some feedback first.

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Commit summaries

2012-10-18 Thread Ori Livneh
On Thursday, October 18, 2012 at 1:08 AM, Antoine Musso wrote:
> During our first weeks using git, we have been asking people to write
> nice summary lines since they are used in Gerrit email notifications and
> in git log. I wrote a basic guideline (which has been improved since)
> that people can be pointed at:
> 
> https://www.mediawiki.org/wiki/Git/Commit_message_guidelines
This is very useful! I wonder if we could inject the URL into the comment part 
of the commit template, using something like the commit.template configuration 
variable[1]. Maybe it's something git review -s could set up. 

  [1]: http://git-scm.com/book/en/Customizing-Git-Git-Configuration

--
Ori Livneh
o...@wikimedia.org



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] IRC office hours with the Language Engineering team 2012-09-17 16:30 UTC

2012-10-18 Thread Srikanth Lakshmanan (WMF)
On Wed, Oct 17, 2012 at 11:47 PM, Srikanth Lakshmanan (WMF) <
slakshma...@wikimedia.org> wrote:

>
> The log[1] is available. The next Language Engineering office hour will be
> on 14th November. Thank you
>

Apologies for the spam, the next office hours will be on 21st November, 3rd
Wednesday as usual. Thanks.

-- 
Srikanth L
Wikimedia Language Engineering Team
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A bot to create articles about species

2012-10-18 Thread Nikola Smolenski

On 18/10/12 09:25, Steven Walling wrote:

On Wed, Oct 17, 2012 at 11:46 PM, Nikola Smolenski
wrote:

The need for such bots should cease after Wikidata is fully deployed. I
suggest to interested programmers that they should direct their effort
there.


Why is that the case?

I didn't understand the scope of Wikidata to include actual creation
of articles that don't exist. Only to provide data about topics
across projects. Sure, that might be extremely helpful to someone with a
bot to populate species articles, but I'm skeptical that Wikidata would
or should be creating millions of articles about such things. If you
consider something even slightly more controversial than species, such as
schools, many projects would not welcome a third party mass-creating pages
about a topic that is described in Wikidata.


Wikidata won't need to create articles. Rather, if you are trying to see 
a page without an article, Wikipedia will check if an item with 
appropriate name exists in Wikidata and generate the article on the fly 
if Wikipedia has a local article template for this type of article.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Commit summaries

2012-10-18 Thread Antoine Musso
Le 16/10/12 17:41, Harry Burt a écrit :
> Hey all,
> 
> Unfortunately the combination of using commit summaries in release notes
> and the mass-merger of Wikidata code means that we now about several
> hundred somewhat cryptic commit summaries in the release notes [1].
> [1] https://www.mediawiki.org/w/index.php?title=MediaWiki_1.21/wmf2

Hello,

That is unfortunate. We might want to rebuild the 1.21/wmf2 release note
by excluding the Wikidata branch change and manually craft a section
dedicated to that merge.

> IMHO it might be a good idea for everyone to get into the habit of using
> clear commit summaries regardless of whether or not they're directly
> committing to core/master, if not overly time-consuming. This would make it
> a lot easier for people like me who wade through the automated release notes

During our first weeks using git, we have been asking people to write
nice summary lines since they are used in Gerrit email notifications and
in git log.  I wrote a basic guideline (which has been improved since)
that people can be pointed at:

  https://www.mediawiki.org/wiki/Git/Commit_message_guidelines

cheers,

-- 
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A bot to create articles about species

2012-10-18 Thread Steven Walling
On Wed, Oct 17, 2012 at 11:46 PM, Nikola Smolenski 
wrote:
> The need for such bots should cease after Wikidata is fully deployed. I
> suggest to interested programmers that they should direct their effort
> there.

Why is that the case?

I didn't understand the scope of Wikidata to include actual creation
of articles that don't exist. Only to provide data about topics
across projects. Sure, that might be extremely helpful to someone with a
bot to populate species articles, but I'm skeptical that Wikidata would
or should be creating millions of articles about such things. If you
consider something even slightly more controversial than species, such as
schools, many projects would not welcome a third party mass-creating pages
about a topic that is described in Wikidata.

Steven
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Fwd: Seeking feedback on new "Organizations" feature on Ohloh

2012-10-18 Thread Quim Gil

On 10/17/2012 11:35 PM, Siebrand Mazeland (WMF) wrote:

Op 18 okt. 2012 om 08:23 heeft Quim Gil  het
volgende geschreven:


Brion, Siebrand and I are admins of the MediaWiki project on
ohloh.net.


Good to know.  :)  Do you want to contact Rich Sands from Ohloh? Do
you want me to contact CCing you...?


Yes and yes.


Done, CCing Brion Vibber, Antoine Musso and Siebrand Mazeland. Will post 
here significant updates.




Do you want to be project admin? Looks like you may be the better
person to take care of this for now... I think you can hit a button
somewhere to spam the other admins with a request for admin.


Button found. Request sent. Thank you!

--
Quim

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l