Re: [Wikitech-l] Collaboration between staff and volunteers: a two-way street

2010-10-17 Thread MZMcBride
Aryeh Gregor wrote: > On Sat, Oct 16, 2010 at 4:34 PM, MZMcBride wrote: >> Having a plan is great and it sounds like a completely reasonable plan, but >> currently only Tim is able to do general code updates and he's not really >> around, from what I understand. > > I don't think anyone is treati

Re: [Wikitech-l] Wikipedia, we have a Google refresh problem!

2010-10-17 Thread Platonides
Dmitriy Sintsov wrote: >> The linked blog post laments the lag between the removal of vandalism > on >> Wikipedia and its removal in Google's indices and cached data. >> > I am completely disconnected from Wikipedia - I do use MediaWiki for > small projects. However, wasn't there FlaggedRevs depl

Re: [Wikitech-l] Wikipedia, we have a Google refresh problem!

2010-10-17 Thread Max Semenik
On 17.10.2010, 22:42 Neil wrote: > Google would rather not have any vandalism in their index, but that's > not the point. They care about the reindexing schedule. If we create > sitemaps that also note the recent velocity of changes, the vandal's > edits in a sense work against themselves. Ever

Re: [Wikitech-l] Wikipedia, we have a Google refresh problem!

2010-10-17 Thread Neil Kandalgaonkar
Google would rather not have any vandalism in their index, but that's not the point. They care about the reindexing schedule. If we create sitemaps that also note the recent velocity of changes, the vandal's edits in a sense work against themselves. Every new change brings new scrutiny. If you

Re: [Wikitech-l] Collaboration between staff and volunteers: a two-way street

2010-10-17 Thread Aryeh Gregor
On Fri, Oct 15, 2010 at 5:58 PM, Erik Moeller wrote: > I think that we agree more than we disagree here. Obviously a huge > code review and deployment backlog is bad for everyone. Volunteers > should feel that their contributions are wanted, supported, and > appreciated (so should staff). And we a

Re: [Wikitech-l] Wikipedia, we have a Google refresh problem!

2010-10-17 Thread Dmitriy Sintsov
* Neil Kandalgaonkar [Sat, 16 Oct 2010 22:20:50 -0700]: > On 10/16/10 8:40 PM, Fred Bauder wrote: > > > > > http://mastersofmedia.hum.uva.nl/2010/10/16/wikipedia-we-have-a-google-refresh-problem/ > > The linked blog post laments the lag between the removal of vandalism on > Wikipedia and its re

Re: [Wikitech-l] Wikipedia, we have a Google refresh problem!

2010-10-17 Thread Q
On 10/17/2010 7:54 AM, Gerard Meijssen wrote: > Hoi, > If you understand the issue, you would know who decides what qualifies as > vandalism. It is exactly the same people who already decide what vandalism > is. So basically you want anybody who visits the website to be able to tell google to fast

Re: [Wikitech-l] Wikipedia, we have a Google refresh problem!

2010-10-17 Thread Robert Stojnic
On 17/10/10 13:34, Q wrote: > -BEGIN PGP SIGNED MESSAGE- > Hash: SHA256 > > On 10/17/2010 5:40 AM, Robert Stojnic wrote: > >> I am sure google already taps into recent changes in wikipedia, but it >> might be worth contacting them officially to see if edits marked as >> vandalism can be

Re: [Wikitech-l] Wikipedia, we have a Google refresh problem!

2010-10-17 Thread Gerard Meijssen
Hoi, If you understand the issue, you would know who decides what qualifies as vandalism. It is exactly the same people who already decide what vandalism is. Thanks, GerardM On 17 October 2010 14:34, Q wrote: > -BEGIN PGP SIGNED MESSAGE- > Hash: SHA256 > > On 10/17/2010 5:40 AM,

Re: [Wikitech-l] Wikipedia, we have a Google refresh problem!

2010-10-17 Thread Q
-BEGIN PGP SIGNED MESSAGE- Hash: SHA256 On 10/17/2010 5:40 AM, Robert Stojnic wrote: > I am sure google already taps into recent changes in wikipedia, but it > might be worth contacting them officially to see if edits marked as > vandalism can be threated with larger priority in their in

Re: [Wikitech-l] Wikipedia, we have a Google refresh problem!

2010-10-17 Thread Robert Stojnic
As far as I know, sitemaps are used primarily to inform the search engine of the pages on a website directly, rather than waiting for the search engine to figure them out from links from external sites. I vaguely remember we used to generate sitemaps, but then stopped because google more-or-le