Re: [Wikitech-l] Wikimedians are rightfully wary

2012-08-24 Thread Strainu
2012/8/24 Ryan Lane :
>> Your idea is a great one, except... I was going to say "you can't see
>> the forest for the trees", but actually it's the other way around. I
>> think you're too focused on the big picture (communicating with the
>> community) to see that smaller steps can help a great deal.
>>
>
> I haven't seen any small step solution that improves the situation,
> though. Unless there's two way communication then it's the WMF telling
> people "here's what we're going to do" without any way for them to
> give us proper feedback. We can't possibly host discussions with all
> of our communities, and it's really unfair to only select the biggest
> ones.

That's exactly what I'm trying to point out to you: the WMF telling
people "here's what we're going to do" *on their home wiki* IS a huge
improvement. Specifically, on ro.wp, instead of 4-5 people seeing
these messages, 50+ people would see the messages on the Village Pump.
That's a ten-fold increase in coverage with very little effort.

>
>> Sure, it's great to have lots of peopled involved in the discussion
>> leading to a big change, but it's not bad at all to have some people
>> involved in the decision making, but _everybody_ in the loop about the
>> decision taken. Think of it as law-making: some people gather, discuss
>> and take a decision, which is then made public for all interested
>> parties before it comes into force.
>>
>
> I really feel that the blog is the best place for announcements like
> this.

How many people read the blog? How many people combined read the
village pumps of the 10 biggest wikipedias?

>  There's a number of decent ways to notify the community of
> changes. The blog is likely the easiest route for that.

No, it isn't. The blog simply does not have enough reach and very
likely will never have enough reach no matter what you do to make it
popular. I could find tens of other reasons why it's not the best
method, but I'll stick to just one: bog posts are at least 2-3 times
longer than messages on village pumps. This means 3 times more time to
translate.

I think the author of the original article said it best: "Agreement
aside, we're seeing a disconnect right now between what the Foundation
is spending resources on and the issues faced by the community." If we
can't agree on the problem, we will have a very hard time finding
solutions.

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Yury Katkov
Hi everyone!

I have found myself in the following situation several times: I
created a wiki for some event or small project, everything works fine
and after the event or project was done - nobody have seen this wiki
for several months and does nothing on it. After several months
somebody needs the wiki once again and realizes that the wiki database
now have 3 Gb of text spam. Suppose that there is no back-up or
rollback option in a wiki hosting. So here is the question: how to

1) remove all the spam
2) delete all the spam accounts
3) reduce the database size from 3Gb to the original size

Cheers,
Yury Katkov, WikiVote

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Nested database transactions

2012-08-24 Thread Daniel Kinzler
On 23.08.2012 22:49, Brion Vibber wrote:
> Well, the main reason is probably that MySQL doesn't support nested
> transactions... trying to simulate them with a counter sounds fragile, as a
> single rollback would roll back the entire transaction "tree", not just the 
> last
> nested one you started (or else do nothing if you just decrement the counter,
> also possibly dangerous if you expected the rollback to work).

To me it seems correct and safe to assume that a ROLLBACK will cause all open
transactions to fail. I don't see any problem with handling things this way. Am
I missing something? Was there any *concrete* problem that caused this feature
to be removed?

On 23.08.2012 23:21, Brion Vibber wrote:
> On Thu, Aug 23, 2012 at 2:02 PM, Evan Priestley 
> wrote:
> 
>> We solve this in Phabricator by using BEGIN (depth 0) or SAVEPOINT (depth
>> 1+) when incrementing the counter, ROLLBACK TO SAVEPOINT (depth 1+) or
>> ROLLBACK (depth 0) when decrementing it after a failure, and nothing (depth
>> 1) or COMMIT (depth 0) when decrementing it after a success. Our experience
>> with transaction stacks has generally been good (no real surprises, doesn't
>> feel magical, significantly reduces the complexity of transactional code),
>> although we don't support anything but MySQL.
>>
> 
> Oooh, nice! Hadn't come across SAVEPOINT before.
> 
> http://dev.mysql.com/doc/refman/5.0/en/savepoint.html

Hm... that'S a 404 for me. For some reason, this is missing in the 5.0 manual,
even though it exists in 4.1 and 5.1:

http://dev.mysql.com/doc/refman/5.1/en/savepoint.html

Anyway, this seems like a neat solution if it is handled automatically by
begin(), rollback() and commit(), so the calling code doesn't have to be aware
of the current transaction level.

I'm tempted to implement this. Any objections?

-- daniel

PS:



-- 
Daniel Kinzler, Softwarearchitekt

Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
http://wikimedia.de  | Tel. (030) 219 158 260

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt
für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread John
Do you have a list of legitimate known good accounts?

On Fri, Aug 24, 2012 at 3:27 AM, Yury Katkov  wrote:
> Hi everyone!
>
> I have found myself in the following situation several times: I
> created a wiki for some event or small project, everything works fine
> and after the event or project was done - nobody have seen this wiki
> for several months and does nothing on it. After several months
> somebody needs the wiki once again and realizes that the wiki database
> now have 3 Gb of text spam. Suppose that there is no back-up or
> rollback option in a wiki hosting. So here is the question: how to
>
> 1) remove all the spam
> 2) delete all the spam accounts
> 3) reduce the database size from 3Gb to the original size
>
> Cheers,
> Yury Katkov, WikiVote
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Lua deployed to www.mediawiki.org

2012-08-24 Thread Neil Harris

On 24/08/12 06:37, Yury Katkov wrote:

Hi everyone!
I can see that there are many topics now in this thread and probably I
will just add to this chaos one more topic.
I should remind everyone about the ghetto minority in MediaWiki
community called independent wiki owners and businesses that uses
MediaWiki in their solutions.  From this point of view I'm interested
in the following:

-  Lua means to replace ParserFunctions, Variables, Array, etc? What
are the plans of supporting these extensions?

-
Yury Katkov




Perhaps ParserFunctions etc. could eventually be supported in Lua via a 
syntax extension of the new Lua mechanism, eliminating the need for 
separate extensions?


-- N.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Derric Atzrott
>> Hi everyone!
>>
>> I have found myself in the following situation several times: I 
>> created a wiki for some event or small project, everything works fine 
>> and after the event or project was done - nobody have seen this wiki 
>> for several months and does nothing on it. After several months 
>> somebody needs the wiki once again and realizes that the wiki database 
>> now have 3 Gb of text spam. Suppose that there is no back-up or 
>> rollback option in a wiki hosting. So here is the question: how to
>>
>> 1) remove all the spam
>> 2) delete all the spam accounts
>> 3) reduce the database size from 3Gb to the original size
>>
>> Cheers,
>> Yury Katkov, WikiVote
>
>Do you have a list of legitimate known good accounts?

I'm actually really interested in this too. I just deleted the databases
for two copies of Mediawiki that I ran for similar reasons...

Thank you,
Derric Atzrott


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Lua deployed to www.mediawiki.org

2012-08-24 Thread Mark A. Hershberger
On 08/22/2012 05:50 PM, David Gerard wrote:
> Something complicated on en:wp that would not be meaningless on
> mediawiki.org could be copied there for hacking.

I recall that one of Robla's standard articles from enwiki for
demonstrating long rendering time was
.  I just did a purge on it
and it took 34s to render.

I haven't yet looked at how the article is written or the templates
used, but perhaps that would be a good place to start looking.

-- 
http://hexmode.com/

Human evil is not a problem.  It is a mystery.  It cannot be solved.
  -- When Atheism Becomes a Religion, Chris Hedges

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Tei
On 24 August 2012 09:27, Yury Katkov  wrote:
> Hi everyone!
>
> I have found myself in the following situation several times: I
> created a wiki for some event or small project, everything works fine
> and after the event or project was done - nobody have seen this wiki
> for several months and does nothing on it. After several months
> somebody needs the wiki once again and realizes that the wiki database
> now have 3 Gb of text spam. Suppose that there is no back-up or
> rollback option in a wiki hosting. So here is the question: how to
>

No backups, no way to roolback to a date? thats bad.
You could start a wiki from scratch, copy manually from the old one
whatever was good.  Maybe share this task with a few selected
voluntaries.
Start the new one without anonymous edits, a sexy theme and a huge
campaign to attract people. "No like the old wiki!, this is actually
good and maintaned!".
Maybe the lack of maintenance contributed to the decay. I wonder if a
wiki without enough contributors is worth existing, like a garden
without anyone to cut the grass.



-- 
--
ℱin del ℳensaje.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Derric Atzrott
>> Hi everyone!
>>
>> I have found myself in the following situation several times: I
>> created a wiki for some event or small project, everything works fine
>> and after the event or project was done - nobody have seen this wiki
>> for several months and does nothing on it. After several months
>> somebody needs the wiki once again and realizes that the wiki database
>> now have 3 Gb of text spam. Suppose that there is no back-up or
>> rollback option in a wiki hosting. So here is the question: how to
>>
>
>No backups, no way to roolback to a date? thats bad.
>You could start a wiki from scratch, copy manually from the old one
>whatever was good.  Maybe share this task with a few selected
>voluntaries.
>Start the new one without anonymous edits, a sexy theme and a huge
>campaign to attract people. "No like the old wiki!, this is actually
>good and maintaned!".
>Maybe the lack of maintenance contributed to the decay. I wonder if a
>wiki without enough contributors is worth existing, like a garden
>without anyone to cut the grass.

Certainly.  If for no other reason than the historical value.

We still keep all the Wikimania wikis around.

Thank you,
Derric Atzrott


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread John
Given enough facts it would be rather easy for me to write a script
that nukes said spam I did something similar on
http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Lua deployed to www.mediawiki.org

2012-08-24 Thread Mark A. Hershberger
On 08/23/2012 12:30 AM, Mike Dupont wrote:
> So please tell me, what are the options to fix this? Is there going to be a
> common code repo and maybe an easy way to sync in a git filesystem of
> template code into the wiki?

Gadgets (v2) makes an attempt to fix the "no central repository"
problem, so maybe that would be a place to start looking.

But MZ is right.  The duplication and inadvertent code forks that result
from not having a way to easily re-use templates and Gadgets is a real
problem that we should start to address.



-- 
http://hexmode.com/

Human evil is not a problem.  It is a mystery.  It cannot be solved.
  -- When Atheism Becomes a Religion, Chris Hedges

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Meta: Inproper Line Breaks

2012-08-24 Thread Strainu
2012/8/17 Derric Atzrott :
> I get the feeling sometimes that either mailman or Outlook completely
> ignores
>
> where I am putting my line breaks and put them wherever it pleases.

Do you introduce line-breaks manually except when passing to a new
paragraph? If so, the solution is simple: don't do that. Outlook will
automatically cut the line to a certain number of characters,
configurable from File->Options->Mail->Message Format. If you really
want to control your message, you can set that to a very high value
(like 150 or something).

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Meta: Inproper Line Breaks

2012-08-24 Thread Derric Atzrott
>> I get the feeling sometimes that either mailman or Outlook completely 
>> ignores
>>
>> where I am putting my line breaks and put them wherever it pleases.
>
>Do you introduce line-breaks manually except when passing to a new
>paragraph? If so, the solution is simple: don't do that. Outlook will
>automatically cut the line to a certain number of characters,
>configurable from File->Options->Mail->Message Format. If you really want
>to control your message, you can set that to a very high value (like
>150 or something).
>
>Strainu


Yes.  I've been manually introducing my line breaks.  I was doing it at 
80 characters, now I am doing it at 74.  I always have a text editor
open so it isn't too much work for me to just type my message in there
and use the verticle line I have placed at 80 or 74 characters, depending
on what I am doing.

The other reason I manually format my messages though is that I really
hate all of the junk Outlook puts into a message when you reply.  I
much prefer the clean kind of look that these messages have.

Definitely will be chaning that option.  Then I can have my text editor
set at 80 all the time.

Thank you,
Derric Atzrott


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikimedians are rightfully wary

2012-08-24 Thread Alex Brollo
I suppose, that a way could be a warning, into centralSiteNotice or into
another similar space, optionally shown by a gadget/a Preferences set
(default=disabled) into any page of any wiki. This warning should be brief,
informative and focused on possible unespected results by software changes.

"Normal" users shuold not view anything; advanced users (sysops and layman
programmers) will surely appreciate it a lot. I remember terrible headaches
trying to fix unexpented, intriguing local bugs of out rich javascript set
of local tools into it.source.

Alex brollo



2012/8/24 Strainu 

> 2012/8/24 Ryan Lane :
> >> Your idea is a great one, except... I was going to say "you can't see
> >> the forest for the trees", but actually it's the other way around. I
> >> think you're too focused on the big picture (communicating with the
> >> community) to see that smaller steps can help a great deal.
> >>
> >
> > I haven't seen any small step solution that improves the situation,
> > though. Unless there's two way communication then it's the WMF telling
> > people "here's what we're going to do" without any way for them to
> > give us proper feedback. We can't possibly host discussions with all
> > of our communities, and it's really unfair to only select the biggest
> > ones.
>
> That's exactly what I'm trying to point out to you: the WMF telling
> people "here's what we're going to do" *on their home wiki* IS a huge
> improvement. Specifically, on ro.wp, instead of 4-5 people seeing
> these messages, 50+ people would see the messages on the Village Pump.
> That's a ten-fold increase in coverage with very little effort.
>
> >
> >> Sure, it's great to have lots of peopled involved in the discussion
> >> leading to a big change, but it's not bad at all to have some people
> >> involved in the decision making, but _everybody_ in the loop about the
> >> decision taken. Think of it as law-making: some people gather, discuss
> >> and take a decision, which is then made public for all interested
> >> parties before it comes into force.
> >>
> >
> > I really feel that the blog is the best place for announcements like
> > this.
>
> How many people read the blog? How many people combined read the
> village pumps of the 10 biggest wikipedias?
>
> >  There's a number of decent ways to notify the community of
> > changes. The blog is likely the easiest route for that.
>
> No, it isn't. The blog simply does not have enough reach and very
> likely will never have enough reach no matter what you do to make it
> popular. I could find tens of other reasons why it's not the best
> method, but I'll stick to just one: bog posts are at least 2-3 times
> longer than messages on village pumps. This means 3 times more time to
> translate.
>
> I think the author of the original article said it best: "Agreement
> aside, we're seeing a disconnect right now between what the Foundation
> is spending resources on and the issues faced by the community." If we
> can't agree on the problem, we will have a very hard time finding
> solutions.
>
> Strainu
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Meta: Inproper Line Breaks

2012-08-24 Thread Strainu
2012/8/24 Derric Atzrott :
> The other reason I manually format my messages though is that I really
> hate all of the junk Outlook puts into a message when you reply.  I
> much prefer the clean kind of look that these messages have.

I'm not sure what you mean by "all of the junk", but you might want to
carefully read all the options from the "Mail" tab. It took me about
an hour, but right now my outlook emails are identical with the
default Gmail configuration.

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Meta: Inproper Line Breaks

2012-08-24 Thread Derric Atzrott
This junk.

Definitely going to peruse through those options though.

Thank you,
Derric Atzrott


-Original Message-
From: wikitech-l-boun...@lists.wikimedia.org
[mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of Strainu
Sent: 24 August 2012 08:47
To: Wikimedia developers
Subject: Re: [Wikitech-l] Meta: Inproper Line Breaks

2012/8/24 Derric Atzrott :
> The other reason I manually format my messages though is that I really
> hate all of the junk Outlook puts into a message when you reply.  I
> much prefer the clean kind of look that these messages have.

I'm not sure what you mean by "all of the junk", but you might want to
carefully read all the options from the "Mail" tab. It took me about
an hour, but right now my outlook emails are identical with the
default Gmail configuration.

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Lua deployed to www.mediawiki.org

2012-08-24 Thread Helder .
I think that is
https://bugzilla.wikimedia.org/show_bug.cgi?id=39610
(Scribunto should support global module invocations)

On Fri, Aug 24, 2012 at 9:18 AM, Mark A. Hershberger  wrote:
> On 08/23/2012 12:30 AM, Mike Dupont wrote:
>> So please tell me, what are the options to fix this? Is there going to be a
>> common code repo and maybe an easy way to sync in a git filesystem of
>> template code into the wiki?
>
> Gadgets (v2) makes an attempt to fix the "no central repository"
> problem, so maybe that would be a place to start looking.
>
> But MZ is right.  The duplication and inadvertent code forks that result
> from not having a way to easily re-use templates and Gadgets is a real
> problem that we should start to address.
>
>
>
> --
> http://hexmode.com/
>
> Human evil is not a problem.  It is a mystery.  It cannot be solved.
>   -- When Atheism Becomes a Religion, Chris Hedges
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Lua deployed to www.mediawiki.org

2012-08-24 Thread Tyler Romeo
Maybe it's just me opinion, but I believe that even with Lua, the existing
templating system is not entirely obsolete. I'm sure ParserFunctions still
has a legitimate purpose. For example, what about a template that's just a
simple if statement (if this, else that). There's not really a need to make
a Lua module for something that basic.

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com



On Fri, Aug 24, 2012 at 8:56 AM, Helder .  wrote:

> I think that is
> https://bugzilla.wikimedia.org/show_bug.cgi?id=39610
> (Scribunto should support global module invocations)
>
> On Fri, Aug 24, 2012 at 9:18 AM, Mark A. Hershberger 
> wrote:
> > On 08/23/2012 12:30 AM, Mike Dupont wrote:
> >> So please tell me, what are the options to fix this? Is there going to
> be a
> >> common code repo and maybe an easy way to sync in a git filesystem of
> >> template code into the wiki?
> >
> > Gadgets (v2) makes an attempt to fix the "no central repository"
> > problem, so maybe that would be a place to start looking.
> >
> > But MZ is right.  The duplication and inadvertent code forks that result
> > from not having a way to easily re-use templates and Gadgets is a real
> > problem that we should start to address.
> >
> >
> >
> > --
> > http://hexmode.com/
> >
> > Human evil is not a problem.  It is a mystery.  It cannot be solved.
> >   -- When Atheism Becomes a Religion, Chris Hedges
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Lua deployed to www.mediawiki.org

2012-08-24 Thread Derric Atzrott
>Maybe it's just me opinion, but I believe that even with Lua, the existing
>templating system is not entirely obsolete. I'm sure ParserFunctions still
>has a legitimate purpose. For example, what about a template that's just a
>simple if statement (if this, else that). There's not really a need to make
>a Lua module for something that basic.

I'm 100% in agreement on this.  I think that Lua will probably be used, and
perhaps quite heavily, but only in the more advanced templates.  Simple
templates will probably continue to use the current standard template
creation proccesses.  Honestly I can't see why they wouldn't.  Its a whole
lot simplier for most people than learning Lua and most people don't need
Lua, just some people.

Thank you,
Derric Atzrott


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] GSoC wrapup report: Simple Uploader for Beginners and Experts

2012-08-24 Thread Sumana Harihareswara
On 08/22/2012 05:43 PM, Platonides wrote:
> My GSoC project was developing a program for uploading
> Wiki Loves Monuments photos to Wikimedia Commons [1]
> 
> Timing seemed great, chaining the end of my previous task
> with the beginning of GSoC and the end of GSoC with WLM 2012.
> Maybe it would be a little adjusted at the beginning, but
> not being a too exigent GSoC project, there would be plenty
> of time for completion. Or so I thought.
> The work in the Real World delayed, pushing down the Google
> Summer of Code, which by its own nature had lower priority.
> I also made another appreciation error when considering how
> much I would advance with a "foreign" computer.
> I ended up sprinting the last days to get a deliverable by
> the deadline, something that seemed impossible at times,
> while I was regretting signing up.
> 
> I should also talk about my relationship with my mentor,
> but it was pretty much non-existant. With little to no
> advance to report, I wasn't motivated to contact her, and she
> was too busy with other duties to contact me. So we failed
> from both sides. :( I'd love to work with her in the future,
> but under different conditions. In the given circumstances,
> I feel it was an error for her to accept me as student.
> The mentor slot should have been assigned to someone else.
> Had a student new to mediawiki, with a project needing
> more support, received this level of mentoring, I doubt
> he could have finished / it would have been a mergeable result.
> OTOH, if someone was going to receive this, I guess I wasn't
> that a bad election. :/
> 
> Despite the smashed planification, results weren't too bad.
> I completed a prototype [2], which implements the core
> functionality, asking the user all the needed information and
> performing the uploads. It misses one dialog, and has the big
> drawback of not being internationalised: the application is
> hardcoded in the mixture of languages I used (for distributing
> as mockups) when developing.
> I had envisioned a translation method which involved making
> a new translatewiki backend, but it was clear pretty soon that
> translations would need to stay out of the 'release'.
> Another point I had planned that was not fullfiled was
> preference-handling and load/save of sets of monuments. All
> preferences are hardcoded right now. Those are easy to add,
> though.
> Currently, the program has a functionality similar to the Upload
> Wizard, in that you need to be online to work with it. However,
> I feel it is superior in the usecase of uploading a full folder,
> both in selecting and preparation (plus uploading in the background
> while preparing new files). I plan to slowly be adding -outside
> of GSoC- some of those missing features and dealing with the
> (previsible) feedback.
> 
> Although short of time due to the above mentioned issues,
> it was refreshing to do some C++ coding. I had only worked with
> dynamic languages recently, or with lower-level C. C++ provided
> enough class magic to make for comfortable coding, with enough
> pointers to crash your program (and a framework complex
> enough to discard valgrind usage) :)
> I was disappointed with wxWidgets HTTP support, though. There
> were a couple of problems I had to overcome by myself and had to
> patch a third one on the library.
> 
> I have uploaded at [2] the source, windows and linux binaries.
> Everyone is welcome to play with them and test the application.
> Following the lead of the mobile team with beta uploading,
> those binaries upload to http://test.wikipedia.org/ instead
> of Wikimedia Commons. It doesn't have many of the commons
> templates the program expects (you can preview in Wikimedia
> Commons if you wish), but it doesn't disrupt the project.
> 
> Best regards
> 
> 1- http://thread.gmane.org/gmane.org.wikimedia.wikilovesmonuments/2641
> 2- http://toolserver.org/~platonides/sube/

Thanks for the wrapup, Platonides.

I decided to mentor this project because several engineers sounded
enthusiastic about the idea and because Platonides had such a strong
reputation as a past contributor.  I tried to find a mentor with more
domain knowledge, but no one accepted.  I decided that all Platonides
needed was project management, and that I could provide that.  I was
wrong.  I now know that I can't mentor a GSoC student, especially not
while I'm also the organizational administrator, and that if we can't
find an enthusiastic, technically knowledgable mentor for a student then
we simply shouldn't accept them at all.

(In my defense, I did ping and email Platonides many times during the
summer, but sometimes we both lagged, a lot, in our responses.  I
already knew it was best practice to have a weekly check-in call, and
it's my fault that we didn't set this up first thing, in May.)

On a more positive note, I'm glad the prototype is out, and I hope other
folks will try out SUBE and that it, along with
https://commons.wikimedia.org/wiki/Commons:Up! and the new WLM Andr

Re: [Wikitech-l] Appreciation thread

2012-08-24 Thread Derric Atzrott
> How about a little email thread for us to say nice things about each
> other?

Sumana for being such a kind person and deciding that it would be a good idea
for us to remember that we need to show each other some love sometimes in order
to stay sane.

MZMcBride for sparking some discussions that really needed to be discussed.

All of the people who have helped me troubleshoot that annoying email issue
I was having before.

Everyone on this list for giving me something to do that breaks up my work
day with something of interest to me personally.

Thank you,
Derric Atzrott


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Appreciation thread

2012-08-24 Thread Andrew Otto
Many many many thanks Rob H, Peter Y, Leslie C, Ben H, Ryan L, Faidon, Daniel 
Z, Mark B, Chris J, and everyone else on the ops team that has put up with my 
IRC poking and prodding thus far.  You guys are a huge help to the analytics 
team.  Thanks for guiding me through and teaching me the systems, and for 
feedback for my puppet stuff. :)


On Aug 23, 2012, at 9:29 PM, Roan Kattouw  wrote:

> On Thu, Aug 23, 2012 at 2:30 AM, Niklas Laxström
>  wrote:
>> * Sumana for this idea.
> +1
> 
> Also:
> * Inez for writing code I intended to write, exactly the way I
> intended to write it, while I was busy with something else yesterday
> * Timo (Krinkle) for announcing he's back from vacation via the gerrit-wm bot
> * The rest of the VE team for being generally awesome
> * Aaron, Ariel, Ben and Faidon (and anyone else that's working on this
> that I'm forgetting) for their relentless work on the Swift migration
> this week
> 
> Roan
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Lua deployed to www.mediawiki.org

2012-08-24 Thread Neil Harris

On 24/08/12 14:07, Derric Atzrott wrote:

Maybe it's just me opinion, but I believe that even with Lua, the existing
templating system is not entirely obsolete. I'm sure ParserFunctions still
has a legitimate purpose. For example, what about a template that's just a
simple if statement (if this, else that). There's not really a need to make
a Lua module for something that basic.

I'm 100% in agreement on this.  I think that Lua will probably be used, and
perhaps quite heavily, but only in the more advanced templates.  Simple
templates will probably continue to use the current standard template
creation proccesses.  Honestly I can't see why they wouldn't.  Its a whole
lot simplier for most people than learning Lua and most people don't need
Lua, just some people.

Thank you,
Derric Atzrott



That's right. I see the hierarchy as being something like this:

* simple templates written in the existing template markup, exactly as 
done at present


* complex templates written in Lua, maintained on-wiki, perhaps in a 
central repositry, but again using normal wiki processes for editing, 
protection, etc.


* common library routines written in Lua for addressing common utility 
functions used in many complex templates, maintained somewhere like git, 
with much stricter check-in rules, and viewed as being part of the core 
software


Possibly ParserFunctions will end up being semantic sugar for invoking 
primitives written in Lua, as part of that set of library routines.


-- N.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Appreciation thread

2012-08-24 Thread Andrew Otto
Oh and thanks to Jeremy B too!  He's been super helpful at directing my 
questions to the proper know-it-all.


On Aug 24, 2012, at 9:58 AM, Andrew Otto  wrote:

> Many many many thanks Rob H, Peter Y, Leslie C, Ben H, Ryan L, Faidon, Daniel 
> Z, Mark B, Chris J, and everyone else on the ops team that has put up with my 
> IRC poking and prodding thus far.  You guys are a huge help to the analytics 
> team.  Thanks for guiding me through and teaching me the systems, and for 
> feedback for my puppet stuff. :)
> 
> 
> On Aug 23, 2012, at 9:29 PM, Roan Kattouw  wrote:
> 
>> On Thu, Aug 23, 2012 at 2:30 AM, Niklas Laxström
>>  wrote:
>>> * Sumana for this idea.
>> +1
>> 
>> Also:
>> * Inez for writing code I intended to write, exactly the way I
>> intended to write it, while I was busy with something else yesterday
>> * Timo (Krinkle) for announcing he's back from vacation via the gerrit-wm bot
>> * The rest of the VE team for being generally awesome
>> * Aaron, Ariel, Ben and Faidon (and anyone else that's working on this
>> that I'm forgetting) for their relentless work on the Swift migration
>> this week
>> 
>> Roan
>> 
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> 


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Appreciation thread

2012-08-24 Thread Derric Atzrott
>Oh and thanks to Jeremy B too!  He's been super helpful at directing my
>questions to the proper know-it-all.

You know, that is something I don't think many of us think about on this list
terribly often.  We have quite a nice "brain-trust" going here.  Between all of
us, our knowledge is really quite extensive.

Its nice being able to enjoy conversation with very intelligent people on
intelligent topics.  I think that is something everyone on this list can
probably appreciate right?

Thank you,
Derric Atzrott


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] GSoC wrapup report: Simple Uploader for Beginners and Experts

2012-08-24 Thread Platonides
On 24/08/12 15:15, Sumana Harihareswara wrote:
> Thanks for the wrapup, Platonides.
> 
> I decided to mentor this project because several engineers sounded
> enthusiastic about the idea and because Platonides had such a strong
> reputation as a past contributor.  I tried to find a mentor with more
> domain knowledge, but no one accepted.  I decided that all Platonides
> needed was project management, and that I could provide that.  I was
> wrong.  

I'd like to add that we discussed the suitability of Sumana at the
beginning, and we agreed that I'd look for support with other community
members when I found roadblocks during the project.


> I now know that I can't mentor a GSoC student, especially not
> while I'm also the organizational administrator, and that if we can't
> find an enthusiastic, technically knowledgable mentor for a student then
> we simply shouldn't accept them at all.

You're being too harsh with yourself. I'm sure you *can* be a good GSoC
mentor. But we haven't cloned a dozen Sumanas... yet :)


> (In my defense, I did ping and email Platonides many times during the
> summer, but sometimes we both lagged, a lot, in our responses.  I
> already knew it was best practice to have a weekly check-in call, and
> it's my fault that we didn't set this up first thing, in May.)

I'm sorry if someone understood from my mail that I was not contacted at
all by my mentor. Albeit weak, there *was* some communication.
The timing of your vacations was also unfortunate regarding GSoC.


> On a more positive note, I'm glad the prototype is out, and I hope other
> folks will try out SUBE and that it, along with
> https://commons.wikimedia.org/wiki/Commons:Up! and the new WLM Android
> app, will aid this and future mass-uploading to Commons.  I also hope
> the localisation team can give Platonides pointers on what he needs to
> do to make this truly internationalized.

Another similar new project is https://github.com/yarl/vicuna, we seem
to have a proliferation of uploaders recently.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Yury Katkov
Hi everyone! I agree with everyone in this thread, but the main
problem is that even if I create a bot of use extensions that removes
pages, the actual database records won't be deleted. If I understand
correctly, the MediaWiki philosophy tells us that we cannot just drop
the page or an account from the database - all the deletions means
only that we will hide those nasty spam pages.

Consequently after the deletions the size of my database won't shrink
to original 100 Mb, it remains around 3Gb which is a problem for
hosting.

The proposed solution of exporting all the pages to a brand new wiki
solves this problem. Are there any other solutions where the dropping
of my old spammed database does not involved?
-
Yury Katkov



On Fri, Aug 24, 2012 at 4:13 PM, John  wrote:
> Given enough facts it would be rather easy for me to write a script
> that nukes said spam I did something similar on
> http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread John
What can be done after mass deleting is to purge the archive database
table which should reduce the database size significantly. If you take
a look at the the example where I cleaned up an existing site I
reduced the database size by about 90%

On Fri, Aug 24, 2012 at 11:47 AM, Yury Katkov  wrote:
> Hi everyone! I agree with everyone in this thread, but the main
> problem is that even if I create a bot of use extensions that removes
> pages, the actual database records won't be deleted. If I understand
> correctly, the MediaWiki philosophy tells us that we cannot just drop
> the page or an account from the database - all the deletions means
> only that we will hide those nasty spam pages.
>
> Consequently after the deletions the size of my database won't shrink
> to original 100 Mb, it remains around 3Gb which is a problem for
> hosting.
>
> The proposed solution of exporting all the pages to a brand new wiki
> solves this problem. Are there any other solutions where the dropping
> of my old spammed database does not involved?
> -
> Yury Katkov
>
>
>
> On Fri, Aug 24, 2012 at 4:13 PM, John  wrote:
>> Given enough facts it would be rather easy for me to write a script
>> that nukes said spam I did something similar on
>> http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Appreciation thread

2012-08-24 Thread Yury Katkov
I respect very much the creators of Article Feedback extension: we
have used it in our wikis and it really rocks. I also is very thankful
to all semantic extensions community - the guys patiently answer my
questions every single day and the work they do is just fantastic.
-
Yury Katkov



On Fri, Aug 24, 2012 at 6:11 PM, Derric Atzrott
 wrote:
>>Oh and thanks to Jeremy B too!  He's been super helpful at directing my
>>questions to the proper know-it-all.
>
> You know, that is something I don't think many of us think about on this list
> terribly often.  We have quite a nice "brain-trust" going here.  Between all 
> of
> us, our knowledge is really quite extensive.
>
> Its nice being able to enjoy conversation with very intelligent people on
> intelligent topics.  I think that is something everyone on this list can
> probably appreciate right?
>
> Thank you,
> Derric Atzrott
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Tyler Romeo
Technically speaking, pages and accounts can be permanently deleted. (There
is an extension for it I believe.) However, since MediaWiki does not use
foreign keys, you have to be careful not to break things in the process.

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com



On Fri, Aug 24, 2012 at 11:47 AM, Yury Katkov wrote:

> Hi everyone! I agree with everyone in this thread, but the main
> problem is that even if I create a bot of use extensions that removes
> pages, the actual database records won't be deleted. If I understand
> correctly, the MediaWiki philosophy tells us that we cannot just drop
> the page or an account from the database - all the deletions means
> only that we will hide those nasty spam pages.
>
> Consequently after the deletions the size of my database won't shrink
> to original 100 Mb, it remains around 3Gb which is a problem for
> hosting.
>
> The proposed solution of exporting all the pages to a brand new wiki
> solves this problem. Are there any other solutions where the dropping
> of my old spammed database does not involved?
> -
> Yury Katkov
>
>
>
> On Fri, Aug 24, 2012 at 4:13 PM, John  wrote:
> > Given enough facts it would be rather easy for me to write a script
> > that nukes said spam I did something similar on
> > http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Yury Katkov
http://www.mediawiki.org/wiki/Manual:Reduce_size_of_the_database and
here is the manual on how to purge the archive database! Thanks John,
that's a perfect solution!
-
Yury Katkov



On Fri, Aug 24, 2012 at 7:51 PM, John  wrote:
> What can be done after mass deleting is to purge the archive database
> table which should reduce the database size significantly. If you take
> a look at the the example where I cleaned up an existing site I
> reduced the database size by about 90%
>
> On Fri, Aug 24, 2012 at 11:47 AM, Yury Katkov  wrote:
>> Hi everyone! I agree with everyone in this thread, but the main
>> problem is that even if I create a bot of use extensions that removes
>> pages, the actual database records won't be deleted. If I understand
>> correctly, the MediaWiki philosophy tells us that we cannot just drop
>> the page or an account from the database - all the deletions means
>> only that we will hide those nasty spam pages.
>>
>> Consequently after the deletions the size of my database won't shrink
>> to original 100 Mb, it remains around 3Gb which is a problem for
>> hosting.
>>
>> The proposed solution of exporting all the pages to a brand new wiki
>> solves this problem. Are there any other solutions where the dropping
>> of my old spammed database does not involved?
>> -
>> Yury Katkov
>>
>>
>>
>> On Fri, Aug 24, 2012 at 4:13 PM, John  wrote:
>>> Given enough facts it would be rather easy for me to write a script
>>> that nukes said spam I did something similar on
>>> http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread John
Like I said if you want I can whip up a script to nuke the spam, just
drop me an email off list

On Fri, Aug 24, 2012 at 11:54 AM, Yury Katkov  wrote:
> http://www.mediawiki.org/wiki/Manual:Reduce_size_of_the_database and
> here is the manual on how to purge the archive database! Thanks John,
> that's a perfect solution!
> -
> Yury Katkov
>
>
>
> On Fri, Aug 24, 2012 at 7:51 PM, John  wrote:
>> What can be done after mass deleting is to purge the archive database
>> table which should reduce the database size significantly. If you take
>> a look at the the example where I cleaned up an existing site I
>> reduced the database size by about 90%
>>
>> On Fri, Aug 24, 2012 at 11:47 AM, Yury Katkov  wrote:
>>> Hi everyone! I agree with everyone in this thread, but the main
>>> problem is that even if I create a bot of use extensions that removes
>>> pages, the actual database records won't be deleted. If I understand
>>> correctly, the MediaWiki philosophy tells us that we cannot just drop
>>> the page or an account from the database - all the deletions means
>>> only that we will hide those nasty spam pages.
>>>
>>> Consequently after the deletions the size of my database won't shrink
>>> to original 100 Mb, it remains around 3Gb which is a problem for
>>> hosting.
>>>
>>> The proposed solution of exporting all the pages to a brand new wiki
>>> solves this problem. Are there any other solutions where the dropping
>>> of my old spammed database does not involved?
>>> -
>>> Yury Katkov
>>>
>>>
>>>
>>> On Fri, Aug 24, 2012 at 4:13 PM, John  wrote:
 Given enough facts it would be rather easy for me to write a script
 that nukes said spam I did something similar on
 http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Sumana Harihareswara
Speaking of scripts, it would be cool if someone would polish this set
of anti-spam scripts a little bit and see if it's worth advertising more:

 https://www.noisebridge.net/wiki/Secretaribot
 https://github.com/dannyob/secretaribot

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

On 08/24/2012 11:55 AM, John wrote:
> Like I said if you want I can whip up a script to nuke the spam, just
> drop me an email off list
> 
> On Fri, Aug 24, 2012 at 11:54 AM, Yury Katkov  wrote:
>> http://www.mediawiki.org/wiki/Manual:Reduce_size_of_the_database and
>> here is the manual on how to purge the archive database! Thanks John,
>> that's a perfect solution!
>> -
>> Yury Katkov
>>
>>
>>
>> On Fri, Aug 24, 2012 at 7:51 PM, John  wrote:
>>> What can be done after mass deleting is to purge the archive database
>>> table which should reduce the database size significantly. If you take
>>> a look at the the example where I cleaned up an existing site I
>>> reduced the database size by about 90%
>>>
>>> On Fri, Aug 24, 2012 at 11:47 AM, Yury Katkov  
>>> wrote:
 Hi everyone! I agree with everyone in this thread, but the main
 problem is that even if I create a bot of use extensions that removes
 pages, the actual database records won't be deleted. If I understand
 correctly, the MediaWiki philosophy tells us that we cannot just drop
 the page or an account from the database - all the deletions means
 only that we will hide those nasty spam pages.

 Consequently after the deletions the size of my database won't shrink
 to original 100 Mb, it remains around 3Gb which is a problem for
 hosting.

 The proposed solution of exporting all the pages to a brand new wiki
 solves this problem. Are there any other solutions where the dropping
 of my old spammed database does not involved?
 -
 Yury Katkov



 On Fri, Aug 24, 2012 at 4:13 PM, John  wrote:
> Given enough facts it would be rather easy for me to write a script
> that nukes said spam I did something similar on
> http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Yury Katkov
Hi John, thanks! Take your time! If you already have such a script,
and can share it - please do! But if not - I think it will be a good
exercise in pywikipediabot or extension development for me.
-
Yury Katkov



On Fri, Aug 24, 2012 at 7:55 PM, John  wrote:
> Like I said if you want I can whip up a script to nuke the spam, just
> drop me an email off list
>
> On Fri, Aug 24, 2012 at 11:54 AM, Yury Katkov  wrote:
>> http://www.mediawiki.org/wiki/Manual:Reduce_size_of_the_database and
>> here is the manual on how to purge the archive database! Thanks John,
>> that's a perfect solution!
>> -
>> Yury Katkov
>>
>>
>>
>> On Fri, Aug 24, 2012 at 7:51 PM, John  wrote:
>>> What can be done after mass deleting is to purge the archive database
>>> table which should reduce the database size significantly. If you take
>>> a look at the the example where I cleaned up an existing site I
>>> reduced the database size by about 90%
>>>
>>> On Fri, Aug 24, 2012 at 11:47 AM, Yury Katkov  
>>> wrote:
 Hi everyone! I agree with everyone in this thread, but the main
 problem is that even if I create a bot of use extensions that removes
 pages, the actual database records won't be deleted. If I understand
 correctly, the MediaWiki philosophy tells us that we cannot just drop
 the page or an account from the database - all the deletions means
 only that we will hide those nasty spam pages.

 Consequently after the deletions the size of my database won't shrink
 to original 100 Mb, it remains around 3Gb which is a problem for
 hosting.

 The proposed solution of exporting all the pages to a brand new wiki
 solves this problem. Are there any other solutions where the dropping
 of my old spammed database does not involved?
 -
 Yury Katkov



 On Fri, Aug 24, 2012 at 4:13 PM, John  wrote:
> Given enough facts it would be rather easy for me to write a script
> that nukes said spam I did something similar on
> http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread John
Its rather easy to write in pywiki I just need some information from
you about your wiki. (IE are all edits after X date bad, we only have
Y valid users and here are their names) exc stuff like that allows me
to tailor the script to your needs.

On Fri, Aug 24, 2012 at 12:03 PM, Yury Katkov  wrote:
> Hi John, thanks! Take your time! If you already have such a script,
> and can share it - please do! But if not - I think it will be a good
> exercise in pywikipediabot or extension development for me.
> -
> Yury Katkov
>
>
>
> On Fri, Aug 24, 2012 at 7:55 PM, John  wrote:
>> Like I said if you want I can whip up a script to nuke the spam, just
>> drop me an email off list
>>
>> On Fri, Aug 24, 2012 at 11:54 AM, Yury Katkov  wrote:
>>> http://www.mediawiki.org/wiki/Manual:Reduce_size_of_the_database and
>>> here is the manual on how to purge the archive database! Thanks John,
>>> that's a perfect solution!
>>> -
>>> Yury Katkov
>>>
>>>
>>>
>>> On Fri, Aug 24, 2012 at 7:51 PM, John  wrote:
 What can be done after mass deleting is to purge the archive database
 table which should reduce the database size significantly. If you take
 a look at the the example where I cleaned up an existing site I
 reduced the database size by about 90%

 On Fri, Aug 24, 2012 at 11:47 AM, Yury Katkov  
 wrote:
> Hi everyone! I agree with everyone in this thread, but the main
> problem is that even if I create a bot of use extensions that removes
> pages, the actual database records won't be deleted. If I understand
> correctly, the MediaWiki philosophy tells us that we cannot just drop
> the page or an account from the database - all the deletions means
> only that we will hide those nasty spam pages.
>
> Consequently after the deletions the size of my database won't shrink
> to original 100 Mb, it remains around 3Gb which is a problem for
> hosting.
>
> The proposed solution of exporting all the pages to a brand new wiki
> solves this problem. Are there any other solutions where the dropping
> of my old spammed database does not involved?
> -
> Yury Katkov
>
>
>
> On Fri, Aug 24, 2012 at 4:13 PM, John  wrote:
>> Given enough facts it would be rather easy for me to write a script
>> that nukes said spam I did something similar on
>> http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Appreciation thread

2012-08-24 Thread Amir E. Aharoni
> How about a little email thread for us to say nice things about each
> other?  Rules: be kind, thank someone, and say why you're thanking them.

The order is random.

* To Victor Vasiliev (vvv) for his ongoing commitment to making the
templates not suck.

* To SPQRobin for his ongoing commitment to making MediaWiki usable to
users who speak all languages, no matter how small or exotic.

* To Krinkle, for being endlessly and tirelessly helpful,
professional, eloquent and just plain nice.

* To Yuvi and the rest of the mobile team for solving so many bugs so quickly.

* Finally, to Katie, Lydia, Denny, Silke, Jens, Daniel and the rest of
the Wikidata crew, for doing their Magick and for being Very Seriously
Committed to making Wikidata no less than a Great Success.

I could go on for very long, but I'll stop here.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Yury Katkov
I think that we have the date after which there was only spam.
-
Yury Katkov



On Fri, Aug 24, 2012 at 8:07 PM, John  wrote:
> Its rather easy to write in pywiki I just need some information from
> you about your wiki. (IE are all edits after X date bad, we only have
> Y valid users and here are their names) exc stuff like that allows me
> to tailor the script to your needs.
>
> On Fri, Aug 24, 2012 at 12:03 PM, Yury Katkov  wrote:
>> Hi John, thanks! Take your time! If you already have such a script,
>> and can share it - please do! But if not - I think it will be a good
>> exercise in pywikipediabot or extension development for me.
>> -
>> Yury Katkov
>>
>>
>>
>> On Fri, Aug 24, 2012 at 7:55 PM, John  wrote:
>>> Like I said if you want I can whip up a script to nuke the spam, just
>>> drop me an email off list
>>>
>>> On Fri, Aug 24, 2012 at 11:54 AM, Yury Katkov  
>>> wrote:
 http://www.mediawiki.org/wiki/Manual:Reduce_size_of_the_database and
 here is the manual on how to purge the archive database! Thanks John,
 that's a perfect solution!
 -
 Yury Katkov



 On Fri, Aug 24, 2012 at 7:51 PM, John  wrote:
> What can be done after mass deleting is to purge the archive database
> table which should reduce the database size significantly. If you take
> a look at the the example where I cleaned up an existing site I
> reduced the database size by about 90%
>
> On Fri, Aug 24, 2012 at 11:47 AM, Yury Katkov  
> wrote:
>> Hi everyone! I agree with everyone in this thread, but the main
>> problem is that even if I create a bot of use extensions that removes
>> pages, the actual database records won't be deleted. If I understand
>> correctly, the MediaWiki philosophy tells us that we cannot just drop
>> the page or an account from the database - all the deletions means
>> only that we will hide those nasty spam pages.
>>
>> Consequently after the deletions the size of my database won't shrink
>> to original 100 Mb, it remains around 3Gb which is a problem for
>> hosting.
>>
>> The proposed solution of exporting all the pages to a brand new wiki
>> solves this problem. Are there any other solutions where the dropping
>> of my old spammed database does not involved?
>> -
>> Yury Katkov
>>
>>
>>
>> On Fri, Aug 24, 2012 at 4:13 PM, John  wrote:
>>> Given enough facts it would be rather easy for me to write a script
>>> that nukes said spam I did something similar on
>>> http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread John
Can I get a link to your site? I would love to take a look and write
you that script, (I always love a challenge)

On Fri, Aug 24, 2012 at 12:10 PM, Yury Katkov  wrote:
> I think that we have the date after which there was only spam.
> -
> Yury Katkov
>
>
>
> On Fri, Aug 24, 2012 at 8:07 PM, John  wrote:
>> Its rather easy to write in pywiki I just need some information from
>> you about your wiki. (IE are all edits after X date bad, we only have
>> Y valid users and here are their names) exc stuff like that allows me
>> to tailor the script to your needs.
>>
>> On Fri, Aug 24, 2012 at 12:03 PM, Yury Katkov  wrote:
>>> Hi John, thanks! Take your time! If you already have such a script,
>>> and can share it - please do! But if not - I think it will be a good
>>> exercise in pywikipediabot or extension development for me.
>>> -
>>> Yury Katkov
>>>
>>>
>>>
>>> On Fri, Aug 24, 2012 at 7:55 PM, John  wrote:
 Like I said if you want I can whip up a script to nuke the spam, just
 drop me an email off list

 On Fri, Aug 24, 2012 at 11:54 AM, Yury Katkov  
 wrote:
> http://www.mediawiki.org/wiki/Manual:Reduce_size_of_the_database and
> here is the manual on how to purge the archive database! Thanks John,
> that's a perfect solution!
> -
> Yury Katkov
>
>
>
> On Fri, Aug 24, 2012 at 7:51 PM, John  wrote:
>> What can be done after mass deleting is to purge the archive database
>> table which should reduce the database size significantly. If you take
>> a look at the the example where I cleaned up an existing site I
>> reduced the database size by about 90%
>>
>> On Fri, Aug 24, 2012 at 11:47 AM, Yury Katkov  
>> wrote:
>>> Hi everyone! I agree with everyone in this thread, but the main
>>> problem is that even if I create a bot of use extensions that removes
>>> pages, the actual database records won't be deleted. If I understand
>>> correctly, the MediaWiki philosophy tells us that we cannot just drop
>>> the page or an account from the database - all the deletions means
>>> only that we will hide those nasty spam pages.
>>>
>>> Consequently after the deletions the size of my database won't shrink
>>> to original 100 Mb, it remains around 3Gb which is a problem for
>>> hosting.
>>>
>>> The proposed solution of exporting all the pages to a brand new wiki
>>> solves this problem. Are there any other solutions where the dropping
>>> of my old spammed database does not involved?
>>> -
>>> Yury Katkov
>>>
>>>
>>>
>>> On Fri, Aug 24, 2012 at 4:13 PM, John  wrote:
 Given enough facts it would be rather easy for me to write a script
 that nukes said spam I did something similar on
 http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Daniel Friesen
Be aware that by default InnoDB uses a file called ibdata1 to do all of  
it's data storage.

When you remove data from the database InnoDB does not shrink ibdata1 down.
So even if you reduce your 3GB database down <1GB and you have room for  

2GB of content to be added before ibdata1 grows again.
The actual size on disk that your database takes up will likely remain at  
3GB.


So if you really want to reduce on-disk size exporting and re-importing at  
least your raw database at some point becomes necessary since InnoDB will  
never give you that disk space back.


On Fri, 24 Aug 2012 08:54:26 -0700, Yury Katkov   
wrote:



http://www.mediawiki.org/wiki/Manual:Reduce_size_of_the_database and
here is the manual on how to purge the archive database! Thanks John,
that's a perfect solution!
-
Yury Katkov



On Fri, Aug 24, 2012 at 7:51 PM, John  wrote:

What can be done after mass deleting is to purge the archive database
table which should reduce the database size significantly. If you take
a look at the the example where I cleaned up an existing site I
reduced the database size by about 90%

On Fri, Aug 24, 2012 at 11:47 AM, Yury Katkov   
wrote:

Hi everyone! I agree with everyone in this thread, but the main
problem is that even if I create a bot of use extensions that removes
pages, the actual database records won't be deleted. If I understand
correctly, the MediaWiki philosophy tells us that we cannot just drop
the page or an account from the database - all the deletions means
only that we will hide those nasty spam pages.

Consequently after the deletions the size of my database won't shrink
to original 100 Mb, it remains around 3Gb which is a problem for
hosting.

The proposed solution of exporting all the pages to a brand new wiki
solves this problem. Are there any other solutions where the dropping
of my old spammed database does not involved?
-
Yury Katkov



On Fri, Aug 24, 2012 at 4:13 PM, John   
wrote:

Given enough facts it would be rather easy for me to write a script
that nukes said spam I did something similar on
http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Derric Atzrott
>Its rather easy to write in pywiki I just need some information from
>you about your wiki. (IE are all edits after X date bad, we only have
>Y valid users and here are their names) exc stuff like that allows me
>to tailor the script to your needs.
>
>Can I get a link to your site? I would love to take a look and write
>you that script, (I always love a challenge)

If you make your script have some sort of configuration variables or something
along those lines for these different things, then you could release it and
many people could be helped by it.

If you do decide to release it.  I would cross post to the mailing list for
Mediawiki administrators as well.  I'm sure someone on there could use it.

Thank you,
Derric Atzrott


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Nested database transactions

2012-08-24 Thread Daniel Kinzler
On 24.08.2012 03:14, Aaron Schulz wrote:
> SAVEPOINTs are useful if we really need to support people rollback
> transactions *and* we need nested transaction support. I think they could be
> made to work, but I'm not sold on their necessity for any use cases we have.

So, how would you solve the use case I described? What I need to do is to
perform some checks before calling WikiPage::doEdit, and make sure the result of
the check is still valid when the actual save occurs.

I can't see a clean way to do this without supporting nested transactions in
*some* way.

-- daniel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Test error

2012-08-24 Thread Jeroen De Dauw
Hey,

I just ran into this issue again by doing a self join somewhere. In the
meanwhile some people pointed out that this 1137 error is caused by MySQL
not supporting more then one reference to a temporary table in a single
query. And we're using temporary tables for running our PHPUnit tests. Is
it possible to change our test stuff to use non-temporary tables?

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread John
Ive got a script but would like to test it before I make it public. If
someone has a site with spam and would let me test it, it would be
appreciated

On Fri, Aug 24, 2012 at 12:20 PM, Derric Atzrott
 wrote:
>>Its rather easy to write in pywiki I just need some information from
>>you about your wiki. (IE are all edits after X date bad, we only have
>>Y valid users and here are their names) exc stuff like that allows me
>>to tailor the script to your needs.
>>
>>Can I get a link to your site? I would love to take a look and write
>>you that script, (I always love a challenge)
>
> If you make your script have some sort of configuration variables or something
> along those lines for these different things, then you could release it and
> many people could be helped by it.
>
> If you do decide to release it.  I would cross post to the mailing list for
> Mediawiki administrators as well.  I'm sure someone on there could use it.
>
> Thank you,
> Derric Atzrott
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Nested database transactions

2012-08-24 Thread Tyler Romeo
> So, how would you solve the use case I described? What I need to do is to
> perform some checks before calling WikiPage::doEdit, and make sure the
result of
> the check is still valid when the actual save occurs.

SAVEPOINTs are basically nested transactions. Can you describe the use case
in more detail?

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com



On Fri, Aug 24, 2012 at 12:36 PM, Daniel Kinzler wrote:

> On 24.08.2012 03:14, Aaron Schulz wrote:
> > SAVEPOINTs are useful if we really need to support people rollback
> > transactions *and* we need nested transaction support. I think they
> could be
> > made to work, but I'm not sold on their necessity for any use cases we
> have.
>
> So, how would you solve the use case I described? What I need to do is to
> perform some checks before calling WikiPage::doEdit, and make sure the
> result of
> the check is still valid when the actual save occurs.
>
> I can't see a clean way to do this without supporting nested transactions
> in
> *some* way.
>
> -- daniel
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Wikidata status

2012-08-24 Thread Daniel Kinzler
Hi all

Here's a quick update on what has been happening with Wikidata.


After I was off duty last week, I have picked up work on the Wikidata branch
again (aka the ContentHandler stuff). Discussion is happening mostly on 
Bugzilla:

https://bugzilla.wikimedia.org/show_bug.cgi?id=38622

I have tried to address several of Tim's concerns, and will hopefully wrap up
the lose ends early next week. Here's the Gerrit log:

https://gerrit.wikimedia.org/r/#/q/project:mediawiki/core+branch:Wikidata,n,z

While I'm still working on this, I'm very much welcoming any comments, reviews,
criticism, whatever. Please look at the code and let me know what you think!


Other than that, there's the ungoing discussion about a new system for managing
interwiki/linterlanguage link targets. The RFC is here:

https://www.mediawiki.org/wiki/Requests_for_comment/New_sites_system

Here, too, we are working on it, but more comments are still encouraged.


Thanks
Daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Tyler Romeo
I do! http://wiki.sittv.com has been building up spam for a number of
months (or longer).

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com



On Fri, Aug 24, 2012 at 12:52 PM, John  wrote:

> Ive got a script but would like to test it before I make it public. If
> someone has a site with spam and would let me test it, it would be
> appreciated
>
> On Fri, Aug 24, 2012 at 12:20 PM, Derric Atzrott
>  wrote:
> >>Its rather easy to write in pywiki I just need some information from
> >>you about your wiki. (IE are all edits after X date bad, we only have
> >>Y valid users and here are their names) exc stuff like that allows me
> >>to tailor the script to your needs.
> >>
> >>Can I get a link to your site? I would love to take a look and write
> >>you that script, (I always love a challenge)
> >
> > If you make your script have some sort of configuration variables or
> something
> > along those lines for these different things, then you could release it
> and
> > many people could be helped by it.
> >
> > If you do decide to release it.  I would cross post to the mailing list
> for
> > Mediawiki administrators as well.  I'm sure someone on there could use
> it.
> >
> > Thank you,
> > Derric Atzrott
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Code ideas thread

2012-08-24 Thread Daniel Friesen
Meta discussions over community, Appreciation threads, GSoC wrapups,  
Deployment threads, and orthogonal questions.
Lately wikitech-l seems to be almost void of one of the most important  
categories of discussion I like to see here.


Discussions on adding new features to MediaWiki!

So, just like Sumana's "Appreciation thread" how about a little thread  
dedicated to listing out things we'd like to see in MediaWiki or perhaps  
would like to write ourselves.
Not really big things like VisualEditor, Wikidata, and Lua who have teams  
of people within WMF working on them. But rather those other important  
things a lot of us may want but always end up pushed to the side and  
forgotten.


For me...
Before I list the small stuff here are 3 big projects right now I wish I  
could work on but won't possibly have the time unless I find someone  
willing to pay me enough to drop a normal job an dedicate my programming  
time to writing things for MediaWiki:
- Gareth: It's not exactly a MediaWiki feature. But with the Gerrit  
annoyances and talk about other review systems I've had a really good idea  
how to do a review system right this time around. It would be nice to  
spend a pile of time turning it into a system that we could actually use  
for our code review.
- OAuth: Well not actually OAuth. After getting a full understanding of  
this topic implementation of actual OAuth (1&2) looks like a dark  
dead-end. Rather than OAuth I'd like to write a new auth standard that  
learns from all the good things and the mistakes made in both versions of  
OAuth and takes note of all the things we really need. And then implement  
it into MediaWiki and write a series of server and client libraries/sdks  
so it's also easier to pick up than either OAuth.
- Machine-Learning based Anti-spam: Wikipedia has bots like ClueBot NG  
dealing with spam. It would be nice to have machine-learning based  
anti-spam built into a MediaWiki extension with a nice intuitive user  
interface usable outside of WMF so all wikis can have great anti-spam.



Now some old and forgotten code topics:
- 404 routing: I'd like us to get to the point where we can set  
ErrorDocument 404 /w/index.php and MediaWiki will automatically start  
doing short urls, outputting 404 pages for you, and acting as an implicit  
thumbnail handler.
- Title rewrite: Aaaaincient topic... updating our handling of the page  
table and titles in general so that the case, whitespace, and all the  
stuff in a title that just get's normalized away is correctly remembered.  
So that [[iPod]], even though it's the same as [[IPod]] will always  
display as "iPod" even in lists outside of the page itself such as  
Special:Allpages
- Password reset tokens: It's unbelievable but we are STILL using  
temporary passwords instead of reset tokens. Naturally this is less usable  
and also lowers the security of our password reset system.
- An abstract revision system. The way we shove configuration into i18n,  
i18n into articles, scripts and stylesheets into articles, and extensions  
go and do the same. All just to get proper revisioning of things. Is  
horrible. Not to mention the extensions that don't and rely on our logging  
system which makes it harder to revert things. With all this together I'd  
like to see an abstract system that lets extensions have their own  
revision system outside of page content for whatever they need to do.


https://www.mediawiki.org/wiki/User:Dantman/Code_Ideas
https://www.mediawiki.org/wiki/User:Dantman/Abstract_Revision_System
https://www.mediawiki.org/wiki/User:Dantman/Code_Ideas/PageLayouts
https://www.mediawiki.org/wiki/User:Dantman/Anti-spam_system
https://www.mediawiki.org/wiki/Requests_for_comment/Entrypoint_Routing_and_404_handling
https://www.mediawiki.org/wiki/User:Dantman/CodeReviewSystem and  
http://gareth-review.com/


--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Appreciation thread

2012-08-24 Thread Daniel Zahn
in random order:

* To James Alexander for fixing a lot of Planet URLs.

* To Ryan and Faidon for their replies on the mailing list archive issues

* To Ben for advice on all the benefits package paperwork i was not
familiar with

* To Mark H. for still replying to my Bugzilla questions even though
he is busy with non-wmf things

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Nested database transactions

2012-08-24 Thread Daniel Kinzler
On 24.08.2012 18:55, Tyler Romeo wrote:
>> So, how would you solve the use case I described? What I need to do is to
>> perform some checks before calling WikiPage::doEdit, and make sure the
> result of
>> the check is still valid when the actual save occurs.
> 
> SAVEPOINTs are basically nested transactions. 

Yes. I'd like to use them.

MediaWiki's current behaviour is calling begin() when a transaction is open
silently commits the old transaction and starts a new one.

This SUCKS.

> Can you describe the use case
> in more detail?

So, in wikidata, we have global constraints, e.g. the requirement that only one
data item can have a sitelink to a given wikipedage (there's a 1:1 relationship
between wikipedia pages and data items). Before saving the item page, this
constraint needs to be checked, just before calling WikiPage::doEdit(). And we
also want to check for edit conflicts (comparing the base revision - note that
we are not using EditPage).

Anyway, wee need to do some checks before we call WikiPage::doEdit. And make
sure the database doesn't change before the actual save is done. So our checks
should be in the same transaction as the actual save.

But WikiPage::doEdit already opens a transaction. So we can no open a
surrounding transactiopn bracket - because nested transactions are not 
supported.

This could be solved be the "counting" or the "safepoint" solution, the latter
being somewhat nicer. But we need to st least *one* of them, as far as I can 
tell.

The current situation is that code always has to know whether it is safe to call
some function X from inside a transaction, and conversely, any function needs to
decide on whether it expects to be called from within an existing transaction,
or if it should open its own.

These things can often not really be known in advance. This has caused trouble
in the past (caused by transactions being committed prematurely, because another
transaction started). I'm sure it will cause more pain in the future.

So I'm proposing to implement support for nested transactions, either by just
counting (and, on rollback, roll back all open transactions). Or by using
savepoints.

-- daniel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Tyler Romeo
> - OAuth: Well not actually OAuth. After getting a full understanding of
this topic
> implementation of actual OAuth (1&2) looks like a dark dead-end. Rather
than OAuth I'd like
>  to write a new auth standard that learns from all the good things and
the mistakes made in
> both versions of OAuth and takes note of all the things we really need.
And then implement it
> into MediaWiki and write a series of server and client libraries/sdks so
it's also easier to pick
> up than either OAuth.

Not a good idea: http://xkcd.com/927/
While OAuth has its problems, it's not a terrible protocol (or at least v1
isn't).

> Password reset tokens: It's unbelievable but we are STILL using temporary
passwords
> instead of reset tokens. Naturally this is less usable and also lowers
the security of our
> password reset system.

My focus lately has been on security, so I may take this on in the near
future.

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com



On Fri, Aug 24, 2012 at 1:05 PM, Daniel Friesen
wrote:

> Meta discussions over community, Appreciation threads, GSoC wrapups,
> Deployment threads, and orthogonal questions.
> Lately wikitech-l seems to be almost void of one of the most important
> categories of discussion I like to see here.
>
> Discussions on adding new features to MediaWiki!
>
> So, just like Sumana's "Appreciation thread" how about a little thread
> dedicated to listing out things we'd like to see in MediaWiki or perhaps
> would like to write ourselves.
> Not really big things like VisualEditor, Wikidata, and Lua who have teams
> of people within WMF working on them. But rather those other important
> things a lot of us may want but always end up pushed to the side and
> forgotten.
>
> For me...
> Before I list the small stuff here are 3 big projects right now I wish I
> could work on but won't possibly have the time unless I find someone
> willing to pay me enough to drop a normal job an dedicate my programming
> time to writing things for MediaWiki:
> - Gareth: It's not exactly a MediaWiki feature. But with the Gerrit
> annoyances and talk about other review systems I've had a really good idea
> how to do a review system right this time around. It would be nice to spend
> a pile of time turning it into a system that we could actually use for our
> code review.
> - OAuth: Well not actually OAuth. After getting a full understanding of
> this topic implementation of actual OAuth (1&2) looks like a dark dead-end.
> Rather than OAuth I'd like to write a new auth standard that learns from
> all the good things and the mistakes made in both versions of OAuth and
> takes note of all the things we really need. And then implement it into
> MediaWiki and write a series of server and client libraries/sdks so it's
> also easier to pick up than either OAuth.
> - Machine-Learning based Anti-spam: Wikipedia has bots like ClueBot NG
> dealing with spam. It would be nice to have machine-learning based
> anti-spam built into a MediaWiki extension with a nice intuitive user
> interface usable outside of WMF so all wikis can have great anti-spam.
>
>
> Now some old and forgotten code topics:
> - 404 routing: I'd like us to get to the point where we can set
> ErrorDocument 404 /w/index.php and MediaWiki will automatically start doing
> short urls, outputting 404 pages for you, and acting as an implicit
> thumbnail handler.
> - Title rewrite: Aaaaincient topic... updating our handling of the page
> table and titles in general so that the case, whitespace, and all the stuff
> in a title that just get's normalized away is correctly remembered. So that
> [[iPod]], even though it's the same as [[IPod]] will always display as
> "iPod" even in lists outside of the page itself such as Special:Allpages
> - Password reset tokens: It's unbelievable but we are STILL using
> temporary passwords instead of reset tokens. Naturally this is less usable
> and also lowers the security of our password reset system.
> - An abstract revision system. The way we shove configuration into i18n,
> i18n into articles, scripts and stylesheets into articles, and extensions
> go and do the same. All just to get proper revisioning of things. Is
> horrible. Not to mention the extensions that don't and rely on our logging
> system which makes it harder to revert things. With all this together I'd
> like to see an abstract system that lets extensions have their own revision
> system outside of page content for whatever they need to do.
> 
> https://www.mediawiki.org/**wiki/User:Dantman/Code_Ideas
> https://www.mediawiki.org/**wiki/User:Dantman/Abstract_**Revision_System
> https://www.mediawiki.org/**wiki/User:Dantman/Code_Ideas/**PageLayouts
> https://www.mediawiki.org/**wi

Re: [Wikitech-l] Lua deployed to www.mediawiki.org

2012-08-24 Thread Rob Lanphier
On Fri, Aug 24, 2012 at 4:59 AM, Mark A. Hershberger  wrote:
> I recall that one of Robla's standard articles from enwiki for
> demonstrating long rendering time was
> .  I just did a purge on it
> and it took 34s to render.

Hi Mark, thanks for pointing that out.  A better way to test this is
to preview without (necessarily) changing anything, since the article
will still be available to anyone else who requests it during the 34s
it takes to parse it.  Sucks that we should have to think about that,
but that'll hopefully be one of the things that this fixes.

> I haven't yet looked at how the article is written or the templates
> used, but perhaps that would be a good place to start looking.

I'm pretty sure that all of the {{Cite}} templates are the major
consumers on that page.  Maybe we can just get people to stop citing
their sources ;-)

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikimedians are rightfully wary

2012-08-24 Thread Quim Gil
On Fri, Aug 24, 2012 at 12:10 AM, Strainu  wrote:
> I think the author of the original article said it best: "Agreement
> aside, we're seeing a disconnect right now between what the Foundation
> is spending resources on and the issues faced by the community." If we
> can't agree on the problem, we will have a very hard time finding
> solutions.

Is this perceived "disconnect" explained anywhere with some detail?
Even better if in the form of a compilation of ignored,
non-prioritized or dismissed problems, projects, tasks, bugs, etc.

The WMF engineering plan is defined at
http://www.mediawiki.org/wiki/Roadmap . It would be useful to know
what is found to be missing, pointless or having the wrong priority.
Not only to influence the plans of the WMF, but also to help current
and potential contributors finding areas and tasks to contribute.

Also, is that the roadmap of the WMF team alone or a roadmap for the
whole Wikimedia technical community, where anybody can get involved
from occasional tester or feedback provider to maintainer and person
in charge?

-- 
Quim Gil /// http://espiral.org

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Nabil Maynard
On Fri, Aug 24, 2012 at 10:16 AM, Tyler Romeo  wrote:

> > - OAuth: Well not actually OAuth. After getting a full understanding of
> this topic
> > implementation of actual OAuth (1&2) looks like a dark dead-end. Rather
> than OAuth I'd like
> >  to write a new auth standard that learns from all the good things and
> the mistakes made in
> > both versions of OAuth and takes note of all the things we really need.
> And then implement it
> > into MediaWiki and write a series of server and client libraries/sdks so
> it's also easier to pick
> > up than either OAuth.
>
> Not a good idea: http://xkcd.com/927/
> While OAuth has its problems, it's not a terrible protocol (or at least v1
> isn't).
>
>
Seconded -- I'd rather see contributions to making OAuth less painful
rather than invent Yet Another Standard.

My personal wishlist:
 - Persona: Previously called BrowserID.  It's come a LONG way in the past
few months, and provides another fairly clean identity/authentication
system.
 - OpenBadges: I'd love to explore options for implementing an OpenBadges
solution for MW -- methods to encourage good editing and contribution, and
to identify those who have consistently demonstrated this capability seems
pretty worthwhile.

Nabil
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Derric Atzrott
>Meta discussions over community, Appreciation threads, GSoC wrapups,  
>Deployment threads, and orthogonal questions.
>Lately wikitech-l seems to be almost void of one of the most important  
>categories of discussion I like to see here.
>
>Discussions on adding new features to MediaWiki!
>
>So, just like Sumana's "Appreciation thread" how about a little thread  
>dedicated to listing out things we'd like to see in MediaWiki or perhaps  
>would like to write ourselves.
>Not really big things like VisualEditor, Wikidata, and Lua who have teams  
>of people within WMF working on them. But rather those other important  
>things a lot of us may want but always end up pushed to the side and  
>forgotten.
>
>For me...
>
> ...
>
>- OAuth: Well not actually OAuth. After getting a full understanding of  
>this topic implementation of actual OAuth (1&2) looks like a dark  
>dead-end. Rather than OAuth I'd like to write a new auth standard that  
>learns from all the good things and the mistakes made in both versions of  
>OAuth and takes note of all the things we really need. And then implement  
>it into MediaWiki and write a series of server and client libraries/sdks  
>so it's also easier to pick up than either OAuth.

Obligitory XKCD: http://xkcd.com/927/

>
> ...
>
>Now some old and forgotten code topics:
>
> ...
>
>- Password reset tokens: It's unbelievable but we are STILL using  
>temporary passwords instead of reset tokens. Naturally this is less usable  
>and also lowers the security of our password reset system.

I had no idea we were doing that.  That /is/ really bad!

>- An abstract revision system. The way we shove configuration into i18n,  
>i18n into articles, scripts and stylesheets into articles, and extensions  
>go and do the same. All just to get proper revisioning of things. Is  
>horrible. Not to mention the extensions that don't and rely on our logging  
>system which makes it harder to revert things. With all this together I'd  
>like to see an abstract system that lets extensions have their own  
>revision system outside of page content for whatever they need to do.

This.  I would pay you for this one.  Not a living by any means, but I would be
willing to put $20-$30 towards whoever implements that as a gift and a "Thank 
you".  All my extensions at my job have to keep track of revisions and it is a
pain to reimplement it every time.  I still haven't gotten my history UIs
anywhere close to as nice as the one used by Mediawiki.

-

That all said, this a fantastic topic idea.  I can't wait to see where this
goes.

Thank you,
Derric Atzrott


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Tyler Romeo
Wait a second. Concerning the password reset, currently it uses the
user_newpassword field, which means the user is required to reset their
password upon login. How is this any different than using a reset token,
where the user supplies the reset token and changes their password?

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com



On Fri, Aug 24, 2012 at 1:38 PM, Derric Atzrott <
datzr...@alizeepathology.com> wrote:

> >Meta discussions over community, Appreciation threads, GSoC wrapups,
> >Deployment threads, and orthogonal questions.
> >Lately wikitech-l seems to be almost void of one of the most important
> >categories of discussion I like to see here.
> >
> >Discussions on adding new features to MediaWiki!
> >
> >So, just like Sumana's "Appreciation thread" how about a little thread
> >dedicated to listing out things we'd like to see in MediaWiki or perhaps
> >would like to write ourselves.
> >Not really big things like VisualEditor, Wikidata, and Lua who have teams
> >of people within WMF working on them. But rather those other important
> >things a lot of us may want but always end up pushed to the side and
> >forgotten.
> >
> >For me...
> >
> > ...
> >
> >- OAuth: Well not actually OAuth. After getting a full understanding of
> >this topic implementation of actual OAuth (1&2) looks like a dark
> >dead-end. Rather than OAuth I'd like to write a new auth standard that
> >learns from all the good things and the mistakes made in both versions of
> >OAuth and takes note of all the things we really need. And then implement
> >it into MediaWiki and write a series of server and client libraries/sdks
> >so it's also easier to pick up than either OAuth.
>
> Obligitory XKCD: http://xkcd.com/927/
>
> >
> > ...
> >
> >Now some old and forgotten code topics:
> >
> > ...
> >
> >- Password reset tokens: It's unbelievable but we are STILL using
> >temporary passwords instead of reset tokens. Naturally this is less usable
> >and also lowers the security of our password reset system.
>
> I had no idea we were doing that.  That /is/ really bad!
>
> >- An abstract revision system. The way we shove configuration into i18n,
> >i18n into articles, scripts and stylesheets into articles, and extensions
> >go and do the same. All just to get proper revisioning of things. Is
> >horrible. Not to mention the extensions that don't and rely on our logging
> >system which makes it harder to revert things. With all this together I'd
> >like to see an abstract system that lets extensions have their own
> >revision system outside of page content for whatever they need to do.
>
> This.  I would pay you for this one.  Not a living by any means, but I
> would be
> willing to put $20-$30 towards whoever implements that as a gift and a
> "Thank
> you".  All my extensions at my job have to keep track of revisions and it
> is a
> pain to reimplement it every time.  I still haven't gotten my history UIs
> anywhere close to as nice as the one used by Mediawiki.
>
> -
>
> That all said, this a fantastic topic idea.  I can't wait to see where this
> goes.
>
> Thank you,
> Derric Atzrott
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Chad
On Fri, Aug 24, 2012 at 1:52 PM, Tyler Romeo  wrote:
> Wait a second. Concerning the password reset, currently it uses the
> user_newpassword field, which means the user is required to reset their
> password upon login. How is this any different than using a reset token,
> where the user supplies the reset token and changes their password?
>

Well I assume we'd put the token in the url we give the user,
so they don't have to type anything in.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Tyler Romeo
Yes, but that's only increased convenience. I'm wondering exactly what
security implications there are to our current system v. a token reset
system.

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com



On Fri, Aug 24, 2012 at 1:56 PM, Chad  wrote:

> On Fri, Aug 24, 2012 at 1:52 PM, Tyler Romeo  wrote:
> > Wait a second. Concerning the password reset, currently it uses the
> > user_newpassword field, which means the user is required to reset their
> > password upon login. How is this any different than using a reset token,
> > where the user supplies the reset token and changes their password?
> >
>
> Well I assume we'd put the token in the url we give the user,
> so they don't have to type anything in.
>
> -Chad
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Thomas Morton
n 24 August 2012 18:57, Tyler Romeo  wrote:

> Yes, but that's only increased convenience. I'm wondering exactly what
> security implications there are to our current system v. a token reset
> system.
>
> *--*
> *Tyler Romeo*
> Stevens Institute of Technology, Class of 2015
> Major in Computer Science
> www.whizkidztech.com | tylerro...@gmail.com
>
>
>
How long is the generated password? Might be a brute force vulnerability.

Tom
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Daniel Friesen
- Usability: Forcing the user to manually enter the token is a very bad  
user experience. We have good email confirmation tokens but don't bother  
to do password resets the same way.
- Security: Because the temporary password is being entered by the user it  
ends up being much shorter than it should be. The temporary passwords have  
really low entropy and if we expired them any later than we do now it  
would theoretically be possible to brute force a password reset. Frankly  
right now if someone was persistent enough to brute force randomly and  
make a second reset after the first expires they may actually have a sane  
enough chance at brute forcing into an account.


On Fri, 24 Aug 2012 10:52:00 -0700, Tyler Romeo   
wrote:



Wait a second. Concerning the password reset, currently it uses the
user_newpassword field, which means the user is required to reset their
password upon login. How is this any different than using a reset token,
where the user supplies the reset token and changes their password?

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com



On Fri, Aug 24, 2012 at 1:38 PM, Derric Atzrott <
datzr...@alizeepathology.com> wrote:


>Meta discussions over community, Appreciation threads, GSoC wrapups,
>Deployment threads, and orthogonal questions.
>Lately wikitech-l seems to be almost void of one of the most important
>categories of discussion I like to see here.
>
>Discussions on adding new features to MediaWiki!
>
>So, just like Sumana's "Appreciation thread" how about a little thread
>dedicated to listing out things we'd like to see in MediaWiki or  
perhaps

>would like to write ourselves.
>Not really big things like VisualEditor, Wikidata, and Lua who have  
teams

>of people within WMF working on them. But rather those other important
>things a lot of us may want but always end up pushed to the side and
>forgotten.
>
>For me...
>
> ...
>
>- OAuth: Well not actually OAuth. After getting a full understanding of
>this topic implementation of actual OAuth (1&2) looks like a dark
>dead-end. Rather than OAuth I'd like to write a new auth standard that
>learns from all the good things and the mistakes made in both versions  
of
>OAuth and takes note of all the things we really need. And then  
implement
>it into MediaWiki and write a series of server and client  
libraries/sdks

>so it's also easier to pick up than either OAuth.

Obligitory XKCD: http://xkcd.com/927/

>
> ...
>
>Now some old and forgotten code topics:
>
> ...
>
>- Password reset tokens: It's unbelievable but we are STILL using
>temporary passwords instead of reset tokens. Naturally this is less  
usable

>and also lowers the security of our password reset system.

I had no idea we were doing that.  That /is/ really bad!

>- An abstract revision system. The way we shove configuration into  
i18n,
>i18n into articles, scripts and stylesheets into articles, and  
extensions

>go and do the same. All just to get proper revisioning of things. Is
>horrible. Not to mention the extensions that don't and rely on our  
logging
>system which makes it harder to revert things. With all this together  
I'd

>like to see an abstract system that lets extensions have their own
>revision system outside of page content for whatever they need to do.

This.  I would pay you for this one.  Not a living by any means, but I
would be
willing to put $20-$30 towards whoever implements that as a gift and a
"Thank
you".  All my extensions at my job have to keep track of revisions and  
it

is a
pain to reimplement it every time.  I still haven't gotten my history  
UIs

anywhere close to as nice as the one used by Mediawiki.

-

That all said, this a fantastic topic idea.  I can't wait to see where  
this

goes.

Thank you,
Derric Atzrott


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l




--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Appreciation thread

2012-08-24 Thread Roan Kattouw
On Thu, Aug 23, 2012 at 7:50 PM, Erik Moeller  wrote:
> Completely random appreciation for whoever implemented the "undo"
> feature in MediaWiki, one of its many hidden gems. :-)
>
Looks like the main contributors to this were Andrew Garrett in July
2006 [1] and December 2006 [2], Aaron Schulz in March 2007 [3], and
Andrew again in July 2007 [4].

So appreciation for them, the other old hands that have been around
for such an incredibly long time, and all the other committers that
created the tens of thousands of commits in MediaWiki's history [5];
it's always interesting to dig through them when looking up old stuff
like this.

Roan

[1] 
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=commitdiff;h=f062b09dc38a0612a92a5dba08dc01746b6f42f2
[2] 
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=commitdiff;h=c490d265fa711547ff110438540b4f3457079fa2
[3] 
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=commitdiff;h=0fe87673b78482cfe60b02e2937d1284d83ad4d8
[4] 
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=commitdiff;h=d53b216aa7819db154b3bde9cb797f20f42900f5
[5] For those of you wondering "shouldn't that be 'over 100k?': there
were about 114k revisions in SVN at the time of the git migration, but
those weren't all MW core. At the time of this writing, there were
about 43k commits in the mediawiki/core.git history (excluding merge
commits).

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Tyler Romeo
The password length is whatever $wgMinimalPasswordLength is set to, and
according to DefaultSettings.php it's 1 :P. Maybe we should increase the
length of passwords from User::randomPassword.

- Security: Because the temporary password is being entered by the user it
> ends up being much shorter than it should be. The temporary passwords have
> really low entropy and if we expired them any later than we do now it would
> theoretically be possible to brute force a password reset. Frankly right
> now if someone was persistent enough to brute force randomly and make a
> second reset after the first expires they may actually have a sane enough
> chance at brute forcing into an account.


Ah I see, so in the end it's pretty much about brute force attacks. Well
what we can do (in order to avoid schema changes), is keep the newpassword
field, increase temporary password lengths to something like 64, and then
shift the Special:ResetPassword and User::mailPasswordInternal logic to use
URLs instead of entering the password manually.

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com



On Fri, Aug 24, 2012 at 1:59 PM, Thomas Morton  wrote:

> n 24 August 2012 18:57, Tyler Romeo  wrote:
>
> > Yes, but that's only increased convenience. I'm wondering exactly what
> > security implications there are to our current system v. a token reset
> > system.
> >
> > *--*
> > *Tyler Romeo*
> > Stevens Institute of Technology, Class of 2015
> > Major in Computer Science
> > www.whizkidztech.com | tylerro...@gmail.com
> >
> >
> >
> How long is the generated password? Might be a brute force vulnerability.
>
> Tom
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Derric Atzrott
>The password length is whatever $wgMinimalPasswordLength is set to, and
>according to DefaultSettings.php it's 1 :P. Maybe we should increase the
>length of passwords from User::randomPassword.
>
>>- Security: Because the temporary password is being entered by the user it
>> ends up being much shorter than it should be. The temporary passwords have
>> really low entropy and if we expired them any later than we do now it would
>> theoretically be possible to brute force a password reset. Frankly right
>> now if someone was persistent enough to brute force randomly and make a
>> second reset after the first expires they may actually have a sane enough
>> chance at brute forcing into an account.
>
>
>Ah I see, so in the end it's pretty much about brute force attacks. Well
>what we can do (in order to avoid schema changes), is keep the newpassword
>field, increase temporary password lengths to something like 64, and then
>shift the Special:ResetPassword and User::mailPasswordInternal logic to use
>URLs instead of entering the password manually.

The other thing though that can be done with tokens that can't be done with
passwords (at least without violating user expectations) is making the token
expire.  Having the randomly generated token/password expire after a day or so
greatly reduces the amount of time available for an attack.

Thank you,
Derric Atzrott


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Chris Steipp
On Fri, Aug 24, 2012 at 10:33 AM, Nabil Maynard  wrote:
> On Fri, Aug 24, 2012 at 10:16 AM, Tyler Romeo  wrote:
>> Not a good idea: http://xkcd.com/927/
>> While OAuth has its problems, it's not a terrible protocol (or at least v1
>> isn't).
>>
>>
> Seconded -- I'd rather see contributions to making OAuth less painful
> rather than invent Yet Another Standard.

I have to agree too. OAuth has problems, but it would allow several of
wmf's current integrations to be more secure overall, and that would
be a win for us. If Daniel is able to create a protocol that is as
secure, and easier for developers to use securely, then I will
definitely push to switch over. But until then, I'm still going to try
and get OAuth out.

I'd also love to see MediaWiki support SAML too, for our .edu/.gov users.

>
> My personal wishlist:
>  - Persona: Previously called BrowserID.  It's come a LONG way in the past
> few months, and provides another fairly clean identity/authentication
> system.

Mozilla is also interested in this. I don't think we can use it on wmf
sites, but if you're interested in working on it, I can probably get
you in touch with someone there. I think it would be a great feature.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Test error

2012-08-24 Thread Christian Aistleitner
Hi Jeroen,

On Fri, Aug 24, 2012 at 06:39:55PM +0200, Jeroen De Dauw wrote:
> Is
> it possible to change our test stuff to use non-temporary tables?

IIRC, you can use the "--use-normal-tables" option when running the
phpunit tests to avoid using temporary tables.

But of course, this only works locally. You'll have to skip that test
for Jenkins etc. :-(


All the best,
Christian



-- 
 quelltextlich e.U.  \\  Christian Aistleitner 
   Companies' registry: 360296y in Linz
Christian Aistleitner
Gruendbergstrasze 65aEmail:  christ...@quelltextlich.at
4040 Linz, Austria   Phone:  +43 732 / 26 95 63
 Fax:+43 732 / 26 95 63
 Homepage: http://quelltextlich.at/
---


signature.asc
Description: Digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Test error

2012-08-24 Thread Jeroen De Dauw
Hey,

> IIRC, you can use the "--use-normal-tables" option when running the
phpunit tests to avoid using temporary tables.

Awesome, that will work for me :)

> But of course, this only works locally. You'll have to skip that test for
Jenkins etc. :-(

Jenkins is running the tests using SQLite, so the problem does not occur
there.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Wikitext-l] [X-POST][GSoC] Realtime Collaboration on VisualEditor

2012-08-24 Thread Sumana Harihareswara
Ashish, thanks for the writeup, and for your work!

I see in
https://www.mediawiki.org/wiki/Wikimedia_Engineering/2012-13_Goals#Visual_Editor
that integration of your work into VisualEditor is scheduled for
sometime in April-June 2013.  What will you be doing to avoid code rot
between now and then, to ensure that the VE team (including you) can
perform that integration 8 months from now?

Thanks again.

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikimedians are rightfully wary

2012-08-24 Thread Daniel Zahn
fwiw, my experience today trying to tell a community about a change we made:

we enabled WebFonts for my.wp:

https://gerrit.wikimedia.org/r/#/c/20727/1/wmf-config/InitialiseSettings.php

as requested in https://bugzilla.wikimedia.org/show_bug.cgi?id=34817

because it was assigned to me in Gerrit and looked like an easy
change. So after merging and pushing out to cluster..

first i joined the IRC channel #wikipedia-my . It was empty.

Then i checked for a mailing list. It did not exist.

Then i went to the Wiki looking for the right place to drop a message.

I do not speak their language, actually i don't even have the right
fonts installed:

http://my.wikipedia.org/wiki/Special:RecentChanges

I just tried Village_Pump,
http://my.wikipedia.org/wiki/Wikipedia:Village_Pump but it appears
empty afaict.

At this point i gave up and relied on the comment in Bugzilla being enough..

The last part took wy longer than the actual merge of course.

If i could have a matrix with links please, one for every project in
every language with just the right places to leave comments on, i
would more often do this...

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Daniel Friesen
On Fri, 24 Aug 2012 11:24:46 -0700, Chris Steipp   
wrote:


On Fri, Aug 24, 2012 at 10:33 AM, Nabil Maynard   
wrote:
On Fri, Aug 24, 2012 at 10:16 AM, Tyler Romeo   
wrote:

Not a good idea: http://xkcd.com/927/
While OAuth has its problems, it's not a terrible protocol (or at  
least v1

isn't).


Randall is right in general about standards proliferation for standards  
sake.

But that's primarily about just writing a standard for other people to use.
If there are issues with the old standard, there is no significant  
advantage to use of the old spec (besides the case that it already exists,  
etc...), and you are intending to actually use the standard rather than  
just throw it out for people to use. Then that's really a valid situation  
to write a new standard in.


- OAuth 1 had some important issues too. In particular the temporary  
credentials and the limitations on the user experience caused by the  
single flow.
- The flow limitations is probably a big one for us. And it is possible to  
work around the issue by separating OAuth into two parts. But by doing  
that you diverge from the spec and there isn't much more reason to stick  
with that standard. And afaik the libraries for doing OAuth 1 don't  
support these alternative types of flows.



Seconded -- I'd rather see contributions to making OAuth less painful
rather than invent Yet Another Standard.


I have to agree too. OAuth has problems, but it would allow several of
wmf's current integrations to be more secure overall, and that would
be a win for us. If Daniel is able to create a protocol that is as
secure, and easier for developers to use securely, then I will
definitely push to switch over. But until then, I'm still going to try
and get OAuth out.
Thanks. I already know what I have lying around in my head will keep OAuth  
1 level security while making signatures easier to implement. Although the  
fundamental idea in this area is auth should always be done by a  
library/SDK anyways.
The stuff making my head spin actually isn't even any part of the basic  
auth. It's not even discovery itself. The hardest thing to figure out is  
what to do about making discovery and dynamic registration over HTTP  
secure.

Which frankly is something that no protocol has anyways.

The only problem with writing out an actual standard for the rest of the  
stuff is all my good hours are taken up by work. The leftovers wouldn't be  
enough to get out a good enough quality standard and reference/testing  
implementation.


Rather than jumping to "get OAuth out" what about first trying to get the  
fundamental base pieces we need for all of these into core. ie: Abstract  
authorizations and applications. Revocation pages. Attaching an  
authorization/application to changes like revisions, logs, etc... and  
tools to mass-revert by confidential application or multiple public  
authorizations.
We'll need that stuff no matter what we implement. And it's going to take  
awhile just to implement those things.
We can decide whether we want OAuth or something else when we finally get  
to that point.



I'd also love to see MediaWiki support SAML too, for our .edu/.gov users.


Did these organizations need to use those SAML credentials directly for API
things or is this just another method we want to support for logging in?


My personal wishlist:
 - Persona: Previously called BrowserID.  It's come a LONG way in the  
past

few months, and provides another fairly clean identity/authentication
system.


Mozilla is also interested in this. I don't think we can use it on wmf
sites, but if you're interested in working on it, I can probably get
you in touch with someone there. I think it would be a great feature.



--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] best way to clean up the wiki from 3Gb of spam

2012-08-24 Thread Platonides
Export everything, filter out revisions newer than the spam start,
import in a new db.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Dan Callahan

On 8/24/12 12:33 PM, Nabil Maynard wrote:

My personal wishlist:
  - Persona: Previously called BrowserID.  It's come a LONG way in the past
few months, and provides another fairly clean identity/authentication
system.


That's on Mozilla's wishlist, too!

For background, Persona is an open, decentralized identity system that 
tries to learn from and build on the foundation set by previous systems 
like OpenID.


Mozilla is using it in production on quite a few sites (MDN, Bugzilla, 
Mozillians, Marketplace, Firefox Affiliates, Popcorn, OpenBadges, etc), 
and we'd love to see Persona as an option for Mediawiki-based sites. 
Especially for wiki.mozilla.org.


From what I understand, one of the biggest hurdles is the account 
model. Persona replaces usernames and passwords with email addresses and 
cryptographic proofs of ownership. IIRC, MediaWiki doesn't necessarily 
collect email addresses for new accounts, so a plugin would have to have 
some sort of interactive migration built in for when a user first 
authenticates with Persona.


This isn't insurmountable (MDN dealt with a similar problem), but thus 
far a plugin hasn't materialized.


Our docs are at https://developer.mozilla.org, we hang out in #identity 
on irc.mozilla.org, and our mailing list is at 
https://lists.mozilla.org/listinfo/dev-identity


As a member of the Identity team at Mozilla, I'm also personally 
available to help out / answer any questions an implementer might have.


Cheers,
-Dan



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikimedians are rightfully wary

2012-08-24 Thread Platonides
On 24/08/12 23:02, Daniel Zahn wrote:
> I do not speak their language, actually i don't even have the right
> fonts installed:
> 
> http://my.wikipedia.org/wiki/Special:RecentChanges


Filtering at Wikipedia namespace,
http://my.wikipedia.org/w/index.php?namespace=4&title=Special%3ARecentChanges&uselang=en
the local Village Pump is probably at:
 
http://my.wikipedia.org/w/index.php?title=Wikipedia%3A%E1%80%9C%E1%80%80%E1%80%BA%E1%80%96%E1%80%80%E1%80%BA%E1%80%9B%E1%80%8A%E1%80%BA%E1%80%86%E1%80%AD%E1%80%AF%E1%80%84%E1%80%BA_%28%E1%80%A1%E1%80%AD%E1%80%AF%E1%80%84%E1%80%BA%E1%80%92%E1%80%AE%E1%80%9A%E1%80%AC%29#Solving_the_font_problem:_WebFonts



> I just tried Village_Pump,
> http://my.wikipedia.org/wiki/Wikipedia:Village_Pump but it appears
> empty afaict.
> 
> At this point i gave up and relied on the comment in Bugzilla being enough..
> 
> The last part took wy longer than the actual merge of course.
> 
> If i could have a matrix with links please, one for every project in
> every language with just the right places to leave comments on, i
> would more often do this...

http://meta.wikimedia.org/wiki/International_names_for_Village_Pump
http://meta.wikimedia.org/wiki/Distribution_list

(However, a Village Pump for mywiki is not listed there either)


BTW, we appreciated your kindness today in joining #wiktionary-es to
notify the deployment of c21203 :)




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Lists you might not know about

2012-08-24 Thread Sumana Harihareswara
I've just updated https://www.mediawiki.org/wiki/Mailing_lists and
https://meta.wikimedia.org/wiki/Mailing_lists/Overview#Mediawiki_and_technical
.  Sorry for the spam, but you really should take a moment to skim and
see whether there are lists there you should join.  I especially want to
single out:

design
mediawiki-api-announce
mediawiki-i18n -- localisation and internationalisation
labs-l -- for when you have a question or request re Wikimedia Labs
analytics
wikitext-l -- the new Visual Editor & parser

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikimedians are rightfully wary

2012-08-24 Thread Daniel Zahn
On Fri, Aug 24, 2012 at 2:50 PM, Platonides  wrote:

> http://meta.wikimedia.org/wiki/International_names_for_Village_Pump
> http://meta.wikimedia.org/wiki/Distribution_list

Thanks, this is exactly what i was looking for. :)



-- 
Daniel Zahn 
Operations Engineer

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Platonides
I also thought about implementing Persona (BrowserID) for user login.
Although solving the "account model problem" by replacing email
addresses with SUL usernames. :)


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Ryan Lane
> Did these organizations need to use those SAML credentials directly for API
> things or is this just another method we want to support for logging in?
>

https://meta.wikimedia.org/wiki/Wikimedia_Fellowships/Project_Ideas/The_Wikipedia_Library

That's the biggest current reason for wanting to support SAML.

- Ryan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Platonides
Tyler Romeo wrote:
> Wait a second. Concerning the password reset, currently it uses the
> user_newpassword field, which means the user is required to reset their
> password upon login. How is this any different than using a reset token,
> where the user supplies the reset token and changes their password?

Thanks Tyler, I was wondering the same.


Tyler Romeo wrote:
> The password length is whatever $wgMinimalPasswordLength is set to, and
> according to DefaultSettings.php it's 1 :P. Maybe we should increase the
> length of passwords from User::randomPassword.

But we never generate random passwords shorter than 10 characters.
(includes/User.php line 859) We can increase that hardcoded value as
much as we want.


Derric wrote:
> The other thing though that can be done with tokens that can't be done with
> passwords (at least without violating user expectations) is making the token
> expire.  Having the randomly generated token/password expire after a day or so
> greatly reduces the amount of time available for an attack.

Our temporary passwords do expire.


Daniel Friesen wrote:
> - Usability: Forcing the user to manually enter the token is a very bad
> user experience. We have good email confirmation tokens but don't bother
> to do password resets the same way.
> - Security: Because the temporary password is being entered by the user
> it ends up being much shorter than it should be. The temporary passwords
> have really low entropy
It's using MWCryptRand, 46 bits. It could be improved, but it's not that
bad.

> and if we expired them any later than we do now
> it would theoretically be possible to brute force a password reset.
> Frankly right now if someone was persistent enough to brute force
> randomly and make a second reset after the first expires they may
> actually have a sane enough chance at brute forcing into an account.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Platonides
Mailman integration: You just got promoted to sysop / steward /
checkuser/arbcom ?
You now have a new checkbox at [[Special:MailingLists]] for subscribing
to the relevant private list!


(I suggest to add the new ideas as direct children of the root post, and
use other descendents for discussing the ideas themselves)



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Daniel Friesen

On Fri, 24 Aug 2012 15:10:55 -0700, Ryan Lane  wrote:

Did these organizations need to use those SAML credentials directly for  
API

things or is this just another method we want to support for logging in?



https://meta.wikimedia.org/wiki/Wikimedia_Fellowships/Project_Ideas/The_Wikipedia_Library

That's the biggest current reason for wanting to support SAML.

- Ryan


Oh... so not SAML logins or SAML assertions for the api but acting as a  
provider of SAML accounts for logging into a 3rd party source with a  
Wikipedia account?


--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code ideas thread

2012-08-24 Thread Ryan Lane
> Oh... so not SAML logins or SAML assertions for the api but acting as a
> provider of SAML accounts for logging into a 3rd party source with a
> Wikipedia account?
>

Yeah. It's the more interesting situation at first. I'm sure there are
some interesting use cases the opposite way, but we have an immediate
use case to act as an identity provider.

- Ryan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Labs-l] Lists you might not know about

2012-08-24 Thread Tomasz Finc
And mobile-l - Discussing upcoming changes for phones, tablets, app, etc.

--tomasz


On Fri, Aug 24, 2012 at 2:59 PM, Sumana Harihareswara
 wrote:
> I've just updated https://www.mediawiki.org/wiki/Mailing_lists and
> https://meta.wikimedia.org/wiki/Mailing_lists/Overview#Mediawiki_and_technical
> .  Sorry for the spam, but you really should take a moment to skim and
> see whether there are lists there you should join.  I especially want to
> single out:
>
> design
> mediawiki-api-announce
> mediawiki-i18n -- localisation and internationalisation
> labs-l -- for when you have a question or request re Wikimedia Labs
> analytics
> wikitext-l -- the new Visual Editor & parser
>
> --
> Sumana Harihareswara
> Engineering Community Manager
> Wikimedia Foundation
>
> ___
> Labs-l mailing list
> lab...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/labs-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] replacing Planet software soon

2012-08-24 Thread Daniel Zahn
Hi,

i am planning to replace the current Planet Wikimedia software early next week.

For those who might not even know planet:  What is planet? -->
http://meta.wikimedia.org/wiki/Planet_Wikimedia

This is the current English planet as an example: -->
http://en.planet.wikimedia.org/

The original planet software we have used up until now is
unfortunately unmaintained and not available as a distribution package
nor was it puppetized.

First there was the original planet software (planetplanet.org), then
development stopped and then later it was continued as Planet 2.0.
Though there is also "Planet Venus", " "a radical refactoring of
Planet 2.0", and that is available as an Ubuntu package in universe :)

--> http://intertwingly.net/code/venus/  ,
http://packages.ubuntu.com/da/precise/planet-venus

---
quote from http://lwn.net/Articles/421348/:

".. However, Planet's development seems to have slowed considerably —
if not entirely stopped. The last updates in Jeff Waugh's repository
are dated early 2007.

Development seems to have carried on, somewhat quietly, with Planet
Venus. It's not reflected on the Planet site at all, but digging
through the mailing lists one finds development has continued under
the name Venus or Planet Venus. Venus is "a radical refactoring of
Planet 2.0," and development discussions continue on the old Planet
mailing lists."...
---

Planet Venus uses html5lib, XSLT and Django templates to parse the
feeds and create HTML.  You can read more about it here:
http://planet.wmflabs.org/html/

And here is a nice .svg showing the architecture is uses to parse
feeds:  http://planet.wmflabs.org/html/venus.svg

I had this running in labs for a while at http://planet.wmflabs.org/
and puppetized it.

You can find the puppet code in ./manifests/role/planet.pp and
./manifests/misc/planet.pp in the operations/puppet git repository.
And recent changes can be found under topic branch "planet".

https://gerrit.wikimedia.org/r/gitweb?p=operations/puppet.git;a=blob;f=manifests/role/planet.pp;hb=HEAD

https://gerrit.wikimedia.org/r/gitweb?p=operations/puppet.git;a=blob;f=manifests/misc/planet.pp;hb=HEAD

Additionally, with the help of James Alexander (thanks!), we recently
went through a major cleanup of feed URLs, fixing lots of
redirected/moved feed URLs and removed broken feeds.

This can be found here:
http://meta.wikimedia.org/wiki/Planet_Wikimedia#Requests_for_Update_or_Removal
 which also links to gerrit.

The new planet is already up here on a production host now:

http://zirconium.wikimedia.org/planet/

The English planet looks like this:  http://zirconium.wikimedia.org/planet/en/

That index.html page will disappear, it is just there to link to the
different language planets for testing. So to get it live i will just
switch DNS to point to the zirconium host and make the index redirect
to the page on meta, as it does now.

The feeds are currently all updated at 00:00 UTC via cron.

If you see any issues with that, please speak up soon.

And have a nice weekend,

Daniel
-- 
Daniel Zahn 
Operations Engineer

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedians are rightfully wary

2012-08-24 Thread MZMcBride
Daniel Zahn wrote:
> On Fri, Aug 24, 2012 at 2:50 PM, Platonides  wrote:
> 
>> http://meta.wikimedia.org/wiki/International_names_for_Village_Pump
>> http://meta.wikimedia.org/wiki/Distribution_list
> 
> Thanks, this is exactly what i was looking for. :)

This is by no means necessary, but my experience with global message
delivery has been that when you add a note about how you found that page
("I'm here because [[m:Distribution list]] said this was the appropriate
place to post."), it dramatically helps in keeping the distribution list
up-to-date. By giving people a pointer to your information source (and
giving them a way to edit the list themselves, of course), you empower them,
I've found. Just something to keep in mind.

Thanks, as always, for your work on these shell requests. Shell requests are
notoriously perilous (poor Jens and the English Wikipedia...). Hopefully
^demon will make a proper configuration system in short order and this work
can be given to the stewards. :-)

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Appreciation thread

2012-08-24 Thread Rob Lanphier
On Thu, Aug 23, 2012 at 2:30 AM, Niklas Laxström
 wrote:
> * Sumana for this idea.

Seconded.

Also thanks to:
*  Denny and Daniel for sending the very helpful summaries of Wikidata
blockers, and being patient with us as we review their work
*  Jack Phoenix for the well-ordered task list on Admin tools development[1]
*  Tim for letting me cut to the front of the line with a last minute
review request yesterday
*  Roan for jumping right on the jQuery issues this week

Of course, this is by no means comprehensive, just what I can think of
right now.

Rob

[1] http://www.mediawiki.org/wiki/Admin_tools_development

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Wiki Loves Monuments Android App v1.1 beta3

2012-08-24 Thread Philip Chang
Greetings WLM testers,

Below you'll find the last beta of our Android app before final launch in
one week.

As before, make sure to have "Unknown sources" in Settings => Applications
turned on.

Uploads will go to test wiki so feel free to upload whatever you like.

Download:

http://dumps.wikimedia.org/android/WLM-v1.1beta3.apk


Feedback:
http://www.mediawiki.org/wiki
/Wiki_Loves_Monuments_mobile_application/Feedback

Please try the following:
* Browse by campaign and drill down to the desired region's monuments
* Sort the list by name and address
* Search with a search term
* Open a monument, click on "Get directions"
* Click on add a photo
* Login or create a Commons account
* Choose from gallery or take a photo
* Choose "Save for Later" on the Confirm Upload screen
* Go back to the opening screen
* Click on "Use my current location"
* Move around the map, open a cluster (a group of monuments close together)
* Click on a pin, open the monument
* Add a photo (login should be retained)
* Choose "Save for Later" on the Confirm Upload screen
* Click OK and choose or take another photo
* Go to Uploads and see the uploads saved for later
* Select some of the deferred uploads and upload them

Let us know what you think!

Known issues:

* The word Campaign will be changed to Country
* After deleting or uploading monuments saved for later, screen does not
refresh

Please forward this email as appropriate.

-- 
Phil Inje Chang
Product Manager, Mobile
Wikimedia Foundation
415-812-0854 m
415-882-7982 x 6810
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Lua deployed to www.mediawiki.org

2012-08-24 Thread Birgitte_sb



> 
> 
> 
> 
> --
> 
> Message: 3
> Date: Wed, 22 Aug 2012 16:01:04 -0400
> From: Chad 
> To: Wikimedia developers 
> Subject: Re: [Wikitech-l] Lua deployed to www.mediawiki.org
> Message-ID:
>
> Content-Type: text/plain; charset=UTF-8
> 
> On Wed, Aug 22, 2012 at 3:51 PM, Tyler Romeo  wrote:
>> This is the exact kind of attitude the op-ed in the Signpost is addressing.
>> When making major feature decision, such as redoing the entire templating
>> system, we cannot just say to editors "oh, if you want some input, go and
>> join our mailing list". That's just a passive-aggressive way of pushing
>> editors out of the conversation. How many purely editors, i.e., not
>> developers, are on this list actively participating in discussion?
>> 
> 
> Which communities? Engaging N editing communities just doesn't
> scale. Nor, to be perfectly honest, do I think its the appropriate
> venue. I expect people to join the places technical discussions take
> place (this list + mediawiki.org), just as I expect I should have to
> join a wiki's discussion forums to discuss content/community things.
> I'm perfectly willing to engage anyone on anything I work on, but I
> don't want to repeat myself in 20 different places.
> 
> A long time ago, technical discussions happened on Meta. It was
> moved off of Meta since there's enough content to warrant its own
> wiki. Perhaps we can improve on getting notices out to people (hey,
> we're discussing FooBar, come talk with us [here]), but trying to
> shift the discussion to hundreds of individual wikis just doesn't work
> for me.
> 
> -Chad
> 

If people want to discuss to technical details of something they should join 
wikitech-l as you suggest. But I don't think others in this thread are asking 
about where the technical discussion of Lua took place. I think they are asking 
about the *other* discussion. The one we rarely seem to have which happens 
before there are labs, or code, or mock-ups.  Something like:

. . .

Dear wikimedia-l,

Templates have been horrendously painful for a long time and it seems like I 
will finally have the time to focus addressing this in the coming year.  I know 
the biggest problem is pages that fail load because timeouts and I hope to 
generally improve performance.  The other things I anticipate address (fill in 
the blank) about editing and using templates.  Also I plan on improve the some 
backend stuff that is off-topic for this list.  The down-side is that to take 
advantage of these improvements templates will have be re-written in a new way 
that no one is familiar with. But the good news is I couldn't make harder to 
write templates I tried!  It really shouldn't be that bad because the old 
template will still work just well/poorly as they did before.  So not every 
template will have to be rewritten in by the new system. We can focus on just 
re-writing the ones that are most problematic, and if people want to use the 
new method to replace benign ones it wills their choice. The other con of going 
this route is that it is a complete rewrite and may take a year or two before 
deployment. But honestly I don't see a better option to fixing the page that 
are break like this one.  LINK

So far I have started a page on MW. Some of it is pretty technical, but this 
link will take you where I have list the pros and cons of this solution and 
some feature it may include. LINK 

Please pass this on to the people who work the most with templates in your 
communities. I am hoping that those most familiar with templates will add to 
this list in the next two weeks so I will have the best information to finalize 
my plans for this. I have already posted this the few places I could think of. 
So if you can think of a group that would like to know about this and don't 
already see this message there please inform them.  

After the discussion at MW is done, I will email a follow to wikimedia-l and 
wikitech-l to let you know whether this something I will commit to take the 
lead on right now, and share my firm plans for development and the priorities 
for feature inclusion. Right now I am committed to nothing except resolving the 
broken page timeouts. After the follow-up email you will probably will not hear 
anything about this until there is something to test, or if I have enough 
testers, maybe not until we start planning deployment. But feel free to poke 
the talk-page on MW or email me for an update if you start to wonder how things 
are progressing. 

. . .

Discussion about development need not be a technical discussion. 

To your other point, I don't think one single instance of repeating yourself in 
20 places about a project you plan on spending a year of your life developing 
is very onerous. This doesn't hold for updates, but It would be nice if there 
we were better at announcing the beginning of a commitment to a project very 
widely. That can only make the project more successful. And I think we may 
agre