One thing could be looking for an article in one language but either not
knowing the correct phrase or exact spelling but knowing enough to use
related categories to find the article on a different language project. And
go from there
On Sunday, September 22, 2013, Gerard Meijssen wrote:
Hoi,
VE is a BAD idea, its full of holes and bugs
On Tue, Dec 3, 2013 at 4:32 PM, David Gerard dger...@gmail.com wrote:
On 3 December 2013 21:26, Mark A. Hershberger m...@everybody.org wrote:
We've put together RC3 for 1.22.0. Please test the tarball and report
any bugs you find on Bugzilla:
was saying Tyler.
On Friday, December 6, 2013, Tyler Romeo wrote:
On Tue, Dec 3, 2013 at 4:39 PM, John phoenixoverr...@gmail.com
javascript:;
wrote:
VE is a BAD idea, its full of holes and bugs
Those are two separate concepts. Just because something has bugs does not
make
Editing via tor is possible on WMF wikis if the account / user is trusted
On Monday, December 30, 2013, Thomas Gries wrote:
Hi,
during the 30C3 Congress [1] in Hamburg - where neither Wikipedia
Foundation nor MediaWiki were formally present this year (but should be
next year)-
Jacob
Give me 25 minutes and ill join
On Monday, December 30, 2013, Thomas Gries wrote:
Am 30.12.2013 23:01, schrieb John:
Editing via tor is possible on WMF wikis if the account / user is
trusted
Can you explain this briefly, or send me a pointer ?
This single info can be a help for him
Unless you are modifying the blacklisted URL you should be able to edit
around it.
On Thu, Jan 2, 2014 at 10:36 AM, Tuszynski, Jaroslaw W.
jaroslaw.w.tuszyn...@leidos.com wrote:
Over the years while editing on Commons, I run several times into lock
down pages that can't be edited. The issue
All connections to tools.wmflabs is out, ssh, ftp, http, and https are all
non-functional. I am getting zero response on IRC, can someone with Coren's
mobile number give him a call?
John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
to be issue on NFS again (IMHO) can't confirm that of course...
Anyway, I can't even ssh there, I /could try/ to reboot some of the
boxes like tools-dev but I am afraid it's not going to help anything.
On Sun, Feb 9, 2014 at 4:43 PM, John phoenixoverr...@gmail.comjavascript:;
wrote:
All
It works for me.
On Monday, March 17, 2014, David Cuenca dacu...@gmail.com wrote:
Hi,
When I type bach on the top right en.wp search box, I only have the
option to select Bach from the list. This option however takes me to
Bạch (with a dot under the a).
You can also use the localuser table in the CA database.
On Thu, Mar 27, 2014 at 8:35 PM, Dan Andreescu dandree...@wikimedia.orgwrote:
Thank you very much for the reply Max, +1 beer for next time we meet.
On Thu, Mar 27, 2014 at 5:10 PM, MZMcBride z...@mzmcbride.com wrote:
Teresa Cho
Probably a local DNS issue. Since the move the associated IP addresses
changed
On Mon, Mar 31, 2014 at 2:43 PM, Arthur Richards aricha...@wikimedia.orgwrote:
+wikitech-l/qa
I presume this is a byproduct of the migration. This is urgent.
On Mon, Mar 31, 2014 at 11:25 AM, Ryan Kaldari
Works for me
On Mon, Mar 31, 2014 at 2:47 PM, Arthur Richards aricha...@wikimedia.orgwrote:
en.wikipedia.beta.wmflabs.org and the mobile equivalent are not working.
I've tried just using the given IP addresses, but that results in a message
that says 'Domain not configured'. What is going on?
Looks the the wikibugs is having issues, and needs poked
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
It should be returned, we had this same argument not that long ago.
On Sat, Apr 26, 2014 at 12:51 AM, Legoktm legoktm.wikipe...@gmail.comwrote:
On 04/25/2014 09:10 PM, John wrote:
Looks the the wikibugs is having issues, and needs poked
The new wikibugs is now in #wikimedia-dev, and also
The discussion resulted in keeping wikibugs in #mediawiki
On Sat, Apr 26, 2014 at 8:11 AM, Bartosz Dziewoński matma@gmail.comwrote:
It should not, we had this same argument not that long ago.
--
Matma Rex
___
Wikitech-l mailing list
, regardless to how broken the code is.
On Sat, Apr 26, 2014 at 8:29 AM, Bartosz Dziewoński matma@gmail.comwrote:
On Sat, 26 Apr 2014 14:13:49 +0200, John phoenixoverr...@gmail.com
wrote:
The discussion resulted in keeping wikibugs in #mediawiki
Not true, we even had a patch merged
Correct, we migrated some of the bots, but left wikibugs there. That was
the outcome of the previous discussions
On Sat, Apr 26, 2014 at 4:04 PM, Antoine Musso hashar+...@free.fr wrote:
Le 26/04/2014 06:51, Legoktm a écrit :
On 04/25/2014 09:10 PM, John wrote:
Looks the the wikibugs
Might be an issue with the JavaScript decoder/player process that is used
on wiki. While direct access and downloaded relies on other methods
On Saturday, May 10, 2014, Daniel Mietchen daniel.mietc...@googlemail.com
wrote:
Dear all,
is anyone here at the hackathon in Zurich who can help us
Looks like Javascript/CSS is intermittently failing.
http://en.wikipedia.org/wiki/Wikipedia:Village_pump_%28technical%29#Something_is_broken
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Here is an interesting deployment idea, roll out but leave as opt-in for a
period of time, after a month or so, set as default for anons and new
accounts, During the whole process keep track of who has enabled it and
then disabled it, vs never enabled it. After a period of time move everyone
who
Actually there are a few cases in the non API where bots can assert not
being a bot, and there are some cases where non-bots can flag as bots for
specific cases (I know it in the past it was used to suppress RC floods of
mass vandalism reverts by admins) so your picture isnt complete
On Mon, May
Looks like something happened to the IRC bot. Havent seen a bug report in
several hours
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
How would this work for non-wmf wikis? what about executing JavaScript that
is posted to a approved wiki? This would make XSS and a whole host of other
problems a lot easier to do. So we whitelist commons.wikimedia.org whats
stopping a user from making a user subpage with some JS code that
there is currently a patch in gerrit which would do exactly that if merged.
On Tue, Jun 10, 2014 at 11:22 PM, Isarra Yos zhoris...@gmail.com wrote:
On 11/06/14 02:30, Tim Starling wrote:
In CR comments on https://gerrit.wikimedia.org/r/#/c/135290/
it has been proposed that we make a git
My suggestion, Flip the fuck out Its a really bad idea that wasnt thought
though, If users want SwiftMailer support it should be done in an
extension, and not in core
On Tue, Jun 10, 2014 at 11:30 PM, Isarra Yos zhoris...@gmail.com wrote:
On 11/06/14 03:24, John wrote:
there is currently
What about doing the reasonable thing and leaving core the hell alone? this
should be an extension and not shoved into core.
On Tue, Jun 10, 2014 at 11:43 PM, Daniel Friesen dan...@nadir-seen-fire.com
wrote:
On 2014-06-10, 7:30 PM, Tim Starling wrote:
I have suggested, as a compromise, to
, Isarra Yos zhoris...@gmail.com wrote:
On 11/06/14 03:31, John wrote:
My suggestion, Flip the fuck out Its a really bad idea that wasnt
thought
though, If users want SwiftMailer support it should be done in an
extension, and not in core
On Tue, Jun 10, 2014 at 11:30 PM, Isarra Yos zhoris
Ouch, thanks for wasting a few of my brain cells. This is why do dont add
stupid code to core.
My web server doesnt have curl installed, nor does it have /usr/bin/local/
You havent bothered to think your code through. Why dont you un-fuck your
code, configure it as an extension and go from
No one has really addressed the point of making this an extension and not
adding the excessive overhead to core.
Especially for something that may have such a wide impact.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
I'm sending this from my mobile, but doesn't the cite doi generate an
external link?
On Saturday, June 14, 2014, Maximilian Klein isa...@gmail.com wrote:
That is a a great Idea for just dealing with those cases of {{Cite doi}}.
However I just realized from your response that the scope of this
Ok, taking a closer look all you need to do is to track external link
usage. It appears that all the cite templates use a
http://dx.doi.org/XX format URL.
write a program to parse/keep track of the uses It shouldnt be that hard.
___
Look up the assert parameter in the API
On Jun 19, 2014 6:52 AM, Petr Bena benap...@gmail.com wrote:
Hi,
Is there some effective way to do this? We are using only mw api's in
latest huggle, and somehow it happens that when users are logged out
of mediawiki, it still works (edits are done
token?
On Thu, Jun 19, 2014 at 12:54 PM, John phoenixoverr...@gmail.com wrote:
Look up the assert parameter in the API
On Jun 19, 2014 6:52 AM, Petr Bena benap...@gmail.com wrote:
Hi,
Is there some effective way to do this? We are using only mw api's in
latest huggle, and somehow
Actually the WMF servers run Ubuntu, and it refers to the apt-get install
mediawiki method as being unsupported
On Sun, Jul 20, 2014 at 4:16 PM, Christopher Wilson gwsuper...@gmail.com
wrote:
I think that's referring to the package distributed by Ubuntu/Canonical,
not support for running the
Why not just move them to an extension? moving them to their own repo begs
for a headaches
On Mon, Jul 28, 2014 at 4:50 PM, Maarten Dammers maar...@mdammers.nl
wrote:
Hi Bartosz,
Sounds good.
Bartosz Dziewoński schreef op 28-7-2014 20:53:
If you're upgrading a wiki or you're a developer
and forking left and right it gets ugly. We already have an existing
framework for adding modules to mediawiki (Extensions) let's use that vs
re-inventing the wheel.
On Monday, July 28, 2014, Bartosz Dziewoński matma@gmail.com wrote:
On Mon, 28 Jul 2014 22:59:40 +0200, John phoenixoverr
Exactly what I warned about. Yet another example of poor thinking/execution
and exactly what I predicted.
On Thu, Aug 7, 2014 at 12:02 PM, James HK jamesin.hongkon...@gmail.com
wrote:
Hi,
I just went on to `git pull --rebase origin master` on getting MW
1.24 master and suddenly I see a
with the
LocalSettings.php from - I don't know - MW 1.16 or something. You
expect that to work? Really?
Maybe everybody could cool down a bit and keep thing on a non-personal
level? I'd appreciate it.
On 7 August 2014 18:18, John phoenixoverr...@gmail.com wrote:
Exactly what I warned about. Yet
How feasible would it be to enable file access/linking to files on a given
filesystem without having to upload them?
Use case, I have a documentation system in /server/docs which I provide
access internally via a file share to all users. However remote users are
unable to access that share.
How
see https://bugzilla.wikimedia.org/show_bug.cgi?id=43210
On Mon, Dec 24, 2012 at 5:12 AM, Matma Rex matma@gmail.com wrote:
On Mon, 24 Dec 2012 11:00:36 +0100, Jon Robson jdlrob...@gmail.com wrote:
Last week I was working on a feature that I didn't want to surface on
a disambiguation page.
That is why I was suggesting adding a field to the page table, instead
of using page_props
On Wed, Dec 26, 2012 at 9:56 AM, Andrew Dunbar hippytr...@gmail.com wrote:
It would also be great if these pages were marked in the dump files too.
It should be exactly the same way as how redirect pages
Can you go into some more detail?
On Wed, Jan 30, 2013 at 4:38 PM, rupert THURNER
rupert.thur...@gmail.com wrote:
hi,
is there any possibility to have a list of users with contributions similar
to:
I think there are still some serious issues with this extension, I
have checked several pages, and used the max limit parameter and all
it returns is a single thumb
On Fri, Feb 1, 2013 at 8:20 AM, Max Semenik maxsem.w...@gmail.com wrote:
On 01.02.2013, 9:21 MZMcBride wrote:
Max Semenik wrote:
Its broken, on pages where there are multiple images it just shows the
first one
On Friday, February 1, 2013, Max Semenik wrote:
On 01.02.2013, 18:14 John wrote:
I think there are still some serious issues with this extension, I
have checked several pages, and used the max limit parameter
I have a basic pywikipedia script that can purge a watch list
On Thursday, February 28, 2013, Katie Chan wrote:
On 28/02/2013 14:22, Tuszynski, Jaroslaw W. wrote:
I do a lot of work on Commons, while using a setting that all pages I
edit go to my watchlist. In the past I was able to control
Its a matter of encoding, see my post on the bug
On Thu, Mar 14, 2013 at 6:29 PM, Daniel Zahn dz...@wikimedia.org wrote:
note: this did not happen from the beginning and does not apply to
other languages (or at least not all of them),
so it depends which feeds you subscribe to and their
I know Aaron has spent a lot of time on the job queue. But I have
several observations and would like some feedback. The current workers
apparently select jobs from the queue at random. A FIFO method would
make far more sense. We have some jobs that can sit there in the queue
for extended periods
Feel free to drop me a mail off list, Ive got a TS account and will gladly
lend a hand with the reports
On Wed, May 22, 2013 at 4:24 PM, Petr Onderka gsv...@gmail.com wrote:
This is probably not what you want to hear, but one way would be to get a
Toolserver account.
That way, you wouldn't
Im a python programmer, your whole approach to strings/unicode needs help.
The encoding issue you have isnt due to the library but rather coder error.
If you want to jump on IRC I can talk you through the issues.
On Sun, Jul 14, 2013 at 4:23 PM, Strainu strain...@gmail.com wrote:
2013/7/14
There are several ways to import/export. Are you looking to copy just page
content or do you need images and/or user accounts?
On Tue, Jul 16, 2013 at 10:16 PM, Ryan Rick Acta rie...@yahoo.com wrote:
Hi,
I am a technical writer and I have a lot of questions.
Our company has been using
!
--
*From:* John phoenixoverr...@gmail.com
*To:* Ryan Rick Acta rie...@yahoo.com; Wikimedia developers
wikitech-l@lists.wikimedia.org
*Sent:* Wednesday, July 17, 2013 10:29 AM
*Subject:* Re: [Wikitech-l] Export/Import Wiki questions
There are several ways to import
Without having the origin page making the connection wouldnt be possible.
(you would just end up suggesting the most common result in stead of the
most accurate )
On Tue, Jul 16, 2013 at 10:37 PM, Lee Worden worden@gmail.com wrote:
Maybe it could be done with just the Referer field on the
for disabling the loading of the associated JavaScript,
If users what to test VE again they can, but lets put this piece of shit to
rest until its not a piece of shit
John
On Mon, Jul 22, 2013 at 12:51 PM, Tyler Romeo tylerro...@gmail.com wrote:
On that note, I think we should start forcing MediaWiki
Minimal java-script load my ass, I guess you must be using a fiber-optic
connection. Most pages already have a lag due to the amount of JS needed to
run the site. Jumping pages have been a normal thing since resourceloader
(caused by lagging JS issues)
On Mon, Jul 22, 2013 at 2:36 PM, Erik
it is rarely a better
product)
John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Screen scraping is evil. I think your issue is HTTPS
On Fri, Aug 30, 2013 at 10:02 PM, Steve Summit s...@eskimo.com wrote:
I have a bot editing script that started having trouble logging
in to the English Wikipedia a few days ago. I think what's
happening is that the login process started
Could the geoip check also disable the preference check mark?
On Tue, Sep 3, 2013 at 10:05 PM, Tyler Romeo tylerro...@gmail.com wrote:
On Tue, Sep 3, 2013 at 8:43 PM, Chris Steipp cste...@wikimedia.org
wrote:
Problem is (I think) we defaulted it on, so most users in China have the
developers design and host tools (quite a few of them are long term
stable projects)? and what can we do to remedy this?
John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
files keeping the existing file information. (that is the biggest
drawback of http://commons.wikimedia.org/wiki/Commons:Tools/Commonist )
John
On Fri, May 25, 2012 at 11:47 AM, Platonides platoni...@gmail.com wrote:
As some of you are already aware, I'm doing for this GSoC a Desktop tool
https://bugzilla.wikimedia.org/show_bug.cgi?id=28339 has been just sitting
their stale for quite a while. I know as a toolserver user, that there is a
potential for a lot of useful tools. Who do I need to bribe or murder in
order to facilitate this process?
John
Old logs/ events
On Wednesday, August 8, 2012, Tyler Romeo wrote:
Hey,
Maybe I'm missing something here, but why does Wikipedia still use the
Oversight extension if it has since be superseded by core functionality.
I'm sure there's a simple explanation, I just can't find it. :)
*--*
Do you have a list of legitimate known good accounts?
On Fri, Aug 24, 2012 at 3:27 AM, Yury Katkov katkov.ju...@gmail.com wrote:
Hi everyone!
I have found myself in the following situation several times: I
created a wiki for some event or small project, everything works fine
and after the
Given enough facts it would be rather easy for me to write a script
that nukes said spam I did something similar on
http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
all the pages to a brand new wiki
solves this problem. Are there any other solutions where the dropping
of my old spammed database does not involved?
-
Yury Katkov
On Fri, Aug 24, 2012 at 4:13 PM, John phoenixoverr...@gmail.com wrote:
Given enough facts it would be rather easy for me
database! Thanks John,
that's a perfect solution!
-
Yury Katkov
On Fri, Aug 24, 2012 at 7:51 PM, John phoenixoverr...@gmail.com wrote:
What can be done after mass deleting is to purge the archive database
table which should reduce the database size significantly. If you take
a look
katkov.ju...@gmail.com wrote:
Hi John, thanks! Take your time! If you already have such a script,
and can share it - please do! But if not - I think it will be a good
exercise in pywikipediabot or extension development for me.
-
Yury Katkov
On Fri, Aug 24, 2012 at 7:55 PM, John phoenixoverr
, 2012 at 8:07 PM, John phoenixoverr...@gmail.com wrote:
Its rather easy to write in pywiki I just need some information from
you about your wiki. (IE are all edits after X date bad, we only have
Y valid users and here are their names) exc stuff like that allows me
to tailor the script to your needs
Ive got a script but would like to test it before I make it public. If
someone has a site with spam and would let me test it, it would be
appreciated
On Fri, Aug 24, 2012 at 12:20 PM, Derric Atzrott
datzr...@alizeepathology.com wrote:
Its rather easy to write in pywiki I just need some
access to
said tools?
John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
in charge of getting it done...
On Sun, Aug 26, 2012 at 9:05 PM, John phoenixoverr...@gmail.com wrote:
Central Auth has been around for about 5 years now and we still lack a
API to interact with it. There is no
blocking/unblocking/locking/unlocking ability at all. see
https
what purpose would the dump serve? you dont want to keep the full dump
on the device.
On Sun, Sep 9, 2012 at 2:34 PM, Roberto Flores f.roberto@gmail.com wrote:
Greetings,
I have developed an offline Wikipedia, Wikibooks, Wiktionary, etc. app for
the iPhone, which does a somewhat decent
Take a look at http://en.wikipedia.org/w/api.php?action=parse it is
exactly what you are looking for. Also a 7GB app is something you want
to CLEARLY state as eating up that much device space/ download
bandwidth is probably a problem for most users
On Sun, Sep 9, 2012 at 3:07 PM, Roberto Flores
What does autopratica mean?
On Sun, Sep 16, 2012 at 10:28 AM, Amir E. Aharoni
amir.ahar...@mail.huji.ac.il wrote:
Hi,
If I search for the word autopratica [1] in the Italian Wikipedia,
the article Terapia [2] comes up as the first result. That word
doesn't appear in that article. Why does it
in an old version of a page? I couldn't find
it in any recent version.
--
Amir
2012/9/16 John phoenixoverr...@gmail.com:
What does autopratica mean?
On Sun, Sep 16, 2012 at 10:28 AM, Amir E. Aharoni
amir.ahar...@mail.huji.ac.il wrote:
Hi,
If I search for the word autopratica [1
=ALkJrhg95PI450ibQEbFfvza4EX5zR_uXQ
On Sun, Sep 16, 2012 at 10:57 AM, Amir E. Aharoni
amir.ahar...@mail.huji.ac.il wrote:
I'm not sure that I understand what you mean.
2012/9/16 John phoenixoverr...@gmail.com:
That would be because the word cure is a redirect to that page
On Sun, Sep 16, 2012 at 10:43 AM
ccing both lists, dab just rebooted s3 which is probably the cause
On Thu, Sep 27, 2012 at 12:28 PM, Jeremy Baron jer...@tuxmachine.com wrote:
On Sep 27, 2012 12:19 PM, Harsh Kothari harshkothari...@gmail.com wrote:
Toolserver is not responding on GU:WP.. So that members of Gujarati
Wikipedia
Im ccing Tparis on this, it should be running without issue
On Fri, Sep 28, 2012 at 9:33 AM, Harsh Kothari
harshkothari...@gmail.com wrote:
when it will start working??
Thanks
Harsh
On 28-Sep-2012, at 12:31 AM, John wrote:
ccing both lists, dab just rebooted s3 which is probably the cause
His wiki is clean, Ive found that the scripts require tweaking for each wiki
On Sat, Oct 6, 2012 at 9:21 AM, Yury Katkov katkov.ju...@gmail.com wrote:
Tyler, how are the results? John, can you upload it on some
repository? Google code, github?
P.S. Sorry for that super-late response, I
It is fairly easy to create a new user group with just that one right.
On Sunday, October 7, 2012, Marcus Buck wrote:
Hello,
On Wikipedia the right to import articles via Special:Import is bound to
the user group 'administrator' by default.
On nds.wp we discussed that it would be useful for
according to http://dumps.wikimedia.org/backup-index.html the last dump was
2010-11-10 04:30:28 is that page broken or are dumps still halted?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
its just a matter of matching page titles, if there is a page in namespace 0
and a page in namespace (article and article talk) with the same title they
go together. its fairly simple
John
On Sat, Jan 8, 2011 at 11:29 AM, Diederik van Liere dvanli...@gmail.comwrote:
Dear dev's,
I am
Yeah, all you need to do is remove the incorrect links from all affected
articles.
You sorta did that with -localright. however that just fixed the correct
article but still left some articles pointing to the wrong article. you need
to fix every article, as long as one page has the wrong link it
The issues should be fixed, if you continue to have issues let me know
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Depends on the language, enwiki can take weeks to dump, while rue wiki may
only take 30 seconds.
On Tue, Feb 15, 2011 at 12:34 PM, Anthony Ventresque (Dr)
aventres...@ntu.edu.sg wrote:
On Tue, Feb 15, 2011 at 9:29 AM, Anthony Ventresque (Dr)
aventres...@ntu.edu.sg wrote:
I was indeed
What is your username?
John
On Thu, Mar 10, 2011 at 11:10 AM, William Allen Simpson
william.allen.simp...@gmail.com wrote:
Recently, I've had a rare need to check interwiki links. I've discovered
that many times, the universal login credentials are not working.
For example, after login
a
note on the mailing list
John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
: bytes=32 time=164ms TTL=45
Ping statistics for 208.80.152.2:
Packets: Sent = 4, Received = 1, Lost = 3 (75% loss),
Approximate round trip times in milli-seconds:
Minimum = 164ms, Maximum = 164ms, Average = 164ms
Hope this helps someone figure out the problem
John
a user Jan Kucera (Kozuch) (with email of garba...@seznam.cz) has been mass
changing the priority of bugs based on a very crude vote count. this is
very disruptive and counter productive issue. I would ask that one of our
devs mass revert this please.
Johh
See bug 19311 https://bugzilla.wikimedia.org/show_bug.cgi?id=19311 its a
known issue
John
On Tue, Aug 30, 2011 at 4:55 AM, Billinghurst billinghu...@gmail.comwrote:
I am trying to relate the edit counts at English Wikisource and I am
wondering whether one
or more of the namespace
Ive fixed the offending article, it was a simple case of vandalism that was
reverted, but was also added to the persondata template and never removed
because its not visible.
John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
mysql select count(*) from archive;
+--+
| count(*) |
+--+
| 33263574 |
+--+
1 row in set (8 min 47.50 sec)
On Sun, Sep 25, 2011 at 5:13 PM, melvin_mm melvin...@gmx.de wrote:
Bryan Tong Minh bryan.tongminh at gmail.com writes:
On Sun, Sep 25, 2011 at 9:09 PM,
It would be rather easy to find those number, just need to parse the upload
logs. (grab a dev or file a DBQ request
https://jira.toolserver.org/browse/DBQ )
John
On Thu, Sep 29, 2011 at 4:53 PM, Strainu strain...@gmail.com wrote:
Hey guys,
I just wanna say a _big_ thank you to all the people
irc.wikimedia.org has been down for six hours without any information can
someone please take a look and give us some information?
John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
We are currently having a widespread failure with no one on IRC
John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I submitted https://bugzilla.wikimedia.org/show_bug.cgi?id=31469 when 1.18
was rolled out. Is there any way that we can get this issue fixed for the
1.18 tarball?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Uh, Creating sleeper accounts from good IPs lettting them go stale beyond
CU retention, and you have an infinite number of accounts you can then use
to skip past the softblocks on tor and create havoc. Anything short of a
hard block wont stop open proxy abuse.
On Wed, Oct 1, 2014 at 10:44 AM,
And any kind of account creation block will cause issues with users who
work across multiple projects as SUL auto account creation is also blocked.
On Wed, Oct 1, 2014 at 10:57 AM, John phoenixoverr...@gmail.com wrote:
Uh, Creating sleeper accounts from good IPs lettting them go stale beyond
Prior to TOR being enabled we need to be able to flag both logged in and
logged out edits made via TOR.
On Wed, Oct 1, 2014 at 11:00 AM, Brian Wolff bawo...@gmail.com wrote:
On Oct 1, 2014 11:40 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org
wrote:
On Wed, Oct 1, 2014 at 10:29 AM, Brian
The abuse filter has no way of identifying TOR exit nodes, thus it cannot
be used for this. Some developer will need to interface with the TOR
blocking code and use the same TOR identification methods to ID and label
both logged in and logged out edits made via TOR.
My example means that unless TOR is hard blocked attackers can create 6
accounts per day on there home IP and just wait till they go stale and use
6 attack accounts per day. There isn't a need for infinite accounts, just
that soft blocking is pointless in this case
On Wednesday, October 1, 2014,
1 - 100 of 584 matches
Mail list logo