Tim Starling wrote:
> We still haven't heard from Faidon who, last I heard, still reads his
> emails by piping telnet into less or something. But I think he can
> make sense of multipart/alternative as long as it's not base-64
> encoded...
I'm not Faidon, and I'm not even a regular contributor to
This probably isn't the right place to report this, but if anyone
knows anyone who maintains GeoHack, today it's emitting KML
placemark names containing raw ampersands, which Google Earth
doesn't like. Example: [[Baltimore & Ohio Railroad Bridge,
Antietam Creek]]. Manually editing it to solves
bawolff wrote:
For comparision, how many revision control systems allow editing commit
messages.
Perforce does.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Martijn Hoekstra wrote:
Wow, that escalated quickly. How did we go from hey, what's the deal with
this? To YOURE BURNING THE WIKI in a few posts?
Easy: because it's a hard question, with excellent arguments on
both sides.
Clearly, people are going to make typos in edit summaries from
time to
bawolff wrote:
On Apr 21, 2014 9:21 PM, Lars Aronsson l...@aronsson.se wrote:
Do we have any guidelines for how to hand-write the
source code of SVG diagrams? Should we?
Really this seems like the domain of commons to set standards for svg
writing. I think this should be brought up over
That would be a good guess, but the script handles redirects and
https just fine -- or at least it did, when those changes went
into effect a month ago. It was working fine up until this past
Tuesday or Wednesday, when it stopped being able to log in.
Tyler Romeo wrote:
My guess is either the
Ah, spoke too soon.
It was handling redirects and https, but not always redirects
and https and POST. But if I simply reconfigure the script to
hit the https: addresses from the beginning (meaning the server
doesn't have to send any redirects at all), everything works
fine. Dunno why I didn't
I have a bot editing script that started having trouble logging
in to the English Wikipedia a few days ago. I think what's
happening is that the login process started using Javascript in a
way it didn't before, and is detecting that my script doesn't do
Javascript (which it doesn't), and throwing
Today I'm noticing that if I visit someone else's user or talk
page (this is on en.wp), I see a little orange box saying
Talk: you have new messages even though I don't. Presumably
that user does, or something.
___
Wikitech-l mailing list
Risker wrote:
On 23 July 2013 15:32, Robert Rohde raro...@gmail.com wrote:
* Corrupted page content that appears to be caused by the unfamiliar
UI (e.g. nowiki[[Foo]]/nowiki)
Why do you think those nowiki tags were added by the editors?
I assume that since it's VE's job to be wysiwyg, and to
Tyler Romeo wrote:
On Mon, Jul 22, 2013 at 9:35 PM, James Forrester
jforres...@wikimedia.orgwrote:
Each added preference adds to the complexity of our software -
so increasing the cost and slowness of development and testing,
and the difficulty of user support.
Stop being so dramatic. This
S Page wrote:
Note Mediawiki.org doesn't have a Git tutorial. There are tons of those
on the web...
Git+Gerrit is fundamentally hard and complicated...
So perhaps there could be a little section somewhere saying
something like:
Using Git and Gerrit effectively requires understanding
David Gerard wrote:
This page came up with raw mathtex, then I saw a math rendering xx%
counter at bottom right, then 15 seconds later I had the page:
https://en.wikipedia.org/wiki/Noether's_theorem
I admit this sort of page would make a good stress test ...
Possibly related: the Math
MZMcBride wrote:
Ryan Lane wrote:
Again with the phrasing. Cut it out.
Sincerely, I'm still a little unclear what phrasing you object to here. Just
to be perfectly clear, it's the use of the word mess, right? If so, I can
make note not to use that word going forward on this list.
It may be
MZMcBride wrote:
Right... well, again, just like the OP, you're focusing on how you feel the
world should be while completely ignoring reality. It's not a matter of
catering to obstinate IT folks. It's a matter of being pragmatic about the
current landscape and its limitations.
This touches,
William Allen Simpson wrote:
This replacement password is much more easily guessed.
The account could have been stolen within minutes or hours.
Is this true? (Yes, I know that a fast machine can try zillions
of passwords per hour in theory, but for a reasonably designed
system, certainly not
Would I sound like a reactionary old crank if I asked why
the coding challenge welcome page requires JavaScript?
Without it, one can't even see the list of challenges!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Platonides wrote:
jida...@jidanni.org wrote:
On some of the Wikipedia sites, there are some messages near the top of
each page. [...] This causes the entire page to jerk up and down the
screen...
Do they change in the same page view, or when changing pages?
(I have seen the later, when
Emmanuel Engelhart wrote:
Titles should be stored in the table page with a first letter uppercased...
Unfortunately, it seems that we have XML dumps (and consequently
mwdumper generated SQL) containing titles with a first letter lowercased.
For example:
$wget
Roan Kattouw wrote:
[The volunteers'] role, IMO, is to keep the collaborative environment
positive. This means being welcoming to new staff, embracing them,
pat them on the shoulder when they to things right and correct them
when they do things wrong, while keeping their patience.
I feel
David Gerard wrote:
On 13 August 2010 22:05, Aryeh Gregor simetrical+wikil...@gmail.com wrote:
Oracle is only suing Google because Google is redistributing Java
without paying them, and because they're using a modified version (so
technically they're not covered by the patent grants), and
daniel wrote:
I have put some basic info about requireing the User-Agent header at...
This way, there's a place where can point people for more info.
Thanks, but FWIW, the very first sentence:
Wikimedia sites require a HTTP User-Agent header for all requests.
is false. (As near as I
Yes, that's precisely the violation of Postel's Law I was
thinking of.
Steve, someone is sending us this User-Agent, is that you?:))
No. :-|
Let me tell you a story. Once upon a time, there was a browser
named SeaMonkey...
I have no idea what point you were trying to make there (I had
Domas wrote:
We don't use UA as first step of analysis, it was helpful tertiary tool...
But it's now being claimed (one might assume, in defense of the
new policy) that disallowing missing User-Agent strings is cutting
20-50% of the (presumably undesirable) load. Which sounds pretty
primary.
Conrad wrote:
Given the lack of of any evidence, I assert that most of the percentage
of people who a) notice a problem, b) care, c) know how to fix it;
probably deserve to be using the resources anyway. Besides anyone who
doesn't deserve but still fixes the problem will likely be able to, and
Ariel Glenn wrote:
I understand it's aggravating to people who didn't get notice;
let's look forward. PLease just add the UA header and your tools
/ bots/ etc. will be back to working. Thanks.
Well, sorry, no, it's not quite like that. A few of us -- though
I fear an inconsequential
Robert Rohde wrote:
If you going to do such blocking can we PLEASE finally find a way to
set up a more informative error message for blocked user agents...
When the new code blocks requests with missing User Agent strings
(which is, oddly, not all of the time), it is with a 403
Forbidden
Domas wrote:
from now on specific per-bot/per-software/per-client User-Agent
header is mandatory for contacting Wikimedia sites.
Oh, my. And not just to be a bot, or to edit the site manually,
but even to view it. You can't even fetch a single, simple page
now without supplying that header.
Domas wrote:
Hi Steve,
But why?
Because we need to identify malicious behavior.
You're trying to detect / guard against malicious behavior using
*User-Agent*?? Good grief. Have fun with the whack-a-mole game, then.
___
Wikitech-l mailing list
Since some sites have regexes that assume that major version is one
character long, the Opera developers had to resort to reporting a 9.x
version in the old place, and append the actual version later.
Good grief. That's one of the stupidest things I've heard in some time.
dgerard wrote:
Are you *sure* we can't put a narky message when iPhone users click a
video? Adobe do!
http://twitpic.com/kf361
I'm not up on the details of Flash, so this comment may be
misguided, but *if* the reason Apple restricts these unstated
technologies is for security reasons, then
Gregory Maxwell wrote:
For instance, take the UK service providers surreptitiously modifying
Wikipedia's responses on the fly to create a fake 404 when you hit
particular articles.
Urk. (Can someone cite the details?)
(2) You could script clients to kick users to a malware installer...
I don't know anything about link preview popups, but does the
issue discussed in this thread:
http://en.wikipedia.org/wiki/Wikipedia:Help_desk/Archives/2009_June_12#.27Adult.27_picture_on_the_Help_Desk.3F
indicate a buglet that could/should be fixed? Should the preview
code know about
Andrew Garrett wrote:
On Wed, Feb 18, 2009 at 7:12 PM, Steve Summit s...@eskimo.com wrote:
Sometime between yesterday and today, the edit summary field on
en.wp's edit page lost its type=text attribute.
Is it causing any problems?
No, just a curiosity.
It was part of some much-needed code
Sometime between yesterday and today, the edit summary field on
en.wp's edit page lost its type=text attribute. It now reads:
input name=wpSummary size=60 value= id=wpSummary
maxlength=200 tabindex=1 /
Lo and behold, type=text is the default, so it doesn't actually
break a
mizusumashi wrote:
Please see [[w:en:User:Mizusumashi/workspace]] with Firefox.
Don't you see moved [edit] links near at the second image?
I see two sections and two edit links, both of them moved down to
roughly the bottom edge of the first section's image. I see this
all the time on the real
Jeff Ferland wrote:
You'll need a quite impressive machine to host even just the current
revisions of the wiki. Expect to expend 10s to even hundreds of
gigabytes on the database alone for Wikipedia using only the current
versions.
No, no, no. You're looking at it all wrong. That's
37 matches
Mail list logo