entation and too much stuff that feels like it might change further.
I'll wait until the release notes indicate impending doom!
Kind regards,
- Mark Clements
(HappyDog)
---
> Best,
> Máté Szabó
> Sr. Software Engineer
> he - him - his
>
> Fandom Poland sp. z o.o. z siedz
an make your code ready for Parsoid-PHP.
Based on that comment, I suspect that further upgrade work will be required
in due course, but at least I have solved the immediate problem for now!
Thanks for your help,
- Mark Clements
(HappyDog)
___
Wikitech
implementation. It would be useful if there were some
migration documentation to help extension developers migrate to the new
implementation. Is that on the roadmap, somewhere?
Kind regards,
- Mark Clements (HappyDog)
"Máté Szabó" wrote in message
news:80e88bac-ae9b-42ae-a0ba-834a39a7
od for manually expanding
template variables from within a parser hook?
Kind regards,
- Mark Clements (HappyDog)
--
Full example (with extension-specific code omitted):
--
function MyExtensionParserHook($Text, $Vars,
Thanks Gergo - that's very helpful.
- Mark
"Gergo Tisza" wrote in message
news:caevcxn08n4vu2ssyeatr7jfprhaiohovajpxgorrpgnqgho...@mail.gmail.com...
> On Tue, Jul 14, 2020 at 5:49 PM Mark Clements (HappyDog) <
> gm...@kennel17.co.uk> wrote:
>
>> I have an exte
get an answer. [1]
And I was literally coming onto this newsgroup to ask about this (and one
other thing, already posted), so it's very fortuitous timing! I'll take a
look and if I have any questions, I'll post on the talk page.
Thank you,
- Mark Clements (HappyDog)
[1] https://www.mediawiki.org/wi
edundant due to
LinksUpdate calls, or not?
Kind regards,
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
(or technical limitations) that resulted in
MediaWiki handling character encodings in this manner.
- Mark Clements (HappyDog)
[1] https://www.mediawiki.org/wiki/Manual:$wgDBmysql5
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
Thanks for all your great responses and tips! I'm really looking forward to
it.
See you there!
- Mark Clements (HappyDog)
Tim Starling tstarl...@wikimedia.org wrote in message
news:lqhsp5$aco$1...@ger.gmane.org...
On 21/07/14 00:24, Marc A. Pelletier wrote:
On 07/19/2014 05:47 AM, Mark
Hi there,
As it's on my doorstep, I've got myself tickets to WikiMania 2014, including
the hackathon.
I've never been to a hackathon before! What should I expect? Are any of
you guys coming?
- Mark Clements
HappyDog
___
Wikitech-l mailing
Andre Klapper aklap...@wikimedia.org wrote in message
news:1352303717.10307.18.ca...@embrace.foo...
On Tue, 2012-11-06 at 14:10 -0800, Quim Gil wrote:
Andre, I don't think we need a new resolution WAITING_FOR_UPSTREAM.
After reading Krinkle's and your email I agree that there is no urgent
dan entous d_ent...@yahoo.com wrote in message
news:75e303e6-3ec7-4a17-824e-c3d8d3378...@yahoo.com...
how about here :
http://www.mediawiki.org/wiki/Manual:Contents
... and add in any resources that may be missing to those pages.
with kind regards,
dan
I would much rather it went to a new
be no bad thing).
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
will continue on
the subsequent pages. In books where there all the photgraphic plates are
next to each other (which is very common, due to the printing/binding
process) the gap could be of an arbitrary length.
- Mark Clements (HappyDog
at ci.tesla.usability.wikimedia.org is taking too long to
respond.
Have tried several times over the past few hours, and no joy.
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Mark A. Hershberger m...@everybody.org wrote in message
news:874of3r2xp@everybody.org...
Mark Clements (HappyDog) gm...@kennel17.co.uk writes:
http://ci.tesla.usability.wikimedia.org/
The connection has timed out
Have tried several times over the past few hours, and no joy.
Sorry
, not for support issues, so you are
correct that mediawiki-l is probably the right place to post this kind of
question.
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo
of the software, they should be
supported. It is absolutely bonkers to ship with unsupported features,
particularly for such a critical element as the user interface.
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
or they could update
http://www.mediawiki.org/wiki/Mailing_lists with the new admin details.
Cheers,
-- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
What happened here? I thought the list didn't accept attachments!
- Mark Clements (HappyDog)
Alexander Shulgin alex.shul...@gmail.com wrote in message
news:729abac10911170053t1c3147e5q49d2b8ac2b6c1...@mail.gmail.com...
Hello everyone,
I'm totally new to this list, so please pardon me
Robert Rohde raro...@gmail.com wrote in message
news:b4da1c6e0911120739o1636328j37f56a01ceccf...@mail.gmail.com...
On Thu, Nov 12, 2009 at 5:00 AM, Mark Clements (HappyDog)
gm...@kennel17.co.uk wrote:
snip
This was changed during the parser rewrite several versions ago. The
restructuring
Robert Rohde raro...@gmail.com wrote in message
news:b4da1c6e0910061725m326860f1i402edfd9e6629...@mail.gmail.com...
On Tue, Oct 6, 2009 at 4:47 PM, Mark Clements (HappyDog)
gm...@kennel17.co.uk wrote:
[SNIP - Re: ParserAfterStrip]
On MW 1.14 and above the code within the nowiki tags is parsed
Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
/testwiki/BugSquish
Cheers,
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
parser handles just fine: '''Photo of L'''arc de triomphe'' by 'John
- Mark Clements (HappyDog)
[1] I'm ignoring all the document-structure requirements, plus
character-encoding issues, etc. that complicate things a bit.
___
Wikitech-l mailing
talk:Foo
Template:Foo
User talk:Foo
User:Foo
This is probably desirable, but I thought it worth pointing out...
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo
as it is...
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
be fixed.
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
then include(../index.php) instead of
commandLine.inc, and voila - all code-duplication at the various entry
points has been removed.
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman
are unaware
of.
True - and not specifying a scope means the visibility defaults to public.
Therefore requiring a scope to be declared can only be a good thing.
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
. All the name-conflict
problems that would occur in any attempt to resolve the changing image file
format problem would obviously apply here, but that might be better than
dropping the type information given by an extension altogether.
- Mark Clements (HappyDog
known-to-fail tests which start to pass.
I don't see a need for any extra command-line args if implemented as
described above.
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman
Brion Vibber br...@wikimedia.org wrote in message
news:4a6dc653.9010...@wikimedia.org...
On 7/27/09 6:05 AM, Mark Clements (HappyDog) wrote:
As long as one requires files have explicit type suffixes (e.g.
.jpg, .svg, etc), one can use the allowed list to determine what
file names to translate
Platonides platoni...@gmail.com wrote in message
news:h2685k$3d...@ger.gmane.org...
Steve Bennett wrote:
On Thu, Jun 25, 2009 at 8:43 PM, Mark Clements
(HappyDog)gm...@kennel17.co.uk wrote:
I noticed the problem here:
http://en.wikipedia.org/wiki/Neil%27s_Heavy_Concept_Album
Now that you
Steve Bennett wrote in message
news:b8ceeef70906250229s1211f650nbb5dc10d4a13e...@mail.gmail.com...
On Wed, Jun 24, 2009 at 11:19 PM, Mark Clements (HappyDog) wrote:
I've just noticed that on English Wikipedia links such as [[Pink
Floyd]]'s
no longer include the 's as part of the resulting
I've just noticed that on English Wikipedia links such as [[Pink Floyd]]'s
no longer include the 's as part of the resulting link. Is this a parser
bug, or a deliberate change in the way links are being parsed? I don't
recall seeing an announcement about it.
- Mark Clements (HappyDog
, etc.)
* Comments
* Category links
* Extension-supplied HTML-style tags.
For a relatively complete list, and some exploration of the problems, see
http://www.mediawiki.org/wiki/Markup_spec.
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
.
http://ria.apri.nl/mediawiki-1.14.0/index.php?title=Main_Pageaction=editoldid=25
(compare text in standard edit box before and after clicking 'save to
wikitext').
This is the kind of thing I'm talking about... :-)
- Mark Clements (HappyDog
is, by definition
'the right result'.
No, because the parser currently behaves inconsistently with this particular
phrase (which I have logged as bug 18765 [1]).
- Mark Clements (HappyDog)
[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=18765
Chad innocentkil...@gmail.com wrote in message
news:5924f50a0904220554n32c3a4ecrd1cbc8cebcd74...@mail.gmail.com...
On Wed, Apr 22, 2009 at 7:12 AM, Mark Clements (HappyDog)
gm...@kennel17.co.uk wrote:
[Description of WikiDB implementation of revision handling SNIPPED]
Perhaps we should add
the performance hit). Wouldn't solve the problem of non-PHP files
being updated, but would solve the rest.
- Mark Clements (HappyDog).
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
if it's relevant to Wikimedia
You're right - it is a wiki which is why I trust the information that's
been there since 2006, rather than the information that was added this
morning.
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech
sure you'll come to terms with it eventually.
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
methods if
memcached is not available? Obviously this is not applicable to WMF wikis,
but I imagine the majority of wikis run without any external caching (beyond
whatever is present in MW itself).
- Mark Clements (HappyDog)
___
Wikitech-l mailing
be, but I'm sure there
are some uses, for example if you have had $wgMaxRedirects set to a high
number but the server load is too great, and so you want to reduce it. This
would allow you to find all affected redirects and fix them before you make
the change to the setting.
- Mark Clements (HappyDog
a parser function that sets tab text directly would be better,
e.g. {{#settab:nstab|Main Page}}{{#settab:talk|Main Page discussion}}
And that should be in an extension.
- Mark Clements (HappyDog)
___
Wikitech-l mailing list
Wikitech-l
Aryeh Gregor simetrical+wikil...@gmail.com wrote in message
news:7c2a12e20901071221w43eef7b5ld19e3ba47760d...@mail.gmail.com...
On Wed, Jan 7, 2009 at 3:17 PM, Mark Clements (HappyDog)
gm...@kennel17.co.uk wrote:
And that should be in an extension.
Probably ParserFunctions.
I think
than are actually available, at least it does to me.
In software I have developed, we tend to use sysadmin and techadmin to
differentiate the two.
- Mark Clements (HappyDog).
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
Roan Kattouw [EMAIL PROTECTED] wrote in message
news:[EMAIL PROTECTED]
Mark Clements (HappyDog) schreef:
It doesn't, and it isn't... :-(
That said, the PHP_Compat PEAR module contains a PHP4 version of the
clone
function (providing you use clone($wgParser)) which I might try
Platonides [EMAIL PROTECTED] wrote in
message news:[EMAIL PROTECTED]
Mark Clements (HappyDog) wrote:
I don't do this on page view (in the data tag handler), as it is a
relatively expensive operation to clear out the old entries and reparse
the
data tags, and the data will not have changed
51 matches
Mail list logo