be awesome!
Thanks a lot,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Yep, I just had a look and all the searchindex tables are empty and the
tables doesn't seem to be accessed at all when using $wgSearchType =
'DcsCirrusSearch', so there's definitely no need for any MyISAM anywhere :-)
On 28/07/19 2:33 PM, Aran via Wikitech-l wrote:
> We're using InnoDB >
We're using InnoDB > 5.7 so full text search is definitely supported -
but also we use Elastic search anyway, does this mean we don't even need
the searchindex table?
On 28/07/19 1:07 PM, Manuel Arostegui wrote:
> On Sat, Jul 27, 2019 at 4:03 PM Bartosz Dziewoński
> wrote:
>
>> The
that InnoDB is the best option,
especially for recent versions. Would it be a good idea to change them
all to InnoDB?
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
of threads assigned to each
group..? But then why does the "gwt" group have runners=0?
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
the prioritisation of various job types differently. All
I've really been able to find about the configuration is the README in
the repo.
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo
to always get executed almost immediately, which makes the
job-queue a workable solution.
Thanks,
Aran
On 18/06/19 11:41 AM, Kosta Harlan wrote:
> My understanding is that eventually there will be enforcement in the
> WMF production environments, but I’m not sure about MediaWiki itself.
>
&g
|In a MediaWiki-based project I'm working on I'm getting many of these
kinds of exceptions: [DBPerformance] Expectation (writes <= 0) by
MediaWiki::main not met (actual: 8) |
|I've read up on the Database transactions article in mediawiki.org and
can see that to remove the exceptions we'd need to
"No file by this name exists, but you can upload
it".
I've tried running all the related maintenance scripts such as
rebuildImages.php with no success :-(
Any ideas?
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia
s
>
> Hope this helps!
> --scott
>
> On Fri, Oct 19, 2018 at 7:25 AM Aran via Wikitech-l
> <mailto:wikitech-l@lists.wikimedia.org>> wrote:
>
> Hi,
>
> just wondering what the situation is with
> $wgVisualEditorParsoidForwardCookies these days. The doc
works fine without me
having set this or allowed anonymous edits by localhost. Is there no
longer any need to worry about settings specifically for locked down wikis?
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
erard wrote:
> https://wikimediafoundation.org/wiki/Ways_to_Give
>
> Thank you very much!
>
>
> - d.
>
> On 22 August 2017 at 23:58, Aran <a...@organicdesign.co.nz> wrote:
>> If you guys had a bitcoin option in your donation form you'd get more
>> donations (like from me!). I
If you guys had a bitcoin option in your donation form you'd get more
donations (like from me!). I don't have any balance in paypal, but
sending some bitcoin would be easy.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Perhaps 1.28's script already does this since my 1.28's didn't have this
problem, their backlinks were up to date immediately after running the
script.
On 02/06/17 16:34, zppix e wrote:
> Sounds like a decent comprimise. I'm not to familar with the code behind
> maintaince scripts to do this
made. Thus I needed to refresh the links manually which the
maintenance script.
On 31/05/17 15:17, Aran wrote:
> No I just ran the refreshLink.php maintenance script from the shell.
>
>
> On 31/05/17 15:12, יגאל חיטרון wrote:
>> Did you use a bot? An API javascript? In any case
No I just ran the refreshLink.php maintenance script from the shell.
On 31/05/17 15:12, יגאל חיטרון wrote:
> Did you use a bot? An API javascript? In any case, try the rest one.
> Igal
>
> On May 31, 2017 21:04, "Aran" <a...@organicdesign.co.nz> wrote:
>
>>
Strange, I've just noticed after coming back to it that the 1.27's are
looking fine now... does the refreshLinks script launch child processes
in the background or something?
On 31/05/17 15:03, Aran wrote:
> Hello,
>
> I have some wikis which have had some of their conten
's in the same situation the links updated fine after
running the script... is this a known issue? is there any way I can
update the links in the 1.27's without upgrading the wiki?
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
and --batchSize and all sorts but nothing allows it to do more than this
amount at a time anyone have any idea what might be going on here?
(It's happening on both MW 1.27 with Elastic Search 1.75 and MW 1.28
with elastic 2.4.4)
Thanks,
Aran
___
Wikitech-l
Fri, Mar 31, 2017 at 8:25 AM, Aran <a...@organicdesign.co.nz> wrote:
>
>> Hello,
>>
>> I have some wikis running elastic search and after some development
>> updates I find I need to rebuild the search indexes which I do as follows:
>>
>> .../CirrusSea
them with the --namespace option?!
On 31/03/17 13:02, Erik Bernhardson wrote:
> On Fri, Mar 31, 2017 at 8:25 AM, Aran <a...@organicdesign.co.nz> wrote:
>
>> Hello,
>>
>> I have some wikis running elastic search and after some development
>> updates I find
search results...
What is the proper command I should run to completely nuke and rebuild
all indexes on a wiki?
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
This has finally been solved!
The problem was a parser-function that relied on the $wgTitle global
which was not available in the context of a command-line index update.
Now it's using $parser->getTitle() instead of relying on a global and
all results are present :-)
Thanks for your help,
A
wrote:
> On Thu, Jan 26, 2017 at 11:30 AM, Aran <a...@organicdesign.co.nz> wrote:
>
>> Hello,
>>
>> I'm managing some mediawiki 1.27.1's running CirrusSearch 0.2 with
>> Elasticsearch 1.7.5. I been noticing that there are often search results
>> missing so
are:
forceSearchIndex.php --skipLinks --indexOnSkip
forceSearchIndex.php --skipParse
Is this the correct way to do a full index rebuild? is there some
parameter that can ensure that no pages get missed?
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l
You need to add the "new" class to a link for it to be a red link, but
there's no simple way to do this in standard wikitext markup. You'd have
to something like enable raw html, add an extension that allows adding
attributes to link syntax, or you maybe add a CSS rule so you could for
example put
the login form.
What is the proper way I should be making the login page determine login
without a login form?
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hi Guys,
I'm using CirrusSearch on mw1.27 and wondering if there's a way to add
custom parameters to the Elastica back-end such as for example,
"sort": ["namespace_text": "asc"]
Thanks,
Aran
___
Wiki
-L53
>
> This example may be more complicated than what you need, but it's a
> starting point from which you can simplify. here is a bit more
>
> -- Krinkle
>
>
>
>
> On Sat, Oct 22, 2016 at 7:12 PM, Aran <a...@organicdesign.co.nz> wrote:
>
>> Hi,
>
Can someone please tell me a way to override a system message such as
'nosuchuser' or 'wrongpassword' from PHP?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Oh sorry I mis-read your message, you mean the article not the file..
but still it would be preferable if the functionality could be all
packaged into the extension without needing to apply content as well.
On 22/10/16 16:15, James Hare wrote:
> On Saturday, October 22, 2016 at 2:12 PM, A
:
> On Saturday, October 22, 2016 at 2:12 PM, Aran wrote:
>> Hi,
>>
>> I have an extension that change some of the default system messages for
>> example "talk" to "comment", but since upgrading to 1.27 these messages
>> no longer change. I've tried re
ese default messages any more. New messages that the extension
introduces can be changed no problem.
Does anyone here know the proper procedure for doing this?
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.w
I developed a MediaWikiLite system many years ago which worked
reasonably well. It was for having a wiki on a memory stick that
included the content and ability to edit it in the field without net
access. It ran on SQLite and use Nanoweb, a web-server written in PHP to
reduce dependencies further.
Ah yes that was the problem, the run rate was set to zero! Thanks a lot :-)
On 07/08/16 16:03, John wrote:
> Check your job queue?
>
> On Sun, Aug 7, 2016 at 2:43 PM, Aran <a...@organicdesign.co.nz> wrote:
>
>> Hi,
>>
>> I've installed the current CirrusS
$wgDisableSearchUpdate=true in LocalSettings.php and have
tried settings it to false too.
Any ideas what might be the trouble?
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
preview.
On 23/06/15 12:14, Ori Livneh wrote:
On Tue, Jun 23, 2015 at 7:23 AM, Aran a...@organicdesign.co.nz wrote:
Also for those that prefer the client to do the work I recently made an
extension to use the highlight.js module at highlightjs.org.
https://www.mediawiki.org/wiki
Also for those that prefer the client to do the work I recently made an
extension to use the highlight.js module at highlightjs.org.
https://www.mediawiki.org/wiki/Extension:HighlightJS
On 22/06/15 21:48, Ori Livneh wrote:
Hello,
Over the course of the next two days, a major update to the
I just noticed that Article::getContent is deprecated now, and the code
says that WikiPage::getContent is now the preferred method. What's the
recommended way to get the current text content of a normal wikitext
article now? Would it be this?
$text =
The the CategoryWatch extension was created a few years ago for this,
but hasn't been updated in a long time so may need some minor fixes by now.
https://www.mediawiki.org/wiki/Extension:CategoryWatch
On 30/03/15 12:40, Kai Nissen wrote:
Hi there,
I created a Request for comments[1] and a
Hello,
I'm trying to install parsoid on Ubuntu 12. I installed nodejs from
source, but when I try and install parsoid via apt-get it fails saying
that it depends on nodejs (= 0.8.0) even though node --version returns
v0.10.31!
Anyone have any ideas what could be wrong?
Cheers,
Aran
Yeah I tried installing from apt-get first, but it installed 0.6.x,
Ubuntu 12 is quite old now.
On 28/08/14 13:23, Gabriel Wicke wrote:
On 08/28/2014 08:46 AM, Brad Jorsch (Anomie) wrote:
On Thu, Aug 28, 2014 at 11:25 AM, Aran a...@organicdesign.co.nz wrote:
I'm trying to install parsoid
Yeah that's what it installed, so then I uninstalled and did it from
source instead.
On 28/08/14 13:40, Brad Jorsch (Anomie) wrote:
On Thu, Aug 28, 2014 at 12:23 PM, Gabriel Wicke gwi...@wikimedia.org
wrote:
What happens when you just do a 'apt-get install nodejs' ?
Presumably it installs
:
On Thu, Aug 28, 2014 at 11:25 AM, Aran a...@organicdesign.co.nz wrote:
I'm trying to install parsoid on Ubuntu 12. I installed nodejs from
source, but when I try and install parsoid via apt-get it fails saying
that it depends on nodejs (= 0.8.0) even though node --version returns
v0.10.31
are rendering with no problem...?
I noticed that the mysqdump command was using
--default-character-set=latin1 which may be a legacy setting now?
Help much appreciated
Thanks a lot,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
Problem solved thanks!
Just needed the same default charset on import :-)
On 10/07/14 16:36, Aran wrote:
Hello,
I've recently had a server disaster and had to recover my wiki from
backup dumps. But after importing all the special characters are mangled
causing many links to be broken
in Foo will show up in the results,
but none of the sub-categories of Foo.
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Sorry I phrased that very badly, I meant:
If I write the query {{#ask:[[Category:Foo]]}} then all the normal pages
which are in Category:Foo will show up in the results, but none of the
sub-categories of Foo (i.e. category pages which are members of
Category:Foo).
On 26/03/14 15:19, Aran
Wouldn't WebSocket be the better choice for a full duplex channel?
On 14/11/13 21:12, Lee Worden wrote:
In the MW extension development I'm doing, I'm thinking of writing
some operations that use [[Comet_(programming)]] to deliver continuous
updates to the client, rather than the Ajax pattern
My command was:
action = 'options',
token = $token,
format = 'xml',
change = 'realname=Foo Bar'
Which resulted in success but did nothing.
On 10/10/13 06:22, Andre Klapper wrote:
Hi Aran,
On Wed, 2013-10-09 at 16:53 -0300, Aran wrote:
I'm trying to set a user's realname
valid
option such as password etc). But when I look in the user's preferences,
or directly in the user db table there's nothing changed.
Is there something else I need to configure? or is this a bug?
Aran
___
Wikitech-l mailing list
Wikitech-l
Yep that's my problem, thanks a lot :-)
On 02/07/13 07:28, Bartosz Dziewoński wrote:
This is most likely bug 45054, fixed in MediaWiki 1.21. It has a
rather simple workaround, too, see
https://bugzilla.wikimedia.org/show_bug.cgi?id=45054 .
___
binary with no errors.
Any ideas what may be wrong?
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
anywhere to say why it's not been able to do it.
On 02/07/13 12:32, Aran wrote:
Hi Guys,
I've just upgraded my wiki from 1.19.2 to 1.21.1 to fix the SVG
rendering problem which now is all fine, but now my Math rendering has
broken. I'm getting the following error:
Failed to parse (PNG
://www.organicdesign.co.nz/File:Nginx-logo.svg
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
at 6:30 PM, Aran Dunkley a...@organicdesign.co.nzwrote:
The file was a .doc, but I've tried changing it to docx and get the same
result. Some other .doc and .docx files that are word 2007 upload no
problem. But I don't see how it can complain when I have the MimeType
verification disabled
Hello, can someone please help me with this .doc upload problem? I've
tried everything and even setting $wgVerifyMimeType to false fails to
solve it. No matter what I do I keep getting the following error when I
upload *some* word 2007 .doc files:
The file is a corrupt or otherwise unreadable
since only specific users can upload.
p.s. this is a MW 1.19.2 on Ubuntu Server 11.10 with PHP 5.3.6
On 07/01/13 21:21, Matma Rex wrote:
On Mon, 07 Jan 2013 23:47:58 +0100, Aran Dunkley
a...@organicdesign.co.nz wrote:
Hello, can someone please help me with this .doc upload problem? I've
tried
Hi Guys,
How do I get a specific version of an extension using git?
I want to get Validator 0.4.1.4 and Maps 1.0.5, but I can't figure out
how to use git to do this...
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
Hello, does anyone here know why the parser insists on adding p tags
to HTML results returned by parser functions?
I'm trying to upgrade the TreeAndMenu extension to work with MediaWiki
1.19 and I can't get the parser to stop adding p tags throughout the
result returned by the parser-function
/2012 07:56, Aran Dunkley wrote:
Hello, does anyone here know why the parser insists on adding p tags
to HTML results returned by parser functions?
I'm trying to upgrade the TreeAndMenu extension to work with MediaWiki
1.19 and I can't get the parser to stop adding p tags throughout the
result
Hi,
I'm just wondering how to clone an extension for a particular branch...
e.g. using Subversion I could do this:
svn co
http://svn.wikimedia.org/svnroot/mediawiki/branches/REL1_18/extensions/CheckUser
What's the equivalent git command to get that same version of the extension?
Thanks,
Aran
creates a new revision, but
the manual process does something more which places the article into the
category page properly.
Cheers,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I noticed that new MediaWiki installations have a new meta tag which is
not in the HTML specification and means that MediaWiki sites cannot be
made W3C compliant.
For example, running the validator over mediawiki.org gives the
following error (amongst a few others complaining about deprecrated
I've figured it out now thanks :) the problem was that my scripts had
things running inline that needed to be deferred until after the JS
modules had loaded.
On 02/08/11 18:10, Roan Kattouw wrote:
On Mon, Aug 1, 2011 at 1:55 PM, Aran Dunkley a...@organicdesign.co.nz wrote:
Yes I'm using both
( array( 'jquery.ui' ) );
But nothing I do will actually get the script to load and become
available, can anyone tell me the new syntax to get jQuery there?
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
to get the jQuery scripts to load. The browsers
error log just says that the $, jQuery and other functions aren't defined.
On 01/08/11 23:51, Jelle Zijlstra wrote:
Are you aware that jQuery and jQuery UI are two different things?
2011/8/1 Aran Dunkley a...@organicdesign.co.nz
Hello, I'm trying
Hello,
I have a lot of deleted pages in my wiki and was wondering how to free
up the database by getting rid of them completely. I don't want to loose
the history of the pages that aren't deleted though, is there an
extension for this?
Thanks,
Aran
Hello, I've just upgraded the Cite extension on our wikis and it's now
giving me this error on all refs: Cite error: Ran out of custom link
labels for group . Define more in the
[[MediaWiki:cite_link_label_group-]] message.], does anyone know the
quickest way to fix this?
it no
matter what I use for export or import.
A solution would be greatly appreciated as their site has been down for
almost a week now while trying to solve this.
Thanks,
Aran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
`page_random` (`page_random`),
KEY `page_len` (`page_len`)
) ENGINE=MyISAM AUTO_INCREMENT=19105 DEFAULT CHARSET=latin1
COLLATE=latin1_spanish_ci
Platonides wrote:
Aran Dunkley escribió:
Hi I'm wondering if anyone can help with this multibyte character
corruption:
http
If the Google Wave Fedaration Protocol is really all its cracked up to
be and becomes a properly open XMPP extension maintained by the IETF
then it could be good to support it at the level of the back end along
with the database classes...
Magnus Manske wrote:
On Mon, Aug 3, 2009 at 4:07 PM,
I've had this trouble too, I just commented out the offending line -
that could be a problem under some specific condition, but it works fine
for me as an interim solution.
O. O. wrote:
Brad Jorsch wrote:
On Mon, Mar 16, 2009 at 04:47:18PM -0700, O. O. wrote:
I have installed
Hi I've tried upgrading my 1.11 to 1.14 and get this illegal mix of
collations error. I went through the normal upgrade procedure first but
this failed, so I then tried exporting as XML and importing into a
completely fresh 1.14 install, and still I get the error!
I've found that by setting
Hi I'm just wondering what the policy is with regards to changes to
extension code in the svn in the case where the modification is
compatible only with recent versions. Shouldn't extensions be designed
to be as backward compatible as is practical rather than focussing
exclusively on
I've looked in to this a little but still in quite a pie in the sky way,
I made the sqlite db layer with the idea of it being simpler to
incorporate into a client based mediawikilite app, I made some notes
at these articles:
http://www.organicdesign.co.nz/MediaWikiLite
76 matches
Mail list logo