On Feb 20, 2013 at 3:54 pm, Tim Starling wrote:
The idea of storing a database in a large string literal could
be made to be fairly efficient and user-friendly if a helper
module was written to do parsing and a binary search.
I have implemented the above suggestion with some promising results.
On Fri, Feb 22, 2013 at 4:41 AM, Johnuniq wp.johnu...@gmail.com wrote:
On Feb 20, 2013 at 3:54 pm, Tim Starling wrote:
The idea of storing a database in a large string literal could
be made to be fairly efficient and user-friendly if a helper
module was written to do parsing and a binary
On Tuesday, February 19, 2013 at 4:27 AM, Tim Starling wrote:
On 19/02/13 21:11, MZMcBride wrote:
Hi.
In the context of https://bugzilla.wikimedia.org/show_bug.cgi?id=10621,
the concept of using wiki pages as databases has come up. We're already
beginning to see this:
*
On Feb 19, 2013 at 9:11 PM, MZMcBride wrote:
https://en.wikipedia.org/wiki/Module:Convertdata
I'm guilty of that, and what's been worrying me is that there are
hundreds more units to add. Some guidance on using Lua as a database
would be very desirable.
Quick tests suggest that if {{convert}}
On 19/02/13 21:11, MZMcBride wrote:
Hi.
In the context of https://bugzilla.wikimedia.org/show_bug.cgi?id=10621,
the concept of using wiki pages as databases has come up. We're already
beginning to see this:
* https://en.wiktionary.org/wiki/Module:languages (over 30,000 lines)
*
So unfortunately I don't have a clear idea of what the problem is,
primarily because I don't know anything about the Parser and its inner
workings, but as far as having all the data in one page, here's something.
Maybe this is a bad idea, but how about having a PHP-array content type. In
other
2013/2/19 Tim Starling tstarl...@wikimedia.org
On 19/02/13 21:11, MZMcBride wrote: Has any thought been given to what to
do about this? Will it require
manually paginating the data over collections of wiki pages? Will this be
something to use Wikidata for?
Ultimately, I would like it to
In the long term, Wikidata is probably the way to go on something like this.
In the short term, as far as dividing things up, note that you can
implement on-demand loading in Lua easily enough using the __index
metamethod.
local obj = {}
setmetatable( obj, {
__index = function ( t, k
On 19/02/13 13:56, Tyler Romeo wrote:
So unfortunately I don't have a clear idea of what the problem is,
primarily because I don't know anything about the Parser and its inner
workings, but as far as having all the data in one page, here's something.
Maybe this is a bad idea, but how about
I wrote:
The performance of #invoke should be OK for modules up to
$wgMaxArticleSize (2MB). Whether the edit interface is usable at such
a size is another question.
The Wiktionary folk are gnashing their teeth today when they
discovered that in fact, loading a 742KB module 1200 times in a
You can already use subpages to store data. Access is then O(1) The
problem is that then you have one page per entry.
I know. What I'm suggesting is an interface where the sub-pages aggregate
up the hierarchy, meaning you can still edit the main top-level page, and
the backend will simply
On 02/19/2013 06:21 PM, Tim Starling wrote:
The Wiktionary folk are gnashing their teeth today when they
discovered that in fact, loading a 742KB module 1200 times in a single
page does in fact take a long time, and it trips the CPU limit after
about 450 invocations . So, sorry for raising
On 20/02/13 15:07, Victor Vasiliev wrote:
On 02/19/2013 06:21 PM, Tim Starling wrote:
The Wiktionary folk are gnashing their teeth today when they
discovered that in fact, loading a 742KB module 1200 times in a single
page does in fact take a long time, and it trips the CPU limit after
about
13 matches
Mail list logo