Please STOP spamming
On Nov 21, 2013, at 11:06 AM, gabjgervais gabjgerv...@gmail.com wrote:
1. *gabriel gervais* @*gabou1696* https://twitter.com/gabou1696
17 hhttps://twitter.com/gabou1696/status/403014599944847361
E VOUS ENVOIS CE LINK AFFILIE GRATUIT TOUT CE QUE VOUS AVEZ A FAIRE
No, no, no... It is dynamically generated, but you're not supposed to
concat strings.
From PHP you use the Title class:
$title = Title::newFromText( 'Foo' );
if ( $title ) { // Invalid titles may be null
$title-getLocalURL();
}
And from JS make sure that the 'mediawiki.Title' is loaded and
.http://pdf.cbpassiveincome.com/realsecret/download.php?vip=gabou2014
2013/11/21 Bill Traynor btray...@gmail.com
On Thu, Nov 21, 2013 at 1:01 PM, Daniel Friesen
dan...@nadir-seen-fire.comwrote:
No, no, no... It is dynamically generated, but you're not supposed to
concat strings.
All
Figured it out myself. I just used concat to prepend the wiki's URI to
each page name.
On Thu, Nov 21, 2013 at 9:20 AM, Bill Traynor btray...@gmail.com wrote:
I've written an SQL query to extract some data from our wikidb as follows:
select
page_title,
u.user_name,
u.user_email,
Dear all,
today I realize that the table objectcache has 457217 Entrys with 1,1 GiB in
size.
But I have disabled caching!
my text table has 47356 Entries with 354,7 MiB
all the entrys in objectcache have within the column “exptime” the Value
“2038-01-19 03:14:07”
in localsettings I have these
You can always safely truncate that table. (No idea what exactly might be
causing this behavior.)
--
Matma Rex
___
MediaWiki-l mailing list
MediaWiki-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
On Thu, Nov 21, 2013 at 1:01 PM, Daniel Friesen
dan...@nadir-seen-fire.comwrote:
No, no, no... It is dynamically generated, but you're not supposed to
concat strings.
All I'm after is a dump of some data from the wiki database that can be
imported to a spreadsheet. In that data, page name
I've written an SQL query to extract some data from our wikidb as follows:
select
page_title,
u.user_name,
u.user_email,
rev_timestamp,
c.cl_to
from page p
join revision r
on p.page_id = r.rev_page
join user u
on r.rev_user = u.user_id
On 11/21/2013 05:36 AM, Yan Seiner wrote:
I am using the collections extension
http://www.mediawiki.org/wiki/Extension:Collection to create PDFs of
selections of our wiki.
Is anyone aware of a way to create these pdfs off-line using a cron job?
I'd like to run a midnight snapshot of the
((Sorry, screwed up, let me fix that))
No, no, no... It is dynamically generated, but you're not supposed to
concat strings.
From PHP you use the Title class:
$title = Title::newFromText( 'Foo' );
if ( $title ) { // Invalid titles may be null
$title-getLocalURL();
}
And from JS make sure
10 matches
Mail list logo