[Wikitech-l] Downloading templates

2009-01-13 Thread Dawson
Hello,

I am looking for a way to download just the templates, even just a way  
to download the basic templates that you need to make an infobox would  
be fine.

I have setup a mediawiki and I'm looking to use the infobox template  
from wikipedia. I've tried to manually copy the template but then  
realised there were too many transcluded pages and this would take me  
forever, which led me to 
http://en.wikipedia.org/wiki/Wikipedia_talk:Database_download#Downloading_templates
 
  and it appears that quite a few other people are asking the same  
question as me. One answer that's been given is to download pages- 
articles.xml.bz2, however it's 4.1G and includes all Wikipedia  
articles, and not just the templates.

Could someone advise?

Thank you, Dawson

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Downloading templates

2009-01-13 Thread Dawson
Thank you Daniel, excellent!

2009/1/13 Daniel Friesen 

> Special:Export
> There is a checkbox for exporting templates that are used in the pages
> you select.
>
> ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://nadir-seen-fire.com]
> -Nadir-Point (http://nadir-point.com)
> -Wiki-Tools (http://wiki-tools.com)
> -Animepedia (http://anime.wikia.com)
> -Narutopedia (http://naruto.wikia.com)
> -Soul Eater Wiki (http://souleater.wikia.com)
>
>
>
> Dawson wrote:
> > Hello,
> >
> > I am looking for a way to download just the templates, even just a way
> > to download the basic templates that you need to make an infobox would
> > be fine.
> >
> > I have setup a mediawiki and I'm looking to use the infobox template
> > from wikipedia. I've tried to manually copy the template but then
> > realised there were too many transcluded pages and this would take me
> > forever, which led me to
> http://en.wikipedia.org/wiki/Wikipedia_talk:Database_download#Downloading_templates
> >   and it appears that quite a few other people are asking the same
> > question as me. One answer that's been given is to download pages-
> > articles.xml.bz2, however it's 4.1G and includes all Wikipedia
> > articles, and not just the templates.
> >
> > Could someone advise?
> >
> > Thank you, Dawson
> >
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] mwdumper ERROR Duplicate entry

2009-01-15 Thread Dawson
Hello,

I have used Special:Export at en.wikipedia to export  
"Diabetes_mellitus" and ticked the box "include templates" (I'm only  
really after the templates).

The resulting XML file is 40.1mb so I decided to go with mwdumper.js  
rather than Special:Import.

I'm working on a fresh build of mediawiki on my local system. When  
running the command:

java -jar mwdumper.jar --format=sql:1.5 Wikipedia-20090113203939.xml |  
mysql -u root -p wiki

It is returning the following error:

1 pages (0.102/sec), 1,000 revs (102.062/sec)
ERROR 1062 (23000) at line 99: Duplicate entry '45970' for key 1
Exception in thread "main" java.io.IOException: XML document  
structures must start and end within the same entity.
at org.mediawiki.importer.XmlDumpReader.readDump(Unknown Source)
at org.mediawiki.dumper.Dumper.main(Unknown Source)
Caused by: org.xml.sax.SAXParseException: XML document structures must  
start and end within the same entity.
at  
org 
.apache 
.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source)
at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown  
Source)
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
at org.apache.xerces.impl.XMLScanner.reportFatalError(Unknown Source)
at  
org 
.apache.xerces.impl.XMLDocumentFragmentScannerImpl.endEntity(Unknown  
Source)
at org.apache.xerces.impl.XMLDocumentScannerImpl.endEntity(Unknown  
Source)
at org.apache.xerces.impl.XMLEntityManager.endEntity(Unknown Source)
at org.apache.xerces.impl.XMLEntityScanner.load(Unknown Source)
at org.apache.xerces.impl.XMLEntityScanner.scanContent(Unknown Source)
at  
org 
.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanContent(Unknown  
Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl 
$FragmentContentDispatcher.dispatch(Unknown Source)
at  
org 
.apache 
.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source)
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown  
Source)
at org.apache.xerces.jaxp.SAXParserImpl.parse(Unknown Source)
at javax.xml.parsers.SAXParser.parse(SAXParser.java:176)
... 2 more

Can anyone please advise? After some googling the only advice I  
managed to find was:

"Before you start, try clearing the tables that mwdumper works in:

DELETE FROM page; DELETE FROM revision; DELETE FROM text; "

I have done this and tried again, but the same error continues.

Many thanks, Dawson


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] mwdumper ERROR Duplicate entry

2009-01-15 Thread Dawson
I figured I would go into the XML and manually remove the offending  
duplicate page/revision, but couldn't find it.

I have gone from top to bottom of the XML file and find no template  
information, even though "include templates" was ticked.

I know it's a lot to ask, but could you take a quick look Daniel? 
http://dawson.md/Wikipedia-20090113203939.xml.zip 
  (XML/1.9mb)

Basically, I'm working on a wiki project that stores information about  
diseases and I just want to use wikipedia's Template:Infobox_Disease.  
I tried to download it manually and all associated templates and  
transcended template files but this was just too complicated and would  
of taken forever. Someone on the list suggested I use Special:Export  
and tick the "include templates" box. This is where I'm now up to.

All suggestions/help welcomed.

Thank you, Dawson

On 15 Jan 2009, at 12:22, Daniel Kinzler wrote:

> Dawson schrieb:
>> Hello,
>>
>> I have used Special:Export at en.wikipedia to export
>> "Diabetes_mellitus" and ticked the box "include templates" (I'm only
>> really after the templates).
>>
>> The resulting XML file is 40.1mb so I decided to go with mwdumper.js
>> rather than Special:Import.
>>
>> I'm working on a fresh build of mediawiki on my local system. When
>> running the command:
>>
>> java -jar mwdumper.jar --format=sql:1.5  
>> Wikipedia-20090113203939.xml |
>> mysql -u root -p wiki
>>
>> It is returning the following error:
>>
>> 1 pages (0.102/sec), 1,000 revs (102.062/sec)
>> ERROR 1062 (23000) at line 99: Duplicate entry '45970' for key 1
>
> This happens when the XML dump contains the same page twice (or was  
> it the same
> revision, even?). Which shouldn't happen. And if it happens,  
> mwdumper shouldn't
> crash and burn.
>
> I don't know a goos way around this, really, sorry. The question is:  
> *why* does
> the dump include the same page twice? Is that legal in terms of the  
> dump format?
> If yes, why can't mwdumper cope with it?
>
> -- daniel
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] mwdumper ERROR Duplicate entry

2009-01-15 Thread Dawson
Hello Roan,

I did try this but it only occurs once:

 
   45970
   2002-03-17T04:46:17Z
   
 Redmist
 307
   
   
   *
   See [[Diabetes]].
 

Feel free to checkout  
http://dawson.md/Wikipedia-20090113203939.xml.zip(XML/1.9mb) 
  and see my last reply.

Thanks, Dawson

On 15 Jan 2009, at 12:49, Roan Kattouw wrote:

> Dawson schreef:
>> I figured I would go into the XML and manually remove the offending
>> duplicate page/revision, but couldn't find it.
>>
>> I have gone from top to bottom of the XML file and find no template
>> information, even though "include templates" was ticked.
> How about searching for "45970", which is the duplicate ID mwdumper
> complained about?
>
> Roan Kattouw (Catrope)
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] mwdumper ERROR Duplicate entry

2009-01-15 Thread Dawson
Thinking that perhaps it's the revisions causing the problem, I have  
returned to Special:Export for "Diabetes_mellitus" and this time ticked:

  > Include only the current revision, not the full history
  > Include templates
  > Save as file

The output file is dramatically smaller, 280kb (due to not including  
revisions), however I'm still getting a similar error:

"3 pages (127.413/sec), 33 revs (127.413/sec)

ERROR 1062 (23000) at line 31: Duplicate entry '264148315' for key 1"

Dawson

On 15 Jan 2009, at 12:22, Daniel Kinzler wrote:

> Dawson schrieb:
>> Hello,
>>
>> I have used Special:Export at en.wikipedia to export
>> "Diabetes_mellitus" and ticked the box "include templates" (I'm only
>> really after the templates).
>>
>> The resulting XML file is 40.1mb so I decided to go with mwdumper.js
>> rather than Special:Import.
>>
>> I'm working on a fresh build of mediawiki on my local system. When
>> running the command:
>>
>> java -jar mwdumper.jar --format=sql:1.5  
>> Wikipedia-20090113203939.xml |
>> mysql -u root -p wiki
>>
>> It is returning the following error:
>>
>> 1 pages (0.102/sec), 1,000 revs (102.062/sec)
>> ERROR 1062 (23000) at line 99: Duplicate entry '45970' for key 1
>
> This happens when the XML dump contains the same page twice (or was  
> it the same
> revision, even?). Which shouldn't happen. And if it happens,  
> mwdumper shouldn't
> crash and burn.
>
> I don't know a goos way around this, really, sorry. The question is:  
> *why* does
> the dump include the same page twice? Is that legal in terms of the  
> dump format?
> If yes, why can't mwdumper cope with it?
>
> -- daniel
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] mwdumper ERROR Duplicate entry

2009-01-15 Thread Dawson
Solution:

With the file now being only 280kb I can use Special:Import instead of  
mwdumper.jar, which works as expected:

"* All revisions were previously imported.

Import finished! "

So this was a problem with mwdumper *shrug*, oh well.

Thanks for all your help, Dawson

On 15 Jan 2009, at 12:22, Daniel Kinzler wrote:

> Dawson schrieb:
>> Hello,
>>
>> I have used Special:Export at en.wikipedia to export
>> "Diabetes_mellitus" and ticked the box "include templates" (I'm only
>> really after the templates).
>>
>> The resulting XML file is 40.1mb so I decided to go with mwdumper.js
>> rather than Special:Import.
>>
>> I'm working on a fresh build of mediawiki on my local system. When
>> running the command:
>>
>> java -jar mwdumper.jar --format=sql:1.5  
>> Wikipedia-20090113203939.xml |
>> mysql -u root -p wiki
>>
>> It is returning the following error:
>>
>> 1 pages (0.102/sec), 1,000 revs (102.062/sec)
>> ERROR 1062 (23000) at line 99: Duplicate entry '45970' for key 1
>
> This happens when the XML dump contains the same page twice (or was  
> it the same
> revision, even?). Which shouldn't happen. And if it happens,  
> mwdumper shouldn't
> crash and burn.
>
> I don't know a goos way around this, really, sorry. The question is:  
> *why* does
> the dump include the same page twice? Is that legal in terms of the  
> dump format?
> If yes, why can't mwdumper cope with it?
>
> -- daniel
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Template Special:Export/Import

2009-01-15 Thread Dawson
Hello,

I have done a Special:Export latest revision of 
http://en.wikipedia.org/w/index.php?title=Diabetes_mellitus 
  including templates, and copied:

{{Infobox Disease
  | Name   = TestSMW
  | Image  =
  | Caption=
  | DiseasesDB =
  | ICD10  = {{ICD10|Group|Major|minor|LinkGroup|LinkMajor}}
  | ICD9   = 00
  | ICDO   =
  | OMIM   =
  | MedlinePlus=
  | eMedicineSubj  =
  | eMedicineTopic =
  | MeshID =
}}

Into my test page http://wiki.medicalstudentblog.co.uk/index.php/ 
TestSMW -- However as you can see, it comes out all garbaged. Can  
anyone advise? I should now have all the templates from the export/ 
import, perhaps I'm missing some other extension(s)?

Thanks, Dawson

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Template Special:Export/Import

2009-01-15 Thread Dawson
Thanks Roan,

All working now.

On 15 Jan 2009, at 13:49, Roan Kattouw wrote:

> You're missing the ParserFunctions extension.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Template Special:Export/Import

2009-01-15 Thread Dawson
Hi Bryan, can you link me this name alias File, I have no clue what it is or
what it will do = )

2009/1/15 Bryan Tong Minh 

> You should also create a namespace alias File for Image.
>
> On Thu, Jan 15, 2009 at 3:00 PM, Dawson  wrote:
> > Thanks Roan,
> >
> > All working now.
> >
> > On 15 Jan 2009, at 13:49, Roan Kattouw wrote:
> >
> >> You're missing the ParserFunctions extension.
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] MediaWiki Slow, what to look for?

2009-01-27 Thread Dawson
Hello, I have a couple of mediawiki installations on two different slices at
Slicehost, both of which run websites on the same slice with no speed
problems, however, the mediawiki themselves run like dogs!
http://wiki.medicalstudentblog.co.uk/ Any ideas what to look for or ways to
optimise them? I still can't get over they need a 100mb ini_set in settings
to just load due to the messages or something.

Thank you, Dawson
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki Slow, what to look for?

2009-01-27 Thread Dawson
Modified config file as follows:

$wgUseDatabaseMessage = false;
$wgUseFileCache = true;
$wgMainCacheType = "CACHE_ACCEL";

I also installed xcache and eaccelerator. The improvement in speed is huge.

2009/1/27 Aryeh Gregor

>

> On Tue, Jan 27, 2009 at 5:31 AM, Dawson  wrote:
> > Hello, I have a couple of mediawiki installations on two different slices
> at
> > Slicehost, both of which run websites on the same slice with no speed
> > problems, however, the mediawiki themselves run like dogs!
> > http://wiki.medicalstudentblog.co.uk/ Any ideas what to look for or ways
> to
> > optimise them? I still can't get over they need a 100mb ini_set in
> settings
> > to just load due to the messages or something.
>
> If you haven't already, you should set up an opcode cache like APC or
> XCache, and a variable cache like APC or XCache (if using one
> application server) or memcached (if using multiple application
> servers).  Those are essential for decent performance.  If you want
> really snappy views, at least for logged-out users, you should use
> Squid too, although that's probably overkill for a small site.  It
> also might be useful to install wikidiff2 and use that for diffs.
>
> Of course, none of this works if you don't have root access.  (Well,
> maybe you could get memcached working with only shell . . .)  In that
> case, I'm not sure what advice to give.
>
> MediaWiki is a big, slow package, though.  For large sites, it has
> scalability features that are almost certainly unparalleled in any
> other wiki software, but it's probably not optimized as much for quick
> loading on small-scale, cheap hardware.  It's mainly meant for
> Wikipedia.  If you want to try digging into what's taking so long, you
> can try enabling profiling:
>
> http://www.mediawiki.org/wiki/Profiling#Profiling
>
> If you find something that helps a lot, it would be helpful to mention it.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki Slow, what to look for?

2009-01-28 Thread Dawson
Thank you Platonides,

Seems now I get the error: "xcache.var_size is either 0 or too small to
enable var data caching in */var/www/includes/BagOStuff.php* on line *643"

*Googling hasn't provided much info on how to fix this, anyone know?*
*
2009/1/28 Platonides 

> Dawson wrote:
> > Modified config file as follows:
> >
> > $wgUseDatabaseMessage = false;
> > $wgUseFileCache = true;
> > $wgMainCacheType = "CACHE_ACCEL";
>
> This should be $wgMainCacheType = CACHE_ACCEL; (constant) not
> $wgMainCacheType = "CACHE_ACCEL"; (string)
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Lightweight Wiki?

2009-02-03 Thread Dawson
Can anyone recommend a really lightweight Wiki? Preferably PHP but flat file
would be considered too.

Thanks
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Lightweight Wiki?

2009-02-03 Thread Dawson
dokuwiki looks good

2009/2/3 Dawson 

> Can anyone recommend a really lightweight Wiki? Preferably PHP but flat
> file would be considered too.
>
> Thanks
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Lightweight Wiki?

2009-02-03 Thread Dawson
Thanks Daniel

2009/2/3 Daniel Kinzler 

> Dawson schrieb:
> > Can anyone recommend a really lightweight Wiki? Preferably PHP but flat
> file
> > would be considered too.
>
> http://en.wikipedia.org/wiki/Comparison_of_wiki_software
>
> http://www.wikimatrix.org/
>
> http://freewiki.info/
>
> -- daniel
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l