[Xmldatadumps-l] Re: Is there a problem with the wikidata-dump?

2024-01-13 Thread Platonides
vironment, but wget is not > available any more in kubernetes (with toolforge jobs start …) :-( > > $ webservice php7.4 shell > tools.persondata@shell-1705135256:~$ wget > bash: wget: command not found > > > Wolfgang > > > Am Sa., 13. Jan. 2024 um 02:20 Uhr s

[Xmldatadumps-l] Re: Is there a problem with the wikidata-dump?

2024-01-12 Thread Platonides
Gerhard said that for him the downloading job ran for about 12 hours. It seems the connection was closed. I wouldn't be surprised if this was facing a similar problem as https://phabricator.wikimedia.org/T351876 With such long download time, it isn't that strange that there could be connection err

[Xmldatadumps-l] Re: Need Wikipedia data dump

2023-11-21 Thread Platonides
Hi You can retrieve those from https://dumps.wikimedia.org/ Regards On Mon, 20 Nov 2023 at 06:49, venkat ekkuluri wrote: > Hi team, > > I need XML data dumps for the Wikipedia website for analysis. Can you > please give me access to that. > > > Thanks & Regards > Narayana Ekkuluri > __

[Xmldatadumps-l] Re: Data about category views or uses

2023-10-17 Thread Platonides
s. In fact, my real > interest is in data about clickstream from category pages to article pages. > But I fear this kind of clickstream data is not provided. Is it? > > Regards. > > Miquel Centelles > > On Sat, 7 Oct 2023 at 18:33, Platonides wrote: > >> Hi Miquel >&g

[Xmldatadumps-l] Re: Data about category views or uses

2023-10-07 Thread Platonides
Hi Miquel You can use pageviews to view the data for the actual category pages. For example: https://pageviews.wmcloud.org/pageviews/?project=es.wikipedia.org&platform=all-access&agent=user&redirects=0&range=latest-20&pages=Categor%C3%ADa:Ciencias_naturales|Categor%C3%ADa:Historia Does this work

[Xmldatadumps-l] Re: Inconsistency of Wikipedia dump exports with content licenses

2023-07-25 Thread Platonides
On Tue, 25 Jul 2023 at 15:14, Dušan Kreheľ wrote: > Hello, Wikipedia export is not right licensed. Could this be brought > into compliance with the licenses? The wording of the violation is: > https://krehel.sk/Oprava_poruseni_licencei_CC_BY-SA_a_GFDL/ (Slovak). > > Dušan Kreheľ Hello Dušan I

[Xmldatadumps-l] Re: Which dump to use?

2023-03-02 Thread Platonides
You would need to download them separately. But, are you sure you need them? Perhaps kiwix would fit your needs? Or a local MediaWiki install configured to autofetch images? What is your goal? Regards On Fri, 3 Mar 2023 at 02:08, Research Pro wrote: > Hi - we need a complete copy of EN wikiped

[Xmldatadumps-l] Re: Custom dump

2023-02-03 Thread Platonides
On Fri, 3 Feb 2023 at 15:36, someone wrote: > Why is it showing my address? I set it to be hidden. > You were emailing me directly, but you now switched to sending it to the mailing list (so it gets to everyone subscribed). The name showed in the email is the one your mail client set. That's comp

[Xmldatadumps-l] Re: dataset for graph

2023-01-24 Thread Platonides
Hello Kiran Do you realise we have no idea what your research is about, what kind of dataset you might need, why wouldn't https://dumps.wikimedia.org/ work for you, etc. Moreover, your message comes off quite rude, and it looks as if you didn't do even the most basic *research*... ___

[Xmldatadumps-l] Re: Custom dump

2023-01-24 Thread Platonides
I did such script a long time ago. I should start by asking, how much disk space do you have for this? Local images are cheap: ++-+ | number | size| ++-+ | 22 | 2108819 | ++-+ eswiktionary +++ | number | size | ++

[Xmldatadumps-l] Re: Access imageinfo data in a dump

2022-02-07 Thread Platonides
The metadata used to be included in the image table, but it was changed 6 months ago out to External Storage. See https://phabricator.wikimedia.org/T275268#7178983 On Fri, 4 Feb 2022 at 20:44, Mitar wrote: > Hi! > > Will do. Thanks. > > After going through the image table dump, it seems not all

[Xmldatadumps-l] Re: still only a partial dump for 20210801 for a lot of wikis

2021-09-01 Thread Platonides
On Wed, 1 Sept 2021 at 12:34, Fernando Ducha wrote: > Can someone help me and tell why can't i download the dump? it say's i > don't have permission who's able to give this permission? its this related > to @Ariel Glenn WMF response? > Which dump and file are you trying to download? I have tried

Re: [Xmldatadumps-l] Collecting data on page revisions over time

2017-08-31 Thread Platonides
On Thu, Aug 31, 2017 at 7:50 PM, Jérémie Roquet wrote: > Hi Platonides, > > 2017-08-31 19:40 GMT+02:00 Platonides : > > On Thu, Aug 31, 2017 at 3:10 PM, Jérémie Roquet > > wrote: > >> > >> PS : what could be incredibly useful to dive into articles histor

Re: [Xmldatadumps-l] Downloading images in the Wikipedia Dump

2016-05-16 Thread Platonides
On 16/05/16 04:11, Praveen Balaji wrote: Hello! Is there a way to tell whether an image is on the "commons" wiki or on "enwiki". Some images seem to be available on "wikipedia/commons", like so: https://upload.wikimedia.org/wikipedia/commons/a/a9/Example.jpg and some others on "wikipedia/en" li

Re: [Xmldatadumps-l] Renewal of question for a dump of all .svg files in commons.

2016-04-03 Thread Platonides
D. Hansen wrote: Hi Platonides, I succeeded in downloading the file. (After some crashes.) As I'm sure you will need the hd-space, I want to use the checksum you provided, so I could tell you that the data actually arrived. As I'm fed up with using Windows for large files, I'l

Re: [Xmldatadumps-l] Renewal of question for a dump of all .svg files in commons.

2016-03-19 Thread Platonides
D. Hansen wrote: Hello Platonides! Thank you for being do kind to make the list. Originally I wanted to download the svg-files whenever my program is using the file, one after another. Turned out, that's just not feasible. Could I get a dump of all svg files, please? Not just the list o

Re: [Xmldatadumps-l] Renewal of question for a dump of all .svg files in commons.

2016-03-02 Thread Platonides
On 02/03/16 22:05, D. Hansen wrote: Hi! Last year I was provided very kindly with a list of all svg-files on commons, that is their then *real* http(s)-paths. (Either by John phoenixoverr...@gmail.com or by Ariel T. Glenn agl...@wikimedia.org agl...@wikimedia.org.) Could I get a current version

Re: [Xmldatadumps-l] bz2 tools

2016-01-17 Thread Platonides
On 18/01/16 00:36, Bernardo Sulzbach wrote: Platonides, we were both talking about 20151201 and you tested 20160113, am I correct? I said I reproduced the mentioned problem with that file, not that all files were problematic. Whoops. Sorry :( Seems I didn't notice and just click

Re: [Xmldatadumps-l] bz2 tools

2016-01-17 Thread Platonides
On 16/01/16 02:44, Platonides wrote: On 16/01/16 02:30, Richard Farmbrough wrote: I have problems bunzip2ing pages-articles files. WinRAR fails at 37G, and bunzip2 fails at some point >> 14g though it "helpfully" cleans up after itself. Bunzip2 v 1.0.6 >bunzip2 en

Re: [Xmldatadumps-l] bz2 tools

2016-01-15 Thread Platonides
On 16/01/16 02:30, Richard Farmbrough wrote: I have problems bunzip2ing pages-articles files. WinRAR fails at 37G, and bunzip2 fails at some point >> 14g though it "helpfully" cleans up after itself. Bunzip2 v 1.0.6 >bunzip2 enwiki-20151201-pages-articles.xml.bz2 bunzip2: I/O or other error,

Re: [Xmldatadumps-l] Use of dumps in mediawiki

2015-09-07 Thread Platonides
Federico Leva (Nemo) wrote: Yoni Lamri, 07/09/2015 16:50: Our company is creating a partnerchip with the wikipedia fondation and we can not use kiwix which is not comming from this fondation. The Wikipedia Foundation doesn't exist and Kiwix is an official Wikimedia tool. Nemo Plus I find h

Re: [Xmldatadumps-l] Getting display: mw:Help:Magic words#Other on the page

2014-10-13 Thread Platonides
On 13/10/14 21:36, Arquillos, Diana wrote: Hi, Since, the dump from August we updated the content and since then, our pages doesn't render properly. Seems that it doesn't render properly the modules/templates contains in double bracelets {{}} but not always. Soft redirects and some Infobox t

Re: [Xmldatadumps-l] relationship between logging and page_restrictions

2013-09-16 Thread Platonides
On 13/09/13 21:56, Xavier Vinyals Mirabent wrote: Thanks, Platonides! Do you know when did this started being applied? A quick check suggests September 2008: [enwiki_p]> select log_type, log_action, log_timestamp from logging_logindex where log_type='protect' and log_action =

Re: [Xmldatadumps-l] relationship between logging and page_restrictions

2013-09-12 Thread Platonides
El 12/09/13 20:01, Xavier Vinyals Mirabent escribió: Hi, Are the values in the columns pr_id and log_id equivalent? I'm trying to select all changes in editing protection status for Wikipedia articles but the table Page_restrictions doesn't contain a time stamp, and the table logging doesn't spe

Re: [Xmldatadumps-l] First preview version of incremental dumps

2013-09-05 Thread Platonides
On 29/08/13 16:07, Petr Onderka wrote: - The flags are a bit convoluted. Sometimes a flag is used for a feature being present, sometimes for a feature being absent, it can be mingled with options. What specifically do you mean? I think the only place where I didn't go with what was most logical

Re: [Xmldatadumps-l] First preview version of incremental dumps

2013-08-28 Thread Platonides
Petr Onderka wrote: The XML output is almost the same as existing XML dumps, but there are some differences [2]. The current state of the new format also now has a detailed specification [3] (this describes the current version, the format is still in flux and can change daily). I didn't partici

Re: [Xmldatadumps-l] rev_len and page_len

2013-08-28 Thread Platonides
On 22/08/13 18:29, Petr Onderka wrote: How did you load the dumps? From my experience, mwdumper does fill page_len, but not rev_len. If you want to fill rev_len, you can run the maintenance script maintenance/populateRevisionLength.php on your wiki. It seems there isn't a similar script for pag

Re: [Xmldatadumps-l] wikidatawiki -- toooo many edits

2013-04-23 Thread Platonides
On 23/04/13 18:31, Ariel T. Glenn wrote: > The long version is that the pages-logging file is already about half > the size of en wp's table, and that the number of edits per minute is > much larger, see: > https://wikipulse.herokuapp.com/ > There's a lot of deletion and a lot of churn too due to t

Re: [Xmldatadumps-l] Finding images within dumps

2013-04-15 Thread Platonides
On 05/04/13 23:21, Keith Schacht wrote: > Hi, I've downloaded the latest set of wikimedia dumps. I'm trying to > understand where to find images within these dumps. I've studied the > database schema and it seems to make sense, but then I take a single > example such as: > > http://en.wikipedia.or

Re: [Xmldatadumps-l] Is Chinese Variants dump available

2013-04-15 Thread Platonides
On 22/03/13 01:16, Jiang BIAN wrote: > Thanks for detailed instructions. A few minor things still not clear to > me, inline: > > > On Thu, Mar 21, 2013 at 1:02 PM, Richard Farmbrough > mailto:rich...@farmbrough.co.uk>> wrote: > > The only fifth exception is Wikidata: > > So you need >

Re: [Xmldatadumps-l] I need Database tables Mapping to DB Dumps

2013-04-15 Thread Platonides
On 15/04/13 15:03, Imran Latif wrote: > And download all sql and XML files and populates my table using some utility, > then the whole Wikipedia data is configured ? > I mean to say that this dump provide me whole data of Wikipedia, including > content, revision history etc. Or i need something m

Re: [Xmldatadumps-l] Is Chinese Variants dump available

2013-03-21 Thread Platonides
On 21/03/13 21:02, Richard Farmbrough wrote: > The only fifth exception is Wikidata: > > So you need > 1. Mediawiki + the same extensions zh:Wikipedia uses > 2. The dumps (including cats and templates) > 3. Instacommons (hadn't heard of this before, sounds cool) > 4. The config setup of zh:Wikipe

Re: [Xmldatadumps-l] Is Chinese Variants dump available

2013-03-21 Thread Platonides
On 21/03/13 02:50, Jiang BIAN wrote: > If we render the content using MediaWiki softare and the extension, > will the content be same as Chinese Wikipedia? Yes... > I know some content also rely on the the site config/settings, > e.g. parsing the interwiki links like [[:File:Mediawiki.png]], >

Re: [Xmldatadumps-l] Is Chinese Variants dump available

2013-03-20 Thread Platonides
On 20/03/13 15:43, Jiang BIAN wrote: > Hi, > > Chinese Wikipedia supports a few variants, zh-cn, zh-tw, zh-hk, same > wikitext is rendered differently under these variants. e.g. "software" > in zh-cn [1] and "software" in zh-tw [2]. > But seems no HTML are included in dump file zhwiki. That's ri

Re: [Xmldatadumps-l] Embedded malware in media

2013-03-11 Thread Platonides
On 11/03/13 20:22, Kevin Day wrote: >> For the longer-term issue, the WMF is not currently scanning upload >> with a virus scanner, because of the performance and false positive >> rates. It would be great if we could get a bot to scan and flag files, >> so we can shorten the time to removing them.

Re: [Xmldatadumps-l] enwiki sql and tab delim files

2013-02-25 Thread Platonides
On 25/02/13 18:06, Ariel T. Glenn wrote: > I have the files from the February run for en wikipedia converted here: > > http://dumps.wikimedia.org/other/experimental/ > > In the sqlfiles directory are the page, revision and text tables in sql > format for MediaWiki 1.20, and in the tabfiles direct

Re: [Xmldatadumps-l] Weird page titles in page table

2013-02-11 Thread Platonides
Sorry, forgot to CC the list On 10/02/13 23:36, Platonides wrote: > On 10/02/13 23:08, Robert Crowe wrote: >> I'm seeing rows in the page table that have weird titles, and I'd like to be >> able to identify and filter them out, but I don't see properties that

Re: [Xmldatadumps-l] Weird page titles in page table

2013-02-11 Thread Platonides
On 11/02/13 00:58, Robert Crowe wrote: > Weird. Why is it that only some of the titles display as hex? I'm using > phpMyAdmin, and the column is varbinary(255). Maybe it's only doing so when it contains non-ascii chars? (in the case of “Egypt–Morocco_relations”, that would be the en-dash)

Re: [Xmldatadumps-l] Question about the meaning of " ''' " mark in dumps

2013-01-28 Thread Platonides
On 28/01/13 09:20, Hydriz Wikipedia wrote: > Hi Chong Wang, > > That is the wiki markup for bolding text within the " ''' " that you see > inside the dumps. In the rendered page, it would show as bold. > > Hope this helps! In summary, you are viewing the content of http://en.wikipedia.org/w/inde

Re: [Xmldatadumps-l] [WP-MIRROR] Questions regarding Metalink and SPDY

2013-01-02 Thread Platonides
On 03/01/13 00:02, wp mirror wrote: > Dear Platonides, > > Happy New Year. Thank you for your email. > > Here is a sketch of the issues with image files: > > 1) IMAGE FILE NAME > > Issue: The enwiki has many image files which names contain control > characters (

Re: [Xmldatadumps-l] [WP-MIRROR] Questions regarding Metalink and SPDY

2013-01-02 Thread Platonides
Ob 02/01/13 17:30, wp mirror wrote: > 2) METALINK > > WP-MIRROR 0.5 and prior version, had to deal with thousands of corrupt > image files. Most of these were partial downloads. cURL would > time-out and leave corrupt files. I currently deal with that by > validating the images. Validation, ho

Re: [Xmldatadumps-l] Issues importing Wikipedia XML dumps

2012-11-20 Thread Platonides
On 17/11/12 02:05, Christoph Sackl wrote: > When trying to import an XML dump (Nov 2011 dump) with importDump.php in > the maintenance folder of the MediaWiki installation, I get the > following error after about 2 seconds: > > “WikiRevision given a null title in import. You may need to adjust > $

Re: [Xmldatadumps-l] Format

2012-11-10 Thread Platonides
On 10/11/12 00:17, Federico Leva (Nemo) wrote: > Platonides, this reminds me: have you/we ever documented > https://gerrit.wikimedia.org/r/#/c/6717/ somewhere? I don't think so. Do we really need to document that "Import on 1.16 no longer breaks with 1.18 export format"?

Re: [Xmldatadumps-l] Format

2012-11-09 Thread Platonides
On 09/11/12 20:21, John wrote: > I am looking to create a script for creating manual dumps for those > wikis that either dont or wont publish their own dumps and that I dont > have server access to. To that end I am writing a python dump creator, > however I would like to ensure that my format is t

Re: [Xmldatadumps-l] Malware reported in mirror

2012-07-11 Thread Platonides
On 11/07/12 23:50, Kevin Day wrote: > > My final list of possibly naughty things uploaded. I know some of these are > pretty harmless (html being appended to jpegs), and most are just encrypted > RARs appended to images or encrypted PDF files. I don't know if there's a > policy on barring encry

Re: [Xmldatadumps-l] Malware reported in mirror

2012-07-03 Thread Platonides
On 03/07/12 18:47, Kevin Day wrote: > Even temporarily forgetting about the complexity of scanning PDFs, there's a > lot of weirdness in a lot of files that even ClamAV doesn't find. For > example: (replacing < and > with [ and ] so this doesn't trigger anyone's > mail spam filters) > > strings

Re: [Xmldatadumps-l] anonymous user account logs (account created / account blocked)

2012-07-02 Thread Platonides
gt; '20120600' and user_registration LIKE '2012%' order by rc_timestamp asc Some notes: - This will contain from June 2 to July 2. - No bytes_diff, but you have old and new byte len :) - It may have some non-edit log entries. File lives at http://toolserver.org/~pl

Re: [Xmldatadumps-l] Malware reported in mirror

2012-07-02 Thread Platonides
> On Jul 1, 2012, at 10:13 PM, Hydriz Wikipedia wrote: > >> As far as I know, the chances are rather slim, because the MediaWiki >> software has a malware checker (I think). >> >> Perhaps we shall see what outputs from the ClamAV checking, before we can >> know what is happening. MediaWiki supp

Re: [Xmldatadumps-l] anonymous user account logs (account created / account blocked)

2012-06-11 Thread Platonides
On 11/06/12 23:22, Gregor Martynus wrote: > Thanks again for your input, sounds like the Stub-meta-history dump is > exactly what we need. I'm already downloading it. > > I'm not sure if this is the place for such an suggestion, but it would > be great to have example versions of the real dumps, w

Re: [Xmldatadumps-l] anonymous user account logs (account created / account blocked)

2012-06-11 Thread Platonides
updated the type/action tree with the input by Platonides, feel > free to use / extend it: > https://gist.github.com/2906718 > > I was surprised that the pages-logging.xml dump does not contain events > about user contributions. My friend is searching for > > - users with fi

Re: [Xmldatadumps-l] anonymous user account logs (account created / account blocked)

2012-06-10 Thread Platonides
On 10/06/12 23:17, Gregor Martynus wrote: > Thanks a lot for the explanations, that helps a lot! > > I think one thing missing in the page-logging.xml dump is target of the > action (page column?). It could be a child node with an id and a name, > just like . Right now, I can filter events that ha

Re: [Xmldatadumps-l] anonymous user account logs (account created / account blocked)

2012-06-10 Thread Platonides
On 10/06/12 19:38, Gregor Martynus wrote: > Thanks Petr! > > Besides the XML dumps, are there also direct SQL dumps available? No. The dumps are in XML precisely to avoid publishing db-specific schemas in SQL files... and filtering the output in the process. I could make you a sql dump, but it'd

Re: [Xmldatadumps-l] anonymous user account logs (account created / account blocked)

2012-06-09 Thread Platonides
On 09/06/12 17:23, Gregor Martynus wrote: > Hi, > > for a dissertation study, I try to find a reliable datasource from where > I can extract user account events, specifically creation and blocking of > user accounts, with usernames, the event name and timestamps, > > Is such data available? If y

Re: [Xmldatadumps-l] Problems with frwiki dumps

2012-06-08 Thread Platonides
On 08/06/12 00:13, Felipe Ortega wrote: > > Thanks for your help, John. I really appreciate it. It looks like the new > copy is working fine, now. I will check again tomorrow morning. It should > finish overnight. > > Best, > Felipe. $ time 7z e -so frwiki-20120430-pages-meta-history.xml.7z |

Re: [Xmldatadumps-l] Problems with frwiki dumps

2012-06-07 Thread Platonides
On 06/06/12 20:22, Felipe Ortega wrote: > Hello. > > I'm finding strange issues when trying to decompress the 7z version of this > dump for the French Wikipedia: > > http://dumps.wikimedia.org/frwiki/20120430/ > > At some point around 3M revisions the 7z process stalls. After a long time > (fe

Re: [Xmldatadumps-l] rsync

2012-05-23 Thread Platonides
On 23/05/12 21:49, calbert wrote: > Is the rsync mirror supposed to improve download times? I've seen > "speedup 1.0" the last two times I've tried > rsync -tL > rsync://ftpmirror.your.org/wikimedia-dumps/enwiki/latest/enwiki-latest-pages-articles.xml.bz2 > > (and speedup 7000 when it doesn't

Re: [Xmldatadumps-l] [Wikitech-l] XML dumps/Media mirrors update

2012-05-17 Thread Platonides
On 17/05/12 14:23, Ariel T. Glenn wrote: > There's a few other reasons articles get deleted: copyright issues, > personal identifying data, etc. This makes maintaning the sort of > mirror you propose problematic, although a similar mirror is here: > http://deletionpedia.dbatley.com/w/index.php?tit

Re: [Xmldatadumps-l] uploaded media for WMF projects available via rsync

2012-04-02 Thread Platonides
On 02/04/12 21:55, Ariel T. Glenn wrote: > For "dumps" of images, we have no such thing; this rsync mirror is the > first thing out of the gate and we can't possibly generate multiple > copies of it on different dates as we do for the xml dumps. That's not too hard to do. You just copy the image t

Re: [Xmldatadumps-l] Views archive - Domas immortalising his blog

2012-03-04 Thread Platonides
On 04/03/12 15:31, Richard Farmbrough wrote: > The files from > > dumps.wikimedia.org/other/pagecounts-raw/2011/2011-10/pagecounts-20111008-180001.gz > > to > > dumps.wikimedia.org/other/pagecounts-raw/2011/2011-10/pagecounts-20111008-220001.gz > > appear to be Domas' (very informative) blog a

Re: [Xmldatadumps-l] Picture of the year

2012-02-09 Thread Platonides
ty images of wikipedia. The POTY images also make > great wallpaper packs! There's also https://toolserver.org/~platonides/catdown/catdown.php for helping you download image categories. ___ Xmldatadumps-l mailing list Xmldatadumps-l@lists

Re: [Xmldatadumps-l] pbzip2 proposal

2012-01-28 Thread Platonides
On 28/01/12 00:38, Richard Jelinek wrote: So to sum up: It's a no loose and two win situation if you migrate to pbzip2. And that just because pbunzip2 is slightly buggy. Isn't that interesting? :-) Note that pbzip2 files are usually larger. And with our dump sizes, a small percentage increase

Re: [Xmldatadumps-l] Import of an XML dump of the RU Wiktionary with mwdumper / problem

2012-01-23 Thread Platonides
On 11/01/12 07:13, Sébastien Druon wrote: )...) I have some questions: - in the SQL generated from mwdumper the following command does not return any result: grep "'сущ ru m ina 1c(1)" ruktionary.sql" ruwiktionary.sql - The string is also not to be found in the generated DB, a

Re: [Xmldatadumps-l] List of all words of a wiktionary

2012-01-09 Thread Platonides
On 09/01/12 10:51, Jérémie Roquet wrote: > You'd still have to filter the output to only keep titles in the main > namespace. > > It should be possible using categories too, but this wouldn't be any > easier nor more reliable and would be much slower. > > Best regards, I disagree. You would only

Re: [Xmldatadumps-l] enwiki-20111201-pages-articles.xml.bz2

2012-01-01 Thread Platonides
On 30/12/11 17:27, Richard Farmbrough wrote: > I checked the MD5, and I can successfully bunzip2 it on Linux. It's very > odd. What does "doesn't expand" mean? Assuming your bz2.exe is good, maybe it's some kind of problem writing to disk? (not enough space, filesystem doesn't allow files so big.

Re: [Xmldatadumps-l] Import of an XML dump of the RU Wiktionary with mwdumper / problem

2011-12-23 Thread Platonides
On 23/12/11 11:08, Sébastien Druon wrote: > Hello! > > I imported the ruwiktionary dump with mwdumper, but I cannot access any > word. > For each of the I get a similar page: > For example for the page http://localhost/mediawiki/index.php/Ребёнок > I get the following result: > > Ребёнок > > (