Hello. I'm trying to run abstracts on the January french Wikipedia dumps
and I noticed that some of the mapping files that are downloaded are huge.
They were downloading for several hours and I had to abort the process at a
point:

>> 3.9G Mapping_be.xml
4.0K         Mapping_et.xml
>> 4.2G Mapping_ko.xml
472K Mapping_pt.xml
>> 4.5G Mapping_ru.xml
16K         Mapping_sk.xml
456K Mapping_sl.xml
>> 5.6G Mapping_sr.xml
144K Mapping_sv.xml

Has something changed? I ran the extraction on English wiki earlier this
month and the files were significantly smaller - a few MB at the most.

Any information is much appreciated.

Thanks
Praveen
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
DBpedia-discussion mailing list
DBpedia-discussion@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dbpedia-discussion

Reply via email to