Ohh good good,
please let me know when ready :)

Or ps. does dbpedia is working with wikimedia staff or are two completely
separated things?
I was wondering why wiki release dumps every month, while ~dbpedia each
year.

HaPPy VaCaTIOn (also to Ariel who is looking to this stuff before (or doing
:D ) party time :)))

see you guys & gals!


On Mon, Dec 28, 2015 at 4:05 PM, Hydriz Scholz <ad...@alphacorp.tk> wrote:

> Happy holidays!
>
> This issue has already been reported at T121348 [1], so you are not alone.
> Ariel is already looking into it.
>
> [1]: https://phabricator.wikimedia.org/T121348
>
> On 28 Dec 2015, at 18:18, Luigi Assom <luigi.as...@gmail.com> wrote:
>
> Hello Wikiteam!
>
> just in time to say have a good vacation and happy 2016 :)
>
> well I am also here about corrupted files :)
>
> I downloaded three times from different wifi networks and using download
> managers from Firefox:
>
> enwiki-20151201-pages-articles-multistream.xml.bz2
>
> and two times:
>
>  enwiki-latest-pages-articles-multistream.xml.bz2
>
>
> MD5 checksum is correct (*-latest-* have checksum of -*20151201-*)
>
> but file is corrupted.
>
>
> Cannot use bzip2recover for file is too large and I should recompile it
> because it will have more than maxlimit of generated tokens... and I think
> it is way better to take a fixed file :D
>
>
> Could you please check if I am the only one having this issue?
>
> dumps from other languages had worked fine for me, en-* is problematic.
>
>
>
>
>
> _______________________________________________
> Xmldatadumps-l mailing list
> Xmldatadumps-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l
>
>


-- 
*Luigi Assom*

T +39 349 3033334 | +1 415 707 9684
Skype oggigigi
_______________________________________________
Xmldatadumps-l mailing list
Xmldatadumps-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l

Reply via email to