yep, same here!

Also another question about consistency of _IDs in time.
I was working with an old version of wikipedia dump, and testing some data
models I built on the dumpusing as pivot a few topics.
I might have data corrupted on my side, but just to be sure:
are _IDs of article *persistent* over time, or are they subjected to change?

Might happen that due any fallback or merge in an article history, ID would
change?
E.g. as test article "Mars" would first point to a version _ID ="4285430"
and then changed to "14640471"

I need to ensure _IDs will persist.
thank you!


*P.S. sorry for cross posting - I've replied from wrong email - could you
please delete the other message and keep only this email address? thank
you! *

On Mon, Jan 11, 2016 at 11:05 AM, XDiscovery Team <i...@xdiscovery.com>
wrote:

> yep, same here!
>
> Also another question about consistency of _IDs in time.
> I was working with an old version of wikipedia dump, and testing some data
> models I built on the dump using as pivot a few topics.
> I might have data corrupted on my side, but just to be sure:
> are _IDs of article *persistent* over time, or are they subjected to
> change?
>
> Might happen that due any fallback or merge in an article history, ID
> would change?
> E.g. as test article "Mars" would first point to a version _ID ="4285430"
> and then changed to "14640471"
>
> I need to ensure _IDs will persist.
> thank you!
>
>
> On Mon, Jan 11, 2016 at 6:22 AM, Tilman Bayer <tba...@wikimedia.org>
> wrote:
>
>> On Sun, Jan 10, 2016 at 4:05 PM, Bernardo Sulzbach <
>> mafagafogiga...@gmail.com> wrote:
>>
>>> On Sun, Jan 10, 2016 at 9:55 PM, Neil Harris <n...@tonal.clara.co.uk>
>>> wrote:
>>> > Hello! I've noticed that no enwiki dump seems to have been generated
>>> so far
>>> > this month. Is this by design, or has there been some sort of dump
>>> failure?
>>> > Does anyone know when the next enwiki dump might happen?
>>> >
>>>
>>> I would also be interested.
>>>
>>> --
>>> Bernardo Sulzbach
>>>
>>> _______________________________________________
>>> Wikitech-l mailing list
>>> wikitec...@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>>
>> CCing the Xmldatadumps mailing list
>> <https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l>, where
>> someone has already posted
>> <https://lists.wikimedia.org/pipermail/xmldatadumps-l/2016-January/001214.html>
>>  about
>> what might be the same issue.
>>
>> --
>> Tilman Bayer
>> Senior Analyst
>> Wikimedia Foundation
>> IRC (Freenode): HaeB
>>
>> _______________________________________________
>> Xmldatadumps-l mailing list
>> Xmldatadumps-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l
>>
>>
>
>
> --
> *Luigi Assom*
> Founder & CEO @ XDiscovery - Crazy on Human Knowledge
> *Corporate*
> www.xdiscovery.com
> *Mobile App for knowledge Discovery*
> APP STORE <http://tiny.cc/LearnDiscoveryApp>  | PR
> <http://tiny.cc/app_Mindmap_Wikipedia>  | WEB
> <http://www.learndiscovery.com/>
>
> T +39 349 3033334 | +1 415 707 9684
>



-- 
*Luigi Assom*

T +39 349 3033334 | +1 415 707 9684
Skype oggigigi
_______________________________________________
Xmldatadumps-l mailing list
Xmldatadumps-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l

Reply via email to