[Wikidata-bugs] [Maniphest] [Commented On] T231151: Moving from main space to user space does not reflect in the Wikidata item

2019-08-24 Thread Ashot1997
Ashot1997 added a comment.


  Here are all such items 

 (only for hywiki). Looks like most of them are due to this.

TASK DETAIL
  https://phabricator.wikimedia.org/T231151

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Ashot1997
Cc: Aklapper, Ashot1997, darthmon_wmde, DannyS712, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Wikidata-bugs, 
aude, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Commented On] T231151: Moving from main space to user space does not reflect in the Wikidata item

2019-08-24 Thread Ashot1997
Ashot1997 added a comment.


  Looks like this is true at least from the beginning of this year (Quarry 
).

TASK DETAIL
  https://phabricator.wikimedia.org/T231151

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Ashot1997
Cc: Aklapper, Ashot1997, darthmon_wmde, DannyS712, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Wikidata-bugs, 
aude, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Created] T231151: Moving from main space to user space does not reflect in the Wikidata item

2019-08-24 Thread Ashot1997
Ashot1997 created this task.
Ashot1997 added a project: Wikidata.
Restricted Application added a subscriber: Aklapper.

TASK DESCRIPTION
  When a user moves a page from the main space to a userspace sandbox the move 
does not reflect in the Wikidata item (Wikidata site link should be removed 
automatically).  Examples,
  
  - This move is done with a gadget (via API) 

 (check Wikidata )
  - This move is done "with a hand" but the result is the same 

 (check Wikidata )
  
  As a result, many site links in Wikidata does not exist.
  
  P.S. Is there an easy way to find all such items?

TASK DETAIL
  https://phabricator.wikimedia.org/T231151

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Ashot1997
Cc: Aklapper, Ashot1997, darthmon_wmde, DannyS712, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Wikidata-bugs, 
aude, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata] ShEx to validate Medical Wikidata

2019-08-24 Thread Houcemeddine A. Turki
Dear all,
I thank you for your efforts. I have to thank Dr. Andra Waagmeester, Dr. Finn 
Årup Nielsen and Dr. Egon Willighagen for their comments concerning the points 
I raised in Wikidata. These comments were useful to learn about ShEx project. I 
saw several papers about the project at http://labra.weso.es/ and I was 
honoured to see how it can provide schemas for Wikidata classes that can be 
later used to validate Wikidata. I ask if someone is interested in creating 
shape expressions of Medical Wikidata classes. As well, I tried to access 
http://wikidata-shex.wmflabs.org/w/index.php. However, it does not work. I ask 
about how to use Wikidata ShEx. I also ask if a tool can be developed to turn a 
Wikidata statement in red if it does not correspond to the shape expressions of 
the class of its subject.
Yours Sincerely,
Houcemeddine Turki (he/him)
Medical Student, Faculty of Medicine of Sfax, University of Sfax, Tunisia
Undergraduate Researcher, UR12SP36
GLAM and Education Coordinator, Wikimedia TN User Group
Member, WikiResearch Tunisia
Member, Wiki Project Med
Member, WikiIndaba Steering Committee
Member, Wikimedia and Library User Group Steering Committee
Co-Founder, WikiLingua Maghreb
Founder, TunSci

+21629499418
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Proposal for the introduction of a practicable Data Quality Indicator in Wikidata

2019-08-24 Thread Imre Samu
TLDR:  it would be useful ; but extreme hard to create rules for every
domains.

>4. How to calculate and represent them?

imho:  it is deepends of the data domain.

For geodata ( human settlements/rivers/mountains/... )  ( with GPS
coordinates ) my simple rules:
- if it has a  "local wikipedia pages" or  any big
lang["EN/FR/PT/ES/RU/.."]  wikipedia page ..  than it is OK.
- if it is only in "cebuano" AND outside of "cebuano BBOX" ->  then 
this is lower quality
- only:{shwiki+srwiki} AND outside of "sh"&"sr" BBOX ->  this is lower
quality
- only {huwiki} AND outside of CentralEuropeBBOX -> this is lower quality
- geodata without GPS coordinate ->  ...
- 
so my rules based on wikipedia pages and languages areas ...  and I prefer
wikidata - with local wikipedia pages.

This is based on my experience - adding Wikidata ID concordances to
NaturalEarth ( https://www.naturalearthdata.com/blog/ )


>5. Which is the most suitable way to further discuss and implement this
idea?

imho:  Loading the wikidata dump to the local database;
and creating
- some "proof of concept" quality data indicators.
- some "meta" rules
- some "real" statistics
so the community can decide it is useful or not.



Imre







Uwe Jung  ezt írta (időpont: 2019. aug. 24., Szo,
14:55):

> Hello,
>
> As the importance of Wikidata increases, so do the demands on the quality
> of the data. I would like to put the following proposal up for discussion.
>
> Two basic ideas:
>
>1. Each Wikidata page (item) is scored after each editing. This score
>should express different dimensions of data quality in a quickly manageable
>way.
>2. A property is created via which the item refers to the score value.
>Certain qualifiers can be used for a more detailed description (e.g. time
>of calculation, algorithm used to calculate the score value, etc.).
>
>
> The score value can be calculated either within Wikibase after each data
> change or "externally" by a bot. For the calculation can be used among
> other things: Number of constraints, completeness of references, degree of
> completeness in relation to the underlying ontology, etc. There are already
> some interesting discussions on the question of data quality which can be
> used here ( see  https://www.wikidata.org/wiki/Wikidata:Item_quality;
> https://www.wikidata.org/wiki/Wikidata:WikiProject_Data_Quality, etc).
>
> Advantages
>
>- Users get a quick overview of the quality of a page (item).
>- SPARQL can be used to query only those items that meet a certain
>quality level.
>- The idea would probably be relatively easy to implement.
>
>
> Disadvantage:
>
>- In a way, the data model is abused by generating statements that no
>longer describe the item itself, but make statements about the
>representation of this item in Wikidata.
>- Additional computing power must be provided for the regular
>calculation of all changed items.
>- Only the quality of pages is referred to. If it is insufficient, the
>changes still have to be made manually.
>
>
> I would now be interested in the following:
>
>1. Is this idea suitable to effectively help solve existing quality
>problems?
>2. Which quality dimensions should the score value represent?
>3. Which quality dimension can be calculated with reasonable effort?
>4. How to calculate and represent them?
>5. Which is the most suitable way to further discuss and implement
>this idea?
>
>
> Many thanks in advance.
>
> Uwe Jung  (UJung )
> www.archivfuehrer-kolonialzeit.de/thesaurus
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Proposal for the introduction of a practicable Data Quality Indicator in Wikidata

2019-08-24 Thread David Abián
Hi,

If we accept that the quality of the data is the "fitness for use",
which in my opinion is the best and most commonly used definition (as
stated in the article linked by Ettore), then it will never be possible
to define a number that objectively represents data quality. We can
define a number that is the result of an arbitrary weighted average of
different metrics related to various dimensions of quality arbitrarily
captured and transformed, and we can fool ourselves by saying that this
number represents data quality, but it will not, nor will it be an
approximation of what data quality means, nor will this number be able
to order Wikidata entities matching any common, understandable,
high-level criterion. The quality of the data depends on the use, it's
relative to each user, and can't be measured globally and objectively in
any way that is better than another.

As an alternative, however, I can suggest that you separately study some
quality dimensions assuming a particular use case for your study; this
will be correct, doable and greatly appreciated. :-) Please feel free to
ask for help in case you need it, either personally or via this list or
other means. And thanks for your interest in improving Wikidata!

Regards,
David


On 8/24/19 13:54, Uwe Jung wrote:
> Hello,
> 
> As the importance of Wikidata increases, so do the demands on the
> quality of the data. I would like to put the following proposal up for
> discussion.
> 
> Two basic ideas:
> 
>  1. Each Wikidata page (item) is scored after each editing. This score
> should express different dimensions of data quality in a quickly
> manageable way.
>  2. A property is created via which the item refers to the score value.
> Certain qualifiers can be used for a more detailed description (e.g.
> time of calculation, algorithm used to calculate the score value, etc.).
> 
> 
> The score value can be calculated either within Wikibase after each data
> change or "externally" by a bot. For the calculation can be used among
> other things: Number of constraints, completeness of references, degree
> of completeness in relation to the underlying ontology, etc. There are
> already some interesting discussions on the question of data quality
> which can be used here ( see 
> https://www.wikidata.org/wiki/Wikidata:Item_quality;
> https://www.wikidata.org/wiki/Wikidata:WikiProject_Data_Quality, etc).
> 
> Advantages
> 
>   * Users get a quick overview of the quality of a page (item).
>   * SPARQL can be used to query only those items that meet a certain
> quality level.
>   * The idea would probably be relatively easy to implement.
> 
> 
> Disadvantage:
> 
>   * In a way, the data model is abused by generating statements that no
> longer describe the item itself, but make statements about the
> representation of this item in Wikidata.
>   * Additional computing power must be provided for the regular
> calculation of all changed items.
>   * Only the quality of pages is referred to. If it is insufficient, the
> changes still have to be made manually.
> 
> 
> I would now be interested in the following:
> 
>  1. Is this idea suitable to effectively help solve existing quality
> problems?
>  2. Which quality dimensions should the score value represent?
>  3. Which quality dimension can be calculated with reasonable effort?
>  4. How to calculate and represent them?
>  5. Which is the most suitable way to further discuss and implement this
> idea?
> 
> 
> Many thanks in advance.
> 
> Uwe Jung  (UJung )
> www.archivfuehrer-kolonialzeit.de/thesaurus
> 
> 
> 
> 
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
> 

-- 
David Abián

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata-bugs] [Maniphest] [Commented On] T172368: [Task] Make the Wikibase.git code base PSR-4 compatible

2019-08-24 Thread gerritbot
gerritbot added a comment.


  Change 532088 had a related patch set uploaded (by Ladsgroup; owner: 
Ladsgroup):
  [mediawiki/extensions/Wikibase@master] Adjust namespace of several classes in 
repo to make them follow PSR-4
  
  https://gerrit.wikimedia.org/r/532088

TASK DETAIL
  https://phabricator.wikimedia.org/T172368

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: gerritbot
Cc: Lucas_Werkmeister_WMDE, Eileenmcnaughton, gerritbot, Ricordisamoa, 
Aklapper, PokestarFan, WMDE-leszek, Jakob_WMDE, daniel, Aleksey_WMDE, 
Ladsgroup, Lydia_Pintscher, thiemowmde, Hook696, Daryl-TTMG, RomaAmorRoma, 
0010318400, E.S.A-Sheild, darthmon_wmde, joker88john, Dinadineke, DannyS712, 
CucyNoiD, Nandana, NebulousIris, Gaboe420, Versusxo, Majesticalreaper22, 
Giuliamocci, tabish.shaikh91, Adrian1985, Cpaulf30, Lahi, Gq86, Af420, 
Darkminds3113, Bsandipan, Lordiis, GoranSMilovanovic, Adik2382, Soteriaspace, 
Jayprakash12345, Th3d3v1ls, JakeTheDeveloper, Ramalepe, Liugev6, QZanden, 
merbst, LawExplorer, WSH1906, Lewizho99, Maathavan, _jensen, rosalieper, Izno, 
Wikidata-bugs, aude, Dinoguy1000, TheDJ, Mbch331, Jay8g
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


Re: [Wikidata] Proposal for the introduction of a practicable Data Quality Indicator in Wikidata

2019-08-24 Thread Gerard Meijssen
Hoi,
What is it that you hope to achieve by this.. It will add to the time it
takes to process an edit. It is a luxury we cannot afford. It is also not
something that would influence my edits.
Thanks,
 GerardM

On Sat, 24 Aug 2019 at 13:55, Uwe Jung  wrote:

> Hello,
>
> As the importance of Wikidata increases, so do the demands on the quality
> of the data. I would like to put the following proposal up for discussion.
>
> Two basic ideas:
>
>1. Each Wikidata page (item) is scored after each editing. This score
>should express different dimensions of data quality in a quickly manageable
>way.
>2. A property is created via which the item refers to the score value.
>Certain qualifiers can be used for a more detailed description (e.g. time
>of calculation, algorithm used to calculate the score value, etc.).
>
>
> The score value can be calculated either within Wikibase after each data
> change or "externally" by a bot. For the calculation can be used among
> other things: Number of constraints, completeness of references, degree of
> completeness in relation to the underlying ontology, etc. There are already
> some interesting discussions on the question of data quality which can be
> used here ( see  https://www.wikidata.org/wiki/Wikidata:Item_quality;
> https://www.wikidata.org/wiki/Wikidata:WikiProject_Data_Quality, etc).
>
> Advantages
>
>- Users get a quick overview of the quality of a page (item).
>- SPARQL can be used to query only those items that meet a certain
>quality level.
>- The idea would probably be relatively easy to implement.
>
>
> Disadvantage:
>
>- In a way, the data model is abused by generating statements that no
>longer describe the item itself, but make statements about the
>representation of this item in Wikidata.
>- Additional computing power must be provided for the regular
>calculation of all changed items.
>- Only the quality of pages is referred to. If it is insufficient, the
>changes still have to be made manually.
>
>
> I would now be interested in the following:
>
>1. Is this idea suitable to effectively help solve existing quality
>problems?
>2. Which quality dimensions should the score value represent?
>3. Which quality dimension can be calculated with reasonable effort?
>4. How to calculate and represent them?
>5. Which is the most suitable way to further discuss and implement
>this idea?
>
>
> Many thanks in advance.
>
> Uwe Jung  (UJung )
> www.archivfuehrer-kolonialzeit.de/thesaurus
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata-bugs] [Maniphest] [Updated] T228263: wire up store with styled TermTextField

2019-08-24 Thread Maintenance_bot
Maintenance_bot removed a project: Patch-For-Review.

TASK DETAIL
  https://phabricator.wikimedia.org/T228263

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Michael, Maintenance_bot
Cc: Lucas_Werkmeister_WMDE, Aklapper, Lydia_Pintscher, Charlie_WMDE, 
Pablo-WMDE, darthmon_wmde, DannyS712, Nandana, Lahi, Gq86, GoranSMilovanovic, 
QZanden, LawExplorer, _jensen, rosalieper, Wikidata-bugs, aude, Mbch331, 
Hook696, Daryl-TTMG, RomaAmorRoma, 0010318400, E.S.A-Sheild, joker88john, 
CucyNoiD, NebulousIris, Gaboe420, Versusxo, Majesticalreaper22, Giuliamocci, 
Adrian1985, Cpaulf30, Af420, Darkminds3113, Bsandipan, Lordiis, Adik2382, 
Th3d3v1ls, Ramalepe, Liugev6, WSH1906, Lewizho99, Maathavan
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


Re: [Wikidata] Proposal for the introduction of a practicable Data Quality Indicator in Wikidata

2019-08-24 Thread Ettore RIZZA
Hello,

Very interesting idea. Just to feed the discussion, here is a very recent
literature survey on data quality in Wikidata:
https://opensym.org/wp-content/uploads/2019/08/os19-paper-A17-piscopo.pdf
https://opensym.org/wp-content/uploads/2019/08/os19-paper-A17-piscopo.pdf

Cheers,

Ettore Rizza



On Sat, 24 Aug 2019 at 13:55, Uwe Jung  wrote:

> Hello,
>
> As the importance of Wikidata increases, so do the demands on the quality
> of the data. I would like to put the following proposal up for discussion.
>
> Two basic ideas:
>
>1. Each Wikidata page (item) is scored after each editing. This score
>should express different dimensions of data quality in a quickly manageable
>way.
>2. A property is created via which the item refers to the score value.
>Certain qualifiers can be used for a more detailed description (e.g. time
>of calculation, algorithm used to calculate the score value, etc.).
>
>
> The score value can be calculated either within Wikibase after each data
> change or "externally" by a bot. For the calculation can be used among
> other things: Number of constraints, completeness of references, degree of
> completeness in relation to the underlying ontology, etc. There are already
> some interesting discussions on the question of data quality which can be
> used here ( see  https://www.wikidata.org/wiki/Wikidata:Item_quality;
> https://www.wikidata.org/wiki/Wikidata:WikiProject_Data_Quality, etc).
>
> Advantages
>
>- Users get a quick overview of the quality of a page (item).
>- SPARQL can be used to query only those items that meet a certain
>quality level.
>- The idea would probably be relatively easy to implement.
>
>
> Disadvantage:
>
>- In a way, the data model is abused by generating statements that no
>longer describe the item itself, but make statements about the
>representation of this item in Wikidata.
>- Additional computing power must be provided for the regular
>calculation of all changed items.
>- Only the quality of pages is referred to. If it is insufficient, the
>changes still have to be made manually.
>
>
> I would now be interested in the following:
>
>1. Is this idea suitable to effectively help solve existing quality
>problems?
>2. Which quality dimensions should the score value represent?
>3. Which quality dimension can be calculated with reasonable effort?
>4. How to calculate and represent them?
>5. Which is the most suitable way to further discuss and implement
>this idea?
>
>
> Many thanks in advance.
>
> Uwe Jung  (UJung )
> www.archivfuehrer-kolonialzeit.de/thesaurus
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Proposal for the introduction of a practicable Data Quality Indicator in Wikidata

2019-08-24 Thread Uwe Jung
Hello,

As the importance of Wikidata increases, so do the demands on the quality
of the data. I would like to put the following proposal up for discussion.

Two basic ideas:

   1. Each Wikidata page (item) is scored after each editing. This score
   should express different dimensions of data quality in a quickly manageable
   way.
   2. A property is created via which the item refers to the score value.
   Certain qualifiers can be used for a more detailed description (e.g. time
   of calculation, algorithm used to calculate the score value, etc.).


The score value can be calculated either within Wikibase after each data
change or "externally" by a bot. For the calculation can be used among
other things: Number of constraints, completeness of references, degree of
completeness in relation to the underlying ontology, etc. There are already
some interesting discussions on the question of data quality which can be
used here ( see  https://www.wikidata.org/wiki/Wikidata:Item_quality;
https://www.wikidata.org/wiki/Wikidata:WikiProject_Data_Quality, etc).

Advantages

   - Users get a quick overview of the quality of a page (item).
   - SPARQL can be used to query only those items that meet a certain
   quality level.
   - The idea would probably be relatively easy to implement.


Disadvantage:

   - In a way, the data model is abused by generating statements that no
   longer describe the item itself, but make statements about the
   representation of this item in Wikidata.
   - Additional computing power must be provided for the regular
   calculation of all changed items.
   - Only the quality of pages is referred to. If it is insufficient, the
   changes still have to be made manually.


I would now be interested in the following:

   1. Is this idea suitable to effectively help solve existing quality
   problems?
   2. Which quality dimensions should the score value represent?
   3. Which quality dimension can be calculated with reasonable effort?
   4. How to calculate and represent them?
   5. Which is the most suitable way to further discuss and implement this
   idea?


Many thanks in advance.

Uwe Jung  (UJung )
www.archivfuehrer-kolonialzeit.de/thesaurus
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata-bugs] [Maniphest] [Commented On] T228263: wire up store with styled TermTextField

2019-08-24 Thread gerritbot
gerritbot added a comment.


  Change 530569 **merged** by jenkins-bot:
  [mediawiki/extensions/Wikibase@master] bridge: integrate StringDataValue 
component into Bridge component
  
  https://gerrit.wikimedia.org/r/530569

TASK DETAIL
  https://phabricator.wikimedia.org/T228263

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Michael, gerritbot
Cc: Lucas_Werkmeister_WMDE, Aklapper, Lydia_Pintscher, Charlie_WMDE, 
Pablo-WMDE, Hook696, Daryl-TTMG, RomaAmorRoma, 0010318400, E.S.A-Sheild, 
darthmon_wmde, joker88john, DannyS712, CucyNoiD, Nandana, NebulousIris, 
Gaboe420, Versusxo, Majesticalreaper22, Giuliamocci, Adrian1985, Cpaulf30, 
Lahi, Gq86, Af420, Darkminds3113, Bsandipan, Lordiis, GoranSMilovanovic, 
Adik2382, Th3d3v1ls, Ramalepe, Liugev6, QZanden, LawExplorer, WSH1906, 
Lewizho99, Maathavan, _jensen, rosalieper, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Lowered Priority] T231103: WikibaseLexeme test broken by refactor of MediaWiki's Language class

2019-08-24 Thread Tarrow
Tarrow lowered the priority of this task from "Unbreak Now!" to "Normal".
Tarrow added a comment.


  Looks to me like this is no longer UBN.
  
  Leaving the task open for investigation as to how we ended up with a broken 
build. I'm still a little confused by the timeline.
  
  Thanks for looking at this @Jdforrester-WMF while WMDE was out of the office.

TASK DETAIL
  https://phabricator.wikimedia.org/T231103

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Jdforrester-WMF, Tarrow
Cc: Tarrow, Mholloway, Liuxinyu970226, daniel, Simetrical, Jdforrester-WMF, 
darthmon_wmde, DannyS712, Nandana, Mringgaard, Lahi, Gq86, GoranSMilovanovic, 
QZanden, LawExplorer, _jensen, rosalieper, Wikidata-bugs, aude, Darkdadaah, 
Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Created] T231128: allow to easily relate a published image tag with a wikibase-docker commit

2019-08-24 Thread Maxlath
Maxlath created this task.
Maxlath added a project: Wikibase-Containers.
Restricted Application added a subscriber: Aklapper.
Restricted Application added a project: Wikidata.

TASK DESCRIPTION
  wikibase-docker `docker-compose.yml` 
 uses 
images published on hub.docker.com, which are sometimes lagging behind/not in 
sync with the master branch. This has for effect that they can be quite hard to 
debug for new comers as you could assume that running `docker-compose -f 
docker-compose.yml` when checking out the branch master would include all the 
behaviors and patches that were already merged into master. As this isn't the 
case, it could save everyone's time to find a way to make this asynchronicity 
explicit (so that containers users don't report bugs that were already patched 
for instance). Ideally, the way to make it explicit should allow to link a 
given image (typically tagged with `latest`) to a given commit hash.
  
  cc @Addshore

TASK DETAIL
  https://phabricator.wikimedia.org/T231128

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Maxlath
Cc: Maxlath, Addshore, Aklapper, darthmon_wmde, Jelabra, DannyS712, Nandana, 
Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, 
Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs