[Wikidata-bugs] [Maniphest] T297995: Remove authentication from Wikimedia Commons Query Services (WCQS)
Andrawaag added a comment. > My impression from speaking with multiple people at the hackathon is that the authentication requirement is a barrier for people to even use the service in the first place, which defeats the point. Exactly!. Thanks for pointing this out. We should however be wary that those low numbers don't reflect a lack of potential. Once and a while I try the WCQS and I keep being impressed by its potential, only to be disappointed when trying to use it at scale. We should keep reiterating that introducing the authentication to the SPARQL endpoint was a bad design choice. TASK DETAIL https://phabricator.wikimedia.org/T297995 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Elfix, Yug, pere_prlpz, Naseweis520, Stang, Denengelse, Spinster, Alicia_Fagerving_WMSE, RhinosF1, Andrawaag, MPhamWMF, Jane023, Sannita, Bugreporter, GFontenelle_WMF, Susannaanas, Gehel, AxelPettersson_WMSE, Abbe98, Izno, Pharos, Krinkle, Nikki, Ainali, Chicocvenancio, LWyatt, Legoktm, Husky, LucasWerkmeister, Fuzheado, Aklapper, Multichill, Astuthiodit_1, AWesterinen, karapayneWMDE, Invadibot, maantietaja, Y.ssk, Muchiri124, CBogen, ItamarWMDE, Akuckartz, Nandana, Namenlos314, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, rosalieper, Taiwania_Justo, Scott_WUaS, Jonas, Xmlizer, Ixocactus, Wong128hk, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, El_Grafo, Dinoguy1000, Manybubbles, Steinsplitter, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T306783: Combining English dialects in SPARQL FILTER bug or feature?
Andrawaag closed this task as "Resolved". Andrawaag claimed this task. Andrawaag added a comment. Perfect, thanks @Lucas_Werkmeister_WMDE. I agree it is a feature. TASK DETAIL https://phabricator.wikimedia.org/T306783 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Lucas_Werkmeister_WMDE, joshmoore, Andrawaag, Aklapper, Astuthiodit_1, karapayneWMDE, Invadibot, MPhamWMF, maantietaja, CBogen, ItamarWMDE, Akuckartz, Nandana, Namenlos314, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, Jonas, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T306783: Combining English dialects in SPARQL FILTER bug or feature?
Andrawaag created this task. Andrawaag added projects: Wikidata, Wikidata-Query-Service. Restricted Application added a subscriber: Aklapper. TASK DESCRIPTION When running the following DISTINCT query (https://w.wiki/56NX) the results does contain 3 seemingly identical records. When run outside out of the WDQS it becomes clear why that is. English dialects/variants are treated as identical. F35066913: image.png <https://phabricator.wikimedia.org/F35066913> Distinct does work, although the WDQS suggests it doesn't. TASK DETAIL https://phabricator.wikimedia.org/T306783 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, Aklapper, Astuthiodit_1, karapayneWMDE, Invadibot, MPhamWMF, maantietaja, CBogen, ItamarWMDE, Akuckartz, Nandana, Namenlos314, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, Jonas, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T297995: Remove authentication from Wikimedia Commons Query Services (WCQS)
Andrawaag added a comment. Can the password feature on the SDCQC please, please, please, please pretty please be removed/disabled? The SDCQC is an epic feature, but almost useless thanks to the requirement to log in. Basically, Commons remains a data silo on its own. I keep running into issues where I am building a query that I want to share, reuse in a jupyter notebook or run a federated query from Wikidata. The decision to Oauth here is really a poor design choice. TASK DETAIL https://phabricator.wikimedia.org/T297995 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Alicia_Fagerving_WMSE, RhinosF1, Andrawaag, MPhamWMF, Jane023, Sannita, Bugreporter, GFontenelle_WMF, Susannaanas, Gehel, AxelPettersson_WMSE, Abbe98, Izno, Pharos, Krinkle, Nikki, Ainali, Chicocvenancio, LWyatt, Legoktm, Husky, LucasWerkmeister, Fuzheado, Aklapper, Multichill, Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, Y.ssk, Muchiri124, CBogen, ItamarWMDE, Akuckartz, Nandana, Namenlos314, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, rosalieper, 4nn1l2, Taiwania_Justo, Scott_WUaS, Jonas, Xmlizer, Ixocactus, Wong128hk, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, El_Grafo, Dinoguy1000, Manybubbles, Steinsplitter, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T306430: Add SDC prefixes to the SDCQS
Andrawaag created this task. Andrawaag added projects: SDC General, Commons, Wikidata-Query-Service. Restricted Application added a subscriber: Aklapper. TASK DESCRIPTION **List of steps to reproduce** (step by step, including full links if applicable): - To add prefixes while building a query the WDQS has a neat feature,where you can click on the thumbnail to add the prefixes to the query. Currently, that thumbnail driven menu does not contain the prefixes of the SDC. These are: @prefix sdc: <https://commons.wikimedia.org/entity/> . @prefix sdcdata: <https://commons.wikimedia.org/wiki/Special:EntityData/> . @prefix sdcs: <https://commons.wikimedia.org/entity/statement/> . @prefix sdcref: <https://commons.wikimedia.org/reference/> . @prefix sdcv: <https://commons.wikimedia.org/value/> . @prefix sdct: <https://commons.wikimedia.org/prop/direct/> . @prefix sdctn: <https://commons.wikimedia.org/prop/direct-normalized/> . @prefix sdcp: <https://commons.wikimedia.org/prop/> . @prefix sdcps: <https://commons.wikimedia.org/prop/statement/> . @prefix sdcpsv: <https://commons.wikimedia.org/prop/statement/value/> . @prefix sdcpsn: <https://commons.wikimedia.org/prop/statement/value-normalized/> . @prefix sdcpq: <https://commons.wikimedia.org/prop/qualifier/> . @prefix sdcpqv: <https://commons.wikimedia.org/prop/qualifier/value/> . @prefix sdcpqn: <https://commons.wikimedia.org/prop/qualifier/value-normalized/> . @prefix sdcpr: <https://commons.wikimedia.org/prop/reference/> . @prefix sdcprv: <https://commons.wikimedia.org/prop/reference/value/> . @prefix sdcprn: <https://commons.wikimedia.org/prop/reference/value-normalized/> . @prefix sdcno: <https://commons.wikimedia.org/prop/novalue/> . - This prevents to build complete queries on the query service of sdc - **What happens?**: I now have to fetch the SDC prefixes through a tedious process where I have to first identify the URI of a commons file, add ".ttl" to that URI and select the needed prefixes.. **What should have happened instead?**: Like with the WDQS the prefixes should be selected through that thumbnail selector in the vertical menu on the left. **Software version (if not a Wikimedia wiki), browser information, screenshots, other information, etc.**: TASK DETAIL https://phabricator.wikimedia.org/T306430 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Aklapper, Andrawaag, GFontenelle_WMF, MPhamWMF, Y.ssk, FRomeo_WMF, Muchiri124, CBogen, Nintendofan885, JKSTNK, Namenlos314, Lahi, Gq86, E1presidente, Ramsey-WMF, Cparle, SandraF_WMF, Lucas_Werkmeister_WMDE, EBjune, Tramullas, Acer, merbst, Salgo60, Silverfish, 4nn1l2, Taiwania_Justo, Jonas, Xmlizer, Susannaanas, Ixocactus, Wong128hk, Jane023, jkroll, Wikidata-bugs, Jdouglas, Base, matthiasmullie, aude, Tobias1984, El_Grafo, Dinoguy1000, Manybubbles, Ricordisamoa, Wesalius, Lydia_Pintscher, Raymond, Steinsplitter ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T305434: The WCQS can't be used in PAWS
Andrawaag created this task. Andrawaag added projects: Commons, Wikidata-Query-Service. Restricted Application added a subscriber: Aklapper. TASK DESCRIPTION **List of steps to reproduce** (step by step, including full links if applicable): - Running a SPARQL query on SDC from PAWS is not possible due to a required password. - Running the query in the browser on commons works: https://w.wiki/524v - Running the same query in Wikimedia's jupyter notebook with a SPARQL kernel does not. https://public.paws.wmcloud.org/User:Andrawaag/wikiproject_biodiversity/Untitled1.ipynb F35039236: image.png <https://phabricator.wikimedia.org/F35039236> **What happens?**: The results in PAWS are prevented by an authorisation step, which is hard - if at all possible - to do in a SPARQL query. **What should have happened instead?**: The same results as when the query is submitted in the browser should be returned. Please remove the authorisation from the WCQS, it does not make sense to have that in SPARQL, plus it goes against Wikimedia's prime objective <https://en.wikipedia.org/wiki/Wikipedia:Prime_objective>. **Software version (if not a Wikimedia wiki), browser information, screenshots, other information, etc.**: TASK DETAIL https://phabricator.wikimedia.org/T305434 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, Aklapper, MPhamWMF, Y.ssk, Muchiri124, CBogen, Namenlos314, Gq86, Lucas_Werkmeister_WMDE, EBjune, merbst, 4nn1l2, Taiwania_Justo, Jonas, Xmlizer, Ixocactus, Wong128hk, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, El_Grafo, Dinoguy1000, Manybubbles, Steinsplitter ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T297995: Remove authentication from Wikimedia Commons Query Services (WCQS)
Andrawaag added a comment. adding security to WCQS, might have an unexpected effect. Since it is not possible to write a federated query where the query is submitted to a remote SPARQL endpoint, it is only possible to run federated queries directly on the WCQS, which means that WCQS needs to deal with all the complexity of a query. Removing that login requirement would allow the majority of the complexity can be dealt with at a remote endpoint. TASK DETAIL https://phabricator.wikimedia.org/T297995 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, MPhamWMF, Jane023, Sannita, Bugreporter, GFontenelle_WMF, Susannaanas, Gehel, AxelPettersson_WMSE, Abbe98, Izno, Pharos, Krinkle, Nikki, Ainali, Chicocvenancio, LWyatt, Legoktm, Husky, LucasWerkmeister, Fuzheado, Aklapper, Multichill, karapayneWMDE, Invadibot, maantietaja, Y.ssk, Muchiri124, CBogen, Akuckartz, Nandana, Namenlos314, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, rosalieper, 4nn1l2, Taiwania_Justo, Scott_WUaS, Jonas, Xmlizer, Ixocactus, Wong128hk, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, El_Grafo, Dinoguy1000, Manybubbles, Steinsplitter, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T302635: Why does it require a Wikimedia login to use SPARQL on commons?
Andrawaag created this task. Andrawaag added projects: Commons, Wikidata-Query-Service, SDC General. Restricted Application added a subscriber: Aklapper. TASK DESCRIPTION I noticed that one needs to login on commons to be able to use the WDQS linked to SDC commons. This works when writing queries on the browser, but not when used in federated queries from other SPARQL endpoints (like Wikidata). Having to login before using the wdqs on commons also prevent using the SDC data in the SPARQL kernel of the Jupyter server of PAWS, the Jupyter server of Wikimedia. TASK DETAIL https://phabricator.wikimedia.org/T302635 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Aklapper, Andrawaag, GFontenelle_WMF, MPhamWMF, Y.ssk, FRomeo_WMF, Muchiri124, CBogen, Nintendofan885, JKSTNK, Namenlos314, Lahi, Gq86, E1presidente, Ramsey-WMF, Cparle, SandraF_WMF, Lucas_Werkmeister_WMDE, EBjune, Tramullas, Acer, merbst, Salgo60, Silverfish, 4nn1l2, Taiwania_Justo, Jonas, Xmlizer, Susannaanas, Ixocactus, Wong128hk, Jane023, jkroll, Wikidata-bugs, Jdouglas, Base, matthiasmullie, aude, Tobias1984, El_Grafo, Dinoguy1000, Manybubbles, Ricordisamoa, Wesalius, Lydia_Pintscher, Raymond, Steinsplitter ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T301089: Can the JSON format of a wikidata item and that of a Structure Data Commons be alligned?
Andrawaag created this task. Andrawaag added projects: StructuredDataOnCommons, Wikidata. Restricted Application added a subscriber: Aklapper. TASK DESCRIPTION **Feature summary** (what you would like to be able to do and where): I noticed some slight difference between the JSON of a wikidata item and that of a structure data commons item. At first glance, they seem rather esthetic (are they?). I was trying to reuse code from de WikidataIntegrator to interact with SDC API, which was unsuccessful because of two minor differences between the two. 1. Where in the Wikidata API output the term "claims" is used, SDC uses "statements" 2. Wikidata JSON output has a field: entities->Qxx->claims->Pxx->mainsnak->datatype, which does not exist in the output of SDC. **Use case(s)** (list the steps that you performed to discover that problem, and describe the actual underlying problem which you want to solve. Do not describe only a solution): I was trying to reuse existing code to parse wikidata items. Early on I noticed the difference between "claims). (wd) and "statements" (sdc). This can be easily fixed by duplicating the code and changing claims for statements. However, I got stuck when trying to assess the statement type. **Benefits** (why should this be implemented?): Having a JSON output of both the wd API and sdc API, would allow the myriad of existing wikidata external tools to be reused in an SDC context. I used the following output as (arbitrary) json representations for comparison 1. https://www.wikidata.org/wiki/Special:EntityData/P279.json 2. https://commons.wikimedia.org/wiki/Special:EntityData/M574781.json TASK DETAIL https://phabricator.wikimedia.org/T301089 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, Aklapper, Invadibot, maantietaja, FRomeo_WMF, CBogen, Nintendofan885, Akuckartz, Nandana, JKSTNK, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T297470: torrent file for Wikidata dumps
Andrawaag created this task. Andrawaag added projects: Wikidata, Dumps-Generation, Datasets-Archiving. Restricted Application added projects: Internet-Archive, wdwb-tech. TASK DESCRIPTION **Feature summary** : Providing a torrent file <https://en.wikipedia.org/wiki/Torrent_file> allows another way to access the Wikidata data dumps <https://www.wikidata.org/wiki/Wikidata:Database_download>. There are already torrent files for different Wikipedia's <https://meta.wikimedia.org/wiki/Data_dump_torrents>, but I did not find those for Wikidata. **Use case(s)** (list the steps that you performed to discover that problem, and describe the actual underlying problem which you want to solve. Do not describe only a solution): Downloading data dumps from either Wikidata or the Internet archive can be slow at times. Using Bittorrent might help spread the distribution and would allow faster download speeds. **Benefits** (why should this be implemented?): I am currently experimenting with creating subsets from wikidata where I often download a data dump to work on, or ask someone to repeat the applied method. THere is always the delay to get the dumps to proceed. TASK DETAIL https://phabricator.wikimedia.org/T297470 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, Invadibot, maantietaja, jannee_e, Biaoo, Philoserf, Nintendofan885, Akuckartz, Ironie, holger.knust, Nandana, Lahi, Gq86, GoranSMilovanovic, Lunewa, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, gnosygnu, mys_721tx, Wikidata-bugs, Hydriz, aude, Nemo_bis, Addshore, Mbch331, Jay8g ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T283997: How are the hash values in Wikidata rdf generated?
Andrawaag added a comment. You are completely right, the same hashes are not needed to apply EntitySchema's on memory ingestion to Wikidata. I need the hashes as a sanity check that my script created the exact same RDF as being produced by Wikidata natively. So the hashes are only needed in the development phase of the script. Here is a notebook that contains the first prototype <https://public.paws.wmcloud.org/User:Andrawaag/Genewiki/Wikidata_json2ttl.ipynb>. allRD = WDqidRDFEngine(qid="Q38", fetch_all=True) compareRDF = Graph() compareRDF.parse("http://www.wikidata.org/entity/Q38.ttl;, ) inboth, left, right = graph_diff(to_isomorphic(compareRDF), to_isomorphic(allRD.rdf_item)) print(len(left)) print(len(compareRDF) If my script works, there should be no difference in the length of both graphs. Currently, that is not the case. I checked various examples and except for the hashes in those normalized statements they seem equal. But if it is indeed difficult to reproduce those hashes, I should reflect on another test to verify. In the actual validation script, not all RDF will be needed. Ignoring the labels for example slims down the RDF graph substantially. So I am currently building functionality into the WikidataIntegrator that allows selecting only certain parts (e.g. no truthy statements, or only truthy statements, no normalized values, etc). A notebook with that code is here <https://public.paws.wmcloud.org/User:Andrawaag/Genewiki/wdi_rdf.ipynb> My PHP skills are a bit rusty, but I will investigate and or consider other test strategies. TASK DETAIL https://phabricator.wikimedia.org/T283997 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Addshore, Andrawaag Cc: Lucas_Werkmeister_WMDE, Aklapper, Andrawaag, Invadibot, maantietaja, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Addshore, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T283997: How are the hash values in Wikidata rdf generated?
Andrawaag added a comment. I wasn't looking for guarantees about the hash values. They have a value a sanity check in a[[ https://github.com/Wikidata/triplify-json | reverse engineering project ]] we are doing to reproduce the Wikidata/Wikibase RDF outside wikibase itself. We need this to be able to apply EntitySchema's pre-ingestion. Currently, EntitySchema's can only be applied post data ingestion. The script, as it currently works, builds the RDF from the JSON. That JSON object is enriched and the idea is to then verify if the new JSON object still fits the EntitySchema before it is submitted to the API of Wikidata. In building that RDF script the hash values have a role in verifying the (reversed engineered) script does indeed reflect the exact same RDF as it is produced natively by Wikidata. For most snaks the hash values are given in the JSON that is produced by the API of wikidata. This is not the case for those values in the RDF that are not given by the JSON export. This is specifically for the normalized values on time and globe coordinates. That is why I am interested in the algorithm that is used to produce those hash values internally. TASK DETAIL https://phabricator.wikimedia.org/T283997 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Addshore, Andrawaag Cc: Lucas_Werkmeister_WMDE, Aklapper, Andrawaag, Invadibot, maantietaja, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Addshore, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T281854: Get baseline measurements/expectations for splitting scholarly articles from Wikidata
Andrawaag added a comment. I would not call it evicting scholarly articles. Scholarly articles are currently a major driving force for Wikidata, however, its size is problematic because it is becoming more difficult to see other topics (sometimes unrelated to scholarly articles). I have thought about and working towards a federated landscape of linked wikibases and other semantic web resources for a while now. Building such a federated landscape is already easy peasy. We have wbstack, wikibase docker, but also platforms like GraphDB, Virtuoso, Stardog (to just mention a few). It would take a simple hackathon and some motivated users to build a nice prototype. But setting up such a federated landscape is the easy part. What is more difficult is to be able to map between the different endpoints (wikidata, wikibases, other rdf stores), Givens its size the subgraph of scholarly articles simply deserves its own metal to excel beyond the current limitations. The main question then becomes how to align this new subgraph with the other parts of Wikidata, to which it intrinsically links (as @Daniel_Mietchen says.). So I am actually in favour of separating the subgraph of scholarly articles from Wikidata (the incubator) to a node in Wikidata (the linked knowledge graph) and the global semantic web, I indeed said: Moving away from Wikidata to Wikidata :) We need a new term for the knowledge graph where the current Wikidata is an index or sort of DNS to other (semantic web) nodes. TASK DETAIL https://phabricator.wikimedia.org/T281854 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, Harej, Lydia_Pintscher, Mohammed_Sadat_WMDE, nichtich, EgonWillighagen, Fnielsen, Darwinius, Daniel_Mietchen, Lokal_Profil, GoEThe, Alicia_Fagerving_WMSE, PKM, LWyatt, Multichill, Aklapper, MPhamWMF, Invadibot, maantietaja, CBogen, Akuckartz, Nandana, Namenlos314, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, Jonas, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T283997: How are the hash values in Wikidata rdf generated?
Andrawaag created this task. Andrawaag added a project: Wikidata. Restricted Application added a subscriber: Aklapper. TASK DESCRIPTION I am comparing the JSON output of Wikidata's API, with its RDF equivalent. The RDF contains URI's which contains hash values, which are visible in the JSON output. e.g. s:Q35869-2E1F6A06-E14B-4533-B207-61DD01CB57D3 pqv:P580 v:0fe8fcb754ca9e8d828baee034479a75 . THe RDF, however also contains "normalized" values that are not part of the json model. URIs in those normalized values also contains has values. How are these generated? TASK DETAIL https://phabricator.wikimedia.org/T283997 WORKBOARD https://phabricator.wikimedia.org/project/board/71/ EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Aklapper, Andrawaag, Invadibot, maantietaja, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T283996: Is it possible to find the merge history of a Wikidata item using the the action API's JSON output?
Andrawaag created this task. Andrawaag added a project: Wikidata. Restricted Application added a subscriber: Aklapper. TASK DESCRIPTION When Wikidata items are merged, that is reflected in the RDF output of a Wikidata-item . e.g; wd:Q12862823 owl:sameAs wd:Q35869 . However, that information is not part of the JSON equivalent. How can a merge history of a wikidata item be retrieved using the API? TASK DETAIL https://phabricator.wikimedia.org/T283996 WORKBOARD https://phabricator.wikimedia.org/project/board/71/ EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, Aklapper, Invadibot, maantietaja, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T277662: latest all rdf dump: bad IRI scheme
Andrawaag added a comment. In the ShEx CG, the following fix was suggested: sed -i -E 's/(<.*)}(.*>)/\1\2/' sed -i -E 's/(<.*)\\n(.*>)/\1\2/' sed -i -E 's/(<.*)\|(.*>)/\1\2/' TASK DETAIL https://phabricator.wikimedia.org/T277662 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, jjkoehorst, Aklapper, maantietaja, jannee_e, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, Lunewa, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, gnosygnu, abian, Wikidata-bugs, aude, Addshore, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] T277662: latest all rdf dump: bad IRI scheme
Andrawaag added a parent task: T179681: Add HDT dump of Wikidata. TASK DETAIL https://phabricator.wikimedia.org/T277662 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: jjkoehorst, Aklapper, maantietaja, jannee_e, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, Lunewa, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, gnosygnu, abian, Wikidata-bugs, aude, Addshore, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] T179681: Add HDT dump of Wikidata
Andrawaag added a subtask: T277662: latest all rdf dump: bad IRI scheme. TASK DETAIL https://phabricator.wikimedia.org/T179681 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: jjkoehorst, MPhamWMF, Daniel_Mietchen, hoo, Addshore, Smalyshev, Ladsgroup, Arkanosis, Tarrow, Lucas_Werkmeister_WMDE, Aklapper, maantietaja, Akuckartz, Dinadineke, DannyS712, Nandana, tabish.shaikh91, Lahi, Gq86, GoranSMilovanovic, Soteriaspace, Jayprakash12345, JakeTheDeveloper, QZanden, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, abian, Wikidata-bugs, aude, TheDJ, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] T277662: latest all rdf dump: bad IRI scheme
Andrawaag added projects: Wikidata, Dumps-Generation. Restricted Application added a project: wdwb-tech-focus. TASK DETAIL https://phabricator.wikimedia.org/T277662 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: jjkoehorst, Aklapper, maantietaja, jannee_e, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, Lunewa, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, gnosygnu, abian, Wikidata-bugs, aude, Addshore, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] T277108: Query service throws exception for non-English wikis
Andrawaag added a comment. I have reproduced the issue by running a Wikibase in both the Japanese and Korean language versions as configured using https://github.com/andrawaag/wikibase_languages wdqs-updater_1 | 22:28:12.349 [main] INFO o.w.q.r.t.change.RecentChangesPoller - Got no real changes wdqs-updater_1 | 22:28:12.349 [main] INFO org.wikidata.query.rdf.tool.Updater - Sleeping for 10 secs wdqs-updater_1 | 22:28:22.401 [main] INFO o.w.q.r.t.change.RecentChangesPoller - Got 1 changes, from Q1@2@20210315222813|2 to Q1@2@20210315222813|2 wdqs-updater_1 | 22:28:23.011 [update 0] INFO o.wikidata.query.rdf.tool.rdf.Munger - Unrecognized subjects: [http://wikibase.svc/wiki/%ED%8A%B9%EC%88%98:EntityData/Q1] while processing http://wikibase.svc/entity/Q1. Expected only sitelinks and subjects starting with http://wikibase.svc/wiki/Special:EntityData/ and [http://wikibase.svc/entity/] wdqs-updater_1 | 22:28:23.017 [update 0] INFO o.wikidata.query.rdf.tool.rdf.Munger - Unrecognized statement: s:http://wikibase.svc/wiki/%ED%8A%B9%EC%88%98:EntityData/Q1 p:http://www.w3.org/1999/02/22-rdf-syntax-ns#type o:http://schema.org/Dataset wdqs-updater_1 | 22:28:23.018 [update 0] INFO o.wikidata.query.rdf.tool.rdf.Munger - Unrecognized statement: s:http://wikibase.svc/wiki/%ED%8A%B9%EC%88%98:EntityData/Q1 p:http://schema.org/about o:http://wikibase.svc/entity/Q1 wdqs-updater_1 | 22:28:23.018 [update 0] INFO o.wikidata.query.rdf.tool.rdf.Munger - Unrecognized statement: s:http://wikibase.svc/wiki/%ED%8A%B9%EC%88%98:EntityData/Q1 p:http://creativecommons.org/ns#license o:http://creativecommons.org/publicdomain/zero/1.0/ wdqs-updater_1 | 22:28:23.018 [update 0] INFO o.wikidata.query.rdf.tool.rdf.Munger - Unrecognized statement: s:http://wikibase.svc/wiki/%ED%8A%B9%EC%88%98:EntityData/Q1 p:http://schema.org/softwareVersion o:"1.0.0"^^<http://www.w3.org/2001/XMLSchema#string> wdqs-updater_1 | 22:28:23.019 [update 0] INFO o.wikidata.query.rdf.tool.rdf.Munger - Unrecognized statement: s:http://wikibase.svc/wiki/%ED%8A%B9%EC%88%98:EntityData/Q1 p:http://schema.org/version o:"2"^^<http://www.w3.org/2001/XMLSchema#integer> wdqs-updater_1 | 22:28:23.019 [update 0] INFO o.wikidata.query.rdf.tool.rdf.Munger - Unrecognized statement: s:http://wikibase.svc/wiki/%ED%8A%B9%EC%88%98:EntityData/Q1 p:http://schema.org/dateModified o:"2021-03-15T22:28:13Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> wdqs-updater_1 | 22:28:23.019 [update 0] INFO o.wikidata.query.rdf.tool.rdf.Munger - Unrecognized statement: s:http://wikibase.svc/wiki/%ED%8A%B9%EC%88%98:EntityData/Q1 p:http://wikiba.se/ontology#statements o:"0"^^<http://www.w3.org/2001/XMLSchema#integer> wdqs-updater_1 | 22:28:23.019 [update 0] INFO o.wikidata.query.rdf.tool.rdf.Munger - Unrecognized statement: s:http://wikibase.svc/wiki/%ED%8A%B9%EC%88%98:EntityData/Q1 p:http://wikiba.se/ontology#identifiers o:"0"^^<http://www.w3.org/2001/XMLSchema#integer> wdqs-updater_1 | 22:28:23.019 [update 0] INFO o.wikidata.query.rdf.tool.rdf.Munger - Unrecognized statement: s:http://wikibase.svc/wiki/%ED%8A%B9%EC%88%98:EntityData/Q1 p:http://wikiba.se/ontology#sitelinks o:"0"^^<http://www.w3.org/2001/XMLSchema#integer> wdqs-updater_1 | 22:28:23.511 [main] INFO org.wikidata.query.rdf.tool.Updater - Polled up to 2021-03-15T22:28:13Z at (0.0, 0.0, 0.0) updates per second and (0.0, 0.0, 0.0) milliseconds per second wdqs-updater_1 | 22:28:23.538 [main] INFO o.w.q.r.t.change.RecentChangesPoller - Got no real changes wdqs-updater_1 | 22:28:23.538 [main] INFO org.wikidata.query.rdf.tool.Updater - Sleeping for 10 secs Steps I followed: 1. Pulled the above mentioned repository from gihub 2. Installed docker & Docker-compose 3. sudo docker-compose up. This leads to F34162460: image.png <https://phabricator.wikimedia.org/F34162460> And no updates on the wbqs. TASK DETAIL https://phabricator.wikimedia.org/T277108 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, Aklapper, JeroenDeDauw, Addshore, MPhamWMF, maantietaja, CBogen, Akuckartz, darthmon_wmde, Nandana, Namenlos314, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, Jonas, Xmlizer, abian, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] T271065: create external triplestore for EntitySchema
Andrawaag added subscribers: Jelabra, Andrawaag. Andrawaag added a comment. Yes, batch parsing of EntitySchema's is still difficult. There are however some tricks one could use to not having to parse the HTML. There is the option under SpecialPages <https://www.wikidata.org/wiki/Special:SpecialPages>: EntitySchemaText <https://www.wikidata.org/wiki/Special:EntitySchemaText>. You'll need to add an EntitySchema number to get the EntitySchema in ShExC (e.g. E42 <https://www.wikidata.org/wiki/Special:EntitySchemaText/E42>). Subsequently json renderings of those ShExC can be obtained with parsers like shex-to-json <https://github.com/shexSpec/shex.js/blob/master/packages/shex-cli/bin/shex-to-json> can help in getting json of the schemas. There is also wikishape <http://wikishape.weso.es/> that brings some GUI to the process. I am not sure how to get JSON here though maybe @Jelabra knows? TASK DETAIL https://phabricator.wikimedia.org/T271065 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, Jelabra, Aklapper, SCIdude, Akuckartz, pdehaye, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, YULdigitalpreservation, LawExplorer, Salgo60, _jensen, rosalieper, Scott_WUaS, MisterSynergy, abian, Wikidata-bugs, aude, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] T205080: Make a canonical example of a painting of Woman with a hat Q1437492
Andrawaag closed this task as "Declined". TASK DETAIL https://phabricator.wikimedia.org/T205080 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Addshore, Aklapper, Andrawaag, Johentsch, Samantha_Alipio_WMDE, Akuckartz, Nandana, LJ, lucamauri, Lahi, Gq86, SandraF_WMF, GoranSMilovanovic, QZanden, YULdigitalpreservation, LawExplorer, _jensen, rosalieper, Jneubert, Scott_WUaS, Wikidata-bugs, aude, Daniel_Mietchen, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] T205081: Extract a machine readable schema description from a canonical example
Andrawaag closed this task as "Resolved". Andrawaag added a comment. Various solutions exist now to semi-automatically extract a schema from WIkidata. Tools like sheXer <https://github.com/DaniFdezAlvarez/shexer> or Shape Designer <https://gitlab.inria.fr/jdusart/shexjapp>. In this notebook <https://colab.research.google.com/drive/1qXXPl1v4fGcmyBe5rNWO3n3y_r_pE-xV?usp=sharing> sheXer is used to extract the schema from a set of external identifiers. TASK DETAIL https://phabricator.wikimedia.org/T205081 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Jan_Dittrich, Lucas_Werkmeister_WMDE, Addshore, Andrawaag, Aklapper, Johentsch, Samantha_Alipio_WMDE, Akuckartz, Nandana, LJ, lucamauri, Lahi, Gq86, SandraF_WMF, GoranSMilovanovic, QZanden, YULdigitalpreservation, LawExplorer, _jensen, rosalieper, Jneubert, Scott_WUaS, Wikidata-bugs, aude, Daniel_Mietchen, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] T205081: Extract a machine readable schema description from a canonical example
Andrawaag added a comment. In T205081#4621716 <https://phabricator.wikimedia.org/T205081#4621716>, @Jan_Dittrich wrote: > I do not understand this ticket: What are concept types? A concept type is anything that we would like to extract from Wikidata. E.g. paintings, disease, rivers, etc. TASK DETAIL https://phabricator.wikimedia.org/T205081 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Jan_Dittrich, Lucas_Werkmeister_WMDE, Addshore, Andrawaag, Aklapper, Johentsch, Samantha_Alipio_WMDE, Akuckartz, Nandana, LJ, lucamauri, Lahi, Gq86, SandraF_WMF, GoranSMilovanovic, QZanden, YULdigitalpreservation, LawExplorer, _jensen, rosalieper, Jneubert, Scott_WUaS, Wikidata-bugs, aude, Daniel_Mietchen, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] T267782: Incorrect use of blank node to represent unknown and no value in Wikidata
Andrawaag created this task. Andrawaag added a project: Wikidata. Restricted Application added a subscriber: Aklapper. TASK DESCRIPTION During the [[ https://github.com/elixir-europe/BioHackathon-projects-2020/tree/master/projects/35 | where we are working on subsetting wikidata, we ran into the issue of blank notes being used in the RDF of Wikidata to express unknown and no values. Unfortunately, this isn't consistent because blank notes are also used to express other things such as owl:complementOf (e.g. Q42). These blank nodes are also problematic for anything that traverses wikidata node-by-node such as faceted browsers or ShEx validators. It is not explicitly incorrect to have blank nodes in RDF data, but it is: 1. inconsistent with the approach that Wikidata has taken (which is to avoid blank nodes) 2. ambiguous because in RDF, blank nodes do not imply unknown values, they are simply *unidentified* nodes in the graph. Steps to Reproduce: - GET http://www.wikidata.org/entity/Q313093.ttl - look for "_:" (currently _:2d22892344b969be376b57170b5e495f) - try a SPARQL query for all properties of that node SELECT ?P ?o { _:2d22892344b969be376b57170b5e495f ?p ?o } - Because of the semantics of SPARQL, this will try to get every triple in the database. It is not explicitly incorrect to have blank nodes in RDF data, but it is 1. inconsistent with the approach that Wikidata has taken (which is to avoid blank nodes) 2. ambiguous because in RDF, blank nodes do not imply unknown values, they are simply *unidentified* nodes in the graph. Remedy: Invent a system-wide identifier for unknown values and use that Q identifier for all references to unknown value. TASK DETAIL https://phabricator.wikimedia.org/T267782 WORKBOARD https://phabricator.wikimedia.org/project/board/71/ EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Aklapper, Jelabra, ericP, Andrawaag, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Closed] T241876: W3C ShEx CG meeting in 2020
Andrawaag closed this task as "Resolved". Andrawaag claimed this task. TASK DETAIL https://phabricator.wikimedia.org/T241876 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, Aklapper, darthmon_wmde, pdehaye, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, YULdigitalpreservation, LawExplorer, Salgo60, _jensen, rosalieper, Scott_WUaS, MisterSynergy, abian, Wikidata-bugs, aude, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T245138: Have a "now is not a good time" flag on the Wikidata api
Andrawaag added a comment. Yes I remember the 1000 suggestion, but we can certainly try it perpetually, but somehow that does not feel right. Every unsuccesful attempt is yet another attempt bothering the api. Would it not be better to simply stop until the api settle down after 25 efforts? TASK DETAIL https://phabricator.wikimedia.org/T245138 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Addshore, Bugreporter, Aklapper, Andrawaag, darthmon_wmde, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T245138: Have a "now is not a good time" flag on the Wikidata api
Andrawaag created this task. Andrawaag added a project: Wikidata. Restricted Application added a subscriber: Aklapper. TASK DESCRIPTION When we run bots we sometime run into api requests to backup for a few seconds. Our bots respect that request and will pause and resume as the suggested seconds. However, often the issue persists which leads to many repeated attempts. See for example snippet from a recent run: 2020-02-11 22:29:03.462709: maxlag. sleeping for 5.467 seconds 2020-02-11 22:29:09.111319: maxlag. sleeping for 5.467 seconds 2020-02-11 22:29:14.735034: maxlag. sleeping for 5.467 seconds 2020-02-11 22:29:20.354678: maxlag. sleeping for 5.467 seconds 2020-02-11 22:29:25.998749: maxlag. sleeping for 5.467 seconds 2020-02-11 22:29:31.629414: maxlag. sleeping for 5.467 seconds 2020-02-11 22:29:37.255963: maxlag. sleeping for 5.467 seconds 2020-02-11 22:29:42.869407: maxlag. sleeping for 5.467 seconds 2020-02-11 22:29:48.503712: maxlag. sleeping for 5.467 seconds 2020-02-11 22:29:54.129682: maxlag. sleeping for 5.467 seconds 2020-02-11 22:29:59.730619: maxlag. sleeping for 5.467 seconds 2020-02-11 22:30:05.355369: maxlag. sleeping for 5.816 seconds 2020-02-11 22:30:11.371665: maxlag. sleeping for 5.816 seconds 2020-02-11 22:30:17.402557: maxlag. sleeping for 5.816 seconds 2020-02-11 22:30:23.387781: maxlag. sleeping for 5.816 seconds 2020-02-11 22:30:29.375381: maxlag. sleeping for 5.816 seconds 2020-02-11 22:30:35.377794: maxlag. sleeping for 5.816 seconds 2020-02-11 22:30:41.355254: maxlag. sleeping for 5.816 seconds 2020-02-11 22:30:47.319589: maxlag. sleeping for 5.816 seconds 2020-02-11 22:30:53.293700: maxlag. sleeping for 5.816 seconds 2020-02-11 22:30:59.300540: maxlag. sleeping for 5.816 seconds 2020-02-11 22:31:05.272313: maxlag. sleeping for 6.233 seconds 2020-02-11 22:31:11.697159: maxlag. sleeping for 6.233 seconds 2020-02-11 22:31:18.098883: maxlag. sleeping for 6.233 seconds 2020-02-11 22:31:24.493465: maxlag. sleeping for 6.233 seconds We have currently have a limit of 25 attempts afterwhich an exception is sent and the bot is terminated. I am a bit surprised that the request to backoff does not become more aggressive with incremental steps, i.e. staying around 5-6 seconds. However the termination based on this approach disrupts the overall workflow, so we are looking into different approaches. One would be to increase the value in the number of max attempts from 25 to 100 or even 1000. Or to exponentailly increase the waiting time per each iteration. But I am wondering if the API benefits from these repetative attempts to connect to the API. Another approach would be to do not repeat, but to have some "Now is not a good time", which our bots could consult before commensing with a bot run. This way, we do not send unneccesary attempts. Is this possible? TASK DETAIL https://phabricator.wikimedia.org/T245138 WORKBOARD https://phabricator.wikimedia.org/project/board/71/ EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Aklapper, Andrawaag, darthmon_wmde, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Closed] T227079: Investigate Shape Expressions for GLAMs
Andrawaag closed this task as "Resolved". Andrawaag claimed this task. TASK DETAIL https://phabricator.wikimedia.org/T227079 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Jsamwrites, Andrawaag, pdehaye, Salgo60, Alicia_Fagerving_WMSE, darthmon_wmde, Tore_Danielsson_WMSE, Nandana, tramm, LJ, lucamauri, Lahi, Gq86, SandraF_WMF, GoranSMilovanovic, QZanden, YULdigitalpreservation, LawExplorer, TomT0m, _jensen, rosalieper, Jneubert, Scott_WUaS, Susannaanas, MisterSynergy, abian, Wikidata-bugs, aude, Daniel_Mietchen, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T241876: W3C ShEx CG meeting in 2020
Andrawaag created this task. Andrawaag added a project: Shape Expressions. Restricted Application added a subscriber: Aklapper. Restricted Application added a project: Wikidata. TASK DESCRIPTION The schedule for the two-weekly ShEx community group online meeting has been set. Everyone interested is cordially invited to attend. calendar <https://calendar.google.com/event?action=TEMPLATE=N2VyOGMyYjJnZTVma25qMWhlYWF2YmYycHFfMjAyMDAxMDhUMTMwMDAwWiBtaWNlbGlvLmJlX2FjM2xqNzNqdTA0YTY3OGIwaHRsMXBpamRvQGc=micelio.be_ac3lj73ju04a678b0htl1pijdo%40group.calendar.google.com=ALL> TASK DETAIL https://phabricator.wikimedia.org/T241876 WORKBOARD https://phabricator.wikimedia.org/project/board/3789/ EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, Aklapper, darthmon_wmde, pdehaye, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, YULdigitalpreservation, LawExplorer, Salgo60, _jensen, rosalieper, Scott_WUaS, MisterSynergy, abian, Wikidata-bugs, aude, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T195817: How to model the continuity of a single project which is divided in reality in more projects "administratively".
Andrawaag added a comment. This isssue will be reviewed in the ShEx CG meeting on January 8th: https://github.com/shexSpec/shex/blob/master/meetings/2020/20200108-agenda.md TASK DETAIL https://phabricator.wikimedia.org/T195817 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Aklapper, Considering.Different.Routes, Andrawaag, darthmon_wmde, Nandana, LJ, lucamauri, Lahi, Gq86, SandraF_WMF, GoranSMilovanovic, QZanden, YULdigitalpreservation, LawExplorer, Salgo60, TomT0m, _jensen, rosalieper, Jneubert, Scott_WUaS, Susannaanas, Wikidata-bugs, aude, Daniel_Mietchen, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T227079: Investigate Shape Expressions for GLAMs
Andrawaag added a comment. This isssue will be reviewed in the ShEx CG meeting on January 8th: https://github.com/shexSpec/shex/blob/master/meetings/2020/20200108-agenda.md TASK DETAIL https://phabricator.wikimedia.org/T227079 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, pdehaye, Salgo60, Alicia_Fagerving_WMSE, darthmon_wmde, Tore_Danielsson_WMSE, Nandana, tramm, LJ, lucamauri, Lahi, Gq86, SandraF_WMF, GoranSMilovanovic, QZanden, YULdigitalpreservation, LawExplorer, TomT0m, _jensen, rosalieper, Jneubert, Scott_WUaS, Susannaanas, MisterSynergy, abian, Wikidata-bugs, aude, Daniel_Mietchen, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T239326: The new checkShEx addition to Wikidata returns different results than the simple shex extension
Andrawaag created this task. Andrawaag added a project: Shape Expressions. Restricted Application added a subscriber: Aklapper. Restricted Application added a project: Wikidata. TASK DESCRIPTION First, this new shex extension is awesome!! However, when I tried to run the human gene schema <https://www.wikidata.org/wiki/EntitySchema:E37> on a human gene RB1 <http://www.wikidata.org/entity/Q40108> it returns a Fail. When running the test for conformance on wikidata by following the link "check entities against this Schema" it approves. Steps to Reproduce: - Go to the[[ https://www.wikidata.org/wiki/EntitySchema:E37 | human gene entity schema ]], while the plugin is enabled. Add the wikidata id for a wikidata item on a human gene (eg. Q40108). It will report a FAIL. While when you press on "check entities against this Schema", you get directed to [[ https://tools.wmflabs.org/shex-simple/wikidata/packages/shex-webapp/doc/shex-simple.html?data=Endpoint:%20https://query.wikidata.org/sparql=[]=%2F%2Fwww.wikidata.org%2Fwiki%2FSpecial%3AEntitySchemaText%2FE37 | shex-simple ]]. There the same wikidata items conforms to E37 <https://phabricator.wikimedia.org/E37> Actual Results: For simpleshex it returns a green OK mark. for check ShEx I see a red fail Expected Results: Both either fail or succeed. TASK DETAIL https://phabricator.wikimedia.org/T239326 WORKBOARD https://phabricator.wikimedia.org/project/board/3789/ EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, Aklapper, darthmon_wmde, pdehaye, DannyS712, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, YULdigitalpreservation, LawExplorer, Salgo60, _jensen, rosalieper, Scott_WUaS, MisterSynergy, abian, Wikidata-bugs, aude, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T192920: Where in the Wikibase ecosystem should ShEx manifests sit?
Andrawaag added a comment. I think it makes sense to close this. Issue. As @Addshore suggests, with the extension being available to any wikibase. it is done. TASK DETAIL https://phabricator.wikimedia.org/T192920 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, pdehaye, Addshore, YULdigitalpreservation, Aklapper, Daniel_Mietchen, darthmon_wmde, DannyS712, Nandana, LJ, lucamauri, Lahi, Gq86, SandraF_WMF, GoranSMilovanovic, QZanden, LawExplorer, Salgo60, TomT0m, _jensen, rosalieper, Jneubert, Scott_WUaS, Susannaanas, Wikidata-bugs, aude, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Closed] T192920: Where in the Wikibase ecosystem should ShEx manifests sit?
Andrawaag closed this task as "Resolved". Andrawaag claimed this task. TASK DETAIL https://phabricator.wikimedia.org/T192920 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, pdehaye, Addshore, YULdigitalpreservation, Aklapper, Daniel_Mietchen, darthmon_wmde, DannyS712, Nandana, LJ, lucamauri, Lahi, Gq86, SandraF_WMF, GoranSMilovanovic, QZanden, LawExplorer, Salgo60, TomT0m, _jensen, rosalieper, Jneubert, Scott_WUaS, Susannaanas, Wikidata-bugs, aude, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Edited] T234631: Setting up a wikibase for WikidataCon2019
Andrawaag updated the task description. TASK DETAIL https://phabricator.wikimedia.org/T234631 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Aklapper, Lea_Lacroix_WMDE, Andrawaag, darthmon_wmde, Jelabra, DannyS712, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Asahiko, Wikidata-bugs, aude, Addshore, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T234631: Setting up a wikibase for WikidataCon2019
Andrawaag created this task. Andrawaag added a project: Wikibase-Containers. Restricted Application added a subscriber: Aklapper. Restricted Application added a project: Wikidata. TASK DESCRIPTION For WikidataCon we have created a Wikibase instance available at Wikibase: http://185.54.115.247:8181/wiki/Main_Page WBQS: http://185.54.115.247:8282/ The wikibase is still under development, suggestions and question are welcome here. TASK DETAIL https://phabricator.wikimedia.org/T234631 WORKBOARD https://phabricator.wikimedia.org/project/board/3079/ EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Aklapper, Lea_Lacroix_WMDE, Andrawaag, darthmon_wmde, Jelabra, DannyS712, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Asahiko, Wikidata-bugs, aude, Addshore, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T234116: How to get ttl and json pages
Andrawaag added a comment. This is enable through changing localhost:8181/wiki/Qx -> localhost:8181/entity/Q1.ttl TASK DETAIL https://phabricator.wikimedia.org/T234116 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Aklapper, Andrawaag, darthmon_wmde, Jelabra, DannyS712, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Wikidata-bugs, aude, Addshore, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Closed] T234116: How to get ttl and json pages
Andrawaag closed this task as "Resolved". Andrawaag claimed this task. TASK DETAIL https://phabricator.wikimedia.org/T234116 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Aklapper, Andrawaag, darthmon_wmde, Jelabra, DannyS712, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Wikidata-bugs, aude, Addshore, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T234116: How to get ttl and json pages
Andrawaag created this task. Andrawaag added a project: Wikibase-Containers. Restricted Application added a subscriber: Aklapper. Restricted Application added a project: Wikidata. TASK DESCRIPTION When changing the URL in a Wikidata item to either .ttl or .json I get the respective formats. E.g. for wd:Q42 I can get TTL <http://www.wikidata.org/entity/Q42.ttl> or json <http://www.wikidata.org/entity/Q42.json>. How is this done in a wikibase installed through Docker compose? TASK DETAIL https://phabricator.wikimedia.org/T234116 WORKBOARD https://phabricator.wikimedia.org/project/board/3079/ EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Aklapper, Andrawaag, darthmon_wmde, Jelabra, DannyS712, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Wikidata-bugs, aude, Addshore, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Closed] T231652: Discussion tab on a EntitySchema is not enabled
Andrawaag closed this task as "Resolved". Andrawaag claimed this task. Andrawaag added a comment. Yes! Solved, thank you TASK DETAIL https://phabricator.wikimedia.org/T231652 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: abian, Aklapper, Andrawaag, darthmon_wmde, DannyS712, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T228561: Add a "List of EntitySchemas" to SpecialPages
Andrawaag added a comment. I would indeed welcome such a list. I am using the following script to get a list in the meantime. I don't know if that code can be integrated. : https://colab.research.google.com/drive/1DWccb-3ZgwcjGjKKD0t669IDKvkc3QA9#scrollTo=rOWDHlV0KhYX TASK DETAIL https://phabricator.wikimedia.org/T228561 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, abian, Jneubert, Aklapper, darthmon_wmde, pdehaye, Michael, Nandana, Sario528, Lahi, Gq86, GoranSMilovanovic, QZanden, YULdigitalpreservation, LawExplorer, Salgo60, _jensen, rosalieper, MisterSynergy, Wikidata-bugs, aude, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T227031: Update wikibase-docker configuration for wdqs-frontend
Andrawaag added a comment. @jjkoehorst fix worked on my install: http://185.54.115.247:8282/#SELECT%20%2a%20WHERE%20%7B%0Awd%3AQ1%20%3Fp%20%3Fo%20.%0A%7D%20 TASK DETAIL https://phabricator.wikimedia.org/T227031 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Lea_Lacroix_WMDE, jjkoehorst, Andrawaag, Addshore, Aklapper, darthmon_wmde, Jelabra, Nandana, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, Salgo60, _jensen, rosalieper, Jonas, Xmlizer, jkroll, Smalyshev, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T227031: Update wikibase-docker configuration for wdqs-frontend
Andrawaag added a comment. Yes, I replicated this issue on a Wikibase I was installing for a 2-minute demo for the upcoming Wikibase workshop in Ghent. The included wikibase query service is indeed returning results from Wikidata TASK DETAIL https://phabricator.wikimedia.org/T227031 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, Addshore, Aklapper, darthmon_wmde, Jelabra, Nandana, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, Salgo60, _jensen, rosalieper, Jonas, Xmlizer, jkroll, Smalyshev, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T192920: Where in the Wikibase ecosystem should ShEx manifests sit?
Andrawaag added a comment. @Addshore Good question. It makes sense to store those schemas on the ShEx extension. I would benefit federation and findability. However, the current extension uses the WDQS alone, so it would difficult if not impossible to use a shex on given wikibase data. TASK DETAIL https://phabricator.wikimedia.org/T192920 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, pdehaye, Addshore, YULdigitalpreservation, Aklapper, Daniel_Mietchen, darthmon_wmde, Nandana, LJ, lucamauri, Lahi, Gq86, SandraF_WMF, GoranSMilovanovic, QZanden, LawExplorer, Salgo60, TomT0m, _jensen, rosalieper, Jneubert, Susannaanas, Wikidata-bugs, aude, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Updated] T226238: Identical items can't be merged due to duplicate articles in ceb wikipedia
Andrawaag added a comment. In T226238#5273387 <https://phabricator.wikimedia.org/T226238#5273387>, @Jane023 wrote: > I am back and after checking, happy to report I was right. One is the town, and the other is the municipality. They are not the same and should not be merged. Can you elaborate a bit more on this? I am trying to understand these ceb originated statements, but I don't speak ceb and most of the ceb originated statements are poorly referenced. If you look at the specific wikidata page you argue should not be merged, it is hard to see what it actually is and why it should stay a Wikidata item on its own merit. Except for two statements (P31 <https://phabricator.wikimedia.org/P31> and P625 <https://phabricator.wikimedia.org/P625>), none of the statements has references. Based on both having P31 <https://phabricator.wikimedia.org/P31> -> city, it makes sense to merge those. TASK DETAIL https://phabricator.wikimedia.org/T226238 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Jane023, Andrawaag, Aklapper, darthmon_wmde, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T226238: Identical items can't be merged due to duplicate articles in ceb wikipedia
Andrawaag added a comment. In T226238#5273312 <https://phabricator.wikimedia.org/T226238#5273312>, @Aklapper wrote: > @Andrawaag: What is "Merge Wizard"? Where to find / see Merge Wizard? Is this on wikidata.org ? Please include steps to reproduce and set a project tag, if possible, so someone else can find this task. Thanks! I added additional screen-dumps. I was assuming this is a default wikidata feature, but I guess its something I installed a long time ago. TASK DETAIL https://phabricator.wikimedia.org/T226238 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Jane023, Andrawaag, Aklapper, darthmon_wmde, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Edited] T226238: Identical items can't be merged due to duplicate articles in ceb wikipedia
Andrawaag updated the task description. TASK DETAIL https://phabricator.wikimedia.org/T226238 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Jane023, Andrawaag, Aklapper, darthmon_wmde, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T225996: JSON results serializer in Wikidata Query Service generates an extra "datatype" field
Andrawaag added a comment. The Wikidata app in Cytoscape <http://apps.cytoscape.org/apps/wikidatascape> is also affected by this. F29609064: Screenshot 2019-06-21 at 10.46.41.png <https://phabricator.wikimedia.org/F29609064> TASK DETAIL https://phabricator.wikimedia.org/T225996 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Andrawaag Cc: Andrawaag, Lydia_Pintscher, Lea_Lacroix_WMDE, ericP, Smalyshev, Lucas_Werkmeister_WMDE, Fnielsen, Aklapper, darthmon_wmde, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, rosalieper, Cirdan, Jonas, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T221604: remove control menu in shex simple tool
Andrawaag added a comment. Did this also remove the result section? It seems it is now only possible to get a binary answer (True or False) but the necessary information to point to the actual location where the data does not fit the schema, Like here: https://colab.research.google.com/drive/1Y1Wv9TGVzEIe26vvXhQuxFUHXKqij-hk TASK DETAIL https://phabricator.wikimedia.org/T221604 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Rosalie_WMDE, Andrawaag Cc: Andrawaag, Aklapper, Lydia_Pintscher, darthmon_wmde, pdehaye, alaa_wmde, Michael, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, YULdigitalpreservation, LawExplorer, Salgo60, _jensen, rosalieper, Jonas, abian, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Updated] T223997: Add reserved IDs to EntitySchema
Andrawaag added a comment. In T223997#5201223 <https://phabricator.wikimedia.org/T223997#5201223>, @Esc3300 wrote: > Some not so random suggestions: > > - E55 <https://phabricator.wikimedia.org/E55> => scheme for European routes. Named for E55 <https://phabricator.wikimedia.org/E55>, a route that runs through Berlin I really like the suggestion for a schema for a highway. Is there a special story that makes E55 <https://phabricator.wikimedia.org/E55> an epic highway? E40 <https://phabricator.wikimedia.org/E40> might also be a nice one (https://www.wikidata.org/wiki/Q327162), because that will have to cover many multilingual features given it is an 8000 km highway starting in Calais and going all the way to Kazakhstan. > - E570 <https://phabricator.wikimedia.org/E570> => scheme for Q5 who died recently, with dob, dod, pob, pod, given name, occupation, manner of death. Named for Property:P570 <https://phabricator.wikimedia.org/P570> > - E734 <https://phabricator.wikimedia.org/E734> => basic scheme for family name items: with native label, writing system, Soundex, see also. Named for Property:P734 > - E735 <https://phabricator.wikimedia.org/E735> => basic scheme for given name items: with native label, writing system, see also. Named for Property:P735 <https://phabricator.wikimedia.org/P735> > - E3300 => basic scheme for Q5 with gender, occupation, given name, dob. Named for .. whatever > - E11424 => basic scheme for films with core properties <https://www.wikidata.org/wiki/Wikidata:WikiProject_Movies/Properties#Core_properties> (P31 <https://phabricator.wikimedia.org/P31>, P1476 <https://phabricator.wikimedia.org/P1476>, P577 <https://phabricator.wikimedia.org/P577>, P364 <https://phabricator.wikimedia.org/P364>, P136, P57 <https://phabricator.wikimedia.org/P57>, P161 <https://phabricator.wikimedia.org/P161>, P344 <https://phabricator.wikimedia.org/P344>, P38 <https://phabricator.wikimedia.org/P38>). Named for Item Q11424 Yes makes sense!! In T223997#5201248 <https://phabricator.wikimedia.org/T223997#5201248>, @Lea_Lacroix_WMDE wrote: > @Andrawaag What do you think? :) TASK DETAIL https://phabricator.wikimedia.org/T223997 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Michael, Andrawaag Cc: Ladsgroup, Salgo60, Andrawaag, Esc3300, Aklapper, Lea_Lacroix_WMDE, Lydia_Pintscher, Michael, darthmon_wmde, pdehaye, alaa_wmde, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, YULdigitalpreservation, LawExplorer, _jensen, rosalieper, Jonas, abian, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T197649: Add Scribunto extension to bundle
Andrawaag added a comment. Using Wikibase as documentation server for federated queries requires the SPARQL2 template that builds on ScribuntuTASK DETAILhttps://phabricator.wikimedia.org/T197649EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Andrawaag, dbs, Aklapper, despens, Tarrow, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, Gstupp, LawExplorer, Abbe98, D3r1ck01, Wikidata-bugs, aude, Addshore, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T209350: Find out how to add new SPARQL endpoints to the Whitelist on a Wikibase-container and document this
Andrawaag created this task.Andrawaag added a project: Wikibase-Containers.Restricted Application added a subscriber: Aklapper.Restricted Application added a project: Wikidata. TASK DESCRIPTIONUsing Wikibase as a documentation system for federated queries requires adding new sparql endpoint to the whitelist. We need to figure out how to do this and document this.TASK DETAILhttps://phabricator.wikimedia.org/T209350EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: rajireturn, Andrawaag, Aklapper, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, Gstupp, LawExplorer, Abbe98, D3r1ck01, Wikidata-bugs, aude, Addshore, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T209349: Importing the SPARQL2 template into a Wikibase instance from Wikidata
Andrawaag created this task.Andrawaag added a project: Wikibase-Containers.Restricted Application added a subscriber: Aklapper.Restricted Application added a project: Wikidata. TASK DESCRIPTIONI am trying to use a custom wikibase instance to work as a documentation system for federated queries. For this to work the SPARQL2 template from Wikidata is needed, however this template builds on my downstream templates. I started creating similar templates in the custom wikibase, but it feels like going into a rabbit hole. Is it possible to fetch the SPARQL2 template from Wikidata and all underlying templates in one task?TASK DETAILhttps://phabricator.wikimedia.org/T209349EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Aklapper, Tarrow, Addshore, Andrawaag, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, Gstupp, LawExplorer, Abbe98, D3r1ck01, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T200153: Add translations to disease terms in all South African languages
Andrawaag added a comment. Compare translations from Wikipedia versus those in Wikidata http://tinyurl.com/ycrfjx3vTASK DETAILhttps://phabricator.wikimedia.org/T200153EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Magnus, Andrawaag, Aklapper, Lahi, Gq86, GoranSMilovanovic, Jayprakash12345, QZanden, LawExplorer, Salgo60, Wikidata-bugs, aude, Jdforrester-WMF, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Updated] T183020: Investigate the possibility to release Wikidata queries
Andrawaag added subscribers: I9606, Esc3300, JAllemandou, mpopov, mforns, PokestarFan, Nuria, Lydia_Pintscher, debt, Jonas, AndrewSu.Andrawaag merged a task: T143819: Data request for logs from SparQL interface at query.wikidata.org. TASK DETAILhttps://phabricator.wikimedia.org/T183020EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: AndrewSu, Jonas, debt, Lydia_Pintscher, Nuria, PokestarFan, mforns, mpopov, JAllemandou, Esc3300, I9606, Gstupp, Andrawaag, EBjune, mkroetzsch, Smalyshev, DarTar, leila, Aklapper, Lahi, Gq86, NoohNaeem, GoranSMilovanovic, QZanden, LawExplorer, Guy13949413, Avner, mys_721tx, Wikidata-bugs, aude, Capt_Swing, Mbch331, Krenair___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Merged] T143819: Data request for logs from SparQL interface at query.wikidata.org
Andrawaag closed this task as a duplicate of T183020: Investigate the possibility to release Wikidata queries. TASK DETAILhttps://phabricator.wikimedia.org/T143819EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Andrawaag, Esc3300, JAllemandou, mpopov, mforns, PokestarFan, Nuria, Lydia_Pintscher, mkroetzsch, leila, debt, Jonas, Smalyshev, AndrewSu, Aklapper, I9606, Lahi, Gq86, Darkminds3113, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, Avner, Gehel, FloNight, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331, jeremyb___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T200153: Add translations to disease terms in all South African languages
Andrawaag created this task.Andrawaag added projects: Wikimania-Hackathon-2018, Wikidata.Herald added a subscriber: Aklapper. TASK DESCRIPTIONThe following query returns disease terms captured in Wikidata and their translations to 10 of the 11 official languages in South Africa. [1]. Currently, there is only one term () which has a term in all South African languages. This is a knowledge gap, we can reduce through wikidata. First steps have been taken an already lead to more Xhosa and Zulu disease terms. Here I reached out to Wikimedia ZA, who collected an initial list of translations [2]. Adding new terms is possible following these steps [3] [1] http://tinyurl.com/ycne27j6 [2] https://etherpad.wikimedia.org/p/wm_za [3] https://twitter.com/andrawaag/status/1020681344517656577TASK DETAILhttps://phabricator.wikimedia.org/T200153EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Andrawaag, Aklapper, Lahi, Gq86, samuelguebo, GoranSMilovanovic, Jayprakash12345, QZanden, LawExplorer, Salgo60, Wikidata-bugs, aude, Jdforrester-WMF, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T200149: Adding labels and description in South African language requires a babel template on your user page
Andrawaag created this task.Andrawaag added projects: Wikimania-Hackathon-2018, Wikidata, MediaWiki-extensions-Babel.Herald added a subscriber: Aklapper. TASK DESCRIPTIONWhile at Wikimania I tried to add labels to wikidata items in one of the South African languages. [1]. At first, it was not possible to add new terms to Wikidata in SA language. A quick fix is to add the languages to babel [2]. This seems like an unnecessary step [3]. Can we remove this requirement? [1] https://twitter.com/andrawaag/status/1020681344517656577 [2] https://twitter.com/andrawaag/status/1020681346291847175 [3] https://twitter.com/pipeweed/status/1020682606403379206TASK DETAILhttps://phabricator.wikimedia.org/T200149EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Andrawaag, Aklapper, Lahi, Gq86, samuelguebo, GoranSMilovanovic, TheDragonFire, Jayprakash12345, QZanden, LawExplorer, Salgo60, Iniquity, Wikidata-bugs, aude, SPQRobin, Arrbee, KartikMistry, Jdforrester-WMF, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Raised Priority] T186161: What to do if the WDQS does not synchronise with Wikibase?
Andrawaag raised the priority of this task from "Normal" to "High".Andrawaag added a comment. @Addshore This is the issue I mentioned in Antwerp, which I could then not replicate.TASK DETAILhttps://phabricator.wikimedia.org/T186161EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Considering.Different.Routes, DarTar, Addshore, Andrawaag, Aklapper, Lahi, Gq86, GoranSMilovanovic, QZanden, Gstupp, LawExplorer, Abbe98, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Updated] T193720: Provide a web form to allow wikibase docker users to setup a docker file by answering simple question without having node / others.
Andrawaag added a project: Federated-Wikibase-Workshops@Berlin-workshop. TASK DETAILhttps://phabricator.wikimedia.org/T193720EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Matthias_Geisler_WMDE, AndrawaagCc: Tarrow, Nunomn, Andrawaag, Jonas, Matthias_Geisler_WMDE, Aklapper, Daniel_Mietchen, RazShuty, LJ, Lahi, Gq86, SandraF_WMF, GoranSMilovanovic, QZanden, Gstupp, LawExplorer, Salgo60, Abbe98, Wikidata-bugs, aude, Lydia_Pintscher, Addshore, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T193720: Provide a web form to allow wikibase docker users to setup a docker file by answering simple question without having node / others.
Andrawaag added a comment. Can we include setting the logo's in both Wikibase and the WDQS I currently use the following steps to do this. CHANGE LOGO at WDQS sudo docker cp wd.svg ubuntu_wdqs-frontend_1:/wd.svg sudo docker exec -it ubuntu_wdqs-frontend_1 sh cd /usr/share/nginx/html mv /wd.svg logo.svg Change Logo at Wikibase sudo docker exec -it ubuntu_wikibase_1 bash apt-get update apt-get install vim TASK DETAILhttps://phabricator.wikimedia.org/T193720EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Matthias_Geisler_WMDE, AndrawaagCc: Tarrow, Nunomn, Andrawaag, Jonas, Matthias_Geisler_WMDE, Aklapper, Daniel_Mietchen, RazShuty, LJ, Lahi, Gq86, SandraF_WMF, GoranSMilovanovic, QZanden, Gstupp, LawExplorer, Salgo60, Abbe98, Wikidata-bugs, aude, Lydia_Pintscher, Addshore, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T193720: Provide a web form to allow wikibase docker users to setup a docker file by answering simple question without having node / others.
Andrawaag added a comment. create interface template to Open Stack so the docker-compose spins of a wikibase on open stack create interface template to Digital Ocean so the docker-compose spins of a wikibase on Digital ocean create interface template to Google cloud so the docker-compose spins of a wikibase on Google cloud TASK DETAILhttps://phabricator.wikimedia.org/T193720EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: RazShuty, AndrawaagCc: Andrawaag, Jonas, Matthias_Geisler_WMDE, Aklapper, Daniel_Mietchen, RazShuty, LJ, Lahi, Gq86, SandraF_WMF, GoranSMilovanovic, QZanden, LawExplorer, Abbe98, Wikidata-bugs, aude, Lydia_Pintscher, Addshore, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Closed] T192598: How to annotate NCI Thesaurus ID (P1748) as datatype identifier
Andrawaag closed this task as "Resolved".Andrawaag claimed this task.Andrawaag added a comment. Moved to project chat to reach consensusTASK DETAILhttps://phabricator.wikimedia.org/T192598EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Addshore, Mbch331, Lydia_Pintscher, Andrawaag, Aklapper, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T192598: How to annotate NCI Thesaurus ID (P1748) as datatype identifier
Andrawaag added a comment. I start a chat on project chat to reach that consensus (or not). https://www.wikidata.org/wiki/Wikidata:Project_chat#Changing_NCI_Thesaurus_ID_(P1748)_to_data_type_%22External_identifier%22TASK DETAILhttps://phabricator.wikimedia.org/T192598EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Addshore, Mbch331, Lydia_Pintscher, Andrawaag, Aklapper, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T192598: How to annotate NCI Thesaurus ID (P1748) as datatype identifier
Andrawaag added a comment. How is community consensus reached here? There are more examples of identifiers that didn't make into an identifier in the first round. I am curious to learn how to proceed in future cases. Here are for example properties with the term ID in it: http://tinyurl.com/y836akc8. Suggesting that those are identifiers. I am not in favour of deprecating the old one and creating a new property. This would lead to new property numbers, and now that properties are used in internal and external SPARQL queries and other (e.g. federated queries) can rely on it, such a step has the potential to lead to breaking things. Deprecating existing property numbers should not be taken lightly (if at all) given the current mature state of Wikidata.TASK DETAILhttps://phabricator.wikimedia.org/T192598EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Mbch331, Lydia_Pintscher, Andrawaag, Aklapper, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T192598: How to annotate NCI Thesaurus ID (P1748) as datatype identifier
Andrawaag created this task.Andrawaag added a project: Wikidata.Herald added a subscriber: Aklapper. TASK DESCRIPTIONThe property NCI Thesaurus ID (https://www.wikidata.org/wiki/Property:P1748) is currently not of datatype identifier. However, as its name suggests it is an identifier. Is it possible to change its data type to be of type External identifier?TASK DETAILhttps://phabricator.wikimedia.org/T192598EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Andrawaag, Aklapper, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T186161: What to do if the WDQS does not synchronise with Wikibase?
Andrawaag added a comment. I actually don't know when things went wrong. I launched all containers using docker compose. Upon initiation, seems to work. I then launched a bot to populate the wiki base instance. Which ran overnight to at around 180lk items. When I then tried to run some SPARQL queries in the WDQS the WDQS gave a nginx error, which I solved by relaunching all containers. I have restarted all with the following commands: sudo /usr/local/bin/docker-compose down ubuntu@wikibase:~/wikibase-docker$ sudo /usr/local/bin/docker-compose down Stopping wikibasedocker_wdqs-updater_1 ... done Stopping wikibasedocker_wdqs-frontend_1 ... done Stopping wikibasedocker_wikibase_1 ... done Stopping wikibasedocker_wdqs-proxy_1... done Stopping wikibasedocker_mysql_1 ... done Stopping wikibasedocker_wdqs_1 ... done Removing wikibasedocker_wdqs-frontend_1 ... done Removing wikibasedocker_wikibase_1 ... done Removing wikibasedocker_wdqs-proxy_1... done Removing wikibasedocker_mysql_1 ... done Removing wikibasedocker_wdqs_1 ... done The be sure, a reboot, followed by: ubuntu@wikibase:~/wikibase-docker$ sudo /usr/local/bin/docker-compose up --no-build -d wikibasedocker_wdqs_1 is up-to-date wikibasedocker_mysql_1 is up-to-date wikibasedocker_wdqs-proxy_1 is up-to-date wikibasedocker_wikibase_1 is up-to-date wikibasedocker_wdqs-frontend_1 is up-to-date wikibasedocker_wdqs-updater_1 is up-to-date I waited an evening for the wdqs to pick up. Unfortunatly still not updates Am I a missing a specific command?TASK DETAILhttps://phabricator.wikimedia.org/T186161EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Addshore, Andrawaag, Aklapper, Abbe98, Wikidata-bugs___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T186161: What to do if the WDQS does not synchronise with Wikibase?
Andrawaag created this task.Andrawaag added a project: Wikibase-Containers.Herald added a subscriber: Aklapper. TASK DESCRIPTIONI am running a wikibase-container at http://185.54.113.154:8181/wiki/Main_Page. I have populated this wikibase instance with 176,113 items using a bot written with the Wikidata integrator. However, after completion that WDQS remains empty. During the bot bot import the WDQS crashed and I had to restart the docker image. Can the WDQS be synchronized afterwards?TASK DETAILhttps://phabricator.wikimedia.org/T186161EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Andrawaag, Aklapper, Abbe98, Wikidata-bugs, Addshore___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Closed] T184908: Pulling and building of the image leads to error messages
Andrawaag closed this task as "Resolved".Andrawaag added a comment. Yes works smoothly.TASK DETAILhttps://phabricator.wikimedia.org/T184908EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Addshore, AndrawaagCc: Abbe98, Lydia_Pintscher, Addshore, Andrawaag, Aklapper, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T184908: Pulling and building of the image leads to error messages
Andrawaag added a comment. Okay, I will look at the new repo. I do like the setup script that came with the first image. It allowed for very quick launching of an "empty wikidata environment". I will see if I can do the same with this new repoTASK DETAILhttps://phabricator.wikimedia.org/T184908EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Addshore, AndrawaagCc: Addshore, Andrawaag, Aklapper, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T184908: Pulling and building of the image leads to error messages
Andrawaag added a comment. Following the instructions on https://github.com/wmde/wikibase-docker/blob/master/README-compose.md: git clone https://github.com/addshore/wikibase-docker.git docker-compose pull docker-compose up --no-build -d Last time I did (approx. 3 weeks ago), this was sufficient to launch a new wikibase instance.TASK DETAILhttps://phabricator.wikimedia.org/T184908EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Addshore, AndrawaagCc: Addshore, Andrawaag, Aklapper, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T184908: Pulling and building of the image leads to error messages
Andrawaag created this task.Andrawaag added a project: Wikibase-Containers.Herald added a subscriber: Aklapper. TASK DESCRIPTIONWhen pulling the docker image on a new instances (http://185.54.114.71:8181/) the following error message is shown: Warning: Class 'DataValues\Geo\Values\LatLongValue' not found in /var/www/html/extensions/Wikibase/vendor/data-values/geo/Geo.php on line 21 Warning: Class 'DataValues\Geo\Values\GlobeCoordinateValue' not found in /var/www/html/extensions/Wikibase/vendor/data-values/geo/Geo.php on line 22 Warning: Class 'DataValues\Geo\Formatters\LatLongFormatter' not found in /var/www/html/extensions/Wikibase/vendor/data-values/geo/Geo.php on line 27 Warning: Class 'DataValues\Geo\Parsers\LatLongParser' not found in /var/www/html/extensions/Wikibase/vendor/data-values/geo/Geo.php on line 31 Warning: Class 'Diff\Differ\CallbackListDiffer' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 14 Warning: Class 'Diff\Differ\Differ' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 15 Warning: Class 'Diff\Differ\ListDiffer' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 16 Warning: Class 'Diff\Differ\MapDiffer' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 17 Warning: Class 'Diff\Differ\OrderedListDiffer' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 18 Warning: Class 'Diff\Patcher\ListPatcher' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 20 Warning: Class 'Diff\Patcher\MapPatcher' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 21 Warning: Class 'Diff\Patcher\Patcher' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 22 Warning: Class 'Diff\Patcher\PatcherException' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 23 Warning: Class 'Diff\Patcher\PreviewablePatcher' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 24 Warning: Class 'Diff\Patcher\ThrowingPatcher' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 25 Warning: Class 'Diff\DiffOp\Diff\Diff' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 27 Warning: Class 'Diff\DiffOp\Diff\ListDiff' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 28 Warning: Class 'Diff\DiffOp\Diff\MapDiff' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 29 Warning: Class 'Diff\DiffOp\AtomicDiffOp' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 31 Warning: Class 'Diff\DiffOp\DiffOp' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 32 Warning: Class 'Diff\DiffOp\DiffOpAdd' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 33 Warning: Class 'Diff\DiffOp\DiffOpChange' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 34 Warning: Class 'Diff\DiffOp\DiffOpRemove' not found in /var/www/html/extensions/Wikibase/vendor/diff/diff/Diff.php on line 35 Fatal error: Uncaught Error: Call to a member function getCode() on null in /var/www/html/includes/user/User.php:1594 Stack trace: #0 /var/www/html/includes/user/User.php(5272): User::getDefaultOptions() #1 /var/www/html/includes/user/User.php(2884): User->loadOptions() #2 /var/www/html/includes/context/RequestContext.php(364): User->getOption('language') #3 /var/www/html/includes/Message.php(380): RequestContext->getLanguage() #4 /var/www/html/includes/Message.php(1275): Message->getLanguage() #5 /var/www/html/includes/Message.php(842): Message->fetchMessage() #6 /var/www/html/includes/Message.php(934): Message->toString('text') #7 /var/www/html/includes/exception/MWExceptionRenderer.php(200): Message->text() #8 /var/www/html/includes/exception/MWExceptionRenderer.php(138): MWExceptionRenderer::msg('internalerror', 'Internal error') #9 /var/www/html/includes/exception/MWExceptionRenderer.php(54): MWExceptionRenderer::reportHTML(Object(Error)) #10 /var/www/html/includes/exception/MWExceptionHandler.php(75): MWExceptionRen in /var/www/html/includes/user/User.php on line 1594TASK DETAILhttps://phabricator.wikimedia.org/T184908EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Andrawaag, Aklapper, Wikidata-bugs, Addshore___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Closed] T184274: Using the API in the wikibase container with or without https.
Andrawaag closed this task as "Resolved".Andrawaag claimed this task. TASK DETAILhttps://phabricator.wikimedia.org/T184274EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Addshore, Aklapper, Andrawaag, Wikidata-bugs___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T184274: Using the API in the wikibase container with or without https.
Andrawaag added a comment. I have two instances running. One locally on my macbook, the other at http://185.54.115.189:8181/. The varnish message came from the localhost instance. In both cases, the docker images worked "out of the box". However, I can't I can't find a log file :( The varnish message came from a localhost instance on my macbook. I dived in the bot code and was able to disable the https feature. By doing so the bot was able to write to the http://185.54.115.189:8181. So it indeed seemed to be an issue with the bot. Here is the code used: https://gist.github.com/andrawaag/737ece3f4f7d3e63715a0ecd7f9fb725 http://185.54.115.189:8181 is a sandbox, so feel free to test drive.TASK DETAILhttps://phabricator.wikimedia.org/T184274EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Addshore, Aklapper, Andrawaag, Wikidata-bugs___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T184274: Using the API in the wikibase container with or without https.
Andrawaag added a comment. When the docker image is running on a machine without an ipv6 address I am able to populate that instance with data through its API; (http://127.0.0.1:8181/w/api.php). When the image is running on a computer connected to a LAN that supports ipv6 or when the docker image is running on a cloud image (e.g. Amazon's ec2) I am not able to run a bot against it. I then get the error message that HTTPS is required: "Error: 403, Insecure Request Forbidden - use HTTPS -"TASK DETAILhttps://phabricator.wikimedia.org/T184274EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Addshore, Aklapper, Andrawaag, Wikidata-bugs___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T184274: Using the API in the wikibase container with or without https.
Andrawaag created this task.Andrawaag added a project: Wikibase-Containers.Herald added a subscriber: Aklapper. TASK DESCRIPTIONWhen trying to add content to an Wikibase container instance through the API the effort is blocked with the following message: "Request from 2xxx::zzz:::::aaa via cp3031 frontend, Varnish XID 881824315Upstream caches: cp3031 intError: 403, Insecure Request Forbidden - use HTTPS -" I have tried to install OpenSSL on the machine, but failed to link it to the docker image being run. Is it possible to disable this HTTPS requirement, or how can I run the docker image so that it accepts https requests when launched on localhost.TASK DETAILhttps://phabricator.wikimedia.org/T184274EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: AndrawaagCc: Aklapper, Andrawaag, Wikidata-bugs, Addshore___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs