ArthurPSmith added a subtask: T360224: Improve Wikidata handling of duplicate
references in model and UI.
TASK DETAIL
https://phabricator.wikimedia.org/T78688
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: ArthurPSmith
Cc: Cirdan, Pintoch, Teslaton
ArthurPSmith added a subtask: T360224: Improve Wikidata handling of duplicate
references in model and UI.
TASK DETAIL
https://phabricator.wikimedia.org/T194305
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: ArthurPSmith
Cc: Aklapper, hoo
ArthurPSmith added a subtask: T360224: Improve Wikidata handling of duplicate
references in model and UI.
TASK DETAIL
https://phabricator.wikimedia.org/T224333
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: ArthurPSmith
Cc: Epidosis
ArthurPSmith added parent tasks: T78688: [Story] Detach source references from
Statements, T270375: Saving identical references with different retrieval dates
should be more difficult, T224333: It's possible to save a statement with
duplicate references, T194305: Track the number of (u
ArthurPSmith added a subtask: T360224: Improve Wikidata handling of duplicate
references in model and UI.
TASK DETAIL
https://phabricator.wikimedia.org/T270375
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: ArthurPSmith
Cc: Aklapper, Epidosis
ArthurPSmith created this task.
ArthurPSmith added projects: MediaWiki-extensions-WikibaseRepository, Wikidata,
MediaWiki-extensions-WikibaseClient.
Restricted Application added a subscriber: Aklapper.
TASK DESCRIPTION
**Feature summary** (what you would like to be able to do and where):
See
ArthurPSmith added a comment.
Ok, I got federation to work - sort of. From the main query service I can
query the scholarly subgraph - but if I try to use the resulting values I
always get a timeout.
select ?author WHERE {
SERVICE <https://query-schola
ArthurPSmith added a comment.
Hi - how does the federation work? I'm experimenting with this by trying to
get the list of names of authors on a scholarly article - the article data
itself is in the scholarly article subgraph, but the human items for the
authors are in the main one.
ArthurPSmith added a comment.
It certainly would be good to get this fixed. However, I think this points up
a fundamental problem with some of the more complex data structures supported
by Wikidata (quantity ranges are a similar case, and probably some of the
lexeme structures as well
ArthurPSmith added a comment.
I see the merit of this idea at least for properties, but I'm wondering where
you envision the property discussion to take place? On the talk page? Would
that be preserved somehow (referring back to old proposal discussions is done
very often).
TASK D
ArthurPSmith added a comment.
Good points from @MisterSynergy and others above. One other case I often run
into is problems caused by item merges; if both original items had P279
<https://phabricator.wikimedia.org/P279> statements this can cause significant
trouble (for example i
ArthurPSmith added a comment.
Hmm - I agree with the above that P2860
<https://phabricator.wikimedia.org/P2860> should not be on this list. If we are
including the "partitive" properties like P361
<https://phabricator.wikimedia.org/P361> and P527
<https://phabr
ArthurPSmith added a comment.
Hmm, another strange case is search for L:Kelly - the 3 current matches are
for L404650, L361948 and L230178, none of which seem to have the string "kelly"
in them. So there's some sort of stemming going on here in addition to the case
insen
ArthurPSmith created this task.
ArthurPSmith added a project: Wikidata Lexicographical data.
Restricted Application added a project: Wikidata.
TASK DESCRIPTION
I know we want to keep the glosses short, but the box right now is too short
(at least when I use the Mac Safari desktop browser). I
ArthurPSmith created this task.
ArthurPSmith added a project: Wikidata Lexicographical data.
Restricted Application added a project: Wikidata.
TASK DESCRIPTION
Wikidata search for L:Anna matches many lexemes (due to their forms
containing "anna" or "Anna") but it would b
ArthurPSmith moved this task from Backlog to In progress on the
Wikidata-Lexicodays-2021 board.
ArthurPSmith added a comment.
Denny's posted this notebook:
https://public.paws.wmcloud.org/User:DVrandecic_(WMF)/Lexicographic%20coverage.ipynb
which does pretty much the above for the lan
ArthurPSmith added a comment.
Something seems to be going on very recently that's a different pattern - did
something change on the infrastructure side, or is there a change in usage
pattern for the last few hours? Basically maxlag (WDQS lag specifically) has
NOT gone below 5 (5 minute
ArthurPSmith added a comment.
Thanks for creating this! I'm not sure what the standard citation reference
for an external ID is, but what I've been using is:
- stated in (P248) the value of "subject item of this property" (P1629
<https://phabricator.wikimed
ArthurPSmith closed subtask T160205: Add interstitial to
wikidata-externalid-url as "Declined".
TASK DETAIL
https://phabricator.wikimedia.org/T150939
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: ArthurPSmith
Cc: Ayack, Salgo60, Lew
ArthurPSmith closed this task as "Declined".
ArthurPSmith added a comment.
Wow, was that really almost 3 years ago. There doesn't seem to really be a
need for this, so I'm closing the request as declined.
TASK DETAIL
https://phabricator.wikimedia.org/T160205
EMAIL
ArthurPSmith added a comment.
Sorry I never got around to looking at this further. @DD063520 do you
understand the above comment from @thiemowmde about using the wbparsevalue api
rather than python internals?
TASK DETAIL
https://phabricator.wikimedia.org/T119226
EMAIL PREFERENCES
https
ArthurPSmith added a comment.
@Bugreporter
> I think increase the factor will not make thing better, it only increase
the oscillating period
Yes that does seem to have happened - instead of a roughly 20 minute cycle,
we now have about a 1-hour cycle.
TASK DETAIL
ht
ArthurPSmith added a comment.
Possibly relevant comment here: I believe there is a plan also to move to
incremental updates (updating only the statements/triples that have changed) so
it is probably important that any parallelism in updating be coordinated so
that updates for the same item
ArthurPSmith added a comment.
In T243701#5855439 <https://phabricator.wikimedia.org/T243701#5855439>,
@ArielGlenn wrote:
> In T243701#5855352 <https://phabricator.wikimedia.org/T243701#5855352>,
@Lea_Lacroix_WMDE wrote:
>
>> Over the past weeks, we noticed a
ArthurPSmith added a comment.
@Addshore and others - the problem has deteriorated since Saturday - see this
discussion on Wikidata:
https://www.wikidata.org/wiki/Wikidata:Contact_the_development_team/Query_Service_and_search#WDQS_lag
TASK DETAIL
https://phabricator.wikimedia.org/T243701
ArthurPSmith added a comment.
In T221774#5815408 <https://phabricator.wikimedia.org/T221774#5815408>,
@Addshore wrote:
> [...]
> Note that this dashboard includes metrics for both pooled and depooled
servers.
> So whatever you read there will likely also be rep
ArthurPSmith added a comment.
@Bugreporter well something must have changed early today - was it previously
"mean" and is now "median"? I'm not sure which is better, but having WDQS hours
out of date (we're over 4 hours now) is NOT a good situation, and what th
ArthurPSmith added a comment.
Just saw this - I'm wondering technically how you would implement it? You
could generate a random number between 2.5 and 5, and if maxlag is greater than
your random number deny the edit?
TASK DETAIL
https://phabricator.wikimedia.org/T240442
ArthurPSmith added a comment.
Am I misreading this graph?
https://grafana.wikimedia.org/d/00489/wikidata-query-service?panelId=8&fullscreen&orgId=1&from=now-12h&to=now&refresh=10s
It looks like the query service lag for 3 of the servers has been growing
steadily fo
ArthurPSmith closed this task as "Resolved".
ArthurPSmith added a comment.
Marking as resolved...
TASK DETAIL
https://phabricator.wikimedia.org/T240371
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: ArthurPSmith
Cc: Bugreporter
ArthurPSmith closed subtask T240371: Maxlag=5 for Author Disambiguator as
"Resolved".
TASK DETAIL
https://phabricator.wikimedia.org/T240369
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: ArthurPSmith
Cc: Lydia_Pintscher, Framawiki, Sjo
ArthurPSmith added a comment.
I increased the default number of retries to 12, so it will now retry for up
to an hour. I think we're good here?
TASK DETAIL
https://phabricator.wikimedia.org/T240371
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences
ArthurPSmith added a comment.
(A) Pintoch's patch has been applied, and (B) I also increased the retry time
from 5 seconds to 5 minutes - that still means an edit will fail after 25
minutes if maxlag doesn't drop, with only 5 retries. Is there a consensus to
retry for an hour? Or
ArthurPSmith added a comment.
If you go to the search page and select "Lexeme" as the only namespace you
get the same error with "thanks" in the search box, but "thank" alone works
fine - the two lexemes that match are L3798 (verb) and L2846
ArthurPSmith added a comment.
The Basque collection is even more complete now!
I do think some customization may be needed for Lexemes due to the different
structure - the forms and senses etc. Perhaps the most useful link for a
wiktionary may be from words to senses to wikidata items via
ArthurPSmith added a comment.
I see the problem also (Safari browser). When you talk about it affecting
lexemes, where do you see that? I experimented with adding a form and that
seemed fine.
TASK DETAIL
https://phabricator.wikimedia.org/T229604
EMAIL PREFERENCES
https
ArthurPSmith added a comment.
Can you add a test to the statement ID generation code that ensures it has an RDF compatible format (except for the 1 character that's a problem now), and a note that this is required for RDF support?TASK DETAILhttps://phabricator.wikimedia.org/T214680
ArthurPSmith added a comment.
promise it will always be one-to-one, no matter what happens with internal IDs
Hmm - if it's NOT one-to-one, will that not break RDF? That is, if it's possible for 2 different statements to have the same ID, then you would have conflicting triples associate
ArthurPSmith added a comment.
Another thought - even better would be if the API could be adjusted so it accepts the WDQS statement ID format as it is (all -'s).TASK DETAILhttps://phabricator.wikimedia.org/T214680EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferenc
ArthurPSmith added a comment.
Thanks for creating this ticket! Actually, my use case is the opposite of Lucas's - I want to be able to go from the results of a WDQS query to fetch the full statement via the API, which requires the statement ID. So I would like to see the id conversion docum
ArthurPSmith added a comment.
I didn't know about the "award token" option!
Yes, we should do something along these lines. However, I think there are a number of situations to be addressed:
(1) The edit may be a clean-up which has no material impact on the value of the statement
ArthurPSmith added a comment.
Just a note - WDQS query gives different results hopping up and down - sometimes 3004 (for English lexeme senses) and sometimes 2872, over about the last 10 minutes.TASK DETAILhttps://phabricator.wikimedia.org/T210495EMAIL PREFERENCEShttps://phabricator.wikimedia.org
ArthurPSmith added a subscriber: Smalyshev.ArthurPSmith added a comment.
@Smalyshev I'd forgotten there was a phabricator ticket for this - anyway, this is what I was referring to... Last night's update bumped the number down again to 2718; however when I run the query directly on WDQS
ArthurPSmith added a comment.
I ran a manual update and the total for English bumped up to 2819 - so it doesn't look as if we've actually lost lexeme senses, just that some of the query servers don't know about all of them?TASK DETAILhttps://phabricator.wikimedia.org/T210495EMAIL P
ArthurPSmith added a comment.
I wouldn't be surprised if it's a WDQS problem, this is definitely generated from an RDF query.TASK DETAILhttps://phabricator.wikimedia.org/T210495EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Ar
ArthurPSmith added a comment.
According to https://www.mediawiki.org/wiki/Extension:WikibaseLexeme/RDF_mapping a lexeme should be "a wikibase:Lexeme " as well as "a ontolex:LexicalEntry", but in the query service I can only find things via the latter relation. Sim
ArthurPSmith added a comment.
WDQS works for me! I'm not sure where that is of course - I guess I could check Phabricator!TASK DETAILhttps://phabricator.wikimedia.org/T197145EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Mahir256, Es
ArthurPSmith added a comment.
Does "alphabetical" ordering even make sense for words in a collection of vastly different writing systems? If this is done I would recommend it be accompanied by some filtering - for language, part of speech, grammatical features, certain properties pe
ArthurPSmith added a comment.
I am in general favorable to Micru's proposal, and perhaps Pamputt's elaboration of it above: using wikidata items directly allows representation of the lemma language naturally in the user's own script/language for one, and other automatic bonuses
ArthurPSmith added a comment.
Here's a specific question that might be detailed enough in description: suppose we have a collection of facts (say the names, countries, inception dates, and official websites for a collection of organizations) that has been extracted from multiple so
ArthurPSmith added a comment.
Hmm, I'm not sure this is all that useful at least as it stands. Most external id's can be as easily found now via the Wikidata Resolver tool - https://tools.wmflabs.org/wikidata-todo/resolver.php - However, what I would find useful would be a way to
ArthurPSmith added a comment.
Hi - my most recent response was following MisterSynergy's comment on Denny's proposed questions, and specifically the meaning of "processes that in bulk extract facts from Wikipedia articles," - it sounds like from subsequent discussion that we ar
ArthurPSmith added a comment.
based on the fact that we have ~42M “imported from” references and ~64M sitelinks in Wikidata
Hmm, I've added likely over 1000 of those "imported from" items myself by hand, for example for organization "official website" entries. So I would
ArthurPSmith added a comment.
Some references on why CC0 is essential for a free public database:
https://wiki.creativecommons.org/wiki/CC0_use_for_data
"Databases may contain facts that, in and of themselves, are not protected by copyright law. However, the copyright laws of many jurisdic
ArthurPSmith added a comment.
FYI I agree with VIGNERON on what it should look like - but at least something more than the id!TASK DETAILhttps://phabricator.wikimedia.org/T195382EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith
ArthurPSmith added a comment.
It has been asserted here several times that OSM data has been wholesale imported into Wikidata - do we know that has happened? Wikidata has two properties related to OSM, one that relates wikidata items to OSM tags like "lighthouse", and one that is e
ArthurPSmith added a comment.Herald added a subscriber: PokestarFan.
Of course, now these examples I gave are working - probably because I updated them recently. However, I found more that are not now, or only partially - for example Q2256713:
SELECT ?item WHERE { ?item wdt:P856 http
ArthurPSmith created this task.ArthurPSmith added projects: Wikidata, Discovery.Herald added a subscriber: Aklapper.
TASK DESCRIPTIONI have found the query service to be consistently (*almost* always, over the past several weeks at least) missing some items - an example is Q30252826:
SELECT ?item
ArthurPSmith raised the priority of this task from "Lowest" to "Normal".ArthurPSmith added a comment.
I don't understand why Multichill can unilaterally alter the priority on this request in the face of an active wikidata RFC where the voting has been 2:1 in support of t
ArthurPSmith added a comment.
Thanks! I did search through the open tasks first and didn't find anything on thisTASK DETAILhttps://phabricator.wikimedia.org/T170614EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Lucas_Werkmeister
ArthurPSmith created this task.ArthurPSmith added a project: Wikibase-Quality-Constraints.Herald added a subscriber: Aklapper.Herald added a project: Wikidata.
TASK DESCRIPTIONIs this the place to report bugs? Whenever I look at a wikidata item with a P279 (subclass of) statement - for example for
ArthurPSmith added a comment.
The dummy user solution sounds good to me. Magnus Manske is doing something like this with his QuickStatementsBot so maybe a special purpose Bot account on wikidata for this?TASK DETAILhttps://phabricator.wikimedia.org/T143486EMAIL PREFERENCEShttps
ArthurPSmith added a comment.
I believe a way this could be done would be to allow the attachment of regular expressions to the formatter URL, and have the external id URL conversion code understand them. That is, if there was a qualifier property that specified "regex substitution" f
ArthurPSmith added a comment.
As background, I'm seeing about 2000 "hits" per day on this service right now, with about a dozen properties linking through it to their databases.TASK DETAILhttps://phabricator.wikimedia.org/T150939EMAIL PREFERENCEShttps://phabricator.wikimedia.org
ArthurPSmith added a comment.
@Esc3300 well, I developed this tool because links for IMDB and a handful of other properties were broken when we made the change from string to "external identifier" last year, where the wikidata UI started putting the links in directly (previously it had
ArthurPSmith closed this task as "Invalid".ArthurPSmith added a comment.
@jeblad I'm resolving this as invalid as the initial claim of an information leak seems to be incorrect. However you might want to open up a separate phabricator ticket with your detailed suggestion on how
ArthurPSmith added a comment.
I see you've closed - looks good by the way. Anyway, on the question of
retaining WDQ - no I don't think that's necessary, I think Magnus would
like to shut it down eventually. I don't see that WDQ adds anything to this
tool now SPARQL is working
ArthurPSmith added a comment.
@Yurik and all, I'm glad to see all this work going on, I was pointed to this after I made a comment on a wikidata property proposal that I thought would be best addressed by somehow allowing a tabular data value rather than a single value. However, I'm wo
ArthurPSmith added a comment.
Excellent, thanks! I probably should have sent you an email...TASK DETAILhttps://phabricator.wikimedia.org/T142432EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Ricordisamoa, ArthurPSmithCc: gerritbot, Aklapper, ArthurPSmith
ArthurPSmith triaged this task as "High" priority.ArthurPSmith added a comment.
So I updated to https in my local copy and that definitely fixed the problem. Not sure if @Ricordisamoa is around? I don't have permission right now to do anything with ptable, but I do have an accou
ArthurPSmith added a comment.
Still broken (at least 3 days now). I can't see the error messages but I tried running my own copy and ran into:
https://lists.wikimedia.org/pipermail/mediawiki-api-announce/2016-May/000110.html
the code is using http not https:
base.py:WD_API =
ArthurPSmith created this task.ArthurPSmith added projects: Tool-Labs-tools-Wikidata-Periodic-Table, Wikidata.Herald added a subscriber: Aklapper.
TASK DESCRIPTIONhttps://tools.wmflabs.org/ptable has been returning a 500 Server Error since earlier today - possibly longer. Something recently broken
ArthurPSmith added a comment.
Ok, the WbRepresentation superclass looks like it might help simplify this. But FilePage, ItemPage and PropertyPage (and basestring) are not subclasses of that, so I think just returning the json hash would be best there. But the function could certainly run
ArthurPSmith added a comment.
@Multichill - could be, I'm not familiar with WbTime other than a glance at the code. Are there edge cases (eg. 10^20 years into the future?) that would break the "int/long" assumptions? But it definitely does NOT work for WbQuantity the way thing
ArthurPSmith added a comment.
>>! In T112140#2435122, @Multichill wrote:
The function should return an object. Possibilities seem to be commonsMedia, globe-coordinate, monolingualtext, quantity, string, time, url, external-id, wikibase-item, wikibase-property, math
The parse API allows a l
ArthurPSmith added a comment.
See https://gerrit.wikimedia.org/r/#/c/297637/ for proposed implementation...TASK DETAILhttps://phabricator.wikimedia.org/T112140EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Tobias1984, ArthurPSmith, Aklapper
ArthurPSmith added a comment.
Ok, that echoes something Tobias has said also about using strings and avoiding IEEE fp. I'm going to look at getting T112140 working first and then see if I can bring that implementation to bear on this.TASK DETAILhttps://phabricator.wikimedia.org/T119226
ArthurPSmith claimed this task.ArthurPSmith added a comment.
I'm going to have a shot at implementing this - it looks like it will be useful for a number of other open phabricator issues for pywikibot. I was figuring a function that will take all the parameters the API offers (datatype - a s
ArthurPSmith added a comment.
You're the one who brought up JSON! It sounds like the issue is something different though - internal representation as strings? Anyway, are you recommending pywikibot use the wbparsevalue API for all (or at least numerical) input? That could be a good idea.
ArthurPSmith added a comment.
That restriction is NOT in the JSON spec: http://tools.ietf.org/html/rfc7159.html#section-6 - also the leading plus is not required by JSON. Is there some other reason for the limitation in the wikidata code? DataValues is a wikidata-specific PHP library right? I
ArthurPSmith added a comment.
Hmm. So is it a pywikibot problem or a wikibase API problem? Is pywikibot sending in JSON format?TASK DETAILhttps://phabricator.wikimedia.org/T119226EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: thiemowmde
ArthurPSmith added a comment.
As far as testing goes, I have (in my own copy) added the following to the pywikibot tests/wikibase_edit_tests.py file (within the class TestWikibaseMakeClaim):
def _check_quantity_claim(self, value, uncertainty):
"""Helper function to add and
ArthurPSmith added a comment.
Please note this is still an issue with the latest pywikibot code and current wikidata release - as of June 23, 2016. The following is the fix I have in the pywikibot core pywikibot/__init__.py file:
instead of
format(value, "+g")
we need:
if math.
ArthurPSmith added a comment.
Note this may be just a problem for the Freebase Identifier; if you replace
just the leading '%2f' with '/' the remaining '%2f' characters are correctly
interpreted by the server (as they should be) - the problem is the formatter
ArthurPSmith added a comment.
In https://phabricator.wikimedia.org/T91505#2015282, @Ricordisamoa wrote:
> In https://phabricator.wikimedia.org/T91505#2008209, @Swpb wrote:
>
> > this discussion
<https://www.wikidata.org/wiki/Wikidata:Property_proposal/Natural_science#R
ArthurPSmith closed this task as "Resolved".
ArthurPSmith claimed this task.
ArthurPSmith added a comment.
Herald added a subscriber: StudiesWorld.
Probably should close this - it's been up live for a week or so now! Ran into
a problem with query service bugs, but that seems
ArthurPSmith created this task.
ArthurPSmith added a subscriber: ArthurPSmith.
ArthurPSmith added projects: Tool-Labs, Wikidata-Periodic-Table.
Herald added subscribers: StudiesWorld, Aklapper.
Herald added projects: Labs, Wikidata.
TASK DESCRIPTION
https://tools.wmflabs.org/ptable/ has been
ArthurPSmith added a subscriber: ArthurPSmith.
TASK DETAIL
https://phabricator.wikimedia.org/T91505
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: ArthurPSmith
Cc: ArthurPSmith, gerritbot, Smalyshev, Shrutika719, MGChecker, Sannita,
Ricordisamoa
ArthurPSmith added a subscriber: ArthurPSmith.
TASK DETAIL
https://phabricator.wikimedia.org/T67397
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: Physikerwelt, ArthurPSmith
Cc: ArthurPSmith, TomT0m, Llyrian, WickieTheViking, Aklapper, MGChecker
ArthurPSmith added a subscriber: ArthurPSmith.
TASK DETAIL
https://phabricator.wikimedia.org/T110534
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: ArthurPSmith
Cc: ArthurPSmith, Tarrow, Addshore, Ricordisamoa, daniel, thiemowmde,
Lydia_Pintscher
ArthurPSmith closed this task as "Resolved".
TASK DETAIL
https://phabricator.wikimedia.org/T112130
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: ArthurPSmith
Cc: ArthurPSmith, zhuyifei1999, jayvdb, Ladsgroup, gerritbot, Aklapper,
pywi
ArthurPSmith created this task.
ArthurPSmith added a subscriber: ArthurPSmith.
ArthurPSmith added projects: Wikidata, Pywikibot-Wikidata, pywikibot-core.
Herald added subscribers: pywikibot-bugs-list, StudiesWorld, Aklapper.
TASK DESCRIPTION
pywikibot was recently updated to better handle
ArthurPSmith added a subscriber: ArthurPSmith.
ArthurPSmith added a comment.
I've been using pywikibot to handle quantities with units for the past few
weeks, it seems to work fine. I don't see what else needs to be done here?
TASK DETAIL
https://phabricator.wikimedia.org/T112
ArthurPSmith added a subscriber: ArthurPSmith.
ArthurPSmith added a comment.
Herald added a subscriber: StudiesWorld.
Just want to add support - this would be useful if possible! Of course it's not
possible in the web interface (claim has to be added first, then qualifiers &
sources in
ArthurPSmith created this task.
ArthurPSmith added a subscriber: ArthurPSmith.
ArthurPSmith added a project: Wikidata.
Herald added subscribers: StudiesWorld, Aklapper.
TASK DESCRIPTION
After entering a value for the Planck constant
(https://www.wikidata.org/wiki/Q122894) in terms of its SI
ArthurPSmith added a comment.
Ok - see https://gerrit.wikimedia.org/r/245591 for the change.
TASK DETAIL
https://phabricator.wikimedia.org/T114547
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: ArthurPSmith
Cc: ArthurPSmith, Pamputt, Tobias1984
ArthurPSmith added a comment.
Thanks! I'm partially set up but I need to do a bit of reading. I will most
likely get this in (with updates) Monday - hope that's ok!
TASK DETAIL
https://phabricator.wikimedia.org/T114547
EMAIL PREFERENCES
https://phabricator.wikimedia.org/sett
ArthurPSmith added a comment.
I've never used Gerrit - I guess it's gerrit.wikimedia.org? phabricator/tools
is the project? How does one get an account there?
I've made a few changes but a couple of things still in progress - it's getting
closer, here's an image of
ArthurPSmith added a comment.
I hacked on the periodic table code to get a very bare-bones nuclides code
working... see files uploaded (nuclides.py has the main content, units.py is to
do something with half-life data for now, nu_app.py runs the flask app,
index.html is the template display
ArthurPSmith added a comment.
By the way, the standard chart uses neutron number on the horizontal axis and
proton number (i.e. the "atomic number" property) on the vertical. Every
legitimate nuclide in wikidata seems to have those set correctly (I had to
correct a handful last wee
1 - 100 of 102 matches
Mail list logo