aude created this task.
aude added subscribers: aude, aaron.
aude added projects: Wikidata, MediaWiki-extensions-WikibaseRepository.
Herald added a subscriber: Aklapper.

TASK DESCRIPTION
  We are sometimes (not sure how much, maybe we could at least log this) 
running into the PARSE_THRESHOLD_SEC limit when running refreshlinks jobs on 
wikidata.
  
  
https://github.com/wikimedia/mediawiki/blob/master/includes/jobqueue/jobs/RefreshLinksJob.php#L207-L215
 
  
  the parseroutput is generated without html and then it is cached. then when 
viewing an item, it has no html at all.
  
  locally, generating parseroutput in refreshlinks takes ~1.1 seconds if my 
item has a statement.  without statements, it is ~0.9 seconds.  I suppose 
production machines are more powerful than my laptop and maybe hitting more 
than a second is more rare but we have reports / cases of this bug happening. 
(especially with larger items? and depending on timing of refreshlinks)
  
  i am not sure we should be doing caching there, without the html.  or maybe 
the parser options / key should be different? or the parseroutput should 
include html there?
  
  suppose we can work around the problem in wikibase for now, but not entirely 
happy with doing that vs. changing how this is done in core.

TASK DETAIL
  https://phabricator.wikimedia.org/T120935

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: aude
Cc: Aklapper, aaron, aude, Wikidata-bugs, Mbch331



_______________________________________________
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs

Reply via email to