[Pywikipedia-bugs] [Maniphest] [Commented On] T320590: WikidataSPARQLPageGenerator Swallows Failures

2022-10-17 Thread gerritbot
gerritbot added a comment. Change 842363 **merged** by jenkins-bot: [pywikibot/core@master] [IMPR] Raise a generic ServerError if requests response is a ServerError https://gerrit.wikimedia.org/r/842363 TASK DETAIL https://phabricator.wikimedia.org/T320590 EMAIL PREFERENCES htt

[Pywikipedia-bugs] [Maniphest] [Commented On] T320590: WikidataSPARQLPageGenerator Swallows Failures

2022-10-17 Thread BrokenSegue
BrokenSegue added a comment. that said it's preferable to merge the current fix TASK DETAIL https://phabricator.wikimedia.org/T320590 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Xqt, BrokenSegue Cc: Smalyshev, Xqt, Aklapper, BrokenSegue, pywik

[Pywikipedia-bugs] [Maniphest] [Commented On] T320590: WikidataSPARQLPageGenerator Swallows Failures

2022-10-13 Thread BrokenSegue
BrokenSegue added a comment. yeah this works better. now it throws an exception on server timeouts. but it isn't throwing exceptions on malformed SPARQL. e.g. >from pywikibot import pagegenerators as pg >generator = pg.WikidataSPARQLPageGenerator("select blah") WARNING: Http res

[Pywikipedia-bugs] [Maniphest] [Commented On] T320590: WikidataSPARQLPageGenerator Swallows Failures

2022-10-13 Thread gerritbot
gerritbot added a comment. Change 842363 had a related patch set uploaded (by Xqt; author: Xqt): [pywikibot/core@master] [IMPR] Raise a generic ServerError if requests response is a ServerError https://gerrit.wikimedia.org/r/842363 TASK DETAIL https://phabricator.wikimedia.org/T32

[Pywikipedia-bugs] [Maniphest] [Commented On] T320590: WikidataSPARQLPageGenerator Swallows Failures

2022-10-12 Thread BrokenSegue
BrokenSegue added a comment. That's probably because you have your timeout in `user-config.py` is too low. The default value probably also is too low. If you set socket_timeout = 240 then you get the behavior I described. I tried commenting that line out in my config and I got wha

[Pywikipedia-bugs] [Maniphest] [Commented On] T320590: WikidataSPARQLPageGenerator Swallows Failures

2022-10-12 Thread Xqt
Xqt added a comment. Wasn't able to reproduce the response 500 issue. The Timeout was raised for me (after max_retries tries): WARNING: Waiting 20 seconds before retrying. ... Traceback (most recent call last): File "D:\pwb\GIT\core\pywikibot\data\sparql.py", line 151, in

[Pywikipedia-bugs] [Maniphest] [Commented On] T320590: WikidataSPARQLPageGenerator Swallows Failures

2022-10-12 Thread BrokenSegue
BrokenSegue added a comment. sure. here's a query that reliably times out: SELECT distinct ?item WHERE { VALUES ?goodRanks { wikibase:NormalRank wikibase:PreferredRank } ?item p:P856 ?url. ?url wikibase:rank ?goodRanks. # don't look at dead urls FILT

[Pywikipedia-bugs] [Maniphest] [Commented On] T320590: WikidataSPARQLPageGenerator Swallows Failures

2022-10-12 Thread Xqt
Xqt added a comment. @BrokenSegue: ah, that means the Timeout isn't raised in such case. Can you give me your `timeoutQuery` example to find a solution for such 500 status response. TASK DETAIL https://phabricator.wikimedia.org/T320590 EMAIL PREFERENCES https://phabricator.wikimedia.org

[Pywikipedia-bugs] [Maniphest] [Commented On] T320590: WikidataSPARQLPageGenerator Swallows Failures

2022-10-12 Thread BrokenSegue
BrokenSegue added a comment. I am using requests version "2.28.1". And unfortunately your solution doesn't work. That retry only happens if the HTTP request times out. But often when using wikidata's SPARQL the server itself times out and returns non-JSON as its response. This causes the

[Pywikipedia-bugs] [Maniphest] [Commented On] T320590: WikidataSPARQLPageGenerator Swallows Failures

2022-10-12 Thread Xqt
Xqt added a comment. @BrokenSegue: What is your requests version? TASK DETAIL https://phabricator.wikimedia.org/T320590 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Xqt Cc: Xqt, Aklapper, BrokenSegue, pywikibot-bugs-list, PotsdamLamb, Jyoo1011,

[Pywikipedia-bugs] [Maniphest] [Commented On] T320590: WikidataSPARQLPageGenerator Swallows Failures

2022-10-11 Thread BrokenSegue
BrokenSegue added a comment. The cause of the bug is this line in sparql.py . It's not clear to me how big the blast radius from fixing it there would be. Probably big. TASK DETAIL https://phabricator.wikime