JanZerebecki added a comment.

Short: Because IE is broken.
Long: The problem here is that while 
https://tools.ietf.org/html/rfc2616#section-3.2.1 says "The HTTP protocol does 
not place any a priori limit on the length of a URI." this reality doesn't 
conform. On the server side our caching layer is AFAIK even in violation of 
"SHOULD be able to handle URIs of unbounded length if they provide GET-based 
forms that could generate such URIs". It probably caps at around 16k. In 
reality we can probably live with that as the limit on queries for the public 
queries.wikidata.org. But while we might manage to fix all the SPARQL tools, we 
can't fix all the browsers. Which leaves us with: "Servers ought to be cautious 
about depending on URI lengths above 255 bytes".

To best serve the listed requirements we probably need to have generic handling 
for this in the layer that differentiates between POST and GET in the described 
multi data center way. It would e.g. know by a certain query argument in the 
Request-URI that this POST request has GET semantics with the actual resource 
specified by the Request-URI together with the request body. Should we then 
transform the request to a GET request or handle this special case all the way 
in the stack? Any better ideas?


TASK DETAIL
  https://phabricator.wikimedia.org/T112151

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Smalyshev, JanZerebecki
Cc: JanZerebecki, BBlack, Andrew, Deskana, Joe, gerritbot, nichtich, Jneubert, 
Karima, Aklapper, Smalyshev, JGirault, jkroll, Wikidata-bugs, Jdouglas, aude, 
Manybubbles



_______________________________________________
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs

Reply via email to