BBlack added a comment.

  Did some further testing on an isolated test machine, using our current 
varnish3 package.
  
  - Got 2833-byte test file from uncorrupted (--compressed) output on prod.  
This is the exact compressed content bytes emitted by MW/Apache for the broken 
test URL.
  - Configured a test backend (nginx) to serve static files, and to always set 
CE:gzip.
  - Placed the gzipped 2833 byte file in test directory, fetched over curl w/ 
--compressed, md5sum comes out right.
  - When fetched through our varnishd with a default config and do_gzip turned 
on, varnish does decompress this file for the curl client, and there is no 
corruption (still same md5sum).
  
  This rules out the possibility that this is some pure, data-sensitive varnish 
bug with 
  gunzipping the content itself.  However, the notable diff in this test from 
reality is that nginx serving a static pre-gzipped file is (a) not emitting it 
as TE:chunked and (b) even if it did, it probably wouldn't use the same chunk 
boundaries, nor is it likely to share any TE:chunked encoding bugs or 
varnish-bug-triggers...

TASK DETAIL
  https://phabricator.wikimedia.org/T133866

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: BBlack
Cc: elukey, ema, Aklapper, hoo, D3r1ck01, Izno, Wikidata-bugs, aude, Mbch331, 
Jay8g, jeremyb



_______________________________________________
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs

Reply via email to