https://bugzilla.wikimedia.org/show_bug.cgi?id=52287

--- Comment #4 from nuria <nu...@wikimedia.org> ---
Commenting in more detail here what we have been talking about on IRC:

Introducing client side delays is not optimal because it produces a slower UX
that users might otherwise have but, not only that, it is double-worrisome(?)
cause from a performance data standpoint those delays are impossible to detect. 

Example: the analytics team just reported recently page timings for all pages
around the world. The "delay" introduced by this logging would not be present
in any way on those numbers. So we would be presenting data that says, the 50th
percentile in the US is 400ms for page loading times. Now, that would be not at
all be what the user saw if we introduce arbitrary client side delays. In
reality the user is seeing 400ms + delay. 

That being said, this is not an easy problem to solve for "full page" reloads
and, as ori noted above, the "good" solution is the beacon API.

The preferable option I can see would be storing those events locally and
polling as needed to report them. This works well for logging internal page
clickthrough. Even for browsers that do not have localStorage (IE8) events can
be logged into a in-memory hashmap. 

Now, the caveat is that the system does not work so well to log external
clicks.

-- 
You are receiving this mail because:
You are on the CC list for the bug.
_______________________________________________
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to