----- Original Message -----
From: "Ed Park" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Thursday, January 04, 2001 6:52 PM
Subject: getting rid of multiple identical http requests (bad users
double-clicking)


> Does anyone out there have a clean, happy solution to the problem of users
> jamming on links & buttons? Analyzing our access logs, it is clear that it's
> relatively common for users to click 2,3,4+ times on a link if it doesn't
> come up right away. This not good for the system for obvious reasons.

The best solution is to make the page come up right away...  If that isn't
possible, try to make at least something show up.  If your page consists
of a big table the browser may be waiting until the closure to compute
the column widths before it can render anything.

> I can think of a few ways around this, but I was wondering if anyone else
> had come up with anything. Here are the avenues I'm exploring:
> 1. Implementing JavaScript disabling on the client side so that links become
> 'click-once' links.
> 2. Implement an MD5 hash of the request and store it on the server (e.g. in
> a MySQL server). When a new request comes in, check the MySQL server to see
> whether it matches an existing request and disallow as necessary. There
> might be some sort of timeout mechanism here, e.g. don't allow identical
> requests within the span of the last 20 seconds.

This might be worthwhile to trap duplicate postings, but unless your page
requires a vast amount of server work you might as well deliver it as
go to this much trouble.

      Les Mikesell
        [EMAIL PROTECTED]


Reply via email to