Re: getting rid of multiple identical http requests (bad users double-clicking)

2001-01-07 Thread Stas Bekman

On Fri, 5 Jan 2001, Gunther Birznieks wrote:

 Sorry if this solution has been mentioned before (i didn't read the earlier 
 parts of this thread), and I know it's not as perfect as a server-side 
 solution...
 
 But I've also seen a lot of people use javascript to accomplish the same 
 thing as a quick fix. Few browsers don't support javascript. Of the small 
 amount that don't, the venn diagram merge of browsers that don't do 
 javascript and users with an itchy trigger finger is very small. The 
 advantage is that it's faster than mungling your own server-side code with 
 extra logic to prevent double posting.

Nothing stops users from saving the form and resubmitting it without the
JS code. This may reduce the number of attempts, but it's a partial
solution and won't stop determined users.

_
Stas Bekman  JAm_pH --   Just Another mod_perl Hacker
http://stason.org/   mod_perl Guide  http://perl.apache.org/guide 
mailto:[EMAIL PROTECTED]   http://apachetoday.com http://logilune.com/
http://singlesheaven.com http://perl.apache.org http://perlmonth.com/  





Re: getting rid of multiple identical http requests (bad users double-clicking)

2001-01-07 Thread James G Smith

Stas Bekman [EMAIL PROTECTED] wrote:
On Fri, 5 Jan 2001, Gunther Birznieks wrote:

 Sorry if this solution has been mentioned before (i didn't read the earlier 
 parts of this thread), and I know it's not as perfect as a server-side 
 solution...
 
 But I've also seen a lot of people use javascript to accomplish the same 
 thing as a quick fix. Few browsers don't support javascript. Of the small 
 amount that don't, the venn diagram merge of browsers that don't do 
 javascript and users with an itchy trigger finger is very small. The 
 advantage is that it's faster than mungling your own server-side code with 
 extra logic to prevent double posting.

Nothing stops users from saving the form and resubmitting it without the
JS code. This may reduce the number of attempts, but it's a partial
solution and won't stop determined users.

Nothing dependent on the client can be considered a fail-safe 
solution.

I encountered this problem with some PHP pages, but the idea is 
the same regardless of the language.

Not all pages have problems with double submissions.  For 
example, a page that provides read-only access to data usually 
can be retrieved multiple times without damaging the data.  It's 
submitting changes to data that can become the problem.  I ended 
up locking on some identifying characteristic of the object whose 
data is being modified.  If I can't get the lock, I send back a 
page to the user explaining that there probably was a double 
submission and everything might have gone ok.  The user would 
need to go in and check the data to make sure.

In pseudo-perl-code:

sub get_lock {
  my($objecttype, $objectid) = @_;

  $n = 0;
  local($sec,$min,$hr,$md, $mon, $yr, $wday, $yday,$isdst) = gmtime(time);
  $lockfile = sprintf("%s/%4d%2d%2d%2d%2d%2d-%s", $objecttype, $yr+1900, $mon+1, $md, 
$hr, $min, $sec, $objectid);
  for( $n = 0; $n  1  !$r; $n++ ) { 
$r = link("$dir/$nullfile", "$dir/$lockfile-$n.lock");
  }
  
  return $r;
}

So, for example, if I am trying to modify an entry for a test 
organization in our directory service, the lock is

  "/var/md/dsa/shadow/www-ldif-log/roles and 
organizations/20010107175816-luggage-org-0.lock"

  $dir = "/var/md/dsa/shadow/www-ldif-log";
  $objecttype = "roles and organizations";
  $objectid   = "luggage-org";

This is a specific example, but I'm sure other ways can have the 
same result -- basically serializing write access to individual 
objects, in this case, in our directory service.  Then, double 
submissions don't hurt anything.

Regarding the desire to not add code - never let down your guard 
when you are designing and programming.  Paranoid people should 
be inherently more secure.
+-
James Smith - [EMAIL PROTECTED] | http://www.jamesmith.com/
[EMAIL PROTECTED] | http://sourcegarden.org/
  [EMAIL PROTECTED]  | http://cis.tamu.edu/systems/opensystems/
+--



Re: getting rid of multiple identical http requests (bad users double-clicking)

2001-01-07 Thread James G Smith

James G Smith [EMAIL PROTECTED] wrote:
Stas Bekman [EMAIL PROTECTED] wrote:
On Fri, 5 Jan 2001, Gunther Birznieks wrote:

 Sorry if this solution has been mentioned before (i didn't read the earlier 
 parts of this thread), and I know it's not as perfect as a server-side 
 solution...
 
 But I've also seen a lot of people use javascript to accomplish the same 
 thing as a quick fix. Few browsers don't support javascript. Of the small 
 amount that don't, the venn diagram merge of browsers that don't do 
 javascript and users with an itchy trigger finger is very small. The 
 advantage is that it's faster than mungling your own server-side code with 
 extra logic to prevent double posting.

Nothing stops users from saving the form and resubmitting it without the
JS code. This may reduce the number of attempts, but it's a partial
solution and won't stop determined users.

Nothing dependent on the client can be considered a fail-safe 
solution.

I encountered this problem with some PHP pages, but the idea is 
the same regardless of the language.

Not all pages have problems with double submissions.  For 
example, a page that provides read-only access to data usually 
can be retrieved multiple times without damaging the data.  It's 
submitting changes to data that can become the problem.  I ended 
up locking on some identifying characteristic of the object whose 
data is being modified.  If I can't get the lock, I send back a 
page to the user explaining that there probably was a double 
submission and everything might have gone ok.  The user would 
need to go in and check the data to make sure.

In pseudo-perl-code:

sub get_lock {
  my($objecttype, $objectid) = @_;

  $n = 0;
  local($sec,$min,$hr,$md, $mon, $yr, $wday, $yday,$isdst) = gmtime(time);
  $lockfile = sprintf("%s/%4d%2d%2d%2d%2d%2d-%s", $objecttype, $yr+1900, $mon+1, $md, 
$hr, $min, $sec, $objectid);
  for( $n = 0; $n  1  !$r; $n++ ) { 
$r = link("$dir/$nullfile", "$dir/$lockfile-$n.lock");
  }
  
  return $r;
}

So, for example, if I am trying to modify an entry for a test 
organization in our directory service, the lock is

  "/var/md/dsa/shadow/www-ldif-log/roles and 
organizations/20010107175816-luggage-org-0.lock"

  $dir = "/var/md/dsa/shadow/www-ldif-log";
  $objecttype = "roles and organizations";
  $objectid   = "luggage-org";

I realized shortly after I sent this that I made a mistake...

The above code gives me a good filename for creating an LDIF to 
feed to ldapmodify.  To actually lock on an object, the code 
should be

sub get_lock {
  my($objecttype, $objectid) = @_;

  $lockfile = "$objecttype/$objectid.lock";
  return link("$dir/$nullfile", "$dir/$lockfile");
}

The resulting lockfile is

   "/var/md/dsa/shadow/www-ldif-log/roles and organizations/luggage-org.lock"
+-
James Smith - [EMAIL PROTECTED] | http://www.jamesmith.com/
[EMAIL PROTECTED] | http://sourcegarden.org/
  [EMAIL PROTECTED]  | http://cis.tamu.edu/systems/opensystems/
+--



Re: getting rid of multiple identical http requests (bad users double-clicking)

2001-01-04 Thread Randal L. Schwartz

 "Ed" == Ed Park [EMAIL PROTECTED] writes:

Ed Has anyone else thought about this?

If you're generating the form on the fly (and who isn't, these days?),
just spit a serial number into a hidden field.  Then lock out two or
more submissions with the same serial number, with a 24-hour retention
of numbers you've generated.  That'll keep 'em from hitting "back" and
resubmitting too.

To keep DOS attacks at a minimum, it should be a cryptographically
secure MD5, to prevent others from lojacking your session.

-- 
Randal L. Schwartz - Stonehenge Consulting Services, Inc. - +1 503 777 0095
[EMAIL PROTECTED] URL:http://www.stonehenge.com/merlyn/
Perl/Unix/security consulting, Technical writing, Comedy, etc. etc.
See PerlTraining.Stonehenge.com for onsite and open-enrollment Perl training!



Re: getting rid of multiple identical http requests (bad users double-clicking)

2001-01-04 Thread Gunther Birznieks

Sorry if this solution has been mentioned before (i didn't read the earlier 
parts of this thread), and I know it's not as perfect as a server-side 
solution...

But I've also seen a lot of people use javascript to accomplish the same 
thing as a quick fix. Few browsers don't support javascript. Of the small 
amount that don't, the venn diagram merge of browsers that don't do 
javascript and users with an itchy trigger finger is very small. The 
advantage is that it's faster than mungling your own server-side code with 
extra logic to prevent double posting.

Add this to the top of the form:

 SCRIPT LANGUAGE="JavaScript"
 !--
 var clicks = 0;

 function submitOnce() {
 clicks ++;
 if (clicks  2) {
 return true;
 } else {
 // alert("You have already clicked the submit button. " + 
clicks + " clicks");
 return false;
 }
 }
 //--
 /SCRIPT

And then just add the submitOnce() function to the submit event for the 
form tag.

At 05:26 PM 1/4/01 -0800, Randal L. Schwartz wrote:
  "Ed" == Ed Park [EMAIL PROTECTED] writes:

Ed Has anyone else thought about this?

If you're generating the form on the fly (and who isn't, these days?),
just spit a serial number into a hidden field.  Then lock out two or
more submissions with the same serial number, with a 24-hour retention
of numbers you've generated.  That'll keep 'em from hitting "back" and
resubmitting too.

To keep DOS attacks at a minimum, it should be a cryptographically
secure MD5, to prevent others from lojacking your session.

--
Randal L. Schwartz - Stonehenge Consulting Services, Inc. - +1 503 777 0095
[EMAIL PROTECTED] URL:http://www.stonehenge.com/merlyn/
Perl/Unix/security consulting, Technical writing, Comedy, etc. etc.
See PerlTraining.Stonehenge.com for onsite and open-enrollment Perl training!

__
Gunther Birznieks ([EMAIL PROTECTED])
eXtropia - The Web Technology Company
http://www.extropia.com/




Re: getting rid of multiple identical http requests (bad users double-clicking)

2001-01-04 Thread Les Mikesell


- Original Message -
From: "Ed Park" [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Thursday, January 04, 2001 6:52 PM
Subject: getting rid of multiple identical http requests (bad users
double-clicking)


 Does anyone out there have a clean, happy solution to the problem of users
 jamming on links  buttons? Analyzing our access logs, it is clear that it's
 relatively common for users to click 2,3,4+ times on a link if it doesn't
 come up right away. This not good for the system for obvious reasons.

The best solution is to make the page come up right away...  If that isn't
possible, try to make at least something show up.  If your page consists
of a big table the browser may be waiting until the closure to compute
the column widths before it can render anything.

 I can think of a few ways around this, but I was wondering if anyone else
 had come up with anything. Here are the avenues I'm exploring:
 1. Implementing JavaScript disabling on the client side so that links become
 'click-once' links.
 2. Implement an MD5 hash of the request and store it on the server (e.g. in
 a MySQL server). When a new request comes in, check the MySQL server to see
 whether it matches an existing request and disallow as necessary. There
 might be some sort of timeout mechanism here, e.g. don't allow identical
 requests within the span of the last 20 seconds.

This might be worthwhile to trap duplicate postings, but unless your page
requires a vast amount of server work you might as well deliver it as
go to this much trouble.

  Les Mikesell
[EMAIL PROTECTED]