On Wednesday April 25 2007 8:42 am, Varjü Tamás wrote:
> Dear Greg!
>
> The reason I did the test, and why I detailed the result is that I could
> not reproduce any of the nasty things which can happen without locking.
> I know that everyone suggests to use the lock, but following your
> argument the test should have had a different outcome.
>
> Any idea how to improve the test to realise potential data loss?  It
> would convince me beyond "to be on the safe side".
>
> Kanya

you example  I think features two processes accessing the same file , now  a 
typical web application  may have  2000 to 2 million processes going at the 
file at the same time ... example being some thing like a  search engine. 

To test  or produce  a race condition  or  over writing , put  the  script 
under a  real world load .

fork 20 or thirty instances of your script at the same time and let them  have 
a go at the file. then look at the results and see if  it's what you think it 
should be. bet it won't be :)


 If you are certain that  no more than 2 processes will ever  access the file 
then feel free to forgo file locking . I have personal seen this happen with 
as little as three processes accessing the same file... There is a reason 
everyone suggest using file locking it's because we have all "been their and 
done that". it's a common mistake new programmers make.

ask yourself "why do Data Base programs have  row locking ? " it's the same 
thing.

Greg
 

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/


Reply via email to