Hi,
   
  I am trying to simulate an Elliott-Gilbert two-state error model in 802.11 in 
which the ratio of the time spent in good vs. bad states is 10:1. I am using 
the following tcl code:
   
  proc tserr {} {
set rvgood [new RandomVariable/Exponential]
$rvgood set avg_ 10.0
set rvbad [new RandomVariable/Exponential]
$rvbad set avg_ 1.0
set err [new ErrorModel/TwoState $rvgood $rvbad pkt]
return $err
}
   
  # configure base-station and all wireless nodes
$ns_ node-config -adhocRouting $opt(adhocRouting) \
                 -llType $opt(ll) \
                   ...
                   ...
                 -IncomingErrProc tserr
  
 
  I'm expecting that the average packet loss rate (assuming 100% loss in bad 
state) is roughly 9% (1/11), but this is not the case. I'm using CBR traffic, 
and the length of the simulation is 350 seconds. I'm wondering what I did 
wrong. Could anyone help me out?
   
  Also, how do I change the 100% loss in bad state to (say) an 80% loss?
   
  And finally, in the base error model, does the rate_ represent the fraction 
of packets lost? As in, rate 0.2 means 20% of the packets with error?
   
  Thanks,
  Samarth.

                
---------------------------------
Relax. Yahoo! Mail virus scanning helps detect nasty viruses!

Reply via email to