Hi, I have a question regarding the selection of n, the number of time steps, in a binomial option pricing model. I suppose my question is not strictly related to R. As larger values should be more accurate, what I've read on the subject simply suggests that you use a sufficiently large value for your purposes. So I've been trying to evaluate what is a sufficiently large value of n for my purposes. Is there any rule of thumb regarding the value of n?
When using the fOptions package CRRBinomialTreeOption function, with varying n, the price oscillates back and forth converging on a price. This can be clearly seen through plotting. require(fOptions) x <- function(n) { CRRBinomialTreeOption(TypeFlag = "ca", S = 50, X = 50, Time = 1/12, r = 0.02, b = 0.02, sigma = 0.18, n = n)@price } y <- sapply(1:100, x) # mean(y) == 1.079693 plot(y) Given this oscillation, my question is whether it would be "better" to compute two prices using two smaller, consecutive values of n rather than one large value? Or is there some other better way? For example, using n =1000 or 1001, the option prices are within 5 hundredths of a cent, but the calculation is extremely slow for either. x(1000) # 1.077408 x(1001) # 1.077926 mean(sapply(1000:1001, x)) # 1.077667 Comparatively, taking the mean of n= 40 and 41 yields a value very close to the middle of the range, yet is much faster. mean(sapply(40:41, x)) # 1.0776 It seems like averaging two smaller, consecutive values of n is basically as accurate and far faster than using large values of n. I was hoping someone might have some insight into why this might or might not be a valid approach. Thanks. James _______________________________________________ R-SIG-Finance@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-sig-finance -- Subscriber-posting only. If you want to post, subscribe first. -- Also note that this is not the r-help list where general R questions should go.