Dear list,
I am trying to compare the fitted sigma of the realGARCH model with my realized 
volatility input. However these two time series have completely different 
values. My realizedVol input has a values of around 0.004-0.010, while the 
fitted sigma output has values of around 0.001-0.003 and I am trying to 
understand why.
Below I walk through my code:
My model specification fitting below, where,return_c2casxts - daily 
close-to-close return object (an xts object)RVar_BN2008 - Daily realized 
variance measure (using realized kernel of Bandorff Nielsen 2008)
spec.realGARCH21 <- ugarchspec(variance.model = list(model = "realGARCH", 
garchOrder = c(2,1)),                 mean.model = list(armaOrder = c(1,1)),    
              distribution.model = 
"std")garch_model<-ugarchfit(spec.realGARCH21, return_c2casxts, solver = 
'hybrid', realizedVol=sqrt(RVar_BN2008))
RVolfitted<-garch_model@fit$sigma

Now comparing RVolfitted and sqrt(RVar_BN2008)below I have completely different 
values throughput the series (3500 days in total). 
RVolfitted:"       0.0020383691   0.0018783530    0.0020688548    0.0015357892  
  0.0016052453     0.0016191145sqrt(RVar_BN2008): 0.006596078    0.008683373    
 0.004344591     0.006393453     0.006223041      0.008463817
Is this a conceptual issue on my part on what sigma signifies here? How can I 
obtain fitted values corresponding to the realizedVol input?
Best Regards,Duco                                         
        [[alternative HTML version deleted]]

_______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-sig-finance
-- Subscriber-posting only. If you want to post, subscribe first.
-- Also note that this is not the r-help list where general R questions should 
go.

Reply via email to