I'm trying to produce one-day ahead volatility forecasts for Bitcoin with
Realized GARCH(1,1) using the rugarch package in R. The realized variance(
data$rv5) is aggregated based on a 5 minute frequency, and the returns(
data.xts$ret) are close-to-close. Here's the specs:

rgarch.spec<- ugarchspec(mean.model = list(armaOrder= c(0,0),
                          include.mean = FALSE),
                          variance.model = list(model= 'realGARCH',
                                                garchOrder= c(1,1)),
                          distribution.model = 'norm')

 rgarchroll<- ugarchroll(spec = rgarch.spec,
                         data= data.xts$ret,
                         n.ahead = 1,
                         forecast.length = forecast_len,
                         refit.every = 5,
                         solver= 'hybrid',
                         realizedVol= data.xts$rv5,
                         VaR.alpha = c(0.01, 0.05, 0.10))

realized_vol= sqrt(tail(data.xts$rv5,forecast_len)),
rgarch.prediction_vol= rgarchroll@forecast$density$Sigma)

A plot of the results can be found here: https://i.stack.imgur.com/XAA3r.png

As you can see, the predicted volatility is consistently higher than the
realized volatility. Needless to say, the VaR predictions are not accurate
at all. However, the standard GARCH(1,1) model works fine using the same
return data. So what could possibly be the issue?

        [[alternative HTML version deleted]]

_______________________________________________
R-SIG-Finance@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-sig-finance
-- Subscriber-posting only. If you want to post, subscribe first.
-- Also note that this is not the r-help list where general R questions should 
go.

Reply via email to