-------- Original-Nachricht --------
> Datum: Mon, 25 Apr 2011 17:21:17 -0400 (EDT)
> Von: Allin Cottrell <cottrell(a)wfu.edu>
> An: Gretl list <gretl-users(a)lists.wfu.edu>
> Betreff: Re: [Gretl-users] Activating HAC does not work

> On Mon, 25 Apr 2011, Artur Tarassow wrote:
> 
> > I am just estimating some VAR models and would like to use robust
> > standard errors. I am using the following lines to set up HAC...
> 
> You're right, these won't work to produce HAC for a VAR. It's not
> exactly a bug, but a semi-deliberate decision ;-)
> 
> That is, some time ago we replaced equation-by-equation estimation
> of VARs by a matrix method that does the whole thing in one go. At
> that time I rebuilt the HC variance estimator for the new method
> but I didn't bother rebuilding the HAC estimator. The reason
> (other than laziness) was that you'd generally expect a VAR to
> include enough lags to make HAC redundant. (Stock and Watson, for
> example, include several VARS in their undergraduate textbook and
> they always use a robust variance estimator, but they never use
> HAC for VARs: I asked them why not, and that's the answer they
> gave me.)
> 
> Anyway, it's easy enough to re-enable HAC for VARs if anyone
> really wants it. But if I do so, what should the default be?
> Should VARs be treated like regular models on time-series data
> with regard to the --robust option (that is, HAC unless you "set
> force_hc on")? Or vice versa (with a new VAR-specific "set"
> variable, "force_hac")?
> 
> What do people think?
> 

I tend to think it should be _possible_ to use HAC with VARs for demonstration 
purposes, even if it may not be wise to use them for real applications.

The robust default should probably be the "wise" one, i.e. HC but not HAC. 
However, there may also be a case to treat all time-series models alike, as you 
mention.

fwiw,
sven

Reply via email to