This question is not gretl-specific, but answers might show up gretl features and hopefully it may be of interest.

My dependent variable is the quantity demanded of a certain good, an hourly time series extending over two years. It exhibits strong seasonality, both by hour of the day and season of the year; it has an upward trend; and it's clearly affected by various measures of weather (temperature, humidity, wind speed). Plain OLS produces quite a decent fit, but by inspecting the loglikelihood-for-level figure produced by gretl (via the Jacobian) I can see that taking the log of the dependent variable gives a better fit. All fine.

However, total demand is the sum of demand from two classes of consumer -- call them A and B -- and I'm wondering if a better fit can be obtained by summing the fitted values from separate regressions, with dependent variables the demand from consumers A and B respectively. (Note: a Chow test in dummy variable mode is not applicable, the unit of observation is the hour, not the transaction.)

My thought was: compute two SSRs in levels, using (restricted) the exponentiated fitted values from the overall model and (unrestricted) the sum of the exponentiated fitted values from the two consumer-class models, then calculate an F test based on the difference of SSRs in the usual way.

Questions: Does this sound valid? Is there a better way of doing it?
[I'm aware of debate over how best to produce predictions of levels
from log-linear regression, but I'm not sure quite how it applies in this case.]

Allin Cottrell
_______________________________________________
Gretl-users mailing list -- [email protected]
To unsubscribe send an email to [email protected]
Website: 
https://gretlml.univpm.it/postorius/lists/gretl-users.gretlml.univpm.it/

Reply via email to