Published online by Cambridge University Press: 05 March 2004
Redundancy of lagged regressors in a conditionally heteroskedastic time series regression—solution.
A solution to the problem is to calculate the greatest lower bound for the asymptotic variance of GMM estimators based on moment function h(i;β) = (yt − βxt)xt−i, i = 0,1,2,…. To do so, we apply Hansen (1985). In the sequel, we will denote h(i) the moment functions h(i;β0). Let L2(h) be the linear space spanned by {h(i) : i = 0,1,…}; it is made of elements
for any real αi. Note that {h(i) : i = 0,1,…} are martingale difference sequences. Moreover, they are linearly independent and in that sense are complete in L2(h).
We are looking for the solution of equation (4.8) of Hansen (1985). That is, we are looking for the element G in L2(h) (if it exists) such that
Or, equivalently, we are looking for the constants αj solutions of
Note that the solution may not exist in L2(h), but there is always a solution in the closure of L2(h). If a solution in L2(h) exists, it is unique because {h(i) : i = 0,1,…} is complete. Let αj = 0, for j = 1,2,…; equation (2) becomes
Using the fact that ηt are i.i.d. standard normal, we have
Hence, Equation (2) becomes
which is satisfied for
Therefore, we found a solution to (1), which is G0 = α0 xt et with α0 as in (3). By Lemma 4.3 of Hansen (1985), the greatest lower bound is given by
Note that α0 can be rewritten as
Therefore, the GMM efficiency bound is
which corresponds to the asymptotic variance of the OLS estimator.
We have shown that the OLS estimator is not only as efficient as any GMM estimator that uses an arbitrary fixed number of instruments from {xt, xt−1,…} but also as efficient as any GMM estimator that uses an infinity of instruments from {xt, xt−1,…}.