and S0 1 is the variance-covariance matrix for the random vector ^/n (j3n — f3o) as presented in Dupuy and Mesbah (2004).
The proof of the above theorem is straightforward, even if technically cumbersome. The steps in proving it are similar to the proof of Theorem 1. It involve showing that the score functions associated with the likelihoods defined in (1.7) and (4.1) are asymptotically jointly normal. This can be achieved by extending the steps of the proof of the asymptotic normality for the score function for the likelihood in (1.7).
Now, let Dwbe the consistent estimator of the matrix DW. Note that this estimator exists for So-1 as shown by Dupuy and Mesbah (2002) and therefore naturally exists for S—^o and dw . Hence, we propose a test statistic to test the model (1.7) is as:
>From theorem 2, under the null hypothesis of a correct model, qw will have an asymptotic chi squared distribution with 2 degrees of freedom. Hence we can reject the model (1.7) for large values of the test statistic.
Was this article helpful?