Estimation of the Parameters of a Single Equation in a Complete System of Stochastic Equations
Open Access
- 1 March 1949
- journal article
- Published by Institute of Mathematical Statistics in The Annals of Mathematical Statistics
- Vol. 20 (1), 46-63
- https://doi.org/10.1214/aoms/1177730090
Abstract
A method is given for estimating the coefficients of a single equation in a complete system of linear stochastic equations (see expression (2.1)), provided that a number of the coefficients of the selected equation are known to be zero. Under the assumption of the knowledge of all variables in the system and the assumption that the disturbances in the equations of the system are normally distributed, point estimates are derived from the regressions of the jointly dependent variables on the predetermined variables (Theorem 1). The vector of the estimates of the coefficients of the jointly dependent variables is the characteristic vector of a matrix involving the regression coefficients and the estimate of the covariance matrix of the residuals from the regression functions. The vector corresponding to the smallest characteristic root is taken. An efficient method of computing these estimates is given in section 7. The asymptotic theory of these estimates is given in a following paper [2]. When the predetermined variables can be considered as fixed, confidence regions for the coefficients can be obtained on the basis of small sample theory (Theorem 3). A statistical test for the hypothesis of over-identification of the single equation can be based on the characteristic root associated with the vector of point estimates (Theorem 2) or on the expression for the small sample confidence region (Theorem 4). This hypothesis is equivalent to the hypothesis that the coefficients assumed to be zero actually are zero. The asymptotic distribution of the criterion is shown in a following paper [2] to be that of $\chi^2$.