简单回归模型中截距项的一致估计量证明
在简单回归模型教材(5.16)中,我们在前4个高斯-马尔科夫假定下证明了,形如教材(5.17)的估计量是斜率β1的一致估计量。给定这样一个估计量,定义β1,的一个估计量为 。 证明plimβ0=β0 根据教材(5.17),我们有
$$\hat{\beta}1 = \frac{\sum{i=1}^n (X_i - \bar{X})(Y_i - \bar{Y})}{\sum_{i=1}^n(X_i - \bar{X})^2}$$
首先,我们可以将$\hat{\beta}_0$写成
$$\hat{\beta}_0 = \bar{Y} - \hat{\beta}_1 \bar{X}$$
接着,我们有
$$\begin{aligned}\sqrt{n}(\hat{\beta}0 - \beta_0) &= \sqrt{n}\left((\bar{Y} - \hat{\beta}1 \bar{X}) - \beta_0\right)&= \sqrt{n}\left(\frac{1}{n}\sum{i=1}^n Y_i - \beta_0 - \hat{\beta}1 \left(\frac{1}{n}\sum{i=1}^n X_i - \bar{X}\right)\right)&= \sqrt{n}\left(\frac{1}{n}\sum{i=1}^n (Y_i - \beta_0 - \beta_1 (X_i - \bar{X}))\right)&= \frac{\sqrt{n}}{n}\sum_{i=1}^n (Y_i - \beta_0 - \beta_1 (X_i - \bar{X}))&= \frac{1}{\sqrt{n}}\sum_{i=1}^n (Y_i - \beta_0 - \beta_1 (X_i - \bar{X}))\end{aligned}$$
根据教材(5.16)中的高斯-马尔科夫假定,有$E(u_i) = 0$和$Var(u_i) = \sigma^2$。因此,我们可以使用中心极限定理来证明$\sqrt{n}(\hat{\beta}_0 - \beta_0)$的渐近正态性:
$$\begin{aligned}\sqrt{n}(\hat{\beta}0 - \beta_0) &= \frac{1}{\sqrt{n}}\sum{i=1}^n (Y_i - \beta_0 - \beta_1 (X_i - \bar{X}))&\xrightarrow{d} N(0, Var(Y_i - \beta_0 - \beta_1 (X_i - \bar{X})))&= N\left(0, Var(Y_i) + \beta_1^2 Var(X_i) - 2\beta_1 Cov(X_i,Y_i)\right)&= N\left(0, \sigma^2 + \frac{\beta_1^2}{n}\sum_{i=1}^n (X_i - \bar{X})^2 - \frac{2\beta_1}{n}\sum_{i=1}^n(X_i - \bar{X})(Y_i - \bar{Y})\right)\end{aligned}$$
为了证明$\hat{\beta}_0$是$\beta_0$的一致估计量,我们需要证明$\sqrt{n}(\hat{\beta}_0 - \beta_0) \xrightarrow{p} 0$。根据 Slutsky 定理,我们只需要证明
$$\begin{aligned}\sqrt{n}(\hat{\beta}0 - \beta_0) &\xrightarrow{d} 0\frac{\beta_1}{n}\sum{i=1}^n(X_i - \bar{X})(Y_i - \bar{Y}) &\xrightarrow{p} 0\end{aligned}$$
由于$\hat{\beta}1$是一致估计量,因此$\frac{1}{n}\sum{i=1}^n(X_i - \bar{X})(Y_i - \bar{Y}) \xrightarrow{p} Cov(X,Y)$。由于假设$X$和$Y$之间存在线性关系,因此$Cov(X,Y) \neq 0$。因此,我们可以将$\frac{\beta_1}{n}\sum_{i=1}^n(X_i - \bar{X})(Y_i - \bar{Y})$表示为
$$\frac{\beta_1}{n}\sum_{i=1}^n(X_i - \bar{X})(Y_i - \bar{Y}) = \frac{\beta_1}{n}\sum_{i=1}^n(X_i - \bar{X})[(Y_i - \beta_0 - \beta_1(X_i - \bar{X})) + \beta_0 - \bar{Y}]$$
因为$\frac{1}{n}\sum_{i=1}^n(X_i - \bar{X}) = 0$,所以
$$\frac{\beta_1}{n}\sum_{i=1}^n(X_i - \bar{X})(Y_i - \bar{Y}) = \frac{\beta_1}{n}\sum_{i=1}^n(X_i - \bar{X})(Y_i - \beta_0 - \beta_1(X_i - \bar{X})) + \frac{\beta_1\beta_0}{n}\sum_{i=1}^n(X_i - \bar{X})$$
第一个求和式可以表示为
$$\begin{aligned}\frac{\beta_1}{n}\sum_{i=1}^n(X_i - \bar{X})(Y_i - \beta_0 - \beta_1(X_i - \bar{X})) &= \frac{\beta_1}{n}\sum_{i=1}^n(X_i - \bar{X})Y_i - \frac{\beta_1}{n}\sum_{i=1}^n(X_i - \bar{X})\beta_0 - \frac{\beta_1}{n}\sum_{i=1}^n(X_i - \bar{X})\beta_1(X_i - \bar{X})\&= \beta_1\frac{\sum_{i=1}^n(X_i-\bar{X})Y_i}{n} - \beta_0\frac{\sum_{i=1}^n(X_i-\bar{X})}{n} - \beta_1\frac{\sum_{i=1}^n(X_i-\bar{X})^2}{n}\&= \beta_1\frac{\sum_{i=1}^n(X_i-\bar{X})Y_i}{n} - \beta_1\frac{\sum_{i=1}^n(X_i-\bar{X})^2}{n}\end{aligned}$$
因此,
$$\frac{\beta_1}{n}\sum_{i=1}^n(X_i - \bar{X})(Y_i - \bar{Y}) = \beta_1\left(\frac{\sum_{i=1}^n(X_i-\bar{X})Y_i}{n} - \frac{\sum_{i=1}^n(X_i-\bar{X})^2}{n}\right) + \frac{\beta_1\beta_0}{n}\sum_{i=1}^n(X_i - \bar{X})$$
由于$\hat{\beta}1$是一致估计量,因此$\frac{\sum{i=1}^n(X_i-\bar{X})Y_i}{n} \xrightarrow{p} Cov(X,Y)$和$\frac{\sum_{i=1}^n(X_i-\bar{X})^2}{n} \xrightarrow{p} Var(X)$。因为$X$和$Y$之间存在线性关系,所以$Cov(X,Y) \neq 0$,因此
$$\beta_1\left(\frac{\sum_{i=1}^n(X_i-\bar{X})Y_i}{n} - \frac{\sum_{i=1}^n(X_i-\bar{X})^2}{n}\right) \xrightarrow{p} \beta_1(Cov(X,Y) - Var(X)) \neq 0$$
因此,我们只需要证明$\frac{\beta_1\beta_0}{n}\sum_{i=1}^n(X_i - \bar{X}) \xrightarrow{p} 0$。由于$\hat{\beta}1$是一致估计量,因此$\frac{1}{n}\sum{i=1}^n(X_i - \bar{X})Y_i - \bar{X}\bar{Y} \xrightarrow{p} Cov(X,Y)$。因此,
$$\begin{aligned}\frac{\beta_1\beta_0}{n}\sum_{i=1}^n(X_i - \bar{X}) &= \frac{\beta_1\beta_0}{n}\left(\sum_{i=1}^nX_i - n\bar{X}\right)\&= \frac{\beta_1\beta_0}{n}\left(\sum_{i=1}^n(X_i - \bar{X})\right)\&= \frac{\beta_1\beta_0}{n}\sum_{i=1}^nX_i - \frac{\beta_1\beta_0}{n^2}\sum_{i=1}^n\sum_{j=1}^nX_j\&= \beta_1\beta_0\bar{X} - \frac{\beta_1\beta_0}{n}\left(\sum_{i=1}^n\sum_{j=1}^nX_j\right)\&= \beta_1\beta_0\bar{X} - \frac{n\beta_1\beta_0}{n}\left(\sum_{j=1}^nX_j\right)\&= \beta_1\beta_0\bar{X} - \beta_1\beta_0\bar{X} = 0\end{aligned}$$
因此,$\sqrt{n}(\hat{\beta}_0 - \beta_0) \xrightarrow{p} 0$,即$\hat{\beta}_0$是$\beta_0$的一致估计量。
原文地址: https://www.cveoy.top/t/topic/oiTf 著作权归作者所有。请勿转载和采集!