Statistics - PS2
โํ๋ฅ ๊ณผ ํต๊ณ(MATH230)โ ์์ ์์ ๋ฐฐ์ด ๊ฒ๊ณผ ๊ณต๋ถํ ๊ฒ์ ์ ๋ฆฌํ ํฌ์คํธ์ ๋๋ค. ์ ์ฒด ํฌ์คํธ๋ Probability and Statistics์์ ํ์ธํ์ค ์ ์์ต๋๋ค ๐ฒ
์ด ๊ธ์ โIntroduction to Linear Regressionโ ํฌ์คํธ์์ ์ ์ํ ์์ ๋ค์ ํ์ดํ ํฌ์คํธ์ ๋๋ค.
Theorem.
The sum of residuals is zero.
\[\sum_{i=1}^n e_i = \sum_{i=1}^n (y_i - \hat{y}_i) = 0\]proof.
$\blacksquare$
Theorem.
The sum of product of residual and $x_i$s is zero.
\[\sum_{i=1}^n x_i e_i = \sum_{i=1}^n x_i (y_i - \hat{y}_i) = 0\]proof.
$\blacksquare$
Theorem.
proof.
(์คํฌ) ์ฆ๋ช ๊ณผ์ ์์ ์์์ ์ฆ๋ช ํ๋ ๋ ๋ช ์ ๋ฅผ ์ฌ์ฉํ๋ค!
\[\begin{aligned} \sum_{i=1}^n (y_i - \bar{y})^2 &= \sum_{i=1}^n (y_i - \hat{y}_i + \hat{y}_i - \bar{y})^2 \\ &= \sum_{i=1}^n \left((y_i - \hat{y}_i) + (\hat{y}_i - \bar{y})\right)^2 \\ &= \sum_{i=1}^n (y_i - \hat{y}_i)^2 + 2 \sum_{i=1}^n (y_i - \hat{y}_i)(\hat{y}_i - \bar{y}) + \sum_{i=1}^n (\hat{y}_i - \bar{y})^2 \\ \end{aligned}\]์ด๋, ์์ ์์์ ์ค๊ฐ์ ํ ๋ง ๋ฐ๋ก ๋ผ์ด๋ณด์. ๊ทธ๋ฆฌ๊ณ $\hat{y}_i$์ ๋ํ ์์ ๋์ ํ๋ฉด,
\[\begin{aligned} \sum_{i=1}^n (y_i - \hat{y}_i)(\hat{y}_i - \bar{y}) &= \sum_{i=1}^n (y_i - \hat{y}_i)(b_0 + b_1 x_i - \bar{y}) \\ &= \sum_{i=1}^n (y_i - \hat{y}_i)((\cancel{\bar{y}} - b_1 \bar{x}) + b_1 x_i - \cancel{\bar{y}}) \\ &= \sum_{i=1}^n (y_i - \hat{y}_i) \cdot b_1 (x_i - \bar{x}) \\ &= b_1 \cdot \left( \cancelto{0}{\sum_{i=1}^n (y_i - \hat{y}_i) x_i} - \bar{x} \cdot \cancelto{0}{\sum_{i=1}^n (y_i - \hat{y}_i)} \right) \\ &= 0 \end{aligned}\]