Statistics - PS2
βνλ₯ κ³Ό ν΅κ³(MATH230)β μμ μμ λ°°μ΄ κ²κ³Ό 곡λΆν κ²μ μ 리ν ν¬μ€νΈμ λλ€. μ 체 ν¬μ€νΈλ Probability and Statisticsμμ νμΈνμ€ μ μμ΅λλ€ π²
μ΄ κΈμ βIntroduction to Linear Regressionβ ν¬μ€νΈμμ μ μν μμ λ€μ νμ΄ν ν¬μ€νΈμ λλ€.
Theorem.
The sum of residuals is zero.
\[\sum_{i=1}^n e_i = \sum_{i=1}^n (y_i - \hat{y}_i) = 0\]proof.
$\blacksquare$
Theorem.
The sum of product of residual and $x_i$s is zero.
\[\sum_{i=1}^n x_i e_i = \sum_{i=1}^n x_i (y_i - \hat{y}_i) = 0\]proof.
$\blacksquare$
Theorem.
proof.
(μ€ν¬) μ¦λͺ κ³Όμ μμ μμμ μ¦λͺ νλ λ λͺ μ λ₯Ό μ¬μ©νλ€!
\[\begin{aligned} \sum_{i=1}^n (y_i - \bar{y})^2 &= \sum_{i=1}^n (y_i - \hat{y}_i + \hat{y}_i - \bar{y})^2 \\ &= \sum_{i=1}^n \left((y_i - \hat{y}_i) + (\hat{y}_i - \bar{y})\right)^2 \\ &= \sum_{i=1}^n (y_i - \hat{y}_i)^2 + 2 \sum_{i=1}^n (y_i - \hat{y}_i)(\hat{y}_i - \bar{y}) + \sum_{i=1}^n (\hat{y}_i - \bar{y})^2 \\ \end{aligned}\]μ΄λ, μμ μμμ μ€κ°μ ν λ§ λ°λ‘ λΌμ΄λ³΄μ. κ·Έλ¦¬κ³ $\hat{y}_i$μ λν μμ λμ νλ©΄,
\[\begin{aligned} \sum_{i=1}^n (y_i - \hat{y}_i)(\hat{y}_i - \bar{y}) &= \sum_{i=1}^n (y_i - \hat{y}_i)(b_0 + b_1 x_i - \bar{y}) \\ &= \sum_{i=1}^n (y_i - \hat{y}_i)((\cancel{\bar{y}} - b_1 \bar{x}) + b_1 x_i - \cancel{\bar{y}}) \\ &= \sum_{i=1}^n (y_i - \hat{y}_i) \cdot b_1 (x_i - \bar{x}) \\ &= b_1 \cdot \left( \cancelto{0}{\sum_{i=1}^n (y_i - \hat{y}_i) x_i} - \bar{x} \cdot \cancelto{0}{\sum_{i=1}^n (y_i - \hat{y}_i)} \right) \\ &= 0 \end{aligned}\]