Gamma Distribution
βνλ₯ κ³Ό ν΅κ³(MATH230)β μμ μμ λ°°μ΄ κ²κ³Ό 곡λΆν κ²μ μ 리ν ν¬μ€νΈμ λλ€. μ 체 ν¬μ€νΈλ Probability and Statisticsμμ νμΈνμ€ μ μμ΅λλ€ π²
μ리μ¦: Continuous Probability Distributions
Gamma Function
Definition. Gamma Function
The <Gamma Function> $\Gamma(\alpha): (0, \infty) \rightarrow (0, \infty)$ is defined as
\[\Gamma(\alpha) = \int^{\infty}_0 t^{\alpha - 1} e^{-t} dt \quad \text{for} \; \alpha > 0\]κ°λ§ ν¨μλ 1730λ μ€μΌλ¬κ° μ μν ν¨μλ‘ μνκ³μμ μ λ§ μ λͺ ν ν¨μλ€. κ°λ§ ν¨μλ μ μμμ μ μλλ <factorial> $n!$μ μ€μ, 볡μμ μμκΉμ§ νμ₯νλ μλμμ λ±μ₯ν ν¨μμ΄λ€. μ¦, <factorial>μ μΌλ°ν λ²μ μ΄λΌκ³ μκ°νλ©΄ λλ€.
Remark.
1. base case
\[\begin{aligned} \Gamma(1) &= \int^{\infty}_0 t^{0} e^{-t} dt = \int^{\infty}_0 e^{-t} dt \\ &= \left.- e^{-t} \right]^{\infty}_0 = 1 \end{aligned}\]2. successive case
\[\begin{aligned} \Gamma(\alpha + 1) &= \int^{\infty}_0 t^{\alpha} e^{-t} dt \\ &= \cancel{\left. t^{\alpha} (- e^{-t}) \right]^{\infty}_0} + \alpha \int^{\infty}_0 t^{\alpha - 1} e^{-t} dt \\ &= \alpha \Gamma(\alpha) \end{aligned}\]3. factorial
\[\begin{aligned} \Gamma(n) &= (n-1) \cdot \Gamma(n-1) = (n-1)(n-2) \cdot \Gamma(n-2) = \cdots \\ &= \left((n-1)(n-2) \cdots 1\right) \cdot \Gamma(1) \\ &= (n-1)! \end{aligned}\]4. (special case) normal distribution
\[\begin{aligned} \Gamma(1/2) &= \int^{\infty}_0 t^{-\frac{1}{2}} e^{-t} dt \\ &= \int^{\infty}_0 \frac{e^{-t}}{\sqrt{t}} dt \end{aligned}\]μ¬κΈ°μ $t = \dfrac{x^2}{2}$λ‘ μ€μ ν΄λ³΄μ. κ·Έλ¬λ©΄, $dt = x \; dx \iff \dfrac{dt}{\sqrt{2t}} = dx$ μ΄λ―λ‘
\[\begin{aligned} \Gamma(1/2) &= \int^{\infty}_0 \frac{e^{-t}}{\sqrt{t}} dt \\ &= \sqrt{2} \int^{\infty}_0 e^{-\frac{x^2}{2}} dx \\ &= \sqrt{2} \cdot \sqrt{2\pi} \cdot \left( \frac{1}{\sqrt{2\pi}} \int^{\infty}_0 e^{-\frac{x^2}{2}} dx \right) \\ &= \sqrt{2} \cdot \sqrt{2\pi} \cdot 0.5 = \sqrt{\pi} \end{aligned}\]Gamma Distribution
Definition. Gamma Distribution
Let $\alpha > 0$ and $\beta > 0$. We say that $X$ has a <Gamma Distribution> with a shape parameter $\alpha$ and a scale parameter $\beta$, if its pdf is given by
\[f(x; \alpha, \beta) = \begin{cases} C_{\alpha, \beta} \cdot x^{\alpha-1} e^{-\frac{x}{\beta}} & \text{for } x > 0 \\ \quad 0 & \text{else} \end{cases}\]μ΄λ, κ³μ $C_{\alpha, \beta}$λ
\[C_{\alpha, \beta} \cdot \int^{\infty}_0 x^{\alpha - 1} e^{-\frac{x}{\beta}} \; dx = 1\]μ΄ λλλ‘ νλ $C_{\alpha, \beta}$λ₯Ό μ ννλ€. μ΄κ²μ μ μ 리νλ©΄,
\[C_{\alpha, \beta} = \frac{1}{\displaystyle \int^{\infty}_0 x^{\alpha - 1} e^{-\frac{x}{\beta}} \; dx} = \frac{1}{\Gamma(\alpha) \cdot \beta^{\alpha}}\](β² μΉνμ λΆμ μ μ°λ©΄ μμ κ°μ κ²°κ³Όκ° λμ¨λ€ γ γ )
κ°λ§λΆν¬λ₯Ό λ€μ κΈ°μ ν΄λ³΄λ©΄,
\[\text{Gamma}(x; \alpha, \beta) = \frac{1}{\Gamma(\alpha) \beta^{\alpha}} \cdot x^{\alpha - 1} e^{-\frac{x}{\beta}} \quad \text{for } x > 0\]Remarks.
1. $\text{Gamma}(1, \beta) \overset{D}{=} \text{EXP}(\beta)$
If $\alpha = 1$, then
\[C_{1, \beta} = \frac{1}{\Gamma(1) \beta} = \frac{1}{\beta}\]then
\[f(x) = \frac{1}{\beta} e^{-\frac{x}{\beta}}\]λ°λΌμ, $\text{Gamma}(1, \beta) \overset{D}{=} \text{EXP}(\beta)$.
2. $\text{Gamma}(n, \beta)$; Generalization of <Exponential Distribution>
If $\alpha = n$, then $X \sim \text{Gamma}(n, \beta)$ can be used for the amount of time that $n$ events occur.
In fact, $X$ can be written as $X = X_1 + \cdots + X_n$ where $X_i$βs are independent $\text{EXP}(\lambda)$ RVs.
μλ₯Ό λ€μ΄, μ¬κ±΄μ΄ $3$λ² μΌμ΄λκΈ°κΉμ§ κ±Έλ¦° μκ°μ λν λΆν¬λ $\text{Gamma}(3, \beta)$λ₯Ό λ°λ₯Έλ€λ λ§μ΄λ€! κ·Έλμ <Exponential Distribution>μ λν μΌλ°νλΌκ³ λ³Ό μλ μλ€.
Theorem.
If $X \sim \text{Gamma}(\alpha, \beta)$, then
- $E[X] = \alpha \beta$
- $\text{Var}(X) = \alpha \beta^2$
Proof.
μ΄λ, κ°λ§ ν¨μμ λν <Remark 2>λ₯Ό μ¬μ©νλ©΄, κ²°κ΅ $E[X]$λ μλμ κ°λ€.
\[\begin{aligned} E[X] &= \frac{\Gamma(\alpha+1)}{\Gamma(\alpha)} \beta = \alpha \beta \end{aligned}\]λΆμ°μ λν μ¦λͺ μμλ $E[X^2]$λ₯Ό ꡬν΄μΌ νλ€.
\[\begin{aligned} E[X^2] &= \int^{\infty}_0 x^2 f(x) dx \\ &= C_{\alpha, \beta} \int^{\infty}_0 x^{\alpha + 1} e^{-x/\beta} dx \\ &= C_{\alpha, \beta} \cdot \left( \frac{1}{C_{\alpha+2, \beta}} \cancelto{1}{\int^{\infty}_0 C_{\alpha+2, \beta} \; x^{\alpha + 1} e^{-x/\beta} dx} \right) \\ &= C_{\alpha, \beta} \cdot \frac{1}{C_{\alpha+2, \beta}} \\ &= \frac{\Gamma(\alpha + 2)}{\Gamma(\alpha)} \beta^2 \\ &= \alpha (\alpha - 1) \beta^2 \end{aligned}\]λ°λΌμ,
\[\begin{aligned} \text{Var}(X) &= E[X^2] - (E[X])^2 \\ &= (\alpha^2 - \alpha) \beta^2 - \alpha^2 \beta^2 \\ &= \alpha \beta^2 \end{aligned}\]Relation to Poisson Process
Let $N(t)$ be a <Poisson process> with rate $\lambda$. Let $X$ be the time to the $n$-th event in the <Poisson process>.
Claim: $X \sim \text{Gamma}(n, \beta)$ where $\beta = 1/\lambda$
μ΄λ, $N(t) \sim \text{POI}(\lambda t)$μ΄λ―λ‘,
\[\begin{aligned} P(N(t) < n) &= \sum^{n-1}_{k=0} P(N(t) = k) \\ &= \sum^{n-1}_{k=0} e^{-\lambda t} \frac{(\lambda t)^k}{k!} \end{aligned}\]μμ μμ ν΅ν΄ $X$μ cdfλ₯Ό μκ³ μμΌλ, μ΄κ²μ λ―ΈλΆν΄ $X$μ pdfλ₯Ό μ λν΄λ³΄μ.
\[\begin{aligned} \frac{d}{dt} P(X \le t) &= - \frac{d}{dt} P(X > t) \\ &= - \left( \sum^{n-1}_{k=0} (-\lambda) e^{-\lambda t} \frac{(\lambda t)^k}{k!} + \sum^{n-1}_{k=1} \lambda e^{-\lambda t} \frac{(\lambda t)^{(k-1)}}{(k-1)!}\right) \\ &= \lambda e^{-\lambda t} \cdot \left( \sum^{n-1}_{k=0} \frac{(\lambda t)^k}{k!} - \sum^{n-1}_{k=1} \frac{(\lambda t)^{(k-1)}}{(k-1)!} \right) \\ &= \lambda e^{-\lambda t} \frac{(\lambda t)^{(n-1)}}{(n-1)!} \\ &= \frac{\lambda^n}{(n-1)!} \cdot t^{n-1} e^{-\lambda t} \\ &= \frac{\lambda^n}{\Gamma(n)} \cdot t^{n-1} e^{-\lambda t} \\ &= \frac{1}{\Gamma(n) \beta^n} \cdot t^{n-1} e^{-t/\beta} \\ &= C_{n, \beta} \cdot t^{n-1} e^{-t/\beta} \\ &= f(x; n, \beta) \end{aligned}\]μ¦, $X$λ $X \sim \text{Gamma}(n, \beta)$μ΄λ€. $\blacksquare$
μ΄μ΄μ§λ ν¬μ€νΈμμλ κ°λ§ λΆν¬μ νΉμν κ²½μ°λ‘ κΌ½νλ <Chi-square distribution>, <Beta distribution>κ³Ό <Log-normal distribution>μ λν΄ λ€λ£¬λ€ π€©
π Chi-square, Beta and Log-normal Distribution