Transformations of Random Variable - 1
โํ๋ฅ ๊ณผ ํต๊ณ(MATH230)โ ์์ ์์ ๋ฐฐ์ด ๊ฒ๊ณผ ๊ณต๋ถํ ๊ฒ์ ์ ๋ฆฌํ ํฌ์คํธ์ ๋๋ค. ์ ์ฒด ํฌ์คํธ๋ Probability and Statistics์์ ํ์ธํ์ค ์ ์์ต๋๋ค ๐ฒ
์ด๋ฒ ์ฑํฐ โCh07. Functions of Random Variablesโ์์๋ RV $X$์ ์ด๋ค ํจ์ $f(x)$๋ฅผ ์์ $Y = f(X)$๋ผ๋ ์๋ก์ด RV๋ฅผ ๋ง๋ค๋, ์ด RV $Y$์ ๋ํ ๋ถํฌ๋ฅผ ์ดํด๋ณธ๋ค. ์ฆ, $f(X)$์ ๋ํ pdf, cdf๋ฅผ ๊ตฌํ๋ค๋ ๋ง์ด๋ค.
์ฌ์ด ๊ฒฝ์ฐ๋ถํฐ ์กฐ๊ธ ๋ณต์กํ ๊ฒฝ์ฐ๊น์ง ์์๋๋ก ์ดํด๋ณผ ๊ฒ์ด๋ฉฐ, $f(x)$๊ฐ 1-1 function์ผ ๋, $f(x)$๊ฐ not 1-1์ผ ๋๋ฅผ ์ดํด๋ณธ๋ค. ๋ท๋ถ๋ถ์์๋ RV์ momemtum์ ์ฝ๊ฒ ๊ตฌํ๋ ๋๊ตฌ์ธ <MGF; Momentum Generating Function> $M_X(t)$์ ๋ํด ์ดํด๋ณธ๋ค.
1-1 Transformation
Discrete Case
Example.
Let $X$ be a RV with $P(X = \pm 1) = 1/2$.
Let $Y = 3X - 5$, find the pmf of $Y$.
\[P(Y = y) = \begin{cases} 1/2 & y=-2 \\ 1/2 & y=-8 \\ \; 0 & \text{else} \end{cases}\]์์ ๊ณผ์ ์ ์ข๋ ํ์ด์ ์ดํด๋ณด์. ๊ทธ๋ฌ๋ฉด,
\[\begin{aligned} P(Y = y) &= P(3X - 5 = y)\\ &= P(f(X) = y) \\ &= P(X = f^{-1}(y)) \end{aligned}\]์ฆ, $f(X)$์์ ์ญํจ์ $f^{-1}$๋ฅผ ์ด์ฉํด $X$์ pmf๋ก๋ถํฐ ์์ฝ๊ฒ $Y$์ pmf๋ฅผ ์ ๋ํ ์ ์๋ค๋ ๊ฒ์ด๋ค!
\[P(Y = y) = P(X = f^{-1}(y))\]Theorem. Discrete Case
1. Supp. $X$ has pmf $f_X (x)$.
Let $Y=g(X)$ where $g$ is 1-1 function with the inverse $x = g^{-1}(y)$.
Then,
\[f_Y (y) = f_X (g^{-1}(y))\]2. Supp. $(X_1, X_2)$ has joint pmf $f_{X_1, X_2} (x_1, x_2)$.
Let $Y_1 := u_1 (X_1, X_2)$ and $Y_2 := u_2 (X_1, X_2)$, and $X_1 := w_1(Y_1, Y_2)$ and $X_2 := w_2 (Y_1, Y_2)$.
Then,
\[f_{Y_1, Y_2} (y_1, y_2) = f_{X_1, X_2} \left( w_1(y_1, y_2), w_2(y_1, y_2) \right)\]์ฆ, $X_1$, $X_2$๋ฅผ ์ด์ฉํด $Y_1$, $Y_2$๋ฅผ ์ ์ํ ์์ ์ ํ์ด์, $Y_1$, $Y_2$๋ฅผ ์ด์ฉํด $X_1$, $X_2$๋ฅผ ๊ธฐ์ ํ ์ $w_1$, $w_2$๋ฅผ ์ ์ํ๊ณ , ๊ทธ๊ฒ์ผ๋ก pmf๋ฅผ ์ ๋ํ๋ค๋ ๋ง์ด๋ค! ๋น์ฐํ ์ ๊ทผ์ ์์์ผ๋ก formalํ๊ฒ ๊ธฐ์ ํ ๊ฒ ์ ๋๋ผ๊ณ ์๊ฐํ๋ฉด ๋๋ค.
Example.
Let $X \sim \text{Poi}(\lambda)$ and $Y \sim \text{Poi}(\mu)$, and $X \perp Y$.
Find the distribution of $X + Y$.
๋ง์ฝ $P(X+Y = 5)$๋ผ๊ณ ํ๋ค๋ฉด, ์ด๊ฒ์ ๊ณ์ฐํ๊ธฐ ์ํด $P(X = 0, Y=5)$, $P(X=1, Y=4)$, โฆ, $P(X=5, Y=0)$์ ํ๋ฅ ์ ๊ตฌํด์ ๋ํ ๊ฒ์ด๋ค. ์ด ์์ด๋์ด๋ฅผ ๋ฐํ์ผ๋ก ์๋์ ๊ฐ์ด ์์ ์ ๊ฐํด๋ณด์!
\[\begin{aligned} P(X+Y = n) &= \sum^{\infty}_{k=0} P(X=n-k, Y = k) \\ &= \sum^{\infty}_{k=0} P(X=n-k) P(Y = k) \quad (\text{independence}) \\ &= \sum^{\infty}_{k=0} e^{-\lambda} \frac{\lambda^{n-k}}{(n-k)!} \cdot e^{-\mu} \frac{\mu^k}{k!} \\ &= \frac{e^{-(\lambda + \mu)}}{n!} \sum^{\infty}_{k=0} \frac{n!}{(n-k)!k!} \lambda^{n-k} \mu^k \\ &= \frac{e^{-(\lambda + \mu)}}{n!} \cdot (\lambda + \mu)^n = e^{-(\lambda + \mu)} \frac{(\lambda + \mu)^n}{n!} \end{aligned}\]์ฆ, $P(X+Y = n)$๋ $\text{Poi}(\lambda+\mu)$์ pmf๋ผ๋ ๊ฒ์ ์ ๋ํ ์ ์๋ค! $\blacksquare$
We first find the joint pmf of $(X, X+Y)$, and then find the marginal pmf of $X+Y$.
Let $U = X$, $V = X+Y$, then $X = U$, $Y = V - U$.
๋ฐ๋ผ์, $U$, $V$์ ๋ํ pmf๋
\[\begin{aligned} f_{U, V} (u, v) &= f_{X, Y} (u, v-u) \\ &= f_X (u) f_Y(v-u) \quad (\text{independence}) \\ &= e^{-\lambda} \frac{\lambda^u}{u!} \cdot e^{-\mu} \frac{\mu^{v-u}}{(v-u)!} \\ &= e^{-(\lambda + \mu)} \frac{\lambda^u}{u!} \frac{\mu^{v-u}}{(v-u)!} \end{aligned}\]$U$, $V$์ ๋ํ joint pmf๋ฅผ ๊ตฌํ์ผ๋, ์ด๋ฒ์๋ $V$์ ๋ํ marginal pmf๋ฅผ ๊ตฌํด๋ณด์.
\[\begin{aligned} f_V (v) &= \sum_u f_{U, V} (u, v) \\ &= \sum_u e^{-(\lambda+\mu)} \frac{\lambda^u}{u!} \frac{\mu^{v-u}}{(v-u)!} \\ &= e^{-(\lambda+\mu)} \sum_u \frac{\lambda^u}{u!} \frac{\mu^{v-u}}{(v-u)!} \\ &= \frac{e^{-(\lambda+\mu)}}{v!} \sum_u \frac{v!}{u!(v-u)!} \lambda^u \mu^{v-u} \\ &= \frac{e^{-(\lambda+\mu)}}{v!} \cdot (\mu + \lambda)^v \\ &= e^{-(\lambda+\mu)} \frac{(\mu + \lambda)^v}{v!} \end{aligned}\]์ฆ, $V = X+Y$์ ๋ํ pmf๋ ๊ฒฐ๊ตญ $\text{Poi}(\lambda + \mu)$์ธ ๊ฒ์ด๋ค! $\blacksquare$
Continuous Case
Theorem.
Let $X$ be a continuous RV with pdf $f_X (x)$.
Let $Y := g(X)$ wgere $g$ is 1-1 with inverse $x = h(y)$.
Then,
\[f_Y (y) = f_X (h(y)) \cdot \left| h'(y) \right|\]๐ฅ Continuous์ ๊ฒฝ์ฐ, $\left| hโ(y) \right|$ ํ ์ด ๊ณฑํด์ง๋ค๋ ์ฌ์ค์ ์ฃผ๋ชฉํ์! ๐ฅ
Example.
1. Let $X \sim N(\mu, \sigma^2)$ and let $Y := \dfrac{X - \mu}{\sigma}$.
Then,
\[\begin{aligned} f_Y (y) &= f_X (h(y)) \cdot \left| h'(y) \right| \\ &= \frac{1}{\sqrt{2\pi} \sigma} \cdot \exp \left({- \frac{(h(y)-\mu)^2}{2\sigma^2}}\right) \cdot \left| \sigma \right|\\ &= \frac{1}{\sqrt{2\pi} \cancel{\sigma}} \cdot \exp \left( - \frac{(\cancel{\sigma} y + \cancel{\mu} - \cancel{\mu})^2}{2\cancel{\sigma^2}} \right) \cdot \cancel{\sigma} \\ &= \frac{1}{\sqrt{2\pi}} \cdot \exp \left( - y^2 / 2\right) \end{aligned}\]2. Let $X \sim \text{Gamma}(\alpha, 1)$, and let $Y := \beta X$.
Claim: $Y \sim \text{Gamma}(\alpha, \beta)$
\[y = \beta x \iff x = \frac{y}{\beta} = h(y)\]and
\[f_X (x) = \frac{1}{\Gamma(\alpha)} x^{\alpha - 1} e^{-x}\]then,
\[\begin{aligned} f_Y (y) &= f_X (h(y)) \cdot \left| h'(y) \right| \\ &= \frac{1}{\Gamma(\alpha)} h(y)^{\alpha - 1} e^{-h(y)} \cdot \left| \frac{1}{\beta} \right| \\ &= \frac{1}{\Gamma(\alpha) \beta} \cdot \left( \frac{y}{\beta}\right)^{\alpha-1} e^{-y/\beta} \\ &= \frac{1}{\Gamma(\alpha) \beta^{\alpha}} \cdot y^{\alpha-1} e^{-y/\beta} \end{aligned}\]๋ฐ๋ผ์, $X \sim \text{Gamma}(\alpha, 1)$์์ $Y = \beta X$์ transformation์ ์ทจํ๋ฉด, $Y \sim \text{Gamma}(\alpha, \beta)$์ ๋ถํฌ๋ฅผ ์ป๋๋ค.
3. Let $\theta \sim \text{Unif}(-\pi/2, \pi/2)$, and let $X = \tan \theta$.
Find the pdf of $X$.
\[h(x) = \arctan x \quad \text{and} \quad h'(x) = \frac{1}{1+x^2}\]์์ ๊ท์น์ ๋ง์ถฐ $X$์ ๋ถํฌ๋ฅผ ์ ๋ํด๋ณด๋ฉด,
\[\begin{aligned} f_X (x) &= f_\theta (h(x)) \cdot \left| h'(x) \right| \\ &= \cancelto{\frac{1}{\pi}}{f_\theta (\arctan x)} \cdot \frac{1}{1+x^2} \quad (\text{Uniform distribution})\\ &= \frac{1}{\pi} \frac{1}{1+x^2} \end{aligned}\]์ฐธ๊ณ ๋ก ์์ ๊ฐ์ ๋ถํฌ๋ฅผ <Cauchy Distribution>๋ผ๊ณ ํ๋ค.
Theorem.
Let $X$ be a RV with cdf $F_X (x)$ which is strictly increasing.
Let $U \sim \text{Unif}(0, 1)$ Then,
1. $Y := F_X^{-1}(U)$ has the same distribution of $X$.
2.$Z := F_X(X)$ has the same distribution of $U$.
๊ฐ์ธ์ ์ผ๋ก ๋ช ์ ์ ๋ํด ์ ์ดํด๊ฐ ๋์ง ์์, ์ฆ๋ช ์ ๋จผ์ ์ดํดํ๋ค.
Proof. 1
๋ฐ๋ผ์, $Y = F_X^{-1}(U)$๋ $X$์ ๋ถํฌ๋ฅผ ๊ฐ๋๋ค. $\blacksquare$
Proof. 2
๋ฐ๋ผ์, $Z = F_X(X)$๋ uniform distribution $U(0, 1)$์ ๊ฐ๋๋ค. $\blacksquare$
Example.
Let $X \sim \text{Exp}(\lambda)$, then cdf is $F_X(x) = 1 - e^{-\lambda x}$.
๊ทธ๋ฌ๋ฉด ์ฐ๋ฆฌ๋ $F_X^{-1}$์ $U$๋ฅผ ์ด์ฉํด $X$์ ๋ถํฌ๋ฅผ ๊ฐ๋ RV๋ฅผ ์ ๋ํ ์ ์๋ค!!
By above theorem,
\[F_X^{-1}(U) = \frac{-\ln (1-U)}{\lambda} \sim X\]๋ง๋ถ์ด๋ฉด, $U$์์ $X$๋ก ๊ฐ๋ Transformation์ ์ฐพ๊ณ ์ถ์๋ฐ, ๊ทธ๊ฑธ $F_X^{-1}$๋ก ์ค์ ํ๋ฉด ์์ฃผ ์ฝ๊ฒ $U$์์ $X$๋ก ๊ฐ๋ Transform์ ์ฐพ๋๊ฒ ๋๋ค๋ ๋ง์ด๋ค!! ๐คฉ
Theorem. Continuous case - Two Random Variables
Let $(X, Y) \mapsto \left( u(X, Y), v(X, Y) \right)$ with the inverse $(U, V) \mapsto \left(w_1(U, V), w_2(U, V)\right)$.
If $(X, Y)$ has joint pdf $f_{X, Y}(x, y)$, then $(U, V)$ has joint pdf
\[f_{U, V} (u, v) = f_{X, Y} \left( w_1(U, V), w_2(U, V) \right) \cdot \left| J \right|\]where $J$ is Jaccobian Matrix
\[J = \begin{pmatrix} \frac{\partial w_1}{\partial u} & \frac{\partial w_1}{\partial v} \\ \frac{\partial w_2}{\partial u} & \frac{\partial w_2}{\partial v} \\ \end{pmatrix}\]์ฐธ๊ณ ๋ก Jaccobian $J$๋ ์ ๋ถ ๋ณ์๋ฅผ ๋ฐ๊พธ๋ ๊ณผ์ ์์ ๋ฑ์ฅํ๋ค.
$x = w_1(u, v)$, $y = w_2(u, v)$๋ผ๊ณ ํ๊ณ , ์ ๋ถ ๋ณ์๋ฅผ $u$, $v$๋ก ๋ณํํ๋ค๋ฉด,
\[\int \int_{X, Y} f(x, y) \, dxdy = \int \int_{U, V} f\left( w_1 (u, v), w_2(u, v) \right) \left| J \right| \, dudv\]๊ทธ๋์ ์ ์ดํด๋ณด๋ฉด, $(U, V)$์ ๋ํ pdf $f_{U, V}(u, v)$๋ ์์ ์์ ์ฐ๋ณ์์ ์ ๋ถ ๋ด๋ถ์ ํจ์๋ฅผ ๊ทธ๋๋ก ๊ฐ์ ธ์จ ๊ฒ์์ ์ฝ๊ฒ ํ์ธํ ์ ์๋ค!!
Example. [1]
Let $X \sim N(0, 1)$ and $Y \sim N(0, 1)$, and $X \perp Y$.
Let $U := X + Y$ and $V := X - Y$.
1. Find the joint pdf of $(U, V)$
๋จผ์ $w_1(u, v)$, $w_2(u, v)$๋ฅผ ๊ตฌํ๋ค.
\[\begin{aligned} x &= \frac{u+v}{2} = w_1(u, v) \\ y &= \frac{u-v}{2} = w_2(u, v) \end{aligned}\]์ด๊ฒ์ ๊ทธ๋๋ก ์ ์ฉํด๋ณด๋ฉด,
\[\begin{aligned} f_{U, V}(u, v) &= f_{X, Y} \left( w_1(u, v), w_2(u, v) \right) \cdot \cancelto{1/2}{\left| \begin{matrix} 1/2 & 1/2 \\ 1/2 & -1/2 \end{matrix}\right|} \\ &= f_{X, Y} \left( \frac{u+v}{2}, \frac{u-v}{2} \right) \frac{1}{2} \\ &= f_X \left( \frac{u+v}{2} \right) f_Y \left(\frac{u-v}{2}\right) \frac{1}{2} \qquad (X \perp Y) \\ &= \frac{1}{\sqrt{2\pi}} \exp \left( - \frac{(u+v)^2}{8} \right) \cdot \frac{1}{\sqrt{2\pi}} \exp \left( - \frac{(u-v)^2}{8} \right) \cdot \frac{1}{2} \\ &= \frac{1}{4\pi} \exp \left( - \frac{u^2 + v^2}{4}\right) \end{aligned}\]2. Are $U$ and $V$ independent?
A. Yes!
\[\begin{aligned} f_{U, V} (u, v) &= \frac{1}{4\pi} \exp \left( - \frac{u^2 + v^2}{4}\right) \\ \end{aligned}\]์ฐ๋ฆฌ๋ $f_{U, V} (u, v)$์ ์์์ $f_U (u)$๋ฅผ ์ ๋ํด๋ณด๋ฉด,
\[\begin{aligned} f_U(u) = \frac{1}{\sqrt{2\pi} \cdot (\sqrt{2})^2} \cdot \exp \left( - \frac{u^2}{2 \cdot (\sqrt{2})^2}\right) \end{aligned}\]์ฆ, $U$๋ $N(0, (\sqrt{2})^2)$์ ๋ถํฌ๋ฅผ ๊ฐ์ง์ ํ์ธํ ์ ์๋ค!
3. Let $Z := \dfrac{Y}{X}$. Find the pdf of $Z$.
Let $U := X$, and $V := \dfrac{Y}{X}$, then
\[\begin{aligned} X &= U \\ Y &= UV \end{aligned}\]and Jaccobian is
\[\left| J \right| = \left| \begin{matrix} 1 & 0 \\ \cdot & u \end{matrix}\right| = \left| u \right|\]์ฌ๋ฃ๋ ๋ค ๊ฐ์ถฐ์ก์ผ๋, ์ด์ pdf $f_{U, V}(u, v)$๋ฅผ ๊ตฌํด๋ณด์.
\[\begin{aligned} f_{U, V}(u, v) &= f_{X, Y} (u, uv) \cdot \left| u \right| \\ &= f_X (u) \cdot f_Y (uv) \cdot \left| u \right| \\ &= \frac{1}{\sqrt{2\pi}} \exp \left( - \frac{u^2}{2} \right) \cdot \frac{1}{\sqrt{2\pi}} \exp \left( - \frac{u^2v^2}{2}\right) \cdot \left| u \right| \\ &= \frac{1}{2\pi} \cdot \exp \left( - \frac{u^2(1+v^2)}{2}\right) \cdot \left| u \right| \end{aligned}\]์ด๋, ์ฐ๋ฆฌ๊ฐ ๋ชฉํ๋ก ํ๋ ๋ถํฌ์ธ $Z$, ์ฆ $V := \dfrac{Y}{X}$๋ฅผ ๊ตฌํ๊ธฐ ์ํด $f_{U, V}(u, v)$์์ marginalize out ํด์ค๋ค.
\[\begin{aligned} f_Z (z) &= f_V (v) = \int f_{U, V} (u, v) \, du \\ &= \frac{1}{2\pi} \int^{\infty}_{-\infty} \left| u \right| \cdot \exp \left( - \frac{u^2(1+v^2)}{2}\right) \, du \\ &= \frac{1}{2\pi} \cdot 2 \int^{\infty}_0 \left| u \right| \cdot \exp \left( - \frac{u^2(1+v^2)}{2}\right) \, du \\ &= \frac{1}{2\pi} \cdot 2 \int^{\infty}_0 \frac{1}{2} \cdot \exp \left( - \frac{t(1+v^2)}{2}\right) \, dt \qquad (u^2 = t) \\ &= \frac{1}{2\pi} \cdot \left( \left. \frac{2}{-(1+v^2)} \cdot \exp \left( - \frac{t(1+v^2)}{2}\right) \right]^{\infty}_0 \right) \\ &= \frac{1}{2\pi} \cdot \frac{2}{-(1+v^2)} \left\{ 0 - 1\right\} \\ &= \frac{1}{\pi} \cdot \frac{1}{1+v^2} \end{aligned}\]์ฆ, ์์ ๊ฐ์ด $Z := \dfrac{Y}{X}$์ ๋ํ ๋ถํฌ๋ฅผ ์ป์ ์ ์๋ค!! ์ฐธ๊ณ ๋ก ์์ ๋ถํฌ๋ ์์์ ์ ๊ฐ ์ธ๊ธ๋ <Cauchy Distribution>์ด๋ค!
Example. [2]
Let $(X, Y)$ have the joint pdf
\[f_{X, Y}(x, y) = \begin{cases} 4xy & \text{for } 0 < x < 1 \text{ and } 0 < y < 1\\ 0 & \text{else} \end{cases}\]1. Find the joint pdf of $(X^2, XY)$.
\[\begin{aligned} U &= X^2 \\ V &= XY \end{aligned}\]then, inverse relation is
\[\begin{aligned} x &= \sqrt{u} \\ y &= \frac{v}{\sqrt{u}} \\ 0 < \sqrt{u} < 1 \quad &\text{and} \quad 0 < \frac{v}{\sqrt{u}} < 1 \end{aligned}\]and Jaccobian is
\[\left| J \right| = \left| \begin{matrix} \frac{1}{2\sqrt{u}} & 0 \\ \cdot & \frac{1}{\sqrt{u}} \end{matrix} \right| = \frac{1}{2u}\]๋ฐ๋ผ์, pdf $f_{U, V} (u, v)$๋
\[\begin{aligned} f_{U, V} (u, v) &= f_{X, Y} \left(\sqrt{u}, \frac{v}{\sqrt{u}}\right) \cdot \left| \frac{1}{2u} \right|\\ &= 4 \cancel{\sqrt{u}} \frac{v}{\cancel{\sqrt{u}}} \cdot \frac{1}{2u} \\ &= \frac{2v}{u} \qquad \text{for } 0 < u < 1 \text{ and } 0 < v < \sqrt{u} \end{aligned}\]2. Find the marginal pdf of $X^2$ and $XY$.
(1) $f_U(u) = f_{X^2}(u)$
\[\begin{aligned} f_{X^2} (u) &= \int^{\sqrt{u}}_0 \frac{2v}{u} \, dv \\ &= \frac{1}{u} \cdot (\sqrt{u})^2 = 1 \end{aligned}\]๋ฐ๋ผ์, $X^2$์ $\text{Unif}(0, 1)$์ ๋ถํฌ๋ฅผ ๋ฐ๋ฅธ๋ค!
(2) $f_V(v) = f_{XY}(v)$
\[\begin{aligned} f_{XY}(v) &= \int^1_{v^2} \frac{2v}{u} \, du \\ &= 2v \cdot \ln(1-v^2) \end{aligned}\]์ด์ด์ง๋ ํฌ์คํธ์์๋ Random Variable์ ๋ํ Transformation์ ์ด์ด์ ์ดํด๋ณธ๋ค. 1-1์ด ์๋ mapping์ ๊ฒฝ์ฐ๋ฅผ ์ข๋ ์ดํด๋ณผ ์์ ์ด๋ค.