โ€œํ™•๋ฅ ๊ณผ ํ†ต๊ณ„(MATH230)โ€ ์ˆ˜์—…์—์„œ ๋ฐฐ์šด ๊ฒƒ๊ณผ ๊ณต๋ถ€ํ•œ ๊ฒƒ์„ ์ •๋ฆฌํ•œ ํฌ์ŠคํŠธ์ž…๋‹ˆ๋‹ค. ์ „์ฒด ํฌ์ŠคํŠธ๋Š” Probability and Statistics์—์„œ ํ™•์ธํ•˜์‹ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค ๐ŸŽฒ

17 minute read

โ€œํ™•๋ฅ ๊ณผ ํ†ต๊ณ„(MATH230)โ€ ์ˆ˜์—…์—์„œ ๋ฐฐ์šด ๊ฒƒ๊ณผ ๊ณต๋ถ€ํ•œ ๊ฒƒ์„ ์ •๋ฆฌํ•œ ํฌ์ŠคํŠธ์ž…๋‹ˆ๋‹ค. ์ „์ฒด ํฌ์ŠคํŠธ๋Š” Probability and Statistics์—์„œ ํ™•์ธํ•˜์‹ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค ๐ŸŽฒ

์ด๋ฒˆ ์ฑ•ํ„ฐ โ€œCh07. Functions of Random Variablesโ€์—์„œ๋Š” RV $X$์— ์–ด๋–ค ํ•จ์ˆ˜ $f(x)$๋ฅผ ์”Œ์›Œ $Y = f(X)$๋ผ๋Š” ์ƒˆ๋กœ์šด RV๋ฅผ ๋งŒ๋“ค๋•Œ, ์ด RV $Y$์— ๋Œ€ํ•œ ๋ถ„ํฌ๋ฅผ ์‚ดํŽด๋ณธ๋‹ค. ์ฆ‰, $f(X)$์— ๋Œ€ํ•œ pdf, cdf๋ฅผ ๊ตฌํ•œ๋‹ค๋Š” ๋ง์ด๋‹ค.

์‰ฌ์šด ๊ฒฝ์šฐ๋ถ€ํ„ฐ ์กฐ๊ธˆ ๋ณต์žกํ•œ ๊ฒฝ์šฐ๊นŒ์ง€ ์ˆœ์„œ๋Œ€๋กœ ์‚ดํŽด๋ณผ ๊ฒƒ์ด๋ฉฐ, $f(x)$๊ฐ€ 1-1 function์ผ ๋•Œ, $f(x)$๊ฐ€ not 1-1์ผ ๋•Œ๋ฅผ ์‚ดํŽด๋ณธ๋‹ค. ๋’ท๋ถ€๋ถ„์—์„œ๋Š” RV์˜ momemtum์„ ์‰ฝ๊ฒŒ ๊ตฌํ•˜๋Š” ๋„๊ตฌ์ธ <MGF; Momentum Generating Function> $M_X(t)$์— ๋Œ€ํ•ด ์‚ดํŽด๋ณธ๋‹ค.


1-1 Transformation

Discrete Case

Example.

Let $X$ be a RV with $P(X = \pm 1) = 1/2$.

Let $Y = 3X - 5$, find the pmf of $Y$.

\[P(Y = y) = \begin{cases} 1/2 & y=-2 \\ 1/2 & y=-8 \\ \; 0 & \text{else} \end{cases}\]

์œ„์˜ ๊ณผ์ •์„ ์ข€๋” ํ’€์–ด์„œ ์‚ดํŽด๋ณด์ž. ๊ทธ๋Ÿฌ๋ฉด,

\[\begin{aligned} P(Y = y) &= P(3X - 5 = y)\\ &= P(f(X) = y) \\ &= P(X = f^{-1}(y)) \end{aligned}\]

์ฆ‰, $f(X)$์—์„œ ์—ญํ•จ์ˆ˜ $f^{-1}$๋ฅผ ์ด์šฉํ•ด $X$์˜ pmf๋กœ๋ถ€ํ„ฐ ์†์‰ฝ๊ฒŒ $Y$์˜ pmf๋ฅผ ์œ ๋„ํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์ด๋‹ค!

\[P(Y = y) = P(X = f^{-1}(y))\]


Theorem. Discrete Case

1. Supp. $X$ has pmf $f_X (x)$.

Let $Y=g(X)$ where $g$ is 1-1 function with the inverse $x = g^{-1}(y)$.

Then,

\[f_Y (y) = f_X (g^{-1}(y))\]


2. Supp. $(X_1, X_2)$ has joint pmf $f_{X_1, X_2} (x_1, x_2)$.

Let $Y_1 := u_1 (X_1, X_2)$ and $Y_2 := u_2 (X_1, X_2)$, and $X_1 := w_1(Y_1, Y_2)$ and $X_2 := w_2 (Y_1, Y_2)$.

Then,

\[f_{Y_1, Y_2} (y_1, y_2) = f_{X_1, X_2} \left( w_1(y_1, y_2), w_2(y_1, y_2) \right)\]

์ฆ‰, $X_1$, $X_2$๋ฅผ ์ด์šฉํ•ด $Y_1$, $Y_2$๋ฅผ ์ •์˜ํ•œ ์‹์„ ์ž˜ ํ’€์–ด์„œ, $Y_1$, $Y_2$๋ฅผ ์ด์šฉํ•ด $X_1$, $X_2$๋ฅผ ๊ธฐ์ˆ ํ•œ ์‹ $w_1$, $w_2$๋ฅผ ์ •์˜ํ•˜๊ณ , ๊ทธ๊ฒƒ์œผ๋กœ pmf๋ฅผ ์œ ๋„ํ•œ๋‹ค๋Š” ๋ง์ด๋‹ค! ๋‹น์—ฐํ•œ ์ ‘๊ทผ์„ ์ˆ˜์‹์œผ๋กœ formalํ•˜๊ฒŒ ๊ธฐ์ˆ ํ•œ ๊ฒƒ ์ •๋„๋ผ๊ณ  ์ƒ๊ฐํ•˜๋ฉด ๋œ๋‹ค.


Example.

Let $X \sim \text{Poi}(\lambda)$ and $Y \sim \text{Poi}(\mu)$, and $X \perp Y$.

Find the distribution of $X + Y$.

๋งŒ์•ฝ $P(X+Y = 5)$๋ผ๊ณ  ํ•œ๋‹ค๋ฉด, ์ด๊ฒƒ์„ ๊ณ„์‚ฐํ•˜๊ธฐ ์œ„ํ•ด $P(X = 0, Y=5)$, $P(X=1, Y=4)$, โ€ฆ, $P(X=5, Y=0)$์˜ ํ™•๋ฅ ์„ ๊ตฌํ•ด์„œ ๋”ํ•  ๊ฒƒ์ด๋‹ค. ์ด ์•„์ด๋””์–ด๋ฅผ ๋ฐ”ํƒ•์œผ๋กœ ์•„๋ž˜์™€ ๊ฐ™์ด ์‹์„ ์ „๊ฐœํ•ด๋ณด์ž!

\[\begin{aligned} P(X+Y = n) &= \sum^{\infty}_{k=0} P(X=n-k, Y = k) \\ &= \sum^{\infty}_{k=0} P(X=n-k) P(Y = k) \quad (\text{independence}) \\ &= \sum^{\infty}_{k=0} e^{-\lambda} \frac{\lambda^{n-k}}{(n-k)!} \cdot e^{-\mu} \frac{\mu^k}{k!} \\ &= \frac{e^{-(\lambda + \mu)}}{n!} \sum^{\infty}_{k=0} \frac{n!}{(n-k)!k!} \lambda^{n-k} \mu^k \\ &= \frac{e^{-(\lambda + \mu)}}{n!} \cdot (\lambda + \mu)^n = e^{-(\lambda + \mu)} \frac{(\lambda + \mu)^n}{n!} \end{aligned}\]

์ฆ‰, $P(X+Y = n)$๋Š” $\text{Poi}(\lambda+\mu)$์˜ pmf๋ผ๋Š” ๊ฒƒ์„ ์œ ๋„ํ•  ์ˆ˜ ์žˆ๋‹ค! $\blacksquare$

We first find the joint pmf of $(X, X+Y)$, and then find the marginal pmf of $X+Y$.

Let $U = X$, $V = X+Y$, then $X = U$, $Y = V - U$.

๋”ฐ๋ผ์„œ, $U$, $V$์— ๋Œ€ํ•œ pmf๋Š”

\[\begin{aligned} f_{U, V} (u, v) &= f_{X, Y} (u, v-u) \\ &= f_X (u) f_Y(v-u) \quad (\text{independence}) \\ &= e^{-\lambda} \frac{\lambda^u}{u!} \cdot e^{-\mu} \frac{\mu^{v-u}}{(v-u)!} \\ &= e^{-(\lambda + \mu)} \frac{\lambda^u}{u!} \frac{\mu^{v-u}}{(v-u)!} \end{aligned}\]

$U$, $V$์— ๋Œ€ํ•œ joint pmf๋ฅผ ๊ตฌํ–ˆ์œผ๋‹ˆ, ์ด๋ฒˆ์—๋Š” $V$์— ๋Œ€ํ•œ marginal pmf๋ฅผ ๊ตฌํ•ด๋ณด์ž.

\[\begin{aligned} f_V (v) &= \sum_u f_{U, V} (u, v) \\ &= \sum_u e^{-(\lambda+\mu)} \frac{\lambda^u}{u!} \frac{\mu^{v-u}}{(v-u)!} \\ &= e^{-(\lambda+\mu)} \sum_u \frac{\lambda^u}{u!} \frac{\mu^{v-u}}{(v-u)!} \\ &= \frac{e^{-(\lambda+\mu)}}{v!} \sum_u \frac{v!}{u!(v-u)!} \lambda^u \mu^{v-u} \\ &= \frac{e^{-(\lambda+\mu)}}{v!} \cdot (\mu + \lambda)^v \\ &= e^{-(\lambda+\mu)} \frac{(\mu + \lambda)^v}{v!} \end{aligned}\]

์ฆ‰, $V = X+Y$์— ๋Œ€ํ•œ pmf๋Š” ๊ฒฐ๊ตญ $\text{Poi}(\lambda + \mu)$์ธ ๊ฒƒ์ด๋‹ค! $\blacksquare$


Continuous Case

Theorem.

Let $X$ be a continuous RV with pdf $f_X (x)$.

Let $Y := g(X)$ wgere $g$ is 1-1 with inverse $x = h(y)$.

Then,

\[f_Y (y) = f_X (h(y)) \cdot \left| h'(y) \right|\]

๐Ÿ’ฅ Continuous์˜ ๊ฒฝ์šฐ, $\left| hโ€™(y) \right|$ ํ…€์ด ๊ณฑํ•ด์ง„๋‹ค๋Š” ์‚ฌ์‹ค์— ์ฃผ๋ชฉํ•˜์ž! ๐Ÿ”ฅ


Example.

1. Let $X \sim N(\mu, \sigma^2)$ and let $Y := \dfrac{X - \mu}{\sigma}$.

Then,

\[\begin{aligned} f_Y (y) &= f_X (h(y)) \cdot \left| h'(y) \right| \\ &= \frac{1}{\sqrt{2\pi} \sigma} \cdot \exp \left({- \frac{(h(y)-\mu)^2}{2\sigma^2}}\right) \cdot \left| \sigma \right|\\ &= \frac{1}{\sqrt{2\pi} \cancel{\sigma}} \cdot \exp \left( - \frac{(\cancel{\sigma} y + \cancel{\mu} - \cancel{\mu})^2}{2\cancel{\sigma^2}} \right) \cdot \cancel{\sigma} \\ &= \frac{1}{\sqrt{2\pi}} \cdot \exp \left( - y^2 / 2\right) \end{aligned}\]

2. Let $X \sim \text{Gamma}(\alpha, 1)$, and let $Y := \beta X$.

Claim: $Y \sim \text{Gamma}(\alpha, \beta)$

\[y = \beta x \iff x = \frac{y}{\beta} = h(y)\]

and

\[f_X (x) = \frac{1}{\Gamma(\alpha)} x^{\alpha - 1} e^{-x}\]

then,

\[\begin{aligned} f_Y (y) &= f_X (h(y)) \cdot \left| h'(y) \right| \\ &= \frac{1}{\Gamma(\alpha)} h(y)^{\alpha - 1} e^{-h(y)} \cdot \left| \frac{1}{\beta} \right| \\ &= \frac{1}{\Gamma(\alpha) \beta} \cdot \left( \frac{y}{\beta}\right)^{\alpha-1} e^{-y/\beta} \\ &= \frac{1}{\Gamma(\alpha) \beta^{\alpha}} \cdot y^{\alpha-1} e^{-y/\beta} \end{aligned}\]

๋”ฐ๋ผ์„œ, $X \sim \text{Gamma}(\alpha, 1)$์—์„œ $Y = \beta X$์˜ transformation์„ ์ทจํ•˜๋ฉด, $Y \sim \text{Gamma}(\alpha, \beta)$์˜ ๋ถ„ํฌ๋ฅผ ์–ป๋Š”๋‹ค.



3. Let $\theta \sim \text{Unif}(-\pi/2, \pi/2)$, and let $X = \tan \theta$.

Find the pdf of $X$.

\[h(x) = \arctan x \quad \text{and} \quad h'(x) = \frac{1}{1+x^2}\]

์œ„์˜ ๊ทœ์น™์— ๋งž์ถฐ $X$์˜ ๋ถ„ํฌ๋ฅผ ์œ ๋„ํ•ด๋ณด๋ฉด,

\[\begin{aligned} f_X (x) &= f_\theta (h(x)) \cdot \left| h'(x) \right| \\ &= \cancelto{\frac{1}{\pi}}{f_\theta (\arctan x)} \cdot \frac{1}{1+x^2} \quad (\text{Uniform distribution})\\ &= \frac{1}{\pi} \frac{1}{1+x^2} \end{aligned}\]

์ฐธ๊ณ ๋กœ ์œ„์™€ ๊ฐ™์€ ๋ถ„ํฌ๋ฅผ <Cauchy Distribution>๋ผ๊ณ  ํ•œ๋‹ค.



Theorem.

Let $X$ be a RV with cdf $F_X (x)$ which is strictly increasing.

Let $U \sim \text{Unif}(0, 1)$ Then,

1. $Y := F_X^{-1}(U)$ has the same distribution of $X$.

2.$Z := F_X(X)$ has the same distribution of $U$.

๊ฐœ์ธ์ ์œผ๋กœ ๋ช…์ œ์— ๋Œ€ํ•ด ์ž˜ ์ดํ•ด๊ฐ€ ๋˜์ง€ ์•Š์•„, ์ฆ๋ช…์„ ๋จผ์ € ์ดํ•ดํ–ˆ๋‹ค.


Proof. 1

\[\begin{aligned} P(Y \le y) &= P(F_X^{-1} (U) \le y) \\ &= P(U \le F_X (y)) \\ &= F_X (y) \quad (\text{by cdf of unfiorm $U$}) \end{aligned}\]

๋”ฐ๋ผ์„œ, $Y = F_X^{-1}(U)$๋Š” $X$์˜ ๋ถ„ํฌ๋ฅผ ๊ฐ–๋Š”๋‹ค. $\blacksquare$

Proof. 2

\[\begin{aligned} P(Z \le x) &= P(F_X(X) \le x) \\ &= P(X \le F_X^{-1}(x)) \\ &= F_x \left( F_X^{-1} (x) \right) = x \end{aligned}\]

๋”ฐ๋ผ์„œ, $Z = F_X(X)$๋Š” uniform distribution $U(0, 1)$์„ ๊ฐ–๋Š”๋‹ค. $\blacksquare$

Example.

Let $X \sim \text{Exp}(\lambda)$, then cdf is $F_X(x) = 1 - e^{-\lambda x}$.

๊ทธ๋Ÿฌ๋ฉด ์šฐ๋ฆฌ๋Š” $F_X^{-1}$์™€ $U$๋ฅผ ์ด์šฉํ•ด $X$์˜ ๋ถ„ํฌ๋ฅผ ๊ฐ–๋Š” RV๋ฅผ ์œ ๋„ํ•  ์ˆ˜ ์žˆ๋‹ค!!

By above theorem,

\[F_X^{-1}(U) = \frac{-\ln (1-U)}{\lambda} \sim X\]

๋ง๋ถ™์ด๋ฉด, $U$์—์„œ $X$๋กœ ๊ฐ€๋Š” Transformation์„ ์ฐพ๊ณ  ์‹ถ์€๋ฐ, ๊ทธ๊ฑธ $F_X^{-1}$๋กœ ์„ค์ •ํ•˜๋ฉด ์•„์ฃผ ์‰ฝ๊ฒŒ $U$์—์„œ $X$๋กœ ๊ฐ€๋Š” Transform์„ ์ฐพ๋Š”๊ฒŒ ๋œ๋‹ค๋Š” ๋ง์ด๋‹ค!! ๐Ÿคฉ


Theorem. Continuous case - Two Random Variables

Let $(X, Y) \mapsto \left( u(X, Y), v(X, Y) \right)$ with the inverse $(U, V) \mapsto \left(w_1(U, V), w_2(U, V)\right)$.

If $(X, Y)$ has joint pdf $f_{X, Y}(x, y)$, then $(U, V)$ has joint pdf

\[f_{U, V} (u, v) = f_{X, Y} \left( w_1(U, V), w_2(U, V) \right) \cdot \left| J \right|\]

where $J$ is Jaccobian Matrix

\[J = \begin{pmatrix} \frac{\partial w_1}{\partial u} & \frac{\partial w_1}{\partial v} \\ \frac{\partial w_2}{\partial u} & \frac{\partial w_2}{\partial v} \\ \end{pmatrix}\]

์ฐธ๊ณ ๋กœ Jaccobian $J$๋Š” ์ ๋ถ„ ๋ณ€์ˆ˜๋ฅผ ๋ฐ”๊พธ๋Š” ๊ณผ์ •์—์„œ ๋“ฑ์žฅํ–ˆ๋‹ค.

$x = w_1(u, v)$, $y = w_2(u, v)$๋ผ๊ณ  ํ•˜๊ณ , ์ ๋ถ„ ๋ณ€์ˆ˜๋ฅผ $u$, $v$๋กœ ๋ณ€ํ™˜ํ•œ๋‹ค๋ฉด,

\[\int \int_{X, Y} f(x, y) \, dxdy = \int \int_{U, V} f\left( w_1 (u, v), w_2(u, v) \right) \left| J \right| \, dudv\]

๊ทธ๋ž˜์„œ ์ž˜ ์‚ดํŽด๋ณด๋ฉด, $(U, V)$์— ๋Œ€ํ•œ pdf $f_{U, V}(u, v)$๋Š” ์œ„์˜ ์‹์˜ ์šฐ๋ณ€์—์„œ ์ ๋ถ„ ๋‚ด๋ถ€์˜ ํ•จ์ˆ˜๋ฅผ ๊ทธ๋Œ€๋กœ ๊ฐ€์ ธ์˜จ ๊ฒƒ์ž„์„ ์‰ฝ๊ฒŒ ํ™•์ธํ•  ์ˆ˜ ์žˆ๋‹ค!!


Example. [1]

Let $X \sim N(0, 1)$ and $Y \sim N(0, 1)$, and $X \perp Y$.

Let $U := X + Y$ and $V := X - Y$.

1. Find the joint pdf of $(U, V)$

๋จผ์ € $w_1(u, v)$, $w_2(u, v)$๋ฅผ ๊ตฌํ•œ๋‹ค.

\[\begin{aligned} x &= \frac{u+v}{2} = w_1(u, v) \\ y &= \frac{u-v}{2} = w_2(u, v) \end{aligned}\]

์ด๊ฒƒ์„ ๊ทธ๋Œ€๋กœ ์ ์šฉํ•ด๋ณด๋ฉด,

\[\begin{aligned} f_{U, V}(u, v) &= f_{X, Y} \left( w_1(u, v), w_2(u, v) \right) \cdot \cancelto{1/2}{\left| \begin{matrix} 1/2 & 1/2 \\ 1/2 & -1/2 \end{matrix}\right|} \\ &= f_{X, Y} \left( \frac{u+v}{2}, \frac{u-v}{2} \right) \frac{1}{2} \\ &= f_X \left( \frac{u+v}{2} \right) f_Y \left(\frac{u-v}{2}\right) \frac{1}{2} \qquad (X \perp Y) \\ &= \frac{1}{\sqrt{2\pi}} \exp \left( - \frac{(u+v)^2}{8} \right) \cdot \frac{1}{\sqrt{2\pi}} \exp \left( - \frac{(u-v)^2}{8} \right) \cdot \frac{1}{2} \\ &= \frac{1}{4\pi} \exp \left( - \frac{u^2 + v^2}{4}\right) \end{aligned}\]


2. Are $U$ and $V$ independent?

A. Yes!

\[\begin{aligned} f_{U, V} (u, v) &= \frac{1}{4\pi} \exp \left( - \frac{u^2 + v^2}{4}\right) \\ \end{aligned}\]

์šฐ๋ฆฌ๋Š” $f_{U, V} (u, v)$์˜ ์‹์—์„œ $f_U (u)$๋ฅผ ์œ ๋„ํ•ด๋ณด๋ฉด,

\[\begin{aligned} f_U(u) = \frac{1}{\sqrt{2\pi} \cdot (\sqrt{2})^2} \cdot \exp \left( - \frac{u^2}{2 \cdot (\sqrt{2})^2}\right) \end{aligned}\]

์ฆ‰, $U$๋Š” $N(0, (\sqrt{2})^2)$์˜ ๋ถ„ํฌ๋ฅผ ๊ฐ€์ง์„ ํ™•์ธํ•  ์ˆ˜ ์žˆ๋‹ค!


3. Let $Z := \dfrac{Y}{X}$. Find the pdf of $Z$.

Let $U := X$, and $V := \dfrac{Y}{X}$, then

\[\begin{aligned} X &= U \\ Y &= UV \end{aligned}\]

and Jaccobian is

\[\left| J \right| = \left| \begin{matrix} 1 & 0 \\ \cdot & u \end{matrix}\right| = \left| u \right|\]

์žฌ๋ฃŒ๋Š” ๋‹ค ๊ฐ–์ถฐ์กŒ์œผ๋‹ˆ, ์ด์ œ pdf $f_{U, V}(u, v)$๋ฅผ ๊ตฌํ•ด๋ณด์ž.

\[\begin{aligned} f_{U, V}(u, v) &= f_{X, Y} (u, uv) \cdot \left| u \right| \\ &= f_X (u) \cdot f_Y (uv) \cdot \left| u \right| \\ &= \frac{1}{\sqrt{2\pi}} \exp \left( - \frac{u^2}{2} \right) \cdot \frac{1}{\sqrt{2\pi}} \exp \left( - \frac{u^2v^2}{2}\right) \cdot \left| u \right| \\ &= \frac{1}{2\pi} \cdot \exp \left( - \frac{u^2(1+v^2)}{2}\right) \cdot \left| u \right| \end{aligned}\]

์ด๋•Œ, ์šฐ๋ฆฌ๊ฐ€ ๋ชฉํ‘œ๋กœ ํ•˜๋Š” ๋ถ„ํฌ์ธ $Z$, ์ฆ‰ $V := \dfrac{Y}{X}$๋ฅผ ๊ตฌํ•˜๊ธฐ ์œ„ํ•ด $f_{U, V}(u, v)$์—์„œ marginalize out ํ•ด์ค€๋‹ค.

\[\begin{aligned} f_Z (z) &= f_V (v) = \int f_{U, V} (u, v) \, du \\ &= \frac{1}{2\pi} \int^{\infty}_{-\infty} \left| u \right| \cdot \exp \left( - \frac{u^2(1+v^2)}{2}\right) \, du \\ &= \frac{1}{2\pi} \cdot 2 \int^{\infty}_0 \left| u \right| \cdot \exp \left( - \frac{u^2(1+v^2)}{2}\right) \, du \\ &= \frac{1}{2\pi} \cdot 2 \int^{\infty}_0 \frac{1}{2} \cdot \exp \left( - \frac{t(1+v^2)}{2}\right) \, dt \qquad (u^2 = t) \\ &= \frac{1}{2\pi} \cdot \left( \left. \frac{2}{-(1+v^2)} \cdot \exp \left( - \frac{t(1+v^2)}{2}\right) \right]^{\infty}_0 \right) \\ &= \frac{1}{2\pi} \cdot \frac{2}{-(1+v^2)} \left\{ 0 - 1\right\} \\ &= \frac{1}{\pi} \cdot \frac{1}{1+v^2} \end{aligned}\]

์ฆ‰, ์œ„์™€ ๊ฐ™์ด $Z := \dfrac{Y}{X}$์— ๋Œ€ํ•œ ๋ถ„ํฌ๋ฅผ ์–ป์„ ์ˆ˜ ์žˆ๋‹ค!! ์ฐธ๊ณ ๋กœ ์œ„์˜ ๋ถ„ํฌ๋Š” ์•ž์—์„œ ์ž ๊ฐ„ ์–ธ๊ธ‰๋œ <Cauchy Distribution>์ด๋‹ค!


Example. [2]

Let $(X, Y)$ have the joint pdf

\[f_{X, Y}(x, y) = \begin{cases} 4xy & \text{for } 0 < x < 1 \text{ and } 0 < y < 1\\ 0 & \text{else} \end{cases}\]

1. Find the joint pdf of $(X^2, XY)$.

\[\begin{aligned} U &= X^2 \\ V &= XY \end{aligned}\]

then, inverse relation is

\[\begin{aligned} x &= \sqrt{u} \\ y &= \frac{v}{\sqrt{u}} \\ 0 < \sqrt{u} < 1 \quad &\text{and} \quad 0 < \frac{v}{\sqrt{u}} < 1 \end{aligned}\]

and Jaccobian is

\[\left| J \right| = \left| \begin{matrix} \frac{1}{2\sqrt{u}} & 0 \\ \cdot & \frac{1}{\sqrt{u}} \end{matrix} \right| = \frac{1}{2u}\]

๋”ฐ๋ผ์„œ, pdf $f_{U, V} (u, v)$๋Š”

\[\begin{aligned} f_{U, V} (u, v) &= f_{X, Y} \left(\sqrt{u}, \frac{v}{\sqrt{u}}\right) \cdot \left| \frac{1}{2u} \right|\\ &= 4 \cancel{\sqrt{u}} \frac{v}{\cancel{\sqrt{u}}} \cdot \frac{1}{2u} \\ &= \frac{2v}{u} \qquad \text{for } 0 < u < 1 \text{ and } 0 < v < \sqrt{u} \end{aligned}\]

2. Find the marginal pdf of $X^2$ and $XY$.

(1) $f_U(u) = f_{X^2}(u)$

\[\begin{aligned} f_{X^2} (u) &= \int^{\sqrt{u}}_0 \frac{2v}{u} \, dv \\ &= \frac{1}{u} \cdot (\sqrt{u})^2 = 1 \end{aligned}\]

๋”ฐ๋ผ์„œ, $X^2$์€ $\text{Unif}(0, 1)$์˜ ๋ถ„ํฌ๋ฅผ ๋”ฐ๋ฅธ๋‹ค!

(2) $f_V(v) = f_{XY}(v)$

\[\begin{aligned} f_{XY}(v) &= \int^1_{v^2} \frac{2v}{u} \, du \\ &= 2v \cdot \ln(1-v^2) \end{aligned}\]

์ด์–ด์ง€๋Š” ํฌ์ŠคํŠธ์—์„œ๋Š” Random Variable์— ๋Œ€ํ•œ Transformation์„ ์ด์–ด์„œ ์‚ดํŽด๋ณธ๋‹ค. 1-1์ด ์•„๋‹Œ mapping์˜ ๊ฒฝ์šฐ๋ฅผ ์ข€๋” ์‚ดํŽด๋ณผ ์˜ˆ์ •์ด๋‹ค.

๐Ÿ‘‰ Transformations of Random Variable - 2