Joint Probability Distribution
βνλ₯ κ³Ό ν΅κ³(MATH230)β μμ μμ λ°°μ΄ κ²κ³Ό 곡λΆν κ²μ μ 리ν ν¬μ€νΈμ λλ€. μ 체 ν¬μ€νΈλ Probability and Statisticsμμ νμΈνμ€ μ μμ΅λλ€ π²
Joint Probability Distribution
μμμλ νλμ RVμ λν probability distributionμ μ΄ν΄λ³΄μλ€. νμ§λ§, νμ€μμλ λ μ΄μμ RVμ λν κ²°κ³Όλ₯Ό λμμ κ³ λ €ν΄μΌ νλ κ²½μ°κ° λ§λ€. <Joint Probability Distribution>μ μ΄λ° λ μ΄μμ RVλ₯Ό μνμ μΌλ‘ μ μν κ°λ μ΄λ€.
Joint Probabilityλ Discrete RVμ Continuous RVμμ κ°κ° <Joint pmf>, <Joint pdf>λ‘ μ μλλ€.
Definition. Joint pmf
The function $f(x, y)$ is a <joint probability distribution> or <joint pmf> of the discrete RV $X$ and $Y$ if
- $f(x, y) \ge 0$ for all $(x, y)$.
- $\displaystyle \sum_x \sum_y f(x, y) = 1$
- $P(X=x, Y=y) = f(x, y)$
Also, for any region $A$ in the $xy$ plane, $\displaystyle P[(X, Y) \in A] = \sum \sum_A f(x, y)$
Definition. Joint pdf
The function $f(x, y)$ is a <joint density function> of the continuous RV $X$ and $Y$ if
- $f(x, y) \ge 0$, for all $(x, y)$.
- $\displaystyle \int^\infty_\infty \int^\infty_\infty f(x, y) \; dx dy = 1$
- $\displaystyle P[(X, Y) \in A] = \int \int_A f(x, y) \; dx dy$, for any region $A$ in the $xy$ plane.
Marginal Distribution
Definition. Marginal Distribution
The <marginal distributions> of $X$ alone and of $Y$ alone are
\[g(x) = \sum_y f(x, y) \quad \text{and} \quad h(y) = \sum_x f(x, y)\]for the discrete case, and
\[g(x) = \int^\infty_{-\infty} f(x, y) \; dy \quad \text{and} \quad h(y) = \int^\infty_{-\infty} f(x, y) \; dx\]for the continuous case.
보좩: <Discrete RVμ λν Marginal Distribution>μ κ·Έ λ°νμ <Law of Total Probability>κ° κΉλ €μλ€!
Conditional Probability Distribution
μμμ <Conditional Probability> $P(Y \mid X)$μ λν΄ λ€λ€λ€. νμ§λ§, μ°λ¦¬λ μ΄ <Conditional Probability>μ λν κ³μ°μ μ’λ ν¨μ¨μ μΌλ‘ κ³μ°νκΈ° μν΄ μλμ κ°μ΄ RV $X$, $Y$μ λν Probability DistributionμΌλ‘ μ λν μ μλ€!
\[P(Y = y \mid X = x) = \frac{P(X=x, Y=y)}{P(X=x)} = \frac{f(x, y)}{f_{X} (x)}, \quad \text{provided} \; f_X (x) > 0\]μμ κ°μ΄ <Conditional Probability>λ₯Ό βλΆν¬(Distribution)βμ ννλ‘ κΈ°μ ν κ²μ <Conditional Probability Distribution>λΌκ³ νλ€!
Definition. Conditional Probability Distribution
Let $X$ and $Y$ be two random variables, discrete or continuous. The <conditional distribution of the RV $Y$ given that $X = x$> is
\[f(y \mid x) = \frac{f(x, y)}{f_X (x)}, \quad \text{provided} \; f_X (x) > 0\]Similarly, the <conditional distribution of the RV $X$ given that $Y=y$> is
\[f(x \mid y) = \frac{f(x, y)}{f_Y (y)}, \quad \text{provided} \; f_Y (y) > 0\]Statistical Independence
<Conditional Probability>μμ μ μν <Independent Event>μ κ°λ μ <Conditional Probability Distribution>μμλ μ μ©ν΄λ³Ό μ μλ€!!
Definition. Statistical Independence
Let $X$ and $Y$ be two RVs, discrete or continuous, with joint probability distribution $f(x, y)$ and marginal distributions $f_X (x)$ and $f_Y (y)$, respectively.
The RVs $X$ and $Y$ are said to be <statistically independent> if and only if
\[f(x, y) = f_X (x) f_Y (y)\]for all $(x, y)$ within their range.
λλ μ΄λ κ² μκ°ν΄λ³Ό μλ μλ€. λ§μ½ conditional distribution $f(x \mid y)$κ° $y\;$μ dependent νμ§ μλ€λ©΄ κ·Έλ¬λκΉ independent νλ€λ©΄, λΉμ°ν $f(x \mid y)$λ $y\;$μ κ²°κ³Όμ μλ¬΄λ° μν₯μ λ°μ§ μμμΌ ν κ²μ΄λ€. κ·Έλ¬κΈ° μν΄μλ $f(x \mid y)$μμ $y$μ λν ν μ΄ μ‘΄μ¬νμ§ μμμΌ νλ€!
μ¦, $\dfrac{f(x, y)}{f_Y (y)}$μμ $y\;$μ λν ν μ΄ λͺ¨λ μκ±° λλ€λ λ§μ΄λ€. μ΄λ₯Ό λ€μ λ°λΌλ³΄λ©΄, $f(x, y)$μμ $f_Y (y)$λ‘ $y$ ν μ μμ ν λΆλ¦¬ν μ μλ€λ λ§μ΄λ€.
\[f(x, y) = f_Y (y) \cdot g(x)\]κ·Έλ°λ° λκ°μ μμ μ $f(y \mid x)$μ μνν΄λ³΄λ©΄, μ΄λ²μλ $f(x, y) = f_X (x) \cdot h(y)$κ° λμ¨λ€. κ·Έλμ μ΄ λ κ²°κ³Όλ₯Ό μ μ‘°ν©νλ©΄, <λ 립>μ λν΄ μμ κ°μ΄ <marginal distribution>μ κ³±μ΄ <probability distribution>μ΄λ€λΌκ³ μ μνλ κ²μ΄ μμ°μ€λ¬μ΄ κ²μ΄λ€! π
μ΄κ²μ $N$κ°μ random variableμ λν΄ μΌλ°ννλ©΄ μλμ κ°λ€.
Definition. mutually statistical independence
Let $X_1, X_2, \dots, X_n$ be $n$ random variables, discrete or continuous, with joint probability distribution $f(x_1, x_2, \dots, x_n)$ and marginal distribution $f_1(x_1), f_2(x_2), \dots, f_n (x_n)$, respectively. The random variables $X_1, X_2, \dots, X_n$ are said to be <mutually statistically independent> if and only if
\[f(x_1, x_2, \dots, x_n) = f_1(x_1) f_2(x_2) \cdots f_n (x_n)\]for all $(x_1, x_2, \dots, x_n)$ within their range.
<Marginal Distribution>μ λν λ€μμ λ¬Έμ λ₯Ό λ΅ν΄λ³΄μ.
Q. We know the marginal pmfs $f_X (x)$ and $f_Y (y)$, can you find the joint pmf $f(x, y)$?
Example.
Let $(X, Y)$ have joint pdf
\[f(x, y) = \begin{cases} 1 && (x, y) \in [0,1] \times [0, 1] \\ 0 && \text{otherwise} \end{cases}\](a) Are $X$ and $Y$ independent?
(b) Let $Z := \max (X, Y)$. Find the distribution of $Z$. (Hint: Find cdf of $Z$)
(c) Let $W := \min (X, Y)$. Find the distribution of $W$. (Hint: Find cdf of $W$)
μ΄λ² ννΈμμ <Joint Probability>λ₯Ό ꡬνκΈ° μν΄ μ λΆμ μ‘°κΈ ν΄μΌ νλ€. νμ§λ§, κ·Έλ κ² μ΄λ €μ΄ μ λΆμ μλκΈ° λλ¬Έμ λͺλ²λ§ μ°μ΅νλ©΄ κΈλ°© μ΅μν΄μ§λ€!! π
μ΄μ΄μ§λ ν¬μ€νΈμμλ RVμ νλ₯ μ μ΄μ©ν΄ <νκ· >, <λΆμ°>, <곡λΆμ°>μ μ λν΄λ³Έλ€!