2020-1ν•™κΈ°, λŒ€ν•™μ—μ„œ β€˜μ•Œκ³ λ¦¬μ¦˜β€™ μˆ˜μ—…μ„ λ“£κ³  κ³΅λΆ€ν•œ λ°”λ₯Ό μ •λ¦¬ν•œ κΈ€μž…λ‹ˆλ‹€. 지적은 μ–Έμ œλ‚˜ ν™˜μ˜μž…λ‹ˆλ‹€ :)

4 minute read

2020-1ν•™κΈ°, λŒ€ν•™μ—μ„œ β€˜μ•Œκ³ λ¦¬μ¦˜β€™ μˆ˜μ—…μ„ λ“£κ³  κ³΅λΆ€ν•œ λ°”λ₯Ό μ •λ¦¬ν•œ κΈ€μž…λ‹ˆλ‹€. 지적은 μ–Έμ œλ‚˜ ν™˜μ˜μž…λ‹ˆλ‹€ :)



Divide and Conquer

<λΆ„ν•  정볡 Divide and Conquer>λŠ” 문제λ₯Ό ν•΄κ²°ν•˜λŠ” 접근법 쀑 ν•˜λ‚˜λ‹€. κ·Έ 과정을 κ°„λž΅νžˆ κΈ°μˆ ν•˜λ©΄ μ•„λž˜μ™€ κ°™λ‹€.

  1. Break the problem into β€œsubproblems”.
  2. Solve them recursively.
  3. Combine the solutions to get the answer to the problem.

λΆ„ν•  정볡은 기본적으둜 <μž¬κ·€ Recursion>을 기반으둜 ν•œλ‹€. 문제λ₯Ό <ν•˜μœ„ 문제 subproblem>둜 λΆ„ν• ν•˜κ³ , κ·Έ ν•˜μœ„ 문제λ₯Ό 더 ν•˜μœ„μ˜ 문제둜 λΆ„ν• ν•œλ‹€. 뢄할은 μ†μ‰½κ²Œ λ‹€λ£° 수 μžˆλŠ” μˆ˜μ€€μΈ 베이슀 μΌ€μ΄μŠ€ μ§„ν–‰λœλ‹€. 이후엔 λΆ„ν• ν•œ κ·œμΉ™μ— 따라 ν•˜μœ„ λ¬Έμ œμ—μ„œ 얻은 κ²°κ³Όλ₯Ό μƒμœ„ λ¬Έμ œμ—μ„œ μ°¨λ‘€λŒ€λ‘œ μ‘°ν•©(Combine) ν•΄μ£Όλ©΄ μ›λž˜ λ¬Έμ œμ— λŒ€ν•œ 정닡을 μ–»λŠ”λ‹€.


Master Theorem: Recurrence Relations

이 λ‹¨λ½μ—μ„œλŠ” <λΆ„ν•  정볡>의 μ‹œκ°„ λ³΅μž‘λ„λ₯Ό μœ λ„ν•˜λŠ” 쒋은 도ꡬ인 <Master Theorem>에 λŒ€ν•΄ 닀룬닀.

λ¨Όμ € μœ„μ™€ 같이 size $n$의 문제λ₯Ό size $n/b$의 subproblem $a$개둜 λΆ„ν• ν•˜λŠ” 경우λ₯Ό κ³ λ €ν•΄λ³΄μž. 쒀더 νŠΉμ •ν•˜μžλ©΄, <branching fator>κ°€ $a$, <λΆ„ν• μ˜ 깊이 depth>κ°€ $\log_b {n}$인 상황이닀. $O(n^c)$λŠ” 각 κ³Όμ •μ—μ„œ λ“œλŠ” κ³ μ • λΉ„μš©μ΄λ‹€.

이것을 μž¬κ·€μ‹μœΌλ‘œ ν‘œν˜„ν•˜λ©΄ μ•„λž˜μ™€ κ°™λ‹€.

\[T(n) = a \cdot T(\lceil n/b \rceil) + O(n^c)\]

<Master Theorem>은 λΆ„ν•  μ •λ³΅μ˜ 계산 λ³΅μž‘λ„κ°€ $a$와 $b$, $c$의 관계에 따라 μ•„λž˜μ™€ 같이 μœ λ„λ¨μ„ κΈ°μˆ ν•œλ‹€.

\[T(n) = \begin{cases} O(n^c) &\mbox{if } c > \log_b {a} \\ O(n^d \log {n}) & \mbox{if } c = \log_b {a} \\ O(n^{\log_b {a}}) & \mbox{if } c < \log_b {a} \end{cases}\]

$a$, $b$, $c$에 λŒ€ν•œ κ°’λ§Œ μ•ˆλ‹€λ©΄, 계산 λ³΅μž‘λ„λ₯Ό λ°”λ‘œ 얻을 수 μžˆλ‹€λ‹ˆ! μ–Όλ§ˆλ‚˜ νŽΈλ¦¬ν•œ 정리인가!! ν•œλ²ˆ 증λͺ…ν•΄λ³΄μž! 🀩

proof.

At $k$-th level, we get $a_k$ subproblems of size $n/b^k$. Also we should consider fixed cost $O(n^c)$ too. At $k$-th level, the size of problem is $n/b^k$, so fixed cost is $O\left(\left(n/b^k\right)^c\right)$.

Then, the cost at $k$-th level is

\[a^k \cdot O\left(\left(n/b^k\right)^c\right)\]

We will slightly modify the given form of cost as below.

\[a^k \cdot O\left(\left(n/b^k\right)^c\right) = O(n^c) \cdot \left( a/b^c \right)^k\]

There are $\log_b {n}$ levels of division process, so let’s sum up the cost at each level.

\[\sum^{\log_b {n}}_{k=0} O(n^c) \cdot \left( a/b^c \right)^k\]

Then, given series is a geometric series with ratio $a/b^c$.

The simple way to understand geometric series is watching the ratio!

Case 1. if ratio $a/b^c < 1$, then the series decreases.

Case 2. if ratio $a/b^c = 1$, then the series is a just constant sum up.

Case 3. if ratio $a/b^c > 1$, then the series increases.

  • In Case 1, take the 1st term, $\implies$ $O(n^c)$
  • In Case 2, take sum, $\implies$ $O(n^c) \cdot \log_b {n} = O(n^c \log_b {n}) = O(n^c \log n)$
  • In Case 3, take the last term, $\implies$ $O(n^c) \cdot \left(a/b^c\right)^{\log_b {n}} = O(n^c) \cdot \frac{a^{\log_b n}}{n^c} = O\left( n^{\log_b a} \right)$

Therefore,

\[T(n) = \begin{cases} O(n^c) &\mbox{if } c > \log_b {a} \\ O(n^d \log {n}) & \mbox{if } c = \log_b {a} \\ O(n^{\log_b {a}}) & \mbox{if } c < \log_b {a} \end{cases}\]

$\blacksquare$


μ•„λž˜λŠ” <λΆ„ν•  정볡>을 μ΄μš©ν•΄ ν•΄κ²°ν•˜λŠ” λŒ€ν‘œμ μΈ μ£Όμ œλ“€μ— λŒ€ν•΄ μ •λ¦¬ν•œ ν¬μŠ€νŠΈλ“€μ΄λ‹€.

  1. Multiplication Algorithm
  2. Binary Search
  3. Merge Sort
  4. Matrix Multiplication: Strassen Algorithm
  5. Quick Selection
  6. Closest pair of points

Categories:

Updated: