yuns

2.2 Probability Theory 본문

graph deep learning/#2 Basics of Math and Graph

2.2 Probability Theory

yuuuun 2020. 11. 4. 16:47
반응형

2.2.1 Basic Concepts and Formulas

  • Random variable
    • a variable that has a random value.
    • X로부터 $x_1$와 $x_2$의 두 변수가 나올 수 있게 될 경우 아래의 식이 성립한다. $$P(X=x_1) + P(X=x_2)=1$$]
  • Joint probability
    • 두 개의 random variable X, Y에서 각각 $x_1$와 $y_1$가 뽑힐 확률 $P(X=x_1, Y=y_1)$ 
  • Conditional Probability
    • $Y=y_1$가 뽑혔을 때 $X=x_1$가 같이 뽑힐 확률
    • $P(X=x_1|Y=y_1)$
    • fundamental rules
      • sum rule: $P(X=x)=\sum_y P(X=x, Y=y)$
      • product rule: $P(X=x, Y=y)=P(Y=y|X=x)P(X=x)$
      • Bayes formula $$P(Y=y|X=x)=\frac{P(X=x, Y=y)}{P(X=x)} = \frac{P(X=x|Y=y)P(Y=y)}{P(X=x)}$$ $$P(X_i = x_i | Y=y) = \frac{P(Y=y|X_i=x_i)P(X_i = x_i)}{=sum_{j=1}^{n}P(Y=y|X_j=x_j)P(X_j=x_j)}$$
      • Chain Rule $$P(X_1 = x_1, \cdots, X_n=x_n)=P(X_1=x_1) \prod_{i=1}^{n} P(X_i =x_i|X_1 = x_1, \cdots,X_{i-1}=x_{i-1})$$
    • Expectation of $f(x)$ : $\mathbb{E}[f(x)] = \sum_x P(x)f(x)$
    • Variance of $f(x)$: $Var(f(x))=\mathbb{E}[(f(x)-\mathbb{E}[f(x)])^2] = \mathbb{E}[f(x)^2] - \mathbb{E}[f(x)]^2$
    • Standard deviation
      • the square root of variance

2.2.2 Probability Distributions

the probability of a random variable or several random variables on every state.

  • Examples of distributions
    • Gaussian distribution
    • Bernoulli distribution
    • Binomial distribution
    • Laplace distribution

 

반응형

'graph deep learning > #2 Basics of Math and Graph' 카테고리의 다른 글

2.1 Linear Algebra  (0) 2020.11.03
Comments