yuns

2.2 Probability Theory 본문

graph deep learning/#2 Basics of Math and Graph

2.2 Probability Theory

yuuuun 2020. 11. 4. 16:47
반응형

2.2.1 Basic Concepts and Formulas

  • Random variable
    • a variable that has a random value.
    • X로부터 x1x1x2x2의 두 변수가 나올 수 있게 될 경우 아래의 식이 성립한다. P(X=x1)+P(X=x2)=1P(X=x1)+P(X=x2)=1]
  • Joint probability
    • 두 개의 random variable X, Y에서 각각 x1x1y1y1가 뽑힐 확률 P(X=x1,Y=y1)P(X=x1,Y=y1) 
  • Conditional Probability
    • Y=y1Y=y1가 뽑혔을 때 X=x1X=x1가 같이 뽑힐 확률
    • P(X=x1|Y=y1)P(X=x1|Y=y1)
    • fundamental rules
      • sum rule: P(X=x)=yP(X=x,Y=y)P(X=x)=yP(X=x,Y=y)
      • product rule: P(X=x,Y=y)=P(Y=y|X=x)P(X=x)P(X=x,Y=y)=P(Y=y|X=x)P(X=x)
      • Bayes formula P(Y=y|X=x)=P(X=x,Y=y)P(X=x)=P(X=x|Y=y)P(Y=y)P(X=x)P(Y=y|X=x)=P(X=x,Y=y)P(X=x)=P(X=x|Y=y)P(Y=y)P(X=x) P(Xi=xi|Y=y)=P(Y=y|Xi=xi)P(Xi=xi)=sumnj=1P(Y=y|Xj=xj)P(Xj=xj)P(Xi=xi|Y=y)=P(Y=y|Xi=xi)P(Xi=xi)=sumnj=1P(Y=y|Xj=xj)P(Xj=xj)
      • Chain Rule P(X1=x1,,Xn=xn)=P(X1=x1)ni=1P(Xi=xi|X1=x1,,Xi1=xi1)P(X1=x1,,Xn=xn)=P(X1=x1)ni=1P(Xi=xi|X1=x1,,Xi1=xi1)
    • Expectation of f(x)f(x) : E[f(x)]=xP(x)f(x)
    • Variance of f(x): Var(f(x))=E[(f(x)E[f(x)])2]=E[f(x)2]E[f(x)]2
    • Standard deviation
      • the square root of variance

2.2.2 Probability Distributions

the probability of a random variable or several random variables on every state.

  • Examples of distributions
    • Gaussian distribution
    • Bernoulli distribution
    • Binomial distribution
    • Laplace distribution

 

반응형

'graph deep learning > #2 Basics of Math and Graph' 카테고리의 다른 글

2.1 Linear Algebra  (0) 2020.11.03