yuns

2.1 Linear Algebra 본문

graph deep learning/#2 Basics of Math and Graph

2.1 Linear Algebra

yuuuun 2020. 11. 3. 11:58
반응형

2.1.1 Basic Concepts

  • Scalar: A number
  • Vector: A column of ordered numbers x=[x1x2xn]
    • norm of vector: the length
      • Lp norm: ||x||p=(ni=1|xi|p)1/p
      • L1 norm: ||x||1=(ni=1|xi|)
      • L2 norm: ||x||2=ni=1xi2 --> used to measure the length of vectors
      • L norm: ||x||=maxi|xi|
    • the distance of two vectors x1,x2 is Dp(x1,x2)=||x1x2||p
    • a set of vectors x1,x2,,xm are linearly independent iff there does not exist a set of scalars λ1,λ2,,λm, which are not all 0, such that λ1x1+λ2x2++λmxm=0
  • Matrix: 2-dimensional array A=[a11a12a1na21a22a2nam1am2amn] where ARm×n 
  • Matrix Product: For ARm×n and BRn×p, matrix product of AB is denoted as CRm×p and calculated as Cij=nk=1AikBkj
    • Features
      • (AB)C = A(BC) --> always true
      • AB = BA --> not always true
  • Determinant: det(A)=k1k2kn(1)τ(k1k2kn)a1k1a2k2ankn 
  • Inversion Matrix: If matrix A is a square matrix, the inverse matrix of A satisfies A1A=I 
  • Transpose: ATij=Aji
  • Hadamard product: Cij=AijBij where  ARm×n, BRm×n and CRm×n
 

Hadamard product (matrices) - Wikipedia

The Hadamard product operates on identically shaped matrices and produces a third matrix of the same dimensions. In mathematics, the Hadamard product (also known as the element-wise, entrywise[1][2]:ch. 5 or Schur[3] product) is a binary operation that tak

en.wikipedia.org


2.1.2 Eigendecomposition (고유값 분해)

  • Let A be a matrix in Rn×n, vCn and λC  Av=λv
    • λ : eigenvalue of A
    • v: eigenvector of A

수식으로 표현한 eigenvector & eigenvalue

  • Let V=[v1v2vn] (V: invertible matrix)
    • Eigendecomposition of A(diagonalization) A=Vdiag(λV1)

2.1.3 Singular Value Decomposition(특이값 분해 SVD)

eigendecomposition(고유값 분해)와 같이 행렬을 대각화하는 방법

  • singular value
    • r: the rank of ATA
    • 0<σrσ2σ1 such that 1ir
    • vi is an eigen vector of ATA with corresponding eigenvalue sigma2i.
    • singular value of A is σ1,σ2,,σr.
  • singular value decomposition  A=UVT
    • URm×m, VRn×n and   (where ij=σi (iff i=jr))

 

 

 

반응형

'graph deep learning > #2 Basics of Math and Graph' 카테고리의 다른 글

2.2 Probability Theory  (0) 2020.11.04
Comments