yuns
2.1 Linear Algebra 본문
반응형
2.1.1 Basic Concepts
- Scalar: A number
- Vector: A column of ordered numbers $$ x=\begin{bmatrix} x_1 \\ x_2 \\ \vdots \\x_n \end{bmatrix}$$
- norm of vector: the length
- $L_p$ norm: $||x||_p = (\sum_{i=1}^{n}|x_i|^p)^{1/p}$
- $L_1$ norm: $||x||_1 = (\sum_{i=1}^{n}|x_i|)$
- $L_2$ norm: $||x||_2 = \sqrt{\sum_{i=1}^{n}{x_i}^2}$ --> used to measure the length of vectors
- $L_\infty$ norm: $||x||_\infty = max_i |x_i|$
- the distance of two vectors $x_1, x_2$ is $$D_p(x_1, x_2) = ||x_1 - x_2||_p$$
- a set of vectors $x_1, x_2, \cdots, x_m$ are linearly independent iff there does not exist a set of scalars $\lambda_1, \lambda_2, \cdots, \lambda_m$, which are not all 0, such that $$\lambda_1 x_1 + \lambda_2 x_2 + \cdots + \lambda_m x_m = 0$$
- norm of vector: the length
- Matrix: 2-dimensional array $$ A=\begin{bmatrix} a_{11} &a_{12}& \cdots& a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots &\vdots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix}$$ where $A\in \mathbb{R}^{m\times n}$
- Matrix Product: For $A\in \mathbb{R}^{m\times n} $ and $B\in \mathbb{R}^{n\times p}$, matrix product of AB is denoted as $C\in \mathbb{R}^{m\times p}$ and calculated as $$C_{ij} = \sum_{k=1}^{n}A_{ik}B_{kj} $$
- Features
- (AB)C = A(BC) --> always true
- AB = BA --> not always true
- Features
- Determinant: $det(A) = \sum_{k_1 k_2 \cdots k_n} (-1)^{\tau(k_1k_2 \cdots k_n)}a_{1k_1}a_{2k_2} \cdots a_{nk_n} $
- Inversion Matrix: If matrix A is a square matrix, the inverse matrix of A satisfies $A^{-1}A=I$
- Transpose: $A_{ij}^T = A_{ji}$
- Hadamard product: $C_{ij}=A_{ij}B_{ij}$ where $A\in \mathbb{R}^{m\times n}$, $B\in \mathbb{R}^{m\times n}$ and $C\in \mathbb{R}^{m\times n}$
2.1.2 Eigendecomposition (고유값 분해)
- Let A be a matrix in $\mathbb{R}^{n\times n}$, $v\in \mathbb{C}^n$ and $\lambda \in \mathbb{C}$ $$Av=\lambda v$$
- $\lambda$ : eigenvalue of A
- v: eigenvector of A
- Let $V=[v_1 v_2 \cdots v_n] $ (V: invertible matrix)
- Eigendecomposition of A(diagonalization) $$ A=Vdiag(\lambda V^{-1}) $$
2.1.3 Singular Value Decomposition(특이값 분해 SVD)
eigendecomposition(고유값 분해)와 같이 행렬을 대각화하는 방법
- singular value
- r: the rank of $A^T A$
- $ 0 < \sigma_r \leq \cdots \leq \sigma_2 \leq \sigma_1$ such that $1 \leq i \leq r$
- $v_i$ is an eigen vector of $A^T A$ with corresponding eigenvalue $sigma_i^2$.
- singular value of A is $\sigma_1, \sigma_2,\cdots ,\sigma_r$.
- singular value decomposition $$ A = U\sum V^T$$
- $U\in \mathbb{R}^{m\times m}$, $V\in \mathbb{R}^{n \times n}$ and $\sum$ (where $\sum_{ij} = \sigma_i$ (iff $i=j \leq r $))
반응형
'graph deep learning > #2 Basics of Math and Graph' 카테고리의 다른 글
2.2 Probability Theory (0) | 2020.11.04 |
---|
Comments