yuns

4.2 Model 본문

graph deep learning/#4 Vanilla Graph Neural Networks

4.2 Model

yuuuun 2020. 11. 7. 11:29
반응형

hv=f(xv,xco[v],hne[v],xne[v])

ov=g(hv,xv)

  • Functions
    • f: local transition function which is shared among all nodes
    • g : local output function
  • Symbols
    • x: the input feature
    • h: hidden state
    • co[v]: the set of edges connected to node v
    • ne[v]: the set of neighbors of node v
    • xv: the features of v
    • xco[v]: the features of its edges
    • hne[v]: the states of nodes in the neighborhood of v
    • hco[v]: the states of features in the neighborhood of v

An example of the graph

Example for node l1

  • xl1: the input feature
  • co[l1]: l(1,4),l(1,6),l(3,1),l(1,2)
  • ne[l1]: l2,l3,l4,l6

전체 노드에 대하여 아래의 식으로 나타낼 수 있음

H=F(H,X)

O=G(H,XN)

  • H: the matrices constructed by stacking all the states 중간 산물로 나온 모든 결과물
  • O: the matrices constructed by all the outputs 마지막 output들
  • X: the matrices constructed by all the features
  • Xn: the matrices constructed by all the node features
  • F: the global transition function
  • G: the global output function

여러개의 layer을 거치게 될 경우, Ht+1=F(Ht,X)

Loss Function

loss=pi=1(tioi)

여기서 p는 supervised nodes의 개수를 의미

Learning algorithm

based on a gradient descent strategy and is composed of the following steps

  • The states htv are iteratively updated by hv=f(xv,xco[v],hne[v],xne[v]) until a time step T. Then obtain an approximate fixed point solution of H=F(H,X) : H(T)H
  • The gradient of weights W is computed from the loss
  • The weights W are updated according to the gradient computed in the last step.
반응형

'graph deep learning > #4 Vanilla Graph Neural Networks' 카테고리의 다른 글

4.3 Limitations  (0) 2020.11.07
4.1 Introduction  (0) 2020.11.06
GNN소개 및 내용 정리  (0) 2020.11.06
Comments