yuns
6.4 Sentence LSTM 본문
반응형
- Text Encoding을 향상시키기 위하여 S-LSTM이 제안되었음
- text를 graph로 변환한 뒤, representation을 학습하여 Graph-LSTM을 용이하게 함.
- S-LSTM model regards each word as a node in the graph and it adds a supernode.
- For each node, the word node could aggregate information from its adjacent words as well as the supernode.
- The supernode could aggregate information from all of the word nodes as well as itself.
- NLP
- Hidden states of words can be used to solve the word-level tasks such as sequence labeling, POS tagging and so on.
반응형
'graph deep learning > #6 Graph Recurrent Networks' 카테고리의 다른 글
6.3 Graph LSTM (0) | 2020.11.22 |
---|---|
6.2 Tree LSTM (0) | 2020.11.20 |
6.1 Gated Graph Neural Networks (0) | 2020.11.20 |
Comments