PDF4PRO ⚡AMP

Modern search engine that looking for books and documents around the web

Example: biology

Graph Transformer Networks - NeurIPS

Graph Transformer NetworksSeongjun Yun, Minbyul Jeong, Raehyun Kim, Jaewoo Kang , Hyunwoo J. Kim Department of Computer Science and EngineeringKorea University{ysj5419, minbyuljeong, raehyun, kangj, neural Networks (GNNs) have been widely used in representation learning ongraphs and achieved state-of-the-art performance in tasks such as node classificationand link prediction. However, most existing GNNs are designed to learn noderepresentations on thefixedandhomogeneousgraphs. The limitations especiallybecome problematic when learning representations on a misspecified Graph oraheterogeneousgraph that consists of various types of nodes and edges.}

remarkable success in representation learning, GNNs learn a powerful representation for given tasks and data. To improve performance or scalability, generalized convolution based on spectral convolution [4, 26], attention mechanism on neighbors [25, 33], subsampling [6, 7] and inductive representation for a large graph [14] have been studied.

Tags:

  Large, Learning, Representation, Transformers, Inductive, Representation learning, Inductive representation

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Spam in document Broken preview Other abuse

Transcription of Graph Transformer Networks - NeurIPS

Related search queries