Example: marketing

Self-supervised Heterogeneous Graph Neural Network with …

Self-supervised Heterogeneous Graph Neural Network withCo-contrastive LearningXiao University of Posts andTelecommunicationsBeijing, ChinaNian University of Posts andTelecommunicationsBeijing, ChinaHui University of Posts andTelecommunicationsBeijing, ChinaChuan Shi University of Posts andTelecommunicationsBeijing, ChinaABSTRACTH eterogeneous Graph Neural networks (HGNNs) as an emergingtechnique have shown superior capacity of dealing with heteroge-neous information Network (HIN). However, most HGNNs follow asemi-supervised learning manner, which notably limits their wideuse in reality since labels are usually scarce in real applications. Re-cently, contrastive learning, a Self-supervised method, becomes oneof the most exciting learning paradigms and shows great potentialwhen there are no labels.

Telecommunications Beijing, China Hui Han hanhui@bupt.edu.cn Beijing University of Posts and Telecommunications Beijing, China Chuan Shi∗ shichuan@bupt.edu.cn Beijing University of Posts and Telecommunications Beijing, China ABSTRACT Heterogeneous graph neural networks (HGNNs) as an emerging technique have shown superior capacity of dealing ...

Tags:

  Network, Telecommunication, Neural network, Neural

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Self-supervised Heterogeneous Graph Neural Network with …

1 Self-supervised Heterogeneous Graph Neural Network withCo-contrastive LearningXiao University of Posts andTelecommunicationsBeijing, ChinaNian University of Posts andTelecommunicationsBeijing, ChinaHui University of Posts andTelecommunicationsBeijing, ChinaChuan Shi University of Posts andTelecommunicationsBeijing, ChinaABSTRACTH eterogeneous Graph Neural networks (HGNNs) as an emergingtechnique have shown superior capacity of dealing with heteroge-neous information Network (HIN). However, most HGNNs follow asemi-supervised learning manner, which notably limits their wideuse in reality since labels are usually scarce in real applications. Re-cently, contrastive learning, a Self-supervised method, becomes oneof the most exciting learning paradigms and shows great potentialwhen there are no labels.

2 In this paper, we study the problem ofself-supervised HGNNs and propose a novel co-contrastive learningmechanism for HGNNs, named HeCo. Different from traditionalcontrastive learning which only focuses on contrasting positive andnegative samples, HeCo employs cross-view contrastive , two views of a HIN ( Network schema and meta-pathviews) are proposed to learn node embeddings, so as to captureboth of local and high-order structures simultaneously. Then thecross-view contrastive learning, as well as a view mask mecha-nism, is proposed, which is able to extract the positive and negativeembeddings from two views. This enables the two views to collab-oratively supervise each other and finally learn high-level nodeembeddings.

3 Moreover, two extensions of HeCo are designed togenerate harder negative samples with high quality, which furtherboosts the performance of HeCo. Extensive experiments conductedon a variety of real-world networks show the superior performanceof the proposed methods over the CONCEPTS Computing methodologies Machine learning; Networks Network algorithms. Corresponding to make digital or hard copies of all or part of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full citationon the first page. Copyrights for components of this work owned by others than ACMmust be honored.

4 Abstracting with credit is permitted. To copy otherwise, or republish,to post on servers or to redistribute to lists, requires prior specific permission and/or afee. Request permissions from 18, June 03 05, 2018, Woodstock, NY 2018 Association for Computing ISBN 978-1-4503-XXXX-X/18/06.. $ information Network , Heterogeneous Graph neuralnetwork, Contrastive learningACM Reference Format:Xiao Wang, Nian Liu, Hui Han, and Chuan Shi. 2018. Self-supervised Hetero-geneous Graph Neural Network with Co-contrastive Learning. InWoodstock 18: ACM Symposium on Neural Gaze Detection, June 03 05, 2018, Woodstock, , New York, NY, USA, 11 pages. INTRODUCTIONIn the real world, Heterogeneous information Network (HIN) orheterogeneous Graph (HG) [30] is ubiquitous, due to the capacity ofmodeling various types of nodes and diverse interactions betweenthem, such as bibliographic Network [15], biomedical Network [3]and so on.

5 Recently, Heterogeneous Graph Neural networks (HGNNs)have achieved great success in dealing with the HIN data, becausethey are able to effectively combine the mechanism of messagepassing with complex heterogeneity, so that the complex struc-tures and rich semantics can be well captured. So far, HGNNs havesignificantly promoted the development of HIN analysis towardsreal-world applications, , recommend system [6] and securitysystem [7].Basically, most HGNN studies belong to the semi-supervisedlearning paradigm, , they usually design different heterogeneousmessage passing mechanisms to learn node embeddings, and thenthe learning procedure is supervised by a part of node labels. How-ever, the requirement that some node labels have to be knownbeforehand is actually frequently violated, because it is very chal-lenging or expensive to obtain labels in some real-world environ-ments.

6 For example, labeling an unknown gene accurately usuallyneeds the enormous knowledge of molecular biology, which is noteasy even for veteran researchers [15]. Recently, self-supervisedlearning, aiming to spontaneously find supervised signals from thedata itself, becomes a promising solution for the setting withoutexplicit labels [24]. Contrastive learning, as one typical techniqueof Self-supervised learning, has attracted considerable attentions[2,12,13,25,33]. By extracting positive and negative samples [ ] 19 May 2021 Woodstock 18, June 03 05, 2018, Woodstock, NYXiao Wang, Nian Liu, Hui Han, and Chuan Shidata, contrastive learning aims at maximizing the similarity be-tween positive samples while minimizing the similarity betweennegative samples.

7 In this way, contrastive learning is able to learnthe discriminative embeddings even without labels. Despite thewide use of contrastive learning in computer vision [2,13] and nat-ural language processing [4,21], little effort has been made towardsinvestigating the great potential on practice, designing Heterogeneous Graph Neural networks withcontrastive learning is non-trivial, we need to carefully considerthe characteristics of HIN and contrastive learning. This requiresus to address the following three fundamental problems:(1) How to design a Heterogeneous contrastive HINconsists of multiple types of nodes and relations, which naturallyimplies it possesses very complex structures.

8 For example, meta-path, the composition of multiple relations, is usually used to cap-ture the long-range structure in a HIN [31]. Different meta-pathsrepresent different semantics, each of which reflects one aspect ofHIN. To learn an effective node embedding which can fully encodethese semantics, performing contrastive learning only on singlemeta-path view [26] is actually distant from sufficient. Therefore,investigating the Heterogeneous cross-view contrastive mechanismis especially important for HGNNs.(2) How to select proper views in a mentioned before,cross-view contrastive learning is desired for HGNNs. Despite thatone can extract many different views from a HIN because of theheterogeneity, one fundamental requirement is that the selectedviews should cover both of the local and high-order schema, a meta template of HIN [30], reflects the directconnections between nodes, which naturally captures the localstructure.

9 By contrast, meta-path is widely used to extract the high-order structure. As a consequence, both of the Network schema andmeta-path structure views should be carefully considered.(3) How to set a difficult contrastive is well known thata proper contrastive task will further promote to learn a morediscriminative embedding [1,2,32]. If two views are too similar, thesupervised signal will be too weak to learn informative we need to make the contrastive learning on these two viewsmore complicated. For example, one strategy is to enhance theinformation diversity in two views, and the other is to generateharder negative samples of high quality. In short, designing a propercontrastive task is very crucial for this paper, we study the problem of Self-supervised learningon HIN and propose a novel Heterogeneous Graph Neural networkwith co-contrastive learning (HeCo).

10 Specifically, different fromprevious contrastive learning which contrasts original networkand the corrupted Network , we choose Network schema and meta-path structure as two views to collaboratively supervise each Network schema view, the node embedding is learned by ag-gregating information from its direct neighbors, which is able tocapture the local structure. In meta-path view, the node embeddingis learned by passing messages along multiple meta-paths, whichaims at capturing high-order structure. In this way, we design anovel contrastive mechanism, which captures complex structures inHIN. To make contrast harder, we propose a view mask mechanismthat hides different parts of Network schema and meta-path, respec-tively, which will further enhance the diversity of two views andhelp extract higher-level factors from these two views.


Related search queries