Example: bankruptcy

Graph Representation Learning - McGill University School ...

Graph Representation LearningWilliam L. HamiltonMcGill University2020 Pre-publication draft of a book to be published byMorgan & Claypool version released with relevant copyrights held by the author andpublisher extend to this pre-publication : William L. Hamilton. (2020). Graph Representation Lectures on Artificial Intelligence and Machine Learning , Vol. 14,No. 3 , Pages data is ubiquitous throughout the natural and social sciences,from telecommunication networks to quantum chemistry. Building relational inductivebiases into deep Learning architectures is crucial if we want systems that can learn,reason, and generalize from this kind of data. Recent years have seen a surge in researchon Graph Representation Learning , including techniques for deep Graph embeddings,generalizations of convolutional neural networks to Graph -structured data, and neuralmessage-passing approaches inspired by belief propagation.

ii Abstract Graph-structured data is ubiquitous throughout the natural and social sciences, from telecommunication networks to quantum chemistry.

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Graph Representation Learning - McGill University School ...

1 Graph Representation LearningWilliam L. HamiltonMcGill University2020 Pre-publication draft of a book to be published byMorgan & Claypool version released with relevant copyrights held by the author andpublisher extend to this pre-publication : William L. Hamilton. (2020). Graph Representation Lectures on Artificial Intelligence and Machine Learning , Vol. 14,No. 3 , Pages data is ubiquitous throughout the natural and social sciences,from telecommunication networks to quantum chemistry. Building relational inductivebiases into deep Learning architectures is crucial if we want systems that can learn,reason, and generalize from this kind of data. Recent years have seen a surge in researchon Graph Representation Learning , including techniques for deep Graph embeddings,generalizations of convolutional neural networks to Graph -structured data, and neuralmessage-passing approaches inspired by belief propagation.

2 These advances in graphrepresentation Learning have led to new state-of-the-art results in numerous domains,including chemical synthesis, 3D-vision, recommender systems, question answering,and social network goal of this book is to provide a synthesis and overview of Graph representationlearning. We begin with a discussion of the goals of Graph Representation Learning , aswell as key methodological foundations in Graph theory and network analysis. Follow-ing this, we introduce and review methods for Learning node embeddings, includingrandom-walk based methods and applications to knowledge graphs. We then providea technical synthesis and introduction to the highly successful Graph neural network(GNN) formalism, which has become a dominant and fast-growing paradigm for deeplearning with Graph data.

3 The book concludes with a synthesis of recent advancementsin deep generative models for graphs a nascent, but quickly growing subset of graphrepresentation What is a Graph ? .. Graphs .. Information .. Machine Learning on graphs .. classification .. prediction .. and community detection .. classification, regression, and clustering ..72 Background and Traditional Graph Statistics and Kernel Methods .. statistics and features .. features and Graph kernels .. Neighborhood Overlap Detection .. overlap measures .. overlap measures .. Graph Laplacians and Spectral Methods .. Laplacians .. Cuts and Clustering .. spectral clustering .. Towards Learned Representations.

4 27I Node Embeddings283 Neighborhood Reconstruction An Encoder-Decoder Perspective .. Encoder .. Decoder .. an Encoder-Decoder Model .. of the Encoder-Decoder Approach .. Factorization-based approaches .. Random walk embeddings .. walk methods and matrix factorization .. Limitations of Shallow Embeddings .. 364 Multi-relational Data and Knowledge Reconstructing multi-relational data .. Loss functions .. Multi-relational decoders .. abilities .. 44II Graph Neural Networks465 The Graph Neural Network Neural Message Passing .. of the Message Passing Framework .. and Intuitions .. Basic GNN .. Passing with Self-loops.

5 Generalized Neighborhood Aggregation .. Normalization .. Aggregators .. Attention .. Generalized Update Methods .. and Skip-Connections .. Updates .. Knowledge Connections .. Edge Features and Multi-relational GNNs .. Graph Neural Networks .. and Feature Concatenation .. Graph Pooling .. Generalized Message Passing .. 666 Graph Neural Networks in Applications and Loss Functions .. for Node Classification .. for Graph Classification .. for Relation Prediction .. GNNs .. Efficiency Concerns and Node Sampling .. Implementations .. and Mini-Batching .. Parameter Sharing and Regularization .. 73ivv7 Theoretical GNNs and Graph Convolutions.

6 And the Fourier Transform .. Time Signals to Graph Signals .. Graph Convolutions .. GNNs .. GNNs and Probabilistic Graphical Models .. Space Embeddings of Distributions .. as Graphical Models .. mean-field inference .. and PGMs More Generally .. GNNs and Graph Isomorphism .. Isomorphism .. Isomorphism and Representational Capacity .. Weisfieler-Lehman Algorithm .. and the WL Algorithm .. the WL Algorithm .. 97 III Generative Graph Models1028 Traditional Graph Generation Overview of Traditional Approaches .. Erd os-R enyi Model .. Stochastic Block Models .. Preferential Attachment .. Traditional Applications .. 1079 Deep Generative Variational Autoencoder Approaches.

7 Latents .. Latents .. Adversarial Approaches .. Autoregressive Methods .. Edge Dependencies .. Models for Graph Generation .. Evaluating Graph Generation .. Molecule Generation .. 121 Conclusion123 Bibliography125 PrefaceThe field of Graph Representation Learning has grown at an incredible and some-times unwieldy pace over the past seven years. I first encountered this area asa graduate student in 2013, during the time when many researchers began in-vestigating deep Learning methods for embedding Graph -structured data. Inthe years since 2013, the field of Graph Representation Learning has witnesseda truly impressive rise and expansion from the development of the standardgraph neural network paradigm to the nascent work on deep generative mod-els of Graph -structured data.

8 The field has transformed from a small subsetof researchers working on a relatively niche topic to one of the fastest growingsub-areas of deep , as the field as grown, our understanding of the methods and the-ories underlying Graph Representation Learning has also stretched backwardsthrough time. We can now view the popular node embedding methods aswell-understood extensions of classic work on dimensionality reduction. Wenow have an understanding and appreciation for how Graph neural networksevolved somewhat independently from historically rich lines of work on spec-tral Graph theory, harmonic analysis, variational inference, and the theory ofgraph isomorphism. This book is my attempt to synthesize and summarizethese methodological threads in a practical way.

9 My hope is to introduce thereader to the current practice of the field, while also connecting this practice tobroader lines of historical research in machine Learning and audienceThis book is intended for a graduate-level researcherin machine Learning or an advanced undergraduate student. The discussionsof Graph -structured data and Graph properties are relatively , the book does assume a background in machine Learning and afamiliarity with modern deep Learning methods ( , convolutional and re-current neural networks). Generally, the book assumes a level of machinelearning and deep Learning knowledge that one would obtain from a text-book such as Goodfellow et al. [2016] sDeep Learning L. HamiltonAugust 2020viAcknowledgmentsOver the past several years, I have had the good fortune to work with manyoutstanding collaborators on topics related to Graph Representation Learning many of whom have made seminal contributions to this nascent field.

10 I amdeeply indebted to all these collaborators and friends: my colleagues at Stanford, McGill , University of Toronto, and elsewhere; my graduate students at McGill who taught me more than anyone else the value of pedagogical writing; and myPhD advisors Dan Jurafsky and Jure Leskovec who encouraged and seededthis path for my also owe a great debt of gratitude to the students of my winter 2020 gradu-ate seminar at McGill University . These students were the early beta testers of this material, and this book would not exist without their feedback and en-couragement. In a similar vein, the exceptionally detailed feedback provided byPetar Veli ckovi c, as well as comments by Mari a C. V. Nascimento, Jian Tang,`Alex Ferrer Campo, Seyed Mohammad Sadegh Mahdavi, Yawei Li, XiaofengChen, and Gabriele Corso were invaluable during revisions of the book is written in a vacuum.


Related search queries