Graph Theory and Network Science for Natural Language Processing – Part 4, TextGraphs & Graph Neural Networks

We close our series on Graph Theory and Network Science for NLP with an overview of the recent developments in the field. It means deep learning enters the scene – finally.

TextGraphs

Tex graphs are representations of texts, based on word co-occurrences or semantic similarity. As we saw in the previous posts, this is a very powerful representation. The annual TextGraphs workshops are the best place to find out what’s going on in this field. Here we just cherry pick a few notable algorithms.

TextRank

The TextRank algorithm for keyphrase extraction and summarization is probably the most famous Graph-Based NLP algorithm. It is implemented in pure Python and is part of the main NLP packages (e.g. SpaCy, gensim). Its basic idea is very simple. A text graph is constructed from skip-grams and a centrality measure (usually PageRank) is used to find the most important words/phrases/sentences. Thanks to recent modifications of the algorithm, text graphs are constructed from word or sentence embeddings. This algorithm is one of the best unsupervised keyphrase extractors and extractive summarizers available these days.

corpus2graph

These days everyone is talking about neural embeddings – either word, sentence or text. These embeddings are cool, but usually need a huge corpus. In this case huge means really huge – its size is a couple of terabytes. To process the billion parameters hundreds of GPU hours are needed. Consequently, most of the time, we use pre-trained models and fine-tune them on downstream tasks. But sometimes fine-tuning doesn’t give good results, or it there is no neural embedding for the language we are working with. corpus2graph is a wonderful tool for converting your corpus into a text graph representation. It’s worth reading this paper on this tool. If you have your corpus in text graph representation, you can do lots of interesting things. E.g. you can use TextRank to find keyphrases and/or to summarize your documents.

Semantic similarity graphs

If you can embed your words or sentences, you can turn it into a graph. For example, by using scikit-learn Nearest Neighbors, you can build a graph in which edges are neighbors, and their similarity should exceed a given threshold. Cosine similarity can be applied to assign weights to the edges.

Knowledge Graphs

Semantic similarity is a good way to extract knowledge from unstructured text, however grammatical parsing and relation extraction can be even more powerful.

Graph Neural Networks

Historically, Graph Neural Networks (or GNNs) were inspired by word2vec. The basic idea is simply to construct sequences from random walks in the graph, so you can treat them as sentences in word2vec. You can embed not only the nodes, but the edges as well (see e.g. edge2vec). This paper on arXiv gives a great overview of Graph Neural Networks. If you want to get into the nitty-gritty details, follow this collection of must-read papers on graph neural networks. You can do really amazing things with GNNs in NLP. You can learn syntactic relations, or you can make your knowledge graphs better. Read this short post on the use of GNNs in NLP to get more ideas.

GNNs & Python

Our absolute favorite is geometric. We are biased towards PyTorch, and have to admit that it is not the most beginner-friendly library, but we encourage you to give it a try.

StellarGraph is a beautiful library for working with GNNs. It’s built on the top of networkx, uses TensorFlow as a backend and is compatible with scikit-learn.

Spektral is probably the most user-friendly GNN library. It is based on Keras and TensorFlow.

If you are looking for a robust and scalable solution, use DGL (the Deep Graph Library). It works with every major tensor backend (e.g. PyTorch, TensorFlow) and runs well with MXNet. It means you can easily deploy it on Azure, AWS, and Google Cloud.

Conclusion

We hope you enjoyed our journey into the field of Graph-Based NLP. Since this filed is very versatile, we couldn’t cover everything. We just wanted to show you the big picture and to give you some hints on how to get started with graph theory and network science in NLP. We wish you happy hacking with graphs!

Previous Post of the Series

One-Time
Monthly
Yearly

Make a one-time donation

Make a monthly donation

Make a yearly donation

Choose an amount

€5.00
€15.00
€100.00
€5.00
€15.00
€100.00
€5.00
€15.00
€100.00

Or enter a custom amount


Your contribution is appreciated.

Your contribution is appreciated.

Your contribution is appreciated.

DonateDonate monthlyDonate yearly

Do you like our visualizations? Buy them for yourself!

Visit our shop on Society6 to get a printout of our vizs.