# Graphgen-redux: a Fast and Lightweight Recurrent Model for labeled Graph Generation

@article{Podda2021GraphgenreduxAF, title={Graphgen-redux: a Fast and Lightweight Recurrent Model for labeled Graph Generation}, author={Marco Podda and Davide Bacciu}, journal={2021 International Joint Conference on Neural Networks (IJCNN)}, year={2021}, pages={1-8} }

The problem of labeled graph generation is gaining attention in the Deep Learning community. The task is challenging due to the sparse and discrete nature of graph spaces. Several approaches have been proposed in the literature, most of which require to transform the graphs into sequences that encode their structure and labels and to learn the distribution of such sequences through an auto-regressive generative model. Among this family of approaches, we focus on the Graphgen model. The… Expand

#### References

SHOWING 1-10 OF 37 REFERENCES

Edge-based sequential graph generation with recurrent neural networks

- Computer Science, Mathematics
- Neurocomputing
- 2020

This work proposes to cast the generative process of a graph into a sequential one, relying on a node ordering procedure, to design a novel generative model composed of two recurrent neural networks that learn to predict the edges of graphs. Expand

GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models

- Computer Science
- ICML
- 2018

The experiments show that GraphRNN significantly outperforms all baselines, learning to generate diverse graphs that match the structural characteristics of a target set, while also scaling to graphs 50 times larger than previous deep models. Expand

Learning Deep Generative Models of Graphs

- Computer Science, Mathematics
- ICLR 2018
- 2018

This work is the first and most general approach for learning generative models over arbitrary graphs, and opens new directions for moving away from restrictions of vector- and sequence-like knowledge representations, toward more expressive and flexible relational data structures. Expand

GRAM: Scalable Generative Models for Graphs with Graph Attention Mechanism

- Computer Science, Mathematics
- ArXiv
- 2019

This paper proposes GRAM, a generative model for graphs that is scalable in all three contexts, especially in training, and aims to achieve scalability by employing a novel graph attention mechanism, formulating the likelihood of graphs in a simple and general manner. Expand

A Systematic Survey on Deep Generative Models for Graph Generation

- Computer Science, Mathematics
- ArXiv
- 2020

An extensive overview of the literature in the field of deep generative models for graph generation is provided and two taxonomies of deep Generative Models for unconditional, and conditional graph generation respectively are proposed. Expand

GraphGen: A Scalable Approach to Domain-agnostic Labeled Graph Generation

- Computer Science, Mathematics
- WWW
- 2020

Extensive experiments on million-sized, real graph datasets show GraphGen to be 4 times faster on average than state-of-the-art techniques while being significantly better in quality across a comprehensive set of 11 different metrics. Expand

GraphGAN: Graph Representation Learning with Generative Adversarial Nets

- Computer Science, Mathematics
- AAAI
- 2018

GraphGAN is proposed, an innovative graph representation learning framework unifying above two classes of methods, in which the generative model and discriminative model play a game-theoretical minimax game. Expand

Graphite: Iterative Generative Modeling of Graphs

- Computer Science, Mathematics
- ICML
- 2019

This work proposes Graphite, an algorithmic framework for unsupervised learning of representations over nodes in large graphs using deep latent variable generative models, parameterizes variational autoencoders (VAE) with graph neural networks, and uses a novel iterative graph refinement strategy inspired by low-rank approximations for decoding. Expand

Graph generation by sequential edge prediction

- Computer Science
- ESANN
- 2019

A recurrent Deep Learning based model to generate graphs by learning to predict their ordered edge sequence is proposed, outperforming canonical graph generative models from graph theory, and reaching performances comparable to the current state of the art on graph generation. Expand

Learning Graphical State Transitions

- Computer Science
- ICLR
- 2017

The Gated Graph Transformer Neural Network (GGTNN), an extension of GGS-NNs that uses graph-structured data as an intermediate representation that can learn to construct and modify graphs in sophisticated ways based on textual input, and also to use the graphs to produce a variety of outputs. Expand