
Graph Autoencoder for Graph Compression and Representation Learning
Apr 1, 2021 · We propose a novel Graph Autoencoder structure, MIAGAE, which achieves state-of-the-art performance on graph compression and representation learning tasks.
Pangyk/Graph_AE - GitHub
This repository contains an implementation of the models introduced in the paper Graph Autoencoder for Graph Compression and Representation Learning by Ge et al. The model …
HC-GAE: The Hierarchical Cluster-based Graph Auto-Encoder for Graph ...
Graph Auto-Encoders (GAEs) are powerful tools for graph representation learning. In this paper, we develop a novel Hierarchical Cluster-based GAE (HC-GAE), that can learn effective …
In this paper, we propose an auto graph encoder-decoder model compression (AGMC) method combined with graph neural networks (GNN) and reinforce-ment learning (RL) to find the best …
Graph Convolutional Networks With Autoencoder-Based Compression …
Apr 27, 2022 · The proposed architecture, named Autoencoder-Aided GCN (AA-GCN), compresses the convolutional features in an information-rich embedding at multiple hidden …
Spotlight 8: Yunhao Ge, Graph Autoencoder for Graph Compression …
To fill this gap, we propose Multi-kernel Inductive Attention Graph Autoencoder (MIAGAE), which, instead of compressing nodes/edges separately, utilizes the node similarity and graph …
Graph Autoencoders: A Deep Learning Approach
Graph Generation: Given a set of example graphs, we can use a graph autoencoder to learn a compressed representation of the graphs that captures their most important features. We can …
Deep graph embedding learning based on multi-variational graph ...
6 days ago · propose a dual-decoder graph autoencoder model (DGAE) for unsupervised graph representation learning, which encode the graph structure and node attributes with a GCNs, …
Attention Graph Autoencoder (MIAGAE), a GAE structure with GNN for graph compression and graph representation learning (Hamilton et al., 2017b; Goyal & Ferrara, 2018) (Fig. 1). …
Graph Autoencoder for Graph Compression and Representation Learning
To fill this gap, we propose Multi-kernel Inductive Attention Graph Autoencoder (MIAGAE), which, instead of compressing nodes/edges separately, utilizes the node similarity and graph …