Multichannel Adaptive Data Mixture Augmentation for Graph Neural Networks

Multichannel Adaptive Data Mixture Augmentation for Graph Neural Networks

Zhonglin Ye, Lin Zhou, Mingyuan Li, Wei Zhang, Zhen Liu, Haixing Zhao
Copyright: © 2024 |Pages: 14
DOI: 10.4018/IJDWM.349975
Article PDF Download
Open access articles are freely available for download

Abstract

Graph neural networks (GNNs) have demonstrated significant potential in analyzing complex graph-structured data. However, conventional GNNs encounter challenges in effectively incorporating global and local features. Therefore, this paper introduces a novel approach for GNN called multichannel adaptive data mixture augmentation (MAME-GNN). It enhances a GNN by adopting a multi-channel architecture and interactive learning to effectively capture and coordinate the interrelationships between local and global graph structures. Additionally, this paper introduces the polynomial–Gaussian mixture graph interpolation method to address the problem of single and sparse graph data, which generates diverse and nonlinear transformed samples, improving the model's generalization ability. The proposed MAME-GNN is validated through extensive experiments on publicly available datasets, showcasing its effectiveness. Compared to existing GNN models, the MAME-GNN exhibits superior performance, significantly enhancing the model's robustness and generalization ability.
Article Preview
Top

Introduction

With the rapid evolution of information technology and the pervasive integration of the Internet, the graph data structure has become instrumental in modeling many structured or relational systems. In recent years, convolutional neural networks (CNNs) have gained widespread traction in addressing challenges such as image and computer vision (Jiao et al., 2022), encompassing tasks such as image detection and recognition (Wang & Zhu, 2023; Sobha & Latifi, 2023). Given the inherent regularity of images as grid-based datasets, convolution operations lend themselves to straightforward definitions. However, the real world presents a plethora of data exhibiting irregular graph structures, including social networks, citation networks, and biological networks, where nodes represent entities and edges denote relationships between them. To leverage the potential inherent in such graph-structured data, researchers have devised a formidable deep learning tool known as graph neural networks (GNNs; Ding et al., 2019). GNNs typically adhere to a recursive message passing framework, enabling the extraction of profound insights from graph-structured data. They have garnered significant acclaim for their exceptional performance, demonstrating remarkable success across diverse tasks, such as node classification (Wang et al., 2023), link prediction (Barros et al., 2021), and graph classification (Zhou et al., 2020).

The primary advantage of GNNs lies in their efficacy in processing graph-structured data, capturing intricate relationships among nodes and edges while providing a holistic understanding of a graph’s global structure. Nevertheless, despite their proficiency in processing graph data, GNNs face certain challenges. For instance, they are susceptible to the problem of over-smoothing, whereby node characteristics may become excessively similar during the course of multiple message passing iterations, making it arduous for the model to differentiate between distinct nodes. Furthermore, the computational complexity associated with GNNs poses a challenge, especially when dealing with large-scale graphs. To address these challenges, researchers have proposed numerous enhanced GNN models and optimization techniques, which have achieved noteworthy advancements across various tasks.

However, GNNs also encounter certain challenges, prompting the emergence of data augmentation techniques (Ding et al., 2022). Data augmentation technology enhances the model’s robustness and generalization capability by introducing diverse forms of randomness for the training process. This technology has seen extensive applications such as computer vision and natural language processing. Due to the irregularity and non-Euclidean structure of graph data, it is difficult to directly apply the data augmentation techniques used in computer vision and natural language processing (Gao et al., 2022; Nadkarni et al., 2011; Wang et al., 2023; Bastings et al., 2017; Zhang et al., 2022) to the field of graphs (Bayer et al., 2023; Tsaregorodtsev & Belagiannis, 2023; Sultana & Ohashi, 2023). Consequently, researchers have turned their attention to developing GNN-based methods for data augmentation to address these challenges (Xia et al., 2021; Gaudelet et al., 2021).

Complete Article List

Search this Journal:
Reset
Volume 20: 1 Issue (2024)
Volume 19: 6 Issues (2023)
Volume 18: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 17: 4 Issues (2021)
Volume 16: 4 Issues (2020)
Volume 15: 4 Issues (2019)
Volume 14: 4 Issues (2018)
Volume 13: 4 Issues (2017)
Volume 12: 4 Issues (2016)
Volume 11: 4 Issues (2015)
Volume 10: 4 Issues (2014)
Volume 9: 4 Issues (2013)
Volume 8: 4 Issues (2012)
Volume 7: 4 Issues (2011)
Volume 6: 4 Issues (2010)
Volume 5: 4 Issues (2009)
Volume 4: 4 Issues (2008)
Volume 3: 4 Issues (2007)
Volume 2: 4 Issues (2006)
Volume 1: 4 Issues (2005)
View Complete Journal Contents Listing