Graph Attention Networks: A Comprehensive Review of Methods and Applications
Abstract
:1. Introduction
2. Graph Neural Networks
2.1. Graph Convolution Networks
2.2. Graph Attention Networks
2.3. Graph Attention Network Version 2 (GATv2)
3. Graph Attention Network Categories
3.1. Global Attention Networks
3.2. Multi-Layer Graph Attention Networks
3.3. Graph-Embedding GATs
3.4. Spatial GATs
3.5. Variational GATs
3.6. Hybrid GATs
Model Name | Year | Citation Count | Cites Per Year |
---|---|---|---|
N/A [29] | 2019 | 115 | 28.75 |
Mgat [84] | 2020 | 44 | 14.67 |
N/A [68] | 2022 | 40 | 40.00 |
WFCG [114] | 2022 | 39 | 39.00 |
HGAT [77] | 2020 | 37 | 12.33 |
HGAT [28] | 2021 | 34 | 17.00 |
Mgat [83] | 2020 | 32 | 10.67 |
N/A [123] | 2020 | 31 | 10.33 |
MAGAT [124] | 2021 | 28 | 14.00 |
N/A [125] | 2021 | 25 | 12.50 |
GATrust [43] | 2022 | 25 | 25.00 |
N/A [103] | 2022 | 23 | 23.00 |
GATMDA [126] | 2021 | 22 | 11.00 |
MGA-Net [52] | 2022 | 20 | 20.00 |
Hawk [127] | 2021 | 19 | 9.50 |
ResGAT [26] | 2021 | 19 | 9.50 |
MEGAN [122] | 2021 | 19 | 9.50 |
GANLDA [40] | 2022 | 19 | 19.00 |
SRGAT [128] | 2021 | 18 | 9.00 |
PD-RGAT[109] | 2022 | 18 | 18.00 |
HLGAT [76] | 2021 | 18 | 9.00 |
HGAT [11] | 2020 | 16 | 5.33 |
RRL-GAT [19] | 2022 | 16 | 16.00 |
N/A [82] | 2021 | 16 | 8.00 |
HEAT [18] | 2022 | 15 | 15.00 |
ASTGAT [66] | 2022 | 14 | 14.00 |
RA-AGAT [9] | 2022 | 14 | 14.00 |
SSGAT [72] | 2022 | 12 | 12.00 |
MDGAT [25] | 2021 | 12 | 6.00 |
FTPG [65] | 2022 | 12 | 12.00 |
Gchgat [49] | 2022 | 11 | 11.00 |
HGATLDA [50] | 2022 | 10 | 10.00 |
STGGAT [70] | 2022 | 10 | 10.00 |
PSCR [81] | 2021 | 10 | 5.00 |
EGAT [71] | 2022 | 10 | 10.00 |
KGAT [35] | 2022 | 10 | 10.00 |
4. Applications of Graph Attention Networks
4.1. Recommendation
4.2. Biomarker–Disease Association
4.3. Sentiment Analysis
4.4. Image Analysis
4.5. Anomaly Detection
5. Discussion
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Barabási, A.L. Network Science; Cambridge University Press: Cambridge, UK, 2016. [Google Scholar]
- Labonne, M. Hands-On Graph Neural Networks Using Python; Packt: Birmingham, UK, 2023. [Google Scholar]
- Kipf, T.N.; Welling, M. Semi-Supervised Classification with Graph Convolutional Networks. arXiv 2016, arXiv:1609.02907. [Google Scholar]
- Abdel-Basset, M.; Moustafa, N.; Hawash, H.; Tari, Z. Responsible Graph Neural Networks; CRC Press: Boca Raton, FL, USA, 2023. [Google Scholar]
- Liu, Y.; Yang, S.; Xu, Y.; Miao, C.; Wu, M.; Zhang, J. Contextualized Graph Attention Network for Recommendation With Item Knowledge Graph. IEEE Trans. Knowl. Data Eng. 2023, 35, 181–195. [Google Scholar] [CrossRef]
- Shan, Y.; Che, C.; Wei, X.; Wang, X.; Zhu, Y.; Jin, B. Bi-graph attention network for aspect category sentiment classification. Knowl.-Based Syst. 2022, 258, 109972. [Google Scholar] [CrossRef]
- Yang, Z.; Liu, J.; Shah, H.A.; Feng, J. A novel hybrid framework for metabolic pathways prediction based on the graph attention network. BMC Bioinform. 2022, 23, 329. [Google Scholar] [CrossRef] [PubMed]
- Fang, L.; Sun, T.; Wang, S.; Fan, H.; Li, J. A graph attention network for road marking classification from mobile LiDAR point clouds. Int. J. Appl. Earth Obs. Geoinf. 2022, 108, 102735. [Google Scholar] [CrossRef]
- Feng, S.; Xu, C.; Zuo, Y.; Chen, G.; Lin, F.; XiaHou, J. Relation-aware dynamic attributed graph attention network for stocks recommendation. Pattern Recognit. 2022, 121, 108119. [Google Scholar] [CrossRef]
- Qin, C.; Zhang, Y.; Liu, Y.; Coleman, S.; Du, H.; Kerr, D. A visual place recognition approach using learnable feature map filtering and graph attention networks. Neurocomputing 2021, 457, 277–292. [Google Scholar] [CrossRef]
- Li, K.; Feng, Y.; Gao, Y.; Qiu, J. Hierarchical graph attention networks for semi-supervised node classification. Appl. Intell. 2020, 50, 3441–3451. [Google Scholar] [CrossRef]
- Rassil, A.; Chougrad, H.; Zouaki, H. Holistic Graph Neural Networks based on a global-based attention mechanism. Knowl.-Based Syst. 2022, 240, 108105. [Google Scholar] [CrossRef]
- Hsu, Y.L.; Tsai, Y.C.; Li, C.T. FinGAT: Financial Graph Attention Networks for Recommending Top-KK Profitable Stocks. IEEE Trans. Knowl. Data Eng. 2023, 35, 469–481. [Google Scholar] [CrossRef]
- Ye, Y.; Ji, S. Sparse Graph Attention Networks. IEEE Trans. Knowl. Data Eng. 2023, 35, 905–916. [Google Scholar] [CrossRef]
- Xu, Y.; Fang, Y.; Huang, C.; Liu, Z. HGHAN: Hacker group identification based on heterogeneous graph attention network. Inf. Sci. 2022, 612, 848–863. [Google Scholar] [CrossRef]
- Cao, R.; He, C.; Wei, P.; Su, Y.; Xia, J.; Zheng, C. Prediction of circRNA-Disease Associations Based on the Combination of Multi-Head Graph Attention Network and Graph Convolutional Network. Biomolecules 2022, 12, 932. [Google Scholar] [CrossRef]
- Xie, Z.; Zhu, R.; Zhao, K.; Liu, J.; Zhou, G.; Huang, J.X. Dual Gated Graph Attention Networks with Dynamic Iterative Training for Cross-Lingual Entity Alignment. ACM Trans. Inf. Syst. 2021, 40, 1165. [Google Scholar] [CrossRef]
- Mo, X.; Huang, Z.; Xing, Y.; Lv, C. Multi-Agent Trajectory Prediction With Heterogeneous Edge-Enhanced Graph Attention Network. IEEE Trans. Intell. Transp. Syst. 2022, 23, 9554–9567. [Google Scholar] [CrossRef]
- Hu, B.; Guo, K.; Wang, X.; Zhang, J.; Zhou, D. RRL-GAT: Graph Attention Network-Driven Multilabel Image Robust Representation Learning. IEEE Internet Things J. 2022, 9, 9167–9178. [Google Scholar] [CrossRef]
- Li, Z.; Zhong, T.; Huang, D.; You, Z.H.; Nie, R. Hierarchical graph attention network for miRNA-disease association prediction. Mol. Ther. 2022, 30, 1775–1786. [Google Scholar] [CrossRef] [PubMed]
- Yan, H.; Wang, J.; Chen, J.; Liu, Z.; Feng, Y. Virtual sensor-based imputed graph attention network for anomaly detection of equipment with incomplete data. J. Manuf. Syst. 2022, 63, 52–63. [Google Scholar] [CrossRef]
- Chen, L.; Cao, J.; Wang, Y.; Liang, W.; Zhu, G. Multi-view Graph Attention Network for Travel Recommendation. Expert Syst. Appl. 2022, 191, 116234. [Google Scholar] [CrossRef]
- Buterez, D.; Bica, I.; Tariq, I.; Andrés-Terré, H.; Liò, P. CellVGAE: An unsupervised scRNA-seq analysis workflow with graph attention networks. Bioinformatics 2021, 38, 1277–1286. [Google Scholar] [CrossRef]
- Hu, J.; Cao, L.; Li, T.; Dong, S.; Li, P. GAT-LI: A graph attention network based learning and interpreting method for functional brain network classification. BMC Bioinform. 2021, 22, 379. [Google Scholar] [CrossRef]
- Shi, C.; Chen, X.; Huang, K.; Xiao, J.; Lu, H.; Stachniss, C. Keypoint Matching for Point Cloud Registration Using Multiplex Dynamic Graph Attention Networks. IEEE Robot. Autom. Lett. 2021, 6, 8221–8228. [Google Scholar] [CrossRef]
- Huang, J.; Guan, L.; Su, Y.; Yao, H.; Guo, M.; Zhong, Z. A topology adaptive high-speed transient stability assessment scheme based on multi-graph attention network with residual structure. Int. J. Electr. Power Energy Syst. 2021, 130, 106948. [Google Scholar] [CrossRef]
- Ji, C.; Wang, Y.; Ni, J.; Zheng, C.; Su, Y. Predicting miRNA-Disease Associations Based on Heterogeneous Graph Attention Networks. Front. Genet. 2021, 12, 727744. [Google Scholar] [CrossRef]
- Yang, T.; Hu, L.; Shi, C.; Ji, H.; Li, X.; Nie, L. HGAT: Heterogeneous Graph Attention Networks for Semi-Supervised Short Text Classification. ACM Trans. Inf. Syst. 2021, 39, 32. [Google Scholar] [CrossRef]
- Wang, P.; Wu, Q.; Cao, J.; Shen, C.; Gao, L.; Hengel, A.v.d. Neighbourhood Watch: Referring Expression Comprehension via Language-Guided Graph Attention Networks. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019; pp. 1960–1968. [Google Scholar] [CrossRef]
- Huang, J.; Luo, K.; Cao, L.; Wen, Y.; Zhong, S. Learning Multiaspect Traffic Couplings by Multirelational Graph Attention Networks for Traffic Prediction. IEEE Trans. Intell. Transp. Syst. 2022, 23, 20681–20695. [Google Scholar] [CrossRef]
- Li, Z.; Zhao, Y.; Zhang, Y.; Zhang, Z. Multi-relational graph attention networks for knowledge graph completion. Knowl.-Based Syst. 2022, 251, 109262. [Google Scholar] [CrossRef]
- Wang, W.; Chen, H. Predicting miRNA-disease associations based on graph attention networks and dual Laplacian regularized least squares. Briefings Bioinform. 2022, 23, bbac292. [Google Scholar] [CrossRef]
- Zhao, K.; Liu, J.; Xu, Z.; Liu, X.; Xue, L.; Xie, Z.; Zhou, Y.; Wang, X. Graph4Web: A relation-aware graph attention network for web service classification. J. Syst. Softw. 2022, 190, 111324. [Google Scholar] [CrossRef]
- Yuan, J.; Cao, M.; Cheng, H.; Yu, H.; Xie, J.; Wang, C. A unified structure learning framework for graph attention networks. Neurocomputing 2022, 495, 194–204. [Google Scholar] [CrossRef]
- Shimizu, R.; Matsutani, M.; Goto, M. An explainable recommendation framework based on an improved knowledge graph attention network with massive volumes of side information. Knowl.-Based Syst. 2022, 239, 107970. [Google Scholar] [CrossRef]
- Jiang, L.; Sun, J.; Wang, Y.; Ning, Q.; Luo, N.; Yin, M. Identifying drug–target interactions via heterogeneous graph attention networks combined with cross-modal similarities. Briefings Bioinform. 2022, 23, bbac016. [Google Scholar] [CrossRef] [PubMed]
- Safai, A.; Vakharia, N.; Prasad, S.; Saini, J.; Shah, A.; Lenka, A.; Pal, P.K.; Ingalhalikar, M. Multimodal Brain Connectomics-Based Prediction of Parkinson’s Disease Using Graph Attention Networks. Front. Neurosci. 2022, 15, 741489. [Google Scholar] [CrossRef] [PubMed]
- Zhao, Z.; Yang, B.; Li, G.; Liu, H.; Jin, Z. Precise Learning of Source Code Contextual Semantics via Hierarchical Dependence Structure and Graph Attention Networks. J. Syst. Softw. 2022, 184, 111108. [Google Scholar] [CrossRef]
- Long, Y.; Zhang, Y.; Wu, M.; Peng, S.; Kwoh, C.K.; Luo, J.; Li, X. Heterogeneous graph attention networks for drug virus association prediction. Methods 2022, 198, 11–18. [Google Scholar] [CrossRef]
- Lan, W.; Wu, X.; Chen, Q.; Peng, W.; Wang, J.; Chen, Y.P. GANLDA: Graph attention network for lncRNA-disease associations prediction. Neurocomputing 2022, 469, 384–393. [Google Scholar] [CrossRef]
- Wang, X.; Liu, X.; Wu, H.; Liu, J.; Chen, X.; Xu, Z. Jointly learning invocations and descriptions for context-aware mashup tagging with graph attention network. World Wide Web 2022, 26, 1295–1322. [Google Scholar] [CrossRef]
- Long, J.; Zhang, R.; Yang, Z.; Huang, Y.; Liu, Y.; Li, C. Self-Adaptation Graph Attention Network via Meta-Learning for Machinery Fault Diagnosis With Few Labeled Data. IEEE Trans. Instrum. Meas. 2022, 71, 1–11. [Google Scholar] [CrossRef]
- Jiang, N.; Jie, W.; Li, J.; Liu, X.; Jin, D. GATrust: A Multi-Aspect Graph Attention Network Model for Trust Assessment in OSNs. IEEE Trans. Knowl. Data Eng. 2022, 35, 5865–5878. [Google Scholar] [CrossRef]
- Zhou, Y.; Shen, J.; Zhang, X.; Yang, W.; Han, T.; Chen, T. Automatic source code summarization with graph attention networks. J. Syst. Softw. 2022, 188, 111257. [Google Scholar] [CrossRef]
- Feng, Y.Y.; Yu, H.; Feng, Y.H.; Shi, J.Y. Directed graph attention networks for predicting asymmetric drug–drug interactions. Briefings Bioinform. 2022, 23, bbac151. [Google Scholar] [CrossRef]
- Xu, D.; Alameda-Pineda, X.; Ouyang, W.; Ricci, E.; Wang, X.; Sebe, N. Probabilistic Graph Attention Network With Conditional Kernels for Pixel-Wise Prediction. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 2673–2688. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Z.; Huang, J.; Tan, Q. Association Rules Enhanced Knowledge Graph Attention Network. Knowl.-Based Syst. 2022, 239, 108038. [Google Scholar] [CrossRef]
- Lai, B.; Xu, J. Accurate protein function prediction via graph attention networks with predicted structure information. Briefings Bioinform. 2021, 23, bbab502. [Google Scholar] [CrossRef] [PubMed]
- Lan, W.; Dong, Y.; Chen, Q.; Zheng, R.; Liu, J.; Pan, Y.; Chen, Y.P.P. KGANCDA: Predicting circRNA-disease associations based on knowledge graph attention network. Briefings Bioinform. 2021, 23, bbab494. [Google Scholar] [CrossRef]
- Zhao, X.; Zhao, X.; Yin, M. Heterogeneous graph attention network based on meta-paths for lncRNA–disease association prediction. Briefings Bioinform. 2021, 23, bbab407. [Google Scholar] [CrossRef]
- Zhao, Y.; Zhou, H.; Zhang, A.; Xie, R.; Li, Q.; Zhuang, F. Connecting Embeddings Based on Multiplex Relational Graph Attention Networks for Knowledge Graph Entity Typing. IEEE Trans. Knowl. Data Eng. 2023, 35, 4608–4620. [Google Scholar] [CrossRef]
- Yang, M.; Bai, X.; Wang, L.; Zhou, F. Mixed Loss Graph Attention Network for Few-Shot SAR Target Classification. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–13. [Google Scholar] [CrossRef]
- Chen, H.; Mei, X.; Ma, Z.; Wu, X.; Wei, Y. Spatial–temporal graph attention network for video anomaly detection. Image Vis. Comput. 2023, 131, 104629. [Google Scholar] [CrossRef]
- Gao, J.; Gao, J.; Ying, X.; Lu, M.; Wang, J. Higher-order Interaction Goes Neural: A Substructure Assembling Graph Attention Network for Graph Classification. IEEE Trans. Knowl. Data Eng. 2021, 35, 1594–1608. [Google Scholar] [CrossRef]
- Zhao, M.; Wang, L.; Jiang, Z.; Li, R.; Lu, X.; Hu, Z. Multi-task learning with graph attention networks for multi-domain task-oriented dialogue systems. Knowl.-Based Syst. 2023, 259, 110069. [Google Scholar] [CrossRef]
- Guan, X.; Xing, W.; Li, J.; Wu, H. HGAT-VCA: Integrating high-order graph attention network with vector cellular automata for urban growth simulation. Comput. Environ. Urban Syst. 2023, 99, 101900. [Google Scholar] [CrossRef]
- Yang, J.; Yang, L.T.; Wang, H.; Gao, Y. Multirelational Tensor Graph Attention Networks for Knowledge Fusion in Smart Enterprise Systems. IEEE Trans. Ind. Inform. 2023, 19, 616–625. [Google Scholar] [CrossRef]
- Wang, L.; Zhong, C. gGATLDA: LncRNA-disease association prediction based on graph-level graph attention network. BMC Bioinform. 2022, 23, 11. [Google Scholar] [CrossRef] [PubMed]
- Zhao, C.; Song, A.; Du, Y.; Yang, B. TrajGAT: A map-embedded graph attention network for real-time vehicle trajectory imputation of roadside perception. Transp. Res. Part C: Emerg. Technol. 2022, 142, 103787. [Google Scholar] [CrossRef]
- He, J.; Cui, J.; Zhang, G.; Xue, M.; Chu, D.; Zhao, Y. Spatial–temporal seizure detection with graph attention network and bi-directional LSTM architecture. Biomed. Signal Process. Control 2022, 78, 103908. [Google Scholar] [CrossRef]
- Zhang, X.; Xu, Y.; Shao, Y. Forecasting Traffic Flow with Spatial–Temporal Convolutional Graph Attention Networks. Neural Comput. Appl. 2022, 34, 15457–15479. [Google Scholar] [CrossRef]
- Wang, M.; Wu, L.; Li, M.; Wu, D.; Shi, X.; Ma, C. Meta-learning based spatial-temporal graph attention network for traffic signal control. Knowl.-Based Syst. 2022, 250, 109166. [Google Scholar] [CrossRef]
- Wang, Y.; Jing, C.; Xu, S.; Guo, T. Attention based spatiotemporal graph attention networks for traffic flow forecasting. Inf. Sci. 2022, 607, 869–883. [Google Scholar] [CrossRef]
- Tang, H.; Wei, P.; Li, J.; Zheng, N. EvoSTGAT: Evolving spatiotemporal graph attention networks for pedestrian trajectory prediction. Neurocomputing 2022, 491, 333–342. [Google Scholar] [CrossRef]
- Fang, M.; Tang, L.; Yang, X.; Chen, Y.; Li, C.; Li, Q. FTPG: A Fine-Grained Traffic Prediction Method With Graph Attention Network Using Big Trace Data. IEEE Trans. Intell. Transp. Syst. 2022, 23, 5163–5175. [Google Scholar] [CrossRef]
- Kong, X.; Zhang, J.; Wei, X.; Xing, W.; Lu, W. Adaptive spatial-temporal graph attention networks for traffic flow forecasting. Appl. Intell. 2022, 52, 4300–4316. [Google Scholar] [CrossRef]
- Yang, J.; Sun, X.; Wang, R.G.; Xue, L.X. PTPGC: Pedestrian trajectory prediction by graph attention network with ConvLSTM. Robot. Auton. Syst. 2022, 148, 103931. [Google Scholar] [CrossRef]
- Gao, H.; Xiao, J.; Yin, Y.; Liu, T.; Shi, J. A Mutually Supervised Graph Attention Network for Few-Shot Segmentation: The Perspective of Fully Utilizing Limited Samples. IEEE Trans. Neural Netw. Learn. Syst. 2022, 35, 4826–4838. [Google Scholar] [CrossRef] [PubMed]
- Wang, Z.; Liu, C.; Gombolay, M. Heterogeneous graph attention networks for scalable multi-robot scheduling with temporospatial constraints. Auton. Robot. 2022, 46, 249–268. [Google Scholar] [CrossRef]
- Tang, J.; Zeng, J. Spatiotemporal gated graph attention network for urban traffic flow prediction based on license plate recognition data. Comput.-Aided Civ. Infrastruct. Eng. 2022, 37, 3–23. [Google Scholar] [CrossRef]
- Tian, S.; Kang, L.; Xing, X.; Tian, J.; Fan, C.; Zhang, Y. A Relation-Augmented Embedded Graph Attention Network for Remote Sensing Object Detection. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–18. [Google Scholar] [CrossRef]
- Zhao, Z.; Wang, H.; Yu, X. Spectral–Spatial Graph Attention Network for Semisupervised Hyperspectral Image Classification. IEEE Geosci. Remote Sens. Lett. 2022, 19, 1–5. [Google Scholar] [CrossRef]
- Jiang, T.; Sun, J.; Liu, S.; Zhang, X.; Wu, Q.; Wang, Y. Hierarchical semantic segmentation of urban scene point clouds via group proposal and graph attention network. Int. J. Appl. Earth Obs. Geoinf. 2021, 105, 102626. [Google Scholar] [CrossRef]
- Wang, Y.; Wang, H.; He, J.; Lu, W.; Gao, S. TAGAT: Type-Aware Graph Attention neTworks for reasoning over knowledge graphs. Knowl.-Based Syst. 2021, 233, 107500. [Google Scholar] [CrossRef]
- Yu, X.; Shi, S.; Xu, L. A spatial–temporal graph attention network approach for air temperature forecasting. Appl. Soft Comput. 2021, 113, 107888. [Google Scholar] [CrossRef]
- Zhang, Z.; Zhang, H.; Liu, S. Person re-identification using heterogeneous local graph attention networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 12136–12145. [Google Scholar]
- Mi, L.; Chen, Z. Hierarchical graph attention network for visual relationship detection. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 13886–13895. [Google Scholar]
- Zhang, R.; Yao, W.; Shi, Z.; Zeng, L.; Tang, Y.; Wen, J. A graph attention networks-based model to distinguish the transient rotor angle instability and short-term voltage instability in power systems. Int. J. Electr. Power Energy Syst. 2022, 137, 107783. [Google Scholar] [CrossRef]
- Hao, J.; Liu, J.; Pereira, E.; Liu, R.; Zhang, J.; Zhang, Y.; Yan, K.; Gong, Y.; Zheng, J.; Zhang, J.; et al. Uncertainty-guided graph attention network for parapneumonic effusion diagnosis. Med Image Anal. 2022, 75, 102217. [Google Scholar] [CrossRef] [PubMed]
- Yan, P.; Li, L.; Zeng, D. Quantum Probability-inspired Graph Attention Network for Modeling Complex Text Interaction. Knowl.-Based Syst. 2021, 234, 107557. [Google Scholar] [CrossRef]
- Yang, C.; Wang, P.; Tan, J.; Liu, Q.; Li, X. Autism spectrum disorder diagnosis using graph attention network based on spatial-constrained sparse functional brain networks. Comput. Biol. Med. 2021, 139, 104963. [Google Scholar] [CrossRef]
- Zhao, Y.; Zhang, G.; Dong, C.; Yuan, Q.; Xu, F.; Zheng, Y. Graph Attention Network with Focal Loss for Seizure Detection on Electroencephalography Signals. Int. J. Neural Syst. 2021, 31, 2150027. [Google Scholar] [CrossRef] [PubMed]
- Xie, Y.; Zhang, Y.; Gong, M.; Tang, Z.; Han, C. MGAT: Multi-view Graph Attention Networks. Neural Networks 2020, 132, 180–189. [Google Scholar] [CrossRef]
- Tao, Z.; Wei, Y.; Wang, X.; He, X.; Huang, X.; Chua, T.S. MGAT: Multimodal Graph Attention Network for Recommendation. Inf. Process. Manag. 2020, 57, 102277. [Google Scholar] [CrossRef]
- Li, G.; Fang, T.; Zhang, Y.; Liang, C.; Xiao, Q.; Luo, J. Predicting miRNA-disease associations based on graph attention network with multi-source information. BMC Bioinform. 2022, 23, 244. [Google Scholar] [CrossRef]
- Cai, P.; Wang, H.; Sun, Y.; Liu, M. DQ-GAT: Towards Safe and Efficient Autonomous Driving With Deep Q-Learning and Graph Attention Networks. IEEE Trans. Intell. Transp. Syst. 2022, 23, 21102–21112. [Google Scholar] [CrossRef]
- Dai, G.; Wang, X.; Zou, X.; Liu, C.; Cen, S. MRGAT: Multi-Relational Graph Attention Network for knowledge graph completion. Neural Networks 2022, 154, 234–245. [Google Scholar] [CrossRef] [PubMed]
- Wang, Y.; Lu, L.; Wu, Y.; Chen, Y. Polymorphic graph attention network for Chinese NER. Expert Syst. Appl. 2022, 203, 117467. [Google Scholar] [CrossRef]
- Zhang, X.; Ma, H.; Gao, Z.; Li, Z.; Chang, L. Exploiting cross-session information for knowledge-aware session-based recommendation via graph attention networks. Int. J. Intell. Syst. 2022, 37, 7614–7637. [Google Scholar] [CrossRef]
- Peng, S.; Nie, J.; Shu, X.; Ruan, Z.; Wang, L.; Sheng, Y.; Xuan, Q. A multi-view framework for BGP anomaly detection via graph attention network. Comput. Networks 2022, 214, 109129. [Google Scholar] [CrossRef]
- Yang, M.; Huang, Z.A.; Gu, W.; Han, K.; Pan, W.; Yang, X.; Zhu, Z. Prediction of biomarker–disease associations based on graph attention network and text representation. Briefings Bioinform. 2022, 23, bbac298. [Google Scholar] [CrossRef]
- Baul, S.; Ahmed, K.T.; Filipek, J.; Zhang, W. omicsGAT: Graph Attention Network for Cancer Subtype Analyses. Int. J. Mol. Sci. 2022, 23, 10220. [Google Scholar] [CrossRef]
- Liu, Z.; Ma, Y.; Cheng, Q.; Liu, Z. Finding Asymptomatic Spreaders in a COVID-19 Transmission Network by Graph Attention Networks. Viruses 2022, 14, 1659. [Google Scholar] [CrossRef]
- Jiang, J.; Wang, T.; Wang, B.; Ma, L.; Guan, Y. Gated Tree-based Graph Attention Network (GTGAT) for medical knowledge graph reasoning. Artif. Intell. Med. 2022, 130, 102329. [Google Scholar] [CrossRef]
- Jiang, X.; Wang, Y.; Fan, A.; Ma, J. Learning for mismatch removal via graph attention networks. ISPRS J. Photogramm. Remote Sens. 2022, 190, 181–195. [Google Scholar] [CrossRef]
- Zhou, L.; Zhao, Y.; Yang, D.; Liu, J. GCHGAT: Pedestrian trajectory prediction using group constrained hierarchical graph attention networks. Appl. Intell. 2022, 52, 11434–11447. [Google Scholar] [CrossRef]
- Inan, E. ZoKa: A fake news detection method using edge-weighted graph attention network with transfer models. Neural Comput. Appl. 2022, 34, 11669–11677. [Google Scholar] [CrossRef]
- Yang, J.; Yang, L.T.; Wang, H.; Gao, Y.; Liu, H.; Xie, X. Tensor Graph Attention Network for Knowledge Reasoning in Internet of Things. IEEE Internet Things J. 2022, 9, 9128–9137. [Google Scholar] [CrossRef]
- Li, F.; Feng, J.; Yan, H.; Jin, D.; Li, Y. Crowd Flow Prediction for Irregular Regions with Semantic Graph Attention Network. ACM Trans. Intell. Syst. Technol. 2022, 13, 81. [Google Scholar] [CrossRef]
- Shi, Y.; Zhou, K.; Li, S.; Zhou, M.; Liu, W. Heterogeneous graph attention network for food safety risk prediction. J. Food Eng. 2022, 323, 111005. [Google Scholar] [CrossRef]
- Karbalayghareh, A.; Sahin, M.; Leslie, C.S. Chromatin interaction–aware gene regulatory modeling with graph attention networks. Genome Res. 2022, 32, 930–944. [Google Scholar] [CrossRef]
- Peng, Y.; Tan, G.; Si, H.; Li, J. DRL-GAT-SA: Deep reinforcement learning for autonomous driving planning based on graph attention networks and simplex architecture. J. Syst. Archit. 2022, 126, 102505. [Google Scholar] [CrossRef]
- Yang, X.; Deng, C.; Liu, T.; Tao, D. Heterogeneous Graph Attention Network for Unsupervised Multiple-Target Domain Adaptation. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 1992–2003. [Google Scholar] [CrossRef]
- Dong, Z.; Li, Z.; Yan, Y.; Calinon, S.; Chen, F. Passive Bimanual Skills Learning From Demonstration With Motion Graph Attention Networks. IEEE Robot. Autom. Lett. 2022, 7, 4917–4923. [Google Scholar] [CrossRef]
- Mahbub, S.; Bayzid, M.S. EGRET: Edge aggregated graph attention networks and transfer learning improve protein–protein interaction site prediction. Briefings Bioinform. 2022, 23, bbab578. [Google Scholar] [CrossRef]
- Tekbiyik, K.; Yurduseven, O.; Kurt, G.K. Graph Attention Network-Based Single-Pixel Compressive Direction of Arrival Estimation. IEEE Commun. Lett. 2022, 26, 562–566. [Google Scholar] [CrossRef]
- Zhou, H.; Yang, Y.; Luo, T.; Zhang, J.; Li, S. A unified deep sparse graph attention network for scene graph generation. Pattern Recognit. 2022, 123, 108367. [Google Scholar] [CrossRef]
- Zhang, D.; Liu, Z.; Jia, W.; Liu, H.; Tan, J. Path Enhanced Bidirectional Graph Attention Network for Quality Prediction in Multistage Manufacturing Process. IEEE Trans. Ind. Inform. 2022, 18, 1018–1027. [Google Scholar] [CrossRef]
- Wu, H.; Zhang, Z.; Shi, S.; Wu, Q.; Song, H. Phrase dependency relational graph attention network for Aspect-based Sentiment Analysis. Knowl.-Based Syst. 2022, 236, 107736. [Google Scholar] [CrossRef]
- Wang, S.; Wang, F.; Qiao, S.; Zhuang, Y.; Zhang, K.; Pang, S.; Nowak, R.; Lv, Z. MSHGANMDA: Meta-Subgraphs Heterogeneous Graph Attention Network for miRNA-Disease Association Prediction. IEEE J. Biomed. Health Inform. 2022, 27, 4639–4648. [Google Scholar] [CrossRef]
- Wei, Q.; Li, Y.; Zhang, J.; Wang, F.Y. VGN: Value Decomposition With Graph Attention Networks for Multiagent Reinforcement Learning. IEEE Trans. Neural Netw. Learn. Syst. 2022, 35, 182–195. [Google Scholar] [CrossRef] [PubMed]
- Wang, H.; Wang, Z.; Chen, J.; Liu, W. Graph Attention Network Model with Defined Applicability Domains for Screening PBT Chemicals. Environ. Sci. Technol. 2022, 56, 6774–6785. [Google Scholar] [CrossRef] [PubMed]
- Chen, Y.; Yan, J.; Jiang, M.; Zhang, T.; Zhao, Z.; Zhao, W.; Zheng, J.; Yao, D.; Zhang, R.; Kendrick, K.M.; et al. Adversarial Learning Based Node-Edge Graph Attention Networks for Autism Spectrum Disorder Identification. IEEE Trans. Neural Netw. Learn. Syst. 2022, 35, 7275–7286. [Google Scholar] [CrossRef]
- Dong, Y.; Liu, Q.; Du, B.; Zhang, L. Weighted Feature Fusion of Convolutional Neural Network and Graph Attention Network for Hyperspectral Image Classification. IEEE Trans. Image Process. 2022, 31, 1559–1572. [Google Scholar] [CrossRef]
- Li, X.; Tan, J.; Wang, P.; Liu, H.; Li, Z.; Wang, W. Anatomically constrained squeeze-and-excitation graph attention network for cortical surface parcellation. Comput. Biol. Med. 2022, 140, 105113. [Google Scholar] [CrossRef]
- Liu, S.; Duan, L.; Zhang, Z.; Cao, X.; Durrani, T.S. Ground-Based Remote Sensing Cloud Classification via Context Graph Attention Network. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–11. [Google Scholar] [CrossRef]
- Xu, C.; Cai, L.; Gao, J. An efficient scRNA-seq dropout imputation method using graph attention network. BMC Bioinform. 2021, 22, 582. [Google Scholar] [CrossRef] [PubMed]
- Yang, Y.; Walker, T.M.; Kouchaki, S.; Wang, C.; Peto, T.E.A.; Crook, D.W.; Consortium, C.; Clifton, D.A. An end-to-end heterogeneous graph attention network for Mycobacterium tuberculosis drug-resistance prediction. Briefings Bioinform. 2021, 22, bbab299. [Google Scholar] [CrossRef] [PubMed]
- Zeng, J.; Liu, T.; Jia, W.; Zhou, J. Fine-grained Question-Answer sentiment classification with hierarchical graph attention network. Neurocomputing 2021, 457, 214–224. [Google Scholar] [CrossRef]
- Shao, Y.; Li, R.; Hu, B.; Wu, Y.; Zhao, Z.; Zhang, H. Graph Attention Network-Based Multi-Agent Reinforcement Learning for Slicing Resource Management in Dense Cellular Network. IEEE Trans. Veh. Technol. 2021, 70, 10792–10803. [Google Scholar] [CrossRef]
- Ji, C.; Liu, Z.; Wang, Y.; Ni, J.; Zheng, C. GATNNCDA: A method based on graph attention network and multi-layer neural network for predicting circRNA-disease associations. Int. J. Mol. Sci. 2021, 22, 8505. [Google Scholar] [CrossRef] [PubMed]
- Sacha, M.; Błaz, M.; Byrski, P.; Dabrowski-Tumanski, P.; Chrominski, M.; Loska, R.; Włodarczyk-Pruszynski, P.; Jastrzebski, S. Molecule edit graph attention network: Modeling chemical reactions as sequences of graph edits. J. Chem. Inf. Model. 2021, 61, 3273–3284. [Google Scholar] [CrossRef]
- Wang, Z.; Gombolay, M. Learning Scheduling Policies for Multi-Robot Coordination With Graph Attention Networks. IEEE Robot. Autom. Lett. 2020, 5, 4509–4516. [Google Scholar] [CrossRef]
- Li, Q.; Lin, W.; Liu, Z.; Prorok, A. Message-Aware Graph Attention Networks for Large-Scale Multi-Robot Path Planning. IEEE Robot. Autom. Lett. 2021, 6, 5533–5540. [Google Scholar] [CrossRef]
- Sha, A.; Wang, B.; Wu, X.; Zhang, L. Semisupervised Classification for Hyperspectral Images Using Graph Attention Networks. IEEE Geosci. Remote Sens. Lett. 2021, 18, 157–161. [Google Scholar] [CrossRef]
- Long, Y.; Luo, J.; Zhang, Y.; Xia, Y. Predicting human microbe–disease associations via graph attention networks with inductive matrix completion. Briefings Bioinform. 2020, 22, bbaa146. [Google Scholar] [CrossRef]
- Hei, Y.; Yang, R.; Peng, H.; Wang, L.; Xu, X.; Liu, J.; Liu, H.; Xu, J.; Sun, L. Hawk: Rapid Android Malware Detection Through Heterogeneous Graph Attention Networks. IEEE Trans. Neural Netw. Learn. Syst. 2021, 35, 4703–4717. [Google Scholar] [CrossRef] [PubMed]
- Yan, Y.; Ren, W.; Hu, X.; Li, K.; Shen, H.; Cao, X. SRGAT: Single Image Super-Resolution With Graph Attention Network. IEEE Trans. Image Process. 2021, 30, 4905–4918. [Google Scholar] [CrossRef]
- Tang, Y.; Zhang, X.; Zhai, Y.; Qin, G.; Song, D.; Huang, S.; Long, Z. Rotating Machine Systems Fault Diagnosis Using Semisupervised Conditional Random Field-Based Graph Attention Network. IEEE Trans. Instrum. Meas. 2021, 70, 1–10. [Google Scholar] [CrossRef]
- Veličković, P.; Cucurull, G.; Casanova, A.; Romero, A.; Liò, P.; Bengio, Y. Graph Attention Networks. arXiv 2018, arXiv:1710.10903. [Google Scholar]
- Fey, M.; Lenssen, J.E. Fast Graph Representation Learning with PyTorch Geometric. arXiv 2019, arXiv:1903.02428. [Google Scholar]
- Varuna Jayasiri, N.W. labml.ai Annotated Paper Implementations. 2020. Available online: https://nn.labml.ai/ (accessed on 29 August 2024).
- Wang, M.; Zheng, D.; Ye, Z.; Gan, Q.; Li, M.; Song, X.; Zhou, J.; Ma, C.; Yu, L.; Gai, Y.; et al. Deep Graph Library: A Graph-Centric, Highly-Performant Package for Graph Neural Networks. arXiv 2019, arXiv:1909.01315. [Google Scholar]
- Li, M.; Zhou, J.; Hu, J.; Fan, W.; Zhang, Y.; Gu, Y.; Karypis, G. DGL-LifeSci: An Open-Source Toolkit for Deep Learning on Graphs in Life Science. ACS Omega 2021, 6, 27233–27238. [Google Scholar] [CrossRef]
- Zheng, D.; Song, X.; Ma, C.; Tan, Z.; Ye, Z.; Dong, J.; Xiong, H.; Zhang, Z.; Karypis, G. DGL-KE: Training Knowledge Graph Embeddings at Scale. In Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, New York, NY, USA, 25 July 2020; SIGIR. pp. 739–748. [Google Scholar]
- Dwivedi, V.P.; Joshi, C.K.; Luu, A.T.; Laurent, T.; Bengio, Y.; Bresson, X. Benchmarking Graph Neural Networks. arXiv 2020, arXiv:2003.00982. [Google Scholar]
- Wu, L.; Chen, Y.; Shen, K.; Guo, X.; Gao, H.; Li, S.; Pei, J.; Long, B. Graph Neural Networks for Natural Language Processing: A Survey. arXiv 2021, arXiv:2106.06090. [Google Scholar]
- Jin, Z.; Wang, Y.; Wang, Q.; Ming, Y.; Ma, T.; Qu, H. Gnnlens: A visual analytics approach for prediction error diagnosis of graph neural networks. IEEE Trans. Vis. Comput. Graph. 2022, 29, 3024–3038. [Google Scholar] [CrossRef]
- Leontis, N.B.; Zirbel, C.L. Nonredundant 3D structure datasets for RNA knowledge extraction and benchmarking. RNA 3D Struct. Anal. Predict. 2012, 27, 281–298. [Google Scholar]
- Han, H.; Zhao, T.; Yang, C.; Zhang, H.; Liu, Y.; Wang, X.; Shi, C. Openhgnn: An open source toolkit for heterogeneous graph neural network. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, Atlanta, GA, USA, 17–22 October 2022; pp. 3993–3997. [Google Scholar]
- Zhou, H.; Zheng, D.; Nisa, I.; Ioannidis, V.; Song, X.; Karypis, G. TGL: A General Framework for Temporal GNN Training on Billion-Scale Graphs. Proc. VLDB Endow. 2022, 15. [Google Scholar] [CrossRef]
- Sammut, C.; Webb, G.I. Encyclopedia of Machine Learning; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2011. [Google Scholar]
- Gao, C.; Zheng, Y.; Li, N.; Li, Y.; Qin, Y.; Piao, J.; Quan, Y.; Chang, J.; Jin, D.; He, X.; et al. A Survey of Graph Neural Networks for Recommender Systems: Challenges, Methods, and Directions. ACM Trans. Recomm. Syst. 2023, 1, 3. [Google Scholar] [CrossRef]
- Ameen, T.; Ali, A.A. Graph Attention Network for Movie Recommendation. Int. J. Intell. Eng. Syst. 2022, 15, 49. [Google Scholar] [CrossRef]
- Xu, A.; Zhong, P.; Kang, Y.; Duan, J.; Wang, A.; Lu, M.; Shi, C. THAN: Multimodal Transportation Recommendation With Heterogeneous Graph Attention Networks. IEEE Trans. Intell. Transp. Syst. 2023, 24, 1533–1543. [Google Scholar] [CrossRef]
- Wang, C.; Ren, J.; Liang, H. MSGraph: Modeling multi-scale K-line sequences with graph attention network for profitable indices recommendation. Electron. Res. Arch. 2023, 31, 2626–2650. [Google Scholar] [CrossRef]
- Jin, Y.; Ji, W.; Shi, Y.; Wang, X.; Yang, X. Meta-path guided graph attention network for explainable herb recommendation. Health Inf. Sci. Syst. 2023, 11, 5. [Google Scholar] [CrossRef]
- Li, X.; Zhang, X.; Wang, P.; Cao, Z. Web Services Recommendation Based on Metapath-Guided Graph Attention Network. J. Supercomput. 2022, 78, 12621–12647. [Google Scholar] [CrossRef]
- Xie, F.; Xu, Y.; Zheng, A.; Chen, L.; Zheng, Z. Service recommendation through graph attention network in heterogeneous information networks. Int. J. Comput. Sci. Eng. 2022, 25, 643. [Google Scholar] [CrossRef]
- Lu, W.; Jiang, N.; Jin, D.; Chen, H.; Liu, X. Learning Distinct Relationship in Package Recommendation With Graph Attention Networks. IEEE Trans. Comput. Soc. Syst. 2022, 10, 3308–3320. [Google Scholar] [CrossRef]
- Song, T.; Guo, F.; Jiang, H.; Ma, W.; Feng, Z.; Guo, L. HGAT-BR: Hyperedge-based graph attention network for basket recommendation. Appl. Intell. 2022, 53, 1435–1451. [Google Scholar] [CrossRef]
- Kouhsar, M.; Kashaninia, E.; Mardani, B.; Rabiee, H.R. CircWalk: A novel approach to predict CircRNA-disease association based on heterogeneous network representation learning. BMC Bioinform. 2022, 23, 331. [Google Scholar] [CrossRef]
- Aznaourova, M.; Schmerer, N.; Schmeck, B.; Schulte, L.N. Disease-Causing Mutations and Rearrangements in Long Non-coding RNA Gene Loci. Front Genet 2020, 11, 527484. [Google Scholar] [CrossRef] [PubMed]
- Bartel, D.P. Metazoan MicroRNAs. Cell 2018, 173, 20–51. [Google Scholar] [CrossRef] [PubMed]
- Ozata, D.M.; Gainetdinov, I.; Zoch, A.; O’Carroll, D.; Zamore, P.D. PIWI-interacting RNAs: Small RNAs with big functions. Nat. Rev. Genet. 2019, 20, 89–108. [Google Scholar] [CrossRef] [PubMed]
- Peng, L.; Yang, C.; Chen, Y.; Liu, W. Predicting CircRNA-Disease associations via feature convolution learning with heterogeneous graph attention network. IEEE J. Biomed. Health Inform. 2023, 27, 3072–3082. [Google Scholar] [CrossRef] [PubMed]
- Zhao, X.; Wu, J.; Zhao, X.; Yin, M. Multi-view contrastive heterogeneous graph attention network for lncRNA–disease association prediction. Briefings Bioinform. 2022, 24, bbac548. [Google Scholar] [CrossRef]
- Zhao, H.; Li, Z.; You, Z.H.; Nie, R.; Zhong, T. Predicting Mirna-Disease Associations Based on Neighbor Selection Graph Attention Networks. IEEE/ACM Trans. Comput. Biol. Bioinform. 2023, 20, 1298–1307. [Google Scholar] [CrossRef]
- Zheng, K.; Zhang, X.L.; Wang, L.; You, Z.H.; Zhan, Z.H.; Li, H.Y. Line graph attention networks for predicting disease-associated Piwi-interacting RNAs. Briefings Bioinform. 2022, 23, bbac393. [Google Scholar] [CrossRef]
- Dayun, L.; Junyi, L.; Yi, L.; Qihua, H.; Deng, L. MGATMDA: Predicting microbe-disease associations via multi-component graph attention network. IEEE/ACM Trans. Comput. Biol. Bioinform. 2021, 19, 3578–3585. [Google Scholar] [CrossRef]
- Lu, J.; Shi, L.; Liu, G.; Zhan, X. Dual-Channel Edge-Featured Graph Attention Networks for Aspect-Based Sentiment Analysis. Electronics 2023, 12, 624. [Google Scholar] [CrossRef]
- Miao, Y.; Luo, R.; Zhu, L.; Liu, T.; Zhang, W.; Cai, G.; Zhou, M. Contextual Graph Attention Network for Aspect-Level Sentiment Classification. Mathematics 2022, 10, 2473. [Google Scholar] [CrossRef]
- Wang, P.; Zhao, Z. Improving context and syntactic dependency for aspect-based sentiment analysis using a fused graph attention network. Evol. Intell. 2023, 17, 589–598. [Google Scholar] [CrossRef]
- Wang, Y.; Yang, N.; Miao, D.; Chen, Q. Dual-channel and multi-granularity gated graph attention network for aspect-based sentiment analysis. Appl. Intell. 2022, 53, 13145–13157. [Google Scholar] [CrossRef]
- Yuan, L.; Wang, J.; Yu, L.C.; Zhang, X. syntactic Graph Attention Network for Aspect-Level Sentiment Analysis. IEEE Trans. Artif. Intell. 2022, 5, 140–153. [Google Scholar] [CrossRef]
- Zhang, X.; Yu, L.; Tian, S. BGAT: Aspect-based sentiment analysis based on bidirectional GRU and graph attention network. J. Intell. Fuzzy Syst. 2023, 44, 3115–3126. [Google Scholar] [CrossRef]
- Zhou, X.; Zhang, T.; Cheng, C.; Song, S. Dynamic multichannel fusion mechanism based on a graph attention network and BERT for aspect-based sentiment classification. Appl. Intell. 2022, 53, 6800–6813. [Google Scholar] [CrossRef]
- Leng, J.; Tang, X. Graph Attention Networks for Multiple Pairs of Entities and Aspects Sentiment Analysis in Long Texts. J. Syst. Sci. Inf. 2022, 10, 203–215. [Google Scholar] [CrossRef]
- Xu, K.; Zhao, Y.; Zhang, L.; Gao, C.; Huang, H. Spectral–Spatial Residual Graph Attention Network for Hyperspectral Image Classification. IEEE Geosci. Remote Sens. Lett. 2022, 19, 1–5. [Google Scholar] [CrossRef]
- Liu, C.; Dong, Y. CNN-Enhanced graph attention network for hyperspectral image super-resolution using non-local self-similarity. Int. J. Remote Sens. 2022, 43, 4810–4835. [Google Scholar] [CrossRef]
- Shen, W. A Novel Conditional Generative Adversarial Network Based On Graph Attention Network For Moving Image Denoising. J. Appl. Sci. Eng. 2022, 26, 829–839. [Google Scholar] [CrossRef]
- Shuai, W.; Jiang, F.; Zheng, H.; Li, J. MSGATN: A Superpixel-Based Multi-Scale Siamese Graph Attention Network for Change Detection in Remote Sensing Images. Appl. Sci. 2022, 12, 5158. [Google Scholar] [CrossRef]
- Zhou, W.; Xia, Z.; Dou, P.; Su, T.; Hu, H. Double Attention Based on Graph Attention Network for Image Multi-Label Classification. ACM Trans. Multimed. Comput. Commun. Appl. 2023, 19, 1–23. [Google Scholar] [CrossRef]
- Chandola, V.; Banerjee, A.; Kumar, V. Anomaly Detection: A Survey. ACM Comput. Surv. 2009, 41, 15. [Google Scholar] [CrossRef]
- Al-Saffar, A.; Guo, L.; Abbosh, A. Graph Attention Network in Microwave Imaging for Anomaly Localization. IEEE J. Electromagn. RF Microwaves Med. Biol. 2022, 6, 212–218. [Google Scholar] [CrossRef]
- Ding, C.; Sun, S.; Zhao, J. MST-GAT: A multimodal spatial–temporal graph attention network for time series anomaly detection. Inf. Fusion 2023, 89, 527–536. [Google Scholar] [CrossRef]
- Zhou, L.; Zeng, Q.; Li, B. Hybrid Anomaly Detection via Multihead Dynamic Graph Attention Networks for Multivariate Time Series. IEEE Access 2022, 10, 40967–40978. [Google Scholar] [CrossRef]
- Li, D.; Ru, Y.; Liu, J. GATBoost: Mining graph attention networks-based important substructures of polymers for a better property prediction. Mater. Today Commun. 2024, 38, 107577. [Google Scholar] [CrossRef]
- Wu, H.; Liu, J.; Jiang, T.; Zou, Q.; Qi, S.; Cui, Z.; Tiwari, P.; Ding, Y. AttentionMGT-DTA: A multi-modal drug-target affinity prediction using graph transformer and attention mechanism. Neural Networks 2024, 169, 623–636. [Google Scholar] [CrossRef]
- Wang, C.; Wang, Y.; Ding, P.; Li, S.; Yu, X.; Yu, B. ML-FGAT: Identification of multi-label protein subcellular localization by interpretable graph attention networks and feature-generative adversarial networks. Comput. Biol. Med. 2024, 170, 107944. [Google Scholar] [CrossRef]
- Liao, Y.; Zhang, X.M.; Ferrie, C. Graph Neural Networks on Quantum Computers. arXiv 2024, arXiv:2405.17060. [Google Scholar]
Domain | Case Study | Problem | Applications of GATs |
---|---|---|---|
Healthcare and Bioinformatics | Drug–Drug Interaction Prediction | Predicting potential interactions between drugs is crucial for drug safety and efficacy. Traditional methods may not fully capture the complex relationships between different drugs and their effects on the human body. | GATs can model drug–drug interaction networks by treating drugs as nodes and interactions as edges. The attention mechanism helps to focus on the most relevant interactions, improving the accuracy of predictions. |
Protein–Protein Interaction Networks | Understanding protein interactions is essential for drug discovery and understanding biological processes. Protein-protein interaction (PPI) networks are complex and require sophisticated models to accurately predict interactions. | GATs are applied to PPI networks by treating proteins as nodes and their interactions as edges. The attention mechanism enables the model to focus on the most biologically relevant interactions, improving predictive performance. | |
Social Network Analysis | Community Detection | Identifying communities within social networks is important for understanding the structure and dynamics of social groups. Traditional methods often struggle with the overlapping and hierarchical nature of communities in large social networks. | GATs can be used to detect communities by focusing on the most influential connections within a network. The attention mechanism allows the model to distinguish between strong and weak ties, which is crucial for accurately identifying communities. |
Fake News Detection | The spread of fake news on social media is a significant problem, and identifying fake news early is critical. Traditional methods may not effectively capture the complex relationships between users and the content they share. | GATs can be applied to social networks where nodes represent users or news articles, and edges represent interactions (e.g., shares or likes). The attention mechanism allows the model to focus on the most suspicious interactions, improving the detection of fake news. | |
Finance and Economics | Fraud Detection in Financial Transactions | Detecting fraudulent transactions in financial networks is challenging due to the complex and evolving nature of financial interactions. Traditional methods may fail to capture subtle patterns indicative of fraud. | GATs can be used to model financial transaction networks, where nodes represent entities (e.g., accounts) and edges represent transactions. The attention mechanism helps in focusing on unusual patterns of transactions that are likely to be fraudulent. |
Stock Market Prediction | Predicting stock market movements involves analyzing complex relationships between different stocks, sectors, and external factors. Traditional models may not effectively capture these relationships. | GATs can be applied to stock market graphs, where nodes represent stocks and edges represent relationships (e.g., co-movement or industry links). The attention mechanism helps in identifying the most influential factors affecting stock prices. | |
Natural Language Processing (NLP) | Document Classification | Classifying documents based on their content can be challenging when the documents have complex structures or when the relationships between different parts of the text are important. | GATs can be applied to document graphs, where nodes represent words or sentences, and edges represent syntactic or semantic relationships. The attention mechanism helps in focusing on the most relevant parts of the document for classification. |
Machine Translation | Machine translation requires understanding the relationships between words and phrases in sentences. Traditional methods may struggle to capture these relationships effectively, especially in complex sentences. | GATs can be used in translation models by treating words as nodes and their relationships as edges in a sentence graph. The attention mechanism allows the model to focus on the most important word relationships, improving translation quality. | |
Autonomous Vehicles and Robotics | Traffic Flow Prediction | Predicting traffic flow in urban environments is complex due to the dynamic nature of traffic and the numerous factors that influence it, such as road networks, weather, and accidents. | GATs can be applied to traffic networks, where nodes represent intersections or road segments, and edges represent traffic flow between them. The attention mechanism allows the model to focus on the most critical road segments, improving the accuracy of traffic predictions. |
Path Planning for Autonomous Robots | Autonomous robots need to navigate complex environments, which requires efficient path planning. Traditional methods may not effectively capture the complex relationships between different parts of the environment. | GATs can be used to model the environment as a graph, where nodes represent locations and edges represent possible paths. The attention mechanism helps the robot focus on the most relevant paths for efficient navigation. | |
Chemistry and Material Science | Molecular Property Prediction | Predicting the properties of molecules, such as their toxicity, reactivity, or solubility, is a key task in drug discovery and material science. Traditional models may not fully capture the complex interactions between atoms in a molecule. | GATs can be applied to molecular graphs, where nodes represent atoms and edges represent chemical bonds. The attention mechanism helps in focusing on the most important atomic interactions, improving the accuracy of property predictions. |
Telecomm- unications | Network Anomaly Detection | Detecting anomalies in telecommunication networks is crucial for maintaining network security and performance. Traditional methods may not effectively capture complex, evolving patterns of network traffic. | GATs can be used to model telecommunication networks, where nodes represent devices or servers, and edges represent communication links. The attention mechanism helps in focusing on abnormal patterns, improving the detection of anomalies. |
Model | Core Idea | Attention Mechanism | Advantages |
---|---|---|---|
Original GAT | Introduces attention mechanisms to graph neural networks (GNNs), allowing the model to learn the importance (attention weights) of neighboring nodes when aggregating information. The attention mechanism is applied to each pair of nodes and their edges. | Uses a single-layer feedforward neural network to compute attention scores, followed by a softmax function to normalize these scores. | Suitable for small to moderately sized graphs but can become computationally expensive for very large graphs due to the pairwise attention calculation. |
Multi-Head Attention GAT | Extends the original GAT by using multiple attention mechanisms (heads) in parallel. This allows the model to capture more complex relationships by combining different attention heads. | Each head computes its own attention scores, and the outputs are either concatenated or averaged. | Improves the expressive power and stabilizes the learning process, making the model more robust to noise. |
GATv2 | An improvement over the original GAT that redefines the attention mechanism to make it more expressive and less sensitive to the order of node pairs. | Instead of computing attention scores as a single linear combination of features, GATv2 computes them using a more flexible approach that allows for asymmetric attention scores, which better captures complex node relationships. | Provides better performance on certain tasks, particularly where the direction of the edge plays a significant role. |
Sparse GAT | A variation designed to handle large-scale graphs with many nodes and edges. Sparse GATs reduce the computational burden by focusing only on a subset of neighbors when computing attention, instead of all possible neighbors. | Often uses techniques like sampling or clustering to limit the number of neighbors considered during attention calculation. | Scalable to much larger graphs while maintaining reasonable performance, making them more practical for real-world applications like social networks or biological networks. |
Hierarchical GAT (H-GAT) | Introduces a hierarchical structure to GATs, where attention is computed at multiple levels of graph granularity. This approach captures both local and global graph structures. | Combines attention scores at different hierarchical levels, allowing the model to learn from different scales of the graph. | Particularly useful for large and complex graphs, where both micro (local node connections) and macro (overall graph structure) views are important. |
Temporal GAT | Adapts GATs for dynamic graphs where the structure evolves over time. It incorporates temporal information into the attention mechanism. | Combines traditional attention with time-aware mechanisms, such as temporal encoding or recurrent neural networks (RNNs), to handle the evolving nature of the graph. | Essential for applications like transaction networks, where the sequence and timing of interactions are crucial. |
Edge-Weighted GAT | Incorporates edge weights directly into the attention mechanism, making the model more sensitive to the strength or significance of connections between nodes. | Modifies the attention computation to include edge weights, which influence the importance of neighboring nodes during information aggregation. | Useful for graphs where edges have varying levels of importance, such as in recommendation systems or weighted social networks. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Vrahatis, A.G.; Lazaros, K.; Kotsiantis, S. Graph Attention Networks: A Comprehensive Review of Methods and Applications. Future Internet 2024, 16, 318. https://doi.org/10.3390/fi16090318
Vrahatis AG, Lazaros K, Kotsiantis S. Graph Attention Networks: A Comprehensive Review of Methods and Applications. Future Internet. 2024; 16(9):318. https://doi.org/10.3390/fi16090318
Chicago/Turabian StyleVrahatis, Aristidis G., Konstantinos Lazaros, and Sotiris Kotsiantis. 2024. "Graph Attention Networks: A Comprehensive Review of Methods and Applications" Future Internet 16, no. 9: 318. https://doi.org/10.3390/fi16090318
APA StyleVrahatis, A. G., Lazaros, K., & Kotsiantis, S. (2024). Graph Attention Networks: A Comprehensive Review of Methods and Applications. Future Internet, 16(9), 318. https://doi.org/10.3390/fi16090318