Next Article in Journal
Lightweight Progressive Fusion Calibration Network for Rotated Object Detection in Remote Sensing Images
Previous Article in Journal
FedGAT-DCNN: Advanced Credit Card Fraud Detection Using Federated Learning, Graph Attention Networks, and Dilated Convolutions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

TransE-MTP: A New Representation Learning Method for Knowledge Graph Embedding with Multi-Translation Principles and TransE

1
College of Information Science and Engineering, Henan University of Technology, Zhengzhou 450001, China
2
Key Laboratory of Grain Information Processing and Control, Henan University of Technology, Ministry of Education, Zhengzhou 450001, China
3
Henan Engineering Research Center of Grain Condition Intelligent Detection and Application, Henan University of Technology, Zhengzhou 450001, China
*
Author to whom correspondence should be addressed.
Electronics 2024, 13(16), 3171; https://doi.org/10.3390/electronics13163171
Submission received: 29 June 2024 / Revised: 1 August 2024 / Accepted: 8 August 2024 / Published: 11 August 2024

Abstract

The purpose of representation learning is to encode the entities and relations in a knowledge graph as low-dimensional and real-valued vectors through machine learning technology. Traditional representation learning methods like TransE, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of a graph’s entities, are effective for learning the embeddings of knowledge bases, but struggle to effectively model complex relations like one-to-many, many-to-one, and many-to-many. To overcome the above issues, we introduce a new method for knowledge representation, reasoning, and completion based on multi-translation principles and TransE (TransE-MTP). By defining multiple translation principles (MTPs) for different relation types, such as one-to-one and complex relations like one-to-many, many-to-one, and many-to-many, and combining MTPs with a typical translating-based model for modeling multi-relational data (TransE), the proposed method, TransE-MTP, ensures that multiple optimization objectives can be targeted and optimized during training on complex relations, thereby providing superior prediction performance. We implement a prototype of TransE-MTP to demonstrate its effectiveness at link prediction and triplet classification on two prominent knowledge graph datasets: Freebase and Wordnet. Our experimental results show that the proposed method enhanced the performance of both TransE and knowledge graph embedding by translating on hyperplanes (TransH), which confirms its effectiveness and competitiveness.
Keywords: representation learning; knowledge graph; translation principle; link prediction; triplet classification representation learning; knowledge graph; translation principle; link prediction; triplet classification

Share and Cite

MDPI and ACS Style

Li, Y.; Zhu, C. TransE-MTP: A New Representation Learning Method for Knowledge Graph Embedding with Multi-Translation Principles and TransE. Electronics 2024, 13, 3171. https://doi.org/10.3390/electronics13163171

AMA Style

Li Y, Zhu C. TransE-MTP: A New Representation Learning Method for Knowledge Graph Embedding with Multi-Translation Principles and TransE. Electronics. 2024; 13(16):3171. https://doi.org/10.3390/electronics13163171

Chicago/Turabian Style

Li, Yongfang, and Chunhua Zhu. 2024. "TransE-MTP: A New Representation Learning Method for Knowledge Graph Embedding with Multi-Translation Principles and TransE" Electronics 13, no. 16: 3171. https://doi.org/10.3390/electronics13163171

APA Style

Li, Y., & Zhu, C. (2024). TransE-MTP: A New Representation Learning Method for Knowledge Graph Embedding with Multi-Translation Principles and TransE. Electronics, 13(16), 3171. https://doi.org/10.3390/electronics13163171

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop