Next Article in Journal
Generalized Linear Model (GLM) Applications for the Exponential Dispersion Model Generated by the Landau Distribution
Previous Article in Journal
A Rate of Change and Center of Gravity Approach to Calculating Composite Indicator Thresholds: Moving from an Empirical to a Theoretical Perspective
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Geometry Interaction Embeddings for Interpolation Temporal Knowledge Graph Completion

1
School of Data and Computer Science, Shandong Women’s University, Jinan 250300, China
2
School of Journalism and Communication, Tsinghua University, Beijing 100190, China
*
Author to whom correspondence should be addressed.
Mathematics 2024, 12(13), 2022; https://doi.org/10.3390/math12132022
Submission received: 12 May 2024 / Revised: 22 June 2024 / Accepted: 27 June 2024 / Published: 28 June 2024

Abstract

:
Knowledge graphs (KGs) have become a cornerstone for structuring vast amounts of information, enabling sophisticated AI applications across domains. The progression to temporal knowledge graphs (TKGs) introduces time as an essential dimension, allowing for a dynamic representation of entity relationships. Despite their potential, TKGs often suffer from incompleteness, necessitating the development of temporal knowledge graph completion (TKGC) techniques. These methods, particularly focusing on interpolation within the known timeframe, aim to infer missing temporal facts and enhance the predictive capabilities of TKGs. The prevalent reliance on Euclidean space modeling in TKGC methods presents challenges in capturing the complex, hierarchical, and time-varying nature of TKGs. To overcome these limitations, we introduced the attention-based geometry interaction embedding (ATGIE) method, a novel approach that leverages the strengths of multiple geometric spaces, i.e., Euclidean, hyperbolic, and hypersphere, to model the intricacies of TKGs more effectively. ATGIE employs an attention mechanism to dynamically weigh the contributions of different geometric spaces, allowing it to adaptively form reliable spatial structures based on interactive geometric information. This multi-space modeling not only captures the diverse relationships within TKGs but also facilitates a nuanced understanding of how entities and their relationships evolve over time. Through extensive experiments, we demonstrate ATGIE’s superiority in TKGC tasks, showcasing its improvement over existing methods, robustness to noise, and sensitivity to temporal dynamics. The results highlight ATGIE’s potential to advance the state-of-the-art in TKGC, offering a promising direction for research and application in the field.

1. Introduction

Knowledge graphs (KGs) are a revolutionary framework for the systematic structuring and utilization of information, where data are encapsulated within a network of nodes representing entities and edges symbolizing the relationships between them [1,2,3]. These graphs empower Artificial Intelligence (AI) systems to comprehend and infer the intricate connections spanning a multitude of fields, ranging from healthcare and financial services to social media platforms and further afield. The graph-based representation of KGs allows for the development of AI applications that are more context-sensitive and sophisticated [4,5], leading to advancements in areas such as natural language understanding and personalized recommendation algorithms. Expanding on the groundbreaking work of KGs, temporal knowledge graphs (TKGs) incorporate the critical element of time [6]. In TKGs, data are articulated as a quadruple, denoted as ( s , r , o , t ) , indicating that a relationship r exists between a subject entity s and an object entity o at a specific time point t. This temporal component transcends the limitations of static KGs by embedding time-varying interactions among entities, thereby equipping AI systems with the ability to perceive the evolution of relationships over time. The integration of temporality is indispensable for applications that necessitate a dynamic representation of knowledge, such as forecasting future occurrences, examining patterns over time, and tracing the progression of ideas. TKGs, by capturing the temporal flux of entities and their relationships, mark a significant advancement towards crafting AI systems that are more anticipatory, agile, and perceptive.
Although TKGs are prevalent, their frequent incompleteness poses a considerable challenge, primarily due to the constraints of human understanding. This limitation has motivated the advancement of temporal knowledge graph completion (TKGC) techniques, which are designed to deduce missing factual connections from the existing corpus of knowledge. The body of work on TKGC is rich with strategies, predominantly categorized into interpolation and extrapolation methods, as elaborated in various scholarly articles for interpolation [7,8,9,10] and extrapolation [11,12,13]. In the context of a TKG that includes data from time t 0 to time t T , the extrapolation method endeavors to foresee occurrences that could arise after t T , in contrast to interpolation, which aims to uncover omissions within this period. However, the reliability and applicability of extrapolation forecasts in practical scenarios are often questionable [14]. In light of these issues, our research has concentrated on the interpolation facet of TKGC. This targeted effort is intended to fill the voids in the current timeline of TKGs, with the goal of augmenting their precision and thoroughness to provide more reliable and utilizable knowledge. Our specific objective is to address queries of the form ( s , r , ? , t ) and ( ? , r , o , t ) within a TKG that is timestamped from t 0 to t T . In these queries, the placeholder “?” denotes an entity whose identity is to be ascertained, predicated on its association with another entity and the particular time instant within the stipulated temporal frame.
Most existing methods for TKGC rely primarily on Euclidean space modeling for knowledge representation and reasoning. However, the unique characteristics of temporal knowledge graphs pose challenges that can limit the effectiveness of Euclidean space modeling [15,16]. The interconnectedness and evolving nature of TKGs often involve complex hierarchical structures and long-range dependencies. Euclidean space modeling struggles to effectively capture these intricate relationships [17,18]. Additionally, Euclidean spaces may not be well-suited for representing the underlying geometry of the data in TKGs, further hindering their ability to capture the dynamics of these graphs [15]. To overcome these limitations, researchers have explored the use of non-Euclidean spaces, such as hyperbolic and hypersphere spaces [16,19]. By leveraging these alternative geometries, these approaches offer potential solutions to the challenges posed by Euclidean space modeling. Non-Euclidean spaces can better capture hierarchical structures, handle long-range dependencies, and represent the intrinsic geometry of the data more effectively [18,20]. However, most of these approaches focus on modeling a single geometric space and do not fully address the heterogeneity that exists among different spaces within a temporal knowledge graph [18,21]. This limitation restricts their ability to perform pattern reasoning across multiple spaces. In other words, they may not adequately capture the complex interactions and dependencies that span different types of entities or relationships in a heterogeneous TKG. Therefore, although non-Euclidean space modeling presents a promising solution to the limitations of Euclidean space modeling, there is still ongoing research to explore the full potential of multiple geometries in TKGs.
To this end, we introduce a novel attention-based geometry interaction embedding (ATGIE) method for temporal knowledge graph completion. Our ATGIE focuses on developing a geometric interaction embedding model specifically tailored for TKGC. Unlike previous approaches that primarily targeted a single geometric space, this novel method aims to obtain interactive embeddings across multiple geometric spaces, thereby addressing the heterogeneity and complexity inherent in TKGs. One key aspect of this approach is the utilization of various geometric spaces, including euclidean, hyperbolic, and spherical spaces, to model the structural and temporal dependencies within the graphs. By leveraging multiple geometric spaces, the approach can capture the diverse and intricate relationships present in temporal knowledge graphs more effectively. By combining the strengths of different geometric spaces, the approach enhances its ability to capture long-range dependencies, hierarchical structures, and temporal dynamics within the graph, leading to more robust and informed predictions and inferences. Furthermore, by incorporating interactive embeddings across multiple geometric spaces, the approach facilitates a more nuanced understanding of how entities and relationships evolve over time. This comprehensive modeling framework has the potential to significantly advance the field of temporal knowledge graph completion by enabling more accurate predictions, improved analysis of temporal dependencies, and enhanced capabilities for reasoning across heterogeneous spaces.
We summarize our main contributions as follows:
  • We introduce a novel attention-based geometry interaction embedding (ATGIE) method for temporal knowledge graph completion, which leverages multiple geometric spaces to capture the intricate structural and temporal dependencies present in temporal knowledge graphs.
  • We take analysis on the interactive nature between different geometric spaces in order to tackle the inherent heterogeneity and complexity within TKGs.
  • Extensive experiments demonstrated the promising capacity of our ATGIE from four aspects, i.e., superiority, improvement, robustness, and sensitivity.

2. Related Work

2.1. Static KG Completion

In the realm of static knowledge graph (KG) completion, a plethora of methods has emerged, categorized into three distinct types. The initial type encompasses translational models that employ vector addition to represent entity relationships, with TransE [22] and its derivatives [23,24,25] being prominent examples. The second type involves semantic matching models, which utilize triangular norms to evaluate the likelihood of relationships; DistMult [26] and ComPlEx [27] are notable for their use of bilinear and complex space asymmetry functions, respectively. The third type focuses on embedding-based reasoning, with models referenced in [28,29] striving to learn distributed representations for KG entities and relations. Extensions to graph convolutional networks (GCNs), such as R-GCN [30] and CompGCN [31], have also been developed for enhanced relation-awareness in KGs. Despite their success, these models predominantly address static reasoning, often overlooking the temporal dynamics present in knowledge graphs.

2.2. Extrapolation TKG Completion

Extrapolation methods in TKG reasoning, aimed at forecasting future facts, have gained traction in recent years. RE-GCN [32] models TKGs by considering structural dependencies and temporal features alongside static attributes without separate query encoding. RE-NET [11] and CyGNeT [33] utilize subgraph aggregators and sequential copy networks, respectively, to model sequences of facts. TANGO-TuckER [34] and TANGO-DistMult [34] introduce the use of neural ODEs in GCNs to encode continuous-time dynamics for improved extrapolation. TLogic [13] offers an interpretable framework based on temporary logical rules extracted through random walks, while xERTE [35] employs a temporal association attention mechanism for causality preservation. TITer [12] employs a reinforcement learning strategy, defining a relative time coding function and a Dirichlet distribution-based reward to enhance learning. CEN [36] addresses evolutionary patterns using a length-aware CNN and curriculum learning, adapting to pattern changes over time. TiRGN [37] models historical dependencies with local and global recurrent graph encoders, providing deeper historical insights for extrapolation tasks.

2.3. Interpolation TKG Completion

Interpolation methods in temporal knowledge graph (TKG) completion extend static KG models by integrating temporal features. TTransE [38], an extension of TransE, incorporates temporal order constraints to enhance the modeling of time-sensitive relationships. HyTE [39] directly incorporates time by associating timestamps with hyperplanes, merging temporal aspects within the entity-relationship framework. TA-DistMult [40] further extends this concept by integrating relations and timestamps via a character LSTM, addressing the variability in time expressions. TComplEx [8], expanding on ComPlEx, uses tensor decomposition to generate timestamp embeddings, improving the model’s temporal adaptability. ATiSE [41] introduces time series decomposition to embed time information, mapping TKG representations into multidimensional Gaussian spaces for uncertainty. TeRo [7] proposes dual complex embeddings for relations to manage the start and end of temporal intervals. TIMEPLEX-base [42] unifies entity, relation, and time embeddings, capitalizing on recurrent facts and temporal interactions for enhanced link and time prediction. RotateQVS [10] and T-GAP [43] prioritize interpretability, with the former using quaternion space for temporal evolution and the latter focusing on temporal displacement for path-based inference. TeLM [9] examines the impact of time granularity through tensor decomposition, furthering the understanding of temporal knowledge graphs.

3. Preliminaries on Non-Euclidean Geometry

The exploration of temporal knowledge graphs is often enriched by the integration of non-Euclidean geometries, which provide a sophisticated framework for representing intricate data structures [15,18,20]. Within this context, Riemannian manifolds serve as a mathematical cornerstone for comprehending spaces with constant curvature. These manifolds are distinguished by the presence of a tangent space at every point, enabling the definition of fundamental operations such as addition and multiplication, which are essential for TKG embeddings.
A pivotal attribute that differentiates these spaces is their curvature. While Euclidean spaces are characterized by zero curvature, furnishing a flat geometrical plane, non-Euclidean spaces present distinct curvatures, positive or negative, that offer varied geometrical properties. Hypersphere spaces, exemplified by positive curvature (denoted as c > 0 ), and hyperbolic spaces, marked by negative curvature ( c ), are especially pertinent for encapsulating specific relational constructs within TKGs.
In the realm of hyperbolic geometry, various models facilitate mathematical operations. These operations include vector addition and multiplication, which are integral to the representation of geometric structures. For instance, the Möbius addition operation, fundamental to hyperbolic space, is articulated by the equation:
x c y = ( 1 + 2 c x , y + c y 2 ) x + ( 1 c x 2 ) y 1 + 2 c x , y + c 2 x 2 y 2 ,
where x , y H c d represent points within the hyperbolic space, and · , · signifies the inner product. The geodesic distance, a natural metric within hyperbolic space, is articulated as follows:
d c ( x , y ) = 2 c arctanh c x c y .
For computational efficiency, the tangent space at a hyperbolic space point, denoted as T x H c d , is frequently utilized. This tangent space is a Euclidean space that facilitates the use of exponential and logarithmic maps for transitioning between the hyperbolic space and the tangent space. These maps are mathematically expressed as follows:
exp 0 c ( v ) = tanh ( c v ) v c v ,
log 0 c ( y ) = artanh c y y c y .
Additionally, hyperbolic space accommodates the definition of scalar multiplication, which scales vectors in accordance with the hyperbolic metric:
r c x = tanh ( r arctanh ( c x ) ) c x / x .
The introduction of these operations is paramount for the development of TKG embeddings capable of effectively modeling the intricacies of real-world data, inclusive of hierarchical and relational structures.

4. Proposed Method

4.1. Notations and Definitions

A temporal knowledge graph (TKG) is an extensive collection of temporal facts, which can be formally represented as a subset of the Cartesian product of entities, relations, entities, and time stamps, denoted as G E × R × E × T . In this context:
  • E symbolizes the set of all entities within the TKG.
  • R represents the set of all relations that connect entities.
  • T denotes the set of time stamps, which are used to indicate when a particular fact occurred.
Each temporal fact in G is expressed as a quadruple ( e s , r , e o , t ) , where:
  • e s E denotes the subject entity.
  • r R denotes the relation that exists between the subject and the object entities.
  • e o E denotes the object entity.
  • t T denotes the time stamp indicating the moment at which the relation r held between e s and e o .
In the domain of TKG completion, the objective is to enhance our understanding of the underlying temporal dynamics within a knowledge graph by predicting missing entities that once participated in a relation at a specific point in time. Given a query q = ( e q , r q , ? , t q ) , the interpolation task is to identify the missing object entity. This is achieved by leveraging the available information from the known subject entity e q E , the relation r q R , and the timestamp t q T . And we use d to denote the embedding dimension.

4.2. Various Geometry Embeddings

The spatial properties of Euclidean, hyperbolic, and hypersphere spaces are fundamentally different, each with its own spatial metric that dictates the rules of distance and angle measurements. The Euclidean metric is the most familiar, derived from the Pythagorean theorem, and is used in the standard Euclidean space. In contrast, the hyperbolic metric is characterized by constant negative curvature, and the hypersphere space is characterized by constant positive curvature. These metrics not only shape the geometry of the spaces but also influence how geometric information propagates between them. The exponential and logarithmic mappings are crucial in transferring information across these spaces. They act as bridges, allowing the embedding of entities in one space to be translated into another. This translation is essential for integrating knowledge from different geometric perspectives and for performing complex operations that span multiple spaces.
In the realm of geometric embeddings, the concept of rotations emerges as an integral mechanism for understanding and manipulating the structure inherent within and across various geometric domains. Rotations are pivotal for learning and encoding the intrinsic geometric relationships. Within our attention-based temporal graph information embedding (ATGIE) framework, these rotations are facilitated through the employment of 2 × 2 Givens transformation matrices. The selection of these matrices is deliberate, capitalizing on their distinctive capacity to maintain the integrity of relative distances within Riemannian non-Euclidean spaces. This attribute renders them particularly well-suited for the embeddings of hyperbolic and hypersphere spaces, thereby enhancing the model’s geometric fidelity and its ability to represent complex spatial relationships.
The block-diagonal matrix W Θ r r o t is composed of elements G + θ r , i on its diagonal, where each G + is a rotation matrix parameterized by an angle θ r , i :
W Θ r r o t = diag G + θ r , 1 , , G + θ r , d / 2 ,
with each G + defined as
G + θ r , i = cos θ r , i sin θ r , i sin θ r , i cos θ r , i .
By applying these rotations to the embeddings, we obtain rotated results for each geometric space:
q r o t H = W Θ r r o t e s H .
Drawing inspiration from translation-based [22,23] and rotation-based [25,44] methods for static KG completion, our model incorporates both transformations to embed relationship and temporal information into the temporal knowledge graph. The head entity embedding is first translated over time to create a temporal embedding, which is then rotated along the relationship vector to produce a transformed head entity embedding.
For the Euclidean space, this translation and rotation is achieved via the Hadamard product:
s E = ( s + t ) r ,
where s is the entity embedding, t is the time embedding, and r is the relationship embedding. For non-Euclidean spaces, namely, hyperbolic and hypersphere, we apply Möbius Addition for translation and Givens transformations for rotation. These operations are then mapped to tangent space representations to facilitate interactions with the Euclidean space:
s B = log 0 c b B Θ r r o t ( h c b t ) ,
s H = log 0 c h H Θ r r o t ( h c h t ) ,
where B Θ r r o t and H Θ r r o t represent the block-diagonal matrices that are meticulously calculated via Equation (6) for the hyperbolic and hypersphere spaces, respectively. c b and c h represent Möbius addition with curvature c b and c h , respectively, and log 0 B and log 0 c h are the corresponding logarithmic mappings to the tangent spaces. These transformations allow for a comprehensive representation of entities and their temporal dynamics across multiple geometric spaces, thus enriching the expressive power of the knowledge graph.

4.3. Attention Mechanism for Geometry Interaction

The interaction of multiple geometric spaces in TKGs offers a rich framework for representing and reasoning about complex data. However, the utility of these embeddings can vary depending on the specific query or task at hand. Some spaces may be more adept at capturing certain types of relationships or temporal dynamics. To leverage the strengths of each geometric representation and to create a more robust and versatile model, we introduce an attention mechanism.
The attention mechanism allows our model to dynamically assign importance to each geometric space based on the query. This is achieved by computing attention weights ( α E , α E B , α H ) for the Euclidean, hyperbolic, and hypersphere spaces, respectively. The weights are derived from the interaction between an attention vector a and the embeddings s E , s B , and s H of the respective spaces:
( α E , α B , α H ) = Softmax ( a T s E , a T s B , a T s H ) ,
where the Softmax function ensures that the attention weights sum to one, providing a normalized measure of the relevance of each space for the given query.
With the attention weights computed, we can construct a weighted sum of the embeddings from each space to form a single geometric interaction embedding G s . This embedding combines the information from all spaces, weighted by their computed importance:
G s = α E s E + α B s B + α H s H .
This results in a comprehensive representation that encapsulates the multi-faceted geometric interactions within the temporal knowledge graph. Finally, query embeddings are compared to target tail embeddings using the distance (Equation (2)) in a hyperbolic space with curvature c r . The resulting scoring function is
s ( q , e o ) = d c r ( G s , e o ) 2 + b ,
where b is the bias which act as margins in the scoring function [17,45].

4.4. Training with Cross-Entropy Loss

Our ATGIE framework is refined through the optimization of a cross-entropy loss function. This function quantifies the discrepancy between the anticipated and actual results for a specified query. Our approach incorporates uniform negative sampling, which generates negative instances for a query q = ( e q , r q , ? , t q ) by uniformly selecting alternative subject entities from the space of all potential triples, thereby disrupting the correct subject entity e o :
L = ( q , e ) T train log ( 1 + exp ( Y e s ( q , e ) ) ) ,
where Y e = 1 if e = e o 1 otherwise ,
where T train is set of training queries. This formulation encourages the model to correctly identify the true object entity e o while differentiating it from the negative samples. Optimization in hyperbolic space presents unique challenges owing to its non-Euclidean nature. To address this, we adopt a strategy where all parameters are defined in the tangent space at the origin of the hyperbolic space. This allows us to leverage standard Euclidean optimization techniques during training. The embeddings are then mapped back to the hyperbolic space using the exponential map when necessary [46].
This approach strikes a balance between the theoretical benefits of hyperbolic embeddings and the practical advantages of Euclidean computation, enabling efficient training of our model while maintaining the geometric properties of the hyperbolic space. The attention mechanism for geometric interaction, as described, provides a sophisticated approach to leveraging the strengths of multiple geometric embeddings within a TKG. By dynamically weighting the contributions of each space, our model can adapt to the requirements of different queries and tasks, leading to improved performance and more nuanced insights into the structured data.

5. Experiments

In this section, we describe a series of experiments conducted to assess the performance of our proposed ATGIE. The evaluation is designed to test the effectiveness of ATGIE across a spectrum of scenarios that are representative of the challenges present in the TKG interpolation task.

5.1. Experimental Setup

5.1.1. Benchmark Datasets

To evaluate our proposed method, we performed link prediction task on four commonly used TKG benchmark datasets, namely, ICEWS14, ICEWS05-15 [40], YAGO11k [39], and GDELT [47]. Table 1 encapsulates a summary of these datasets, highlighting the ICEWS14 and ICEWS05-15 datasets’ greater quantity of quantitative relations in comparison to YAGO11k and GDELT. The ICEWS [48] serves as a comprehensive archive of political events, each indexed with a distinct timestamp. The datasets ICEWS14 and ICEWS05-15 [40] are derived subsets of the ICEWS corpus, representing events from the year 2014 and spanning the years 2005 to 2015, respectively. YAGO11k [39] is an excerpt from the YAGO3 [49] dataset, with temporal annotations articulated as time intervals. We derive the dataset from HyTE [39] to obtain the same year-level granularity by dropping the month and date information, which results in 70 different time stamps. For the GDELT dataset, we utilize the subset defined by [47], encompassing events from 1 April 2015 to 31 March 2016. Consistent with the methodology of [50], we applied identical preprocessing steps to the train, validation, and test sets, thereby framing the problem as a TKG completion (TKGC) task rather than one of extrapolation.

5.1.2. Evaluation Protocol

The evaluation of our proposed model for TKG completion is conducted through a link prediction task. This task is specifically designed to deduce incomplete temporal facts with a missing entity. It targets scenarios where the facts are represented by either a source entity s, relation r, and a missing target entity at time t, denoted as ( s , r , ? , t ) , or vice versa, where the source entity is missing, represented as ( ? , r , o , t ) . During the inference phase, we adhere to the candidate generation strategy as outlined in [7]. This strategy involves the creation of candidate quadruples C, which are formed by substituting either the subject s or the object o with every entity from the entity set E , resulting in the set C = { ( s , r , o ¯ , t ) : o ¯ E } { ( s ¯ , r , o , t ) : s ¯ E } . Subsequently, these quadruples are ordered by their corresponding scores in an ascending order by their scores (Equation (14)). We opt for the time-wise filtered setting as reported in [41,50] to present our experimental outcomes. This method ensures that facts not observed at time t remain eligible for assessment when evaluating a given test quadruple, as outlined in [7]. This approach is particularly well-suited for TKG reasoning, as it maintains the temporal integrity of the data while facilitating a more accurate analysis. All experiments are conducted based on a Linux machine with 16 CPU cores, 32 GB RAM and a NVIDIA 3090Ti GPU (Nvidia Corporation, Santa Clara, CA, USA). We set the dimension of the embedding to 200.
To assess the effectiveness of our approach, we rely on conventional evaluation metrics, which include the ratio of accurate triples positioned within the top ranks (Hits@1, Hits@3, and Hits@10), as well as the Mean Reciprocal Rank (MRR). In the context of these metrics, higher values are indicative of superior performance. For the purpose of our experiments, we present the mean results obtained from five independent runs, electing to exclude the variance due to its typically minimal impact.

5.1.3. Baselines

We compared with both SOTA static and temporal KGE baselines. For static baselines, we use TransE [22], DistMult [26], RotatE [25], and QuatE [51]. For TKGE methods, we consider TTransE [38], HyTE [39], TA-DistMult [40], DE-SimplE [50], ATiSE [41], TeRo [7], RotateQVS [10], and TGoemE++ [52].

5.2. Performance Comparison

The experimental results over four extrapolation datasets are listed in Table 2. Overall, temporal knowledge graph completion methods are better than static knowledge graph completion methods, which shows the effectiveness of modeling temporal information. For the proposed ATGIE, we observe that our model outperforms all the baseline models over the four datasets across all metrics consistently. Despite the significant differences in scale, number of entities, and number of relations between the four TKGs used in our evaluation (see Table 1), the results demonstrate that our proposed model exhibits efficient reasoning performance on TKGs of different sizes and complexity levels. This showcases the versatility and efficiency of our model in addressing knowledge graph completion tasks in diverse temporal contexts.

5.3. Ablation Study of Performance

Ablation studies were conducted on two datasets, namely, ICEWS14 and ICEWS05-15, to investigate the impact of individual components on the performance of our ATGIE. The MRR performances are shown in Figure 1, where seven sub-models are compared, including (1) the original ATGIE model; (2) ATGIE without euclidean space, denoted as “-E”; (3) ATGIE without hyperbolic space, denoted as “-B”; (4) ATGIE without hypersphere space, denoted as “-H”; (5) ATGIE without both the euclidean and hyperbolic spaces, denoted as “-E-B”; (6) ATGIE without both the euclidean and hypersphere spaces, denoted as “-E-H”; and (7) ATGIE without both the hyperbolic and hypersphere spaces, denoted as “-B-H”. Based on the experimental results, it can be concluded that the embeddings of each space, including Euclidean, hyperbolic, and hypersphere, play a crucial role in determining the performance metrics of ATGIE. The ablation study helps us understand the importance of each component and highlights their contributions to the overall performance of the ATGIE.

5.4. Robustness Evaluation

To rigorously assess the robustness of the ATGIE framework, we conducted experiments under two distinct conditions: data sparsity and data noise. This was achieved by systematically manipulating the ICEWS14 dataset through the deletion or insertion of facts. Specifically, we altered the training set by randomly removing existing facts or adding new ones, targeting proportions of 25%, 50%, and 75%. Figure 2 and Figure 3 present a comparative analysis of performance across these varying levels of data sparsity and noise. In this comparison, we evaluate our ATGIE model against two baselines: TeRo [7] and RotateQVS [10]. The results showcased in Figure 2 and Figure 3 clearly illustrate that ATGIE significantly outperforms the baseline models across an extensive range of data sparsity and noise conditions. Notably, ATGIE maintains a high level of performance even under extreme conditions of 75% data sparsity or noise, achieving a Hits@1 value exceeding 0.45. This exceptional outcome underscores ATGIE’s unparalleled robustness and its proficient capability in effectively managing both sparse and noisy datasets. The superior performance of ATGIE can be attributed to its advanced design, which likely incorporates innovative mechanisms to mitigate the adverse effects of data imperfections. This makes ATGIE a highly resilient tool for applications where data quality cannot be guaranteed, offering significant advantages over conventional models in handling challenging real-world scenarios involving incomplete or inaccurate information.

6. Conclusions

In this paper, we introduce a novel attention-based geometry interaction embedding (ATGIE) method, tailored for temporal knowledge graph completion. This innovative method harnesses the collective advantages of distinct geometric domains, i.e., Euclidean, hyperbolic, and hypersphere, to more adeptly represent the complexities inherent in TKGs. ATGIE employs an attention mechanism to dynamically weigh the contributions of different geometric spaces, allowing it to adaptively form reliable spatial structures based on interactive geometric information. This multi-space modeling not only captures the diverse relationships within TKGs but also facilitates a nuanced understanding of how entities and their relationships evolve over time. Through extensive experiments, we demonstrate ATGIE’s superiority in TKGC tasks, showcasing its improvement over existing methods, robustness to noise, and sensitivity to temporal dynamics. The results highlight ATGIE’s potential to advance the state of the art in TKGC, offering a promising direction for research and application in the field.

7. Limitations and Future Work

Limitations: The ATGIE method, despite its innovative approach to temporal knowledge graph completion, comes with a set of limitations that warrant attention. A key aspect of our model is the attention mechanism, which dynamically weighs the contributions from distinct geometric domains. While this feature enhances the model’s adaptability, it also introduces sensitivity to hyperparameter settings. The process of tuning these hyperparameters is critical and can be complex, requiring a delicate balance to avoid overfitting. The model’s reliance on the attention mechanism may also make the optimization process more susceptible to local minima, thus demanding sophisticated initialization methods and advanced hyperparameter optimization techniques. Furthermore, although ATGIE demonstrates robustness against data sparsity and noise, the model’s adaptability across different domains and its scalability when applied to extensive TKGs are yet to be thoroughly investigated. The empirical validation of ATGIE’s performance on a variety of datasets and its capacity to handle large-scale TKGs with millions of entities and relationships is an area that requires further research.
Future Work: Looking ahead, several avenues for future work present themselves. First, addressing the scalability of ATGIE to very large TKGs will be a focal point. This includes developing efficient learning strategies, such as mini-batch processing or distributed computing approaches, to manage the computational demands of big data environments. Second, we recognize the need to enhance the interpretability of the attention mechanism within ATGIE. Future work will explore visualization and explanation methods to provide insights into how different geometric spaces contribute to the model’s decisions, thereby increasing trust and transparency in the model’s predictions. Lastly, we plan to investigate the long-term stability and convergence properties of the model. This includes studying the effects of different optimization algorithms and learning rate schedules on the training process to ensure reliable and consistent performance.

Author Contributions

Methodology, X.Z. and S.P.; writing—original draft, J.M.; writing—review & editing, F.Y.; Funding acquisition, S.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Cultivation Foundation of Shandong Women’s University (grant number 2022GSPSJ03).

Data Availability Statement

The data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Mao, Y.; Chen, H. Rule-guided compositional representation learning on knowledge graphs with hierarchical types. Mathematics 2021, 9, 1978. [Google Scholar] [CrossRef]
  2. Zhang, P.; Chen, D.; Fang, Y.; Zhao, X.; Xiao, W. CIST: Differentiating Concepts and Instances Based on Spatial Transformation for Knowledge Graph Embedding. Mathematics 2022, 10, 3161. [Google Scholar] [CrossRef]
  3. Liu, X.; Mao, S.; Wang, X.; Bu, J. Generative Transformer with Knowledge-Guided Decoding for Academic Knowledge Graph Completion. Mathematics 2023, 11, 1073. [Google Scholar] [CrossRef]
  4. Sun, K.; Yu, S.; Peng, C.; Wang, Y.; Alfarraj, O.; Tolba, A.; Xia, F. Relational structure-aware knowledge graph representation in complex space. Mathematics 2022, 10, 1930. [Google Scholar] [CrossRef]
  5. Yuan, X.; Chen, J.; Wang, Y.; Chen, A.; Huang, Y.; Zhao, W.; Yu, S. Semantic-Enhanced Knowledge Graph Completion. Mathematics 2024, 12, 450. [Google Scholar] [CrossRef]
  6. Jiang, T.; Liu, T.; Ge, T.; Sha, L.; Chang, B.; Li, S.; Sui, Z. Towards Time-Aware Knowledge Graph Completion. In Proceedings of the COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, Osaka, Japan, 11–16 December 2016; ACL: Kerrville, TX, USA, 2016; pp. 1715–1724. [Google Scholar]
  7. Xu, C.; Nayyeri, M.; Alkhoury, F.; Shariat Yazdi, H.; Lehmann, J. TeRo: A Time-aware Knowledge Graph Embedding via Temporal Rotation. In Proceedings of the 28th International Conference on Computational Linguistics, Barcelona, Spain (Online), 8–13 December 2020; International Committee on Computational Linguistics: New York, NY, USA, 2020; pp. 1583–1593. [Google Scholar]
  8. Lacroix, T.; Obozinski, G.; Usunier, N. Tensor Decompositions for Temporal Knowledge Base Completion. In Proceedings of the 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, 26–30 April 2020. [Google Scholar]
  9. Xu, C.; Chen, Y.; Nayyeri, M.; Lehmann, J. Temporal Knowledge Graph Completion using a Linear Temporal Regularizer and Multivector Embeddings. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2021, Online, 6–11 June 2021; Toutanova, K., Rumshisky, A., Zettlemoyer, L., Hakkani-Tür, D., Beltagy, I., Bethard, S., Cotterell, R., Chakraborty, T., Zhou, Y., Eds.; Association for Computational Linguistics: Kerrville, TX, USA, 2021; pp. 2569–2578. [Google Scholar] [CrossRef]
  10. Chen, K.; Wang, Y.; Li, Y.; Li, A. RotateQVS: Representing Temporal Information as Rotations in Quaternion Vector Space for Temporal Knowledge Graph Completion. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), ACL 2022, Dublin, Ireland, 22–27 May 2022; Muresan, S., Nakov, P., Villavicencio, A., Eds.; Association for Computational Linguistics: Kerrville, TX, USA, 2022; pp. 5843–5857. [Google Scholar] [CrossRef]
  11. Jin, W.; Qu, M.; Jin, X.; Ren, X. Recurrent Event Network: Autoregressive Structure Inference over Temporal Knowledge Graphs. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), Online, 16–20 November 2020; pp. 6669–6683. [Google Scholar]
  12. Sun, H.; Zhong, J.; Ma, Y.; Han, Z.; He, K. TimeTraveler: Reinforcement Learning for Temporal Knowledge Graph Forecasting. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021, Virtual Event/Punta Cana, Dominican Republic, 7–11 November 2021; Moens, M., Huang, X., Specia, L., Yih, S.W., Eds.; Association for Computational Linguistics: Kerrville, TX, USA, 2021; pp. 8306–8319. [Google Scholar] [CrossRef]
  13. Liu, Y.; Ma, Y.; Hildebrandt, M.; Joblin, M.; Tresp, V. TLogic: Temporal Logical Rules for Explainable Link Forecasting on Temporal Knowledge Graphs. In Proceedings of the Thirty-Sixth AAAI Conference on Artificial Intelligence, AAAI 2022, Thirty-Fourth Conference on Innovative Applications of Artificial Intelligence, IAAI 2022, The Twelveth Symposium on Educational Advances in Artificial Intelligence, EAAI 2022, Virtual, 22 February–1 March 2022; AAAI Press: Washington, DC, USA, 2022; pp. 4120–4127. [Google Scholar]
  14. Ren, X.; Bai, L.; Xiao, Q.; Meng, X. Hierarchical Self-Attention Embedding for Temporal Knowledge Graph Completion. In Proceedings of the ACM Web Conference 2023, Austin, TX, USA, 30 April 2023–4 May 2023; pp. 2539–2547. [Google Scholar]
  15. Han, Z.; Chen, P.; Ma, Y.; Tresp, V. DyERNIE: Dynamic Evolution of Riemannian Manifold Embeddings for Temporal Knowledge Graph Completion. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020, Online, 16–20 November 2020; Webber, B., Cohn, T., He, Y., Liu, Y., Eds.; Association for Computational Linguistics: Kerrville, TX, USA, 2020; pp. 7301–7316. [Google Scholar] [CrossRef]
  16. Montella, S.; Rojas-Barahona, L.M.; Heinecke, J. Hyperbolic Temporal Knowledge Graph Embeddings with Relational and Time Curvatures. In Proceedings of the Findings of the Association for Computational Linguistics: ACL/IJCNLP 2021, Online, 1–6 August 2021; Zong, C., Xia, F., Li, W., Navigli, R., Eds.; Association for Computational Linguistics: Kerrville, TX, USA, 2021; pp. 3296–3308. [Google Scholar] [CrossRef]
  17. Balažević, I.; Allen, C.; Hospedales, T. Multi-relational Poincaré Graph Embeddings. In Proceedings of the Advances in Neural Information Processing Systems, Vancouver, BC, Canada, 8–14 December 2019; pp. 4465–4475. [Google Scholar]
  18. Zhu, S.; Pan, S.; Zhou, C.; Wu, J.; Cao, Y.; Wang, B. Graph Geometry Interaction Learning. In Proceedings of the 34th International Conference on Neural Information Processing Systems, NeurIPS 2020, Virtual, 6–12 December 2020; pp. 7548–7558. [Google Scholar]
  19. Jia, Y.; Lin, M.; Wang, Y.; Li, J.; Chen, K.; Siebert, J.; Zhang, G.Z.; Liao, Q. Extrapolation over temporal knowledge graph via hyperbolic embedding. CAAI Trans. Intell. Technol. 2023, 8, 418–429. [Google Scholar] [CrossRef]
  20. Chami, I.; Wolf, A.; Juan, D.; Sala, F.; Ravi, S.; Ré, C. Low-Dimensional Hyperbolic Knowledge Graph Embeddings. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020, Online, 5–10 July 2020; Jurafsky, D., Chai, J., Schluter, N., Tetreault, J.R., Eds.; Association for Computational Linguistics: Kerrville, TX, USA, 2020; pp. 6901–6914. [Google Scholar] [CrossRef]
  21. Shang, B.; Zhao, Y.; Liu, J.; Wang, D. Mixed Geometry Message and Trainable Convolutional Attention Network for Knowledge Graph Completion. In Proceedings of the Thirty-Eighth AAAI Conference on Artificial Intelligence, AAAI 2024, Thirty-Sixth Conference on Innovative Applications of Artificial Intelligence, IAAI 2024, Fourteenth Symposium on Educational Advances in Artificial Intelligence, EAAI 2014, Vancouver, BC, Canada, 20–27 February 2024; Wooldridge, M.J., Dy, J.G., Natarajan, S., Eds.; AAAI Press: Washington, DC, USA, 2024; pp. 8966–8974. [Google Scholar] [CrossRef]
  22. Bordes, A.; Usunier, N.; Garcia-Duran, A.; Weston, J.; Yakhnenko, O. Translating embeddings for modeling multi-relational data. In Proceedings of the Advances in Neural Information Processing Systems, Lake Tahoe, NA, USA, 5–10 December 2013; pp. 2787–2795. [Google Scholar]
  23. Wang, Z.; Zhang, J.; Feng, J.; Chen, Z. Knowledge Graph Embedding by Translating on Hyperplanes. In Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence, Québec City, QC, Canada, 27–31 July 2014; Brodley, C.E., Stone, P., Eds.; AAAI Press: Washington, DC, USA, 2014; pp. 1112–1119. [Google Scholar]
  24. Lin, Y.; Liu, Z.; Sun, M.; Liu, Y.; Zhu, X. Learning Entity and Relation Embeddings for Knowledge Graph Completion. In Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence, Austin, TX, USA, 25–30 January 2015; Bonet, B., Koenig, S., Eds.; AAAI Press: Washington, DC, USA, 2015; pp. 2181–2187. [Google Scholar]
  25. Sun, Z.; Deng, Z.; Nie, J.; Tang, J. RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space. In Proceedings of the 7th International Conference on Learning Representations, ICLR 2019, New Orleans, LA, USA, 6–9 May 2019. [Google Scholar]
  26. Yang, B.; Yih, W.; He, X.; Gao, J.; Deng, L. Embedding Entities and Relations for Learning and Inference in Knowledge Bases. In Proceedings of the 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, 7–9 May 2015. [Google Scholar]
  27. Trouillon, T.; Welbl, J.; Riedel, S.; Gaussier, É.; Bouchard, G. Complex Embeddings for Simple Link Prediction. In Proceedings of the 33nd International Conference on Machine Learning, ICML 2016, New York, NY, USA, 19–24 June 2016; JMLR Workshop and Conference Proceedings. Volume 48, pp. 2071–2080. [Google Scholar]
  28. Dettmers, T.; Minervini, P.; Stenetorp, P.; Riedel, S. Convolutional 2D Knowledge Graph Embeddings. In Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, 2–7 February 2018; pp. 1811–1818. [Google Scholar]
  29. Shang, C.; Tang, Y.; Huang, J.; Bi, J.; He, X.; Zhou, B. End-to-End Structure-Aware Convolutional Networks for Knowledge Base Completion. In Proceedings of the Thirty-Third AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; pp. 3060–3067. [Google Scholar]
  30. Schlichtkrull, M.S.; Kipf, T.N.; Bloem, P.; van den Berg, R.; Titov, I.; Welling, M. Modeling Relational Data with Graph Convolutional Networks. In Proceedings of the Semantic Web—15th International Conference, ESWC 2018, Heraklion, Greece, 3–7 June 2018; Volume 10843, pp. 593–607. [Google Scholar]
  31. Vashishth, S.; Sanyal, S.; Nitin, V.; Talukdar, P.P. Composition-based Multi-Relational Graph Convolutional Networks. In Proceedings of the 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, 26–30 April 2020. [Google Scholar]
  32. Li, Z.; Jin, X.; Li, W.; Guan, S.; Guo, J.; Shen, H.; Wang, Y.; Cheng, X. Temporal Knowledge Graph Reasoning Based on Evolutional Representation Learning. In Proceedings of the SIGIR ’21: The 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, Virtual, 11–15 July 2021; Diaz, F., Shah, C., Suel, T., Castells, P., Jones, R., Sakai, T., Eds.; ACM: New York, NY, USA, 2021; pp. 408–417. [Google Scholar] [CrossRef]
  33. Zhu, C.; Chen, M.; Fan, C.; Cheng, G.; Zhang, Y. Learning from History: Modeling Temporal Knowledge Graphs with Sequential Copy-Generation Networks. In Proceedings of the Thirty-Fifth AAAI Conference on Artificial Intelligence, AAAI 2021, Thirty-Third Conference on Innovative Applications of Artificial Intelligence, IAAI 2021, The Eleventh Symposium on Educational Advances in Artificial Intelligence, EAAI 2021, Virtual, 2–9 February 2021; AAAI Press: Washington, DC, USA, 2021; pp. 4732–4740. [Google Scholar]
  34. Han, Z.; Ding, Z.; Ma, Y.; Gu, Y.; Tresp, V. Learning Neural Ordinary Equations for Forecasting Future Links on Temporal Knowledge Graphs. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021, Punta Cana, Dominican Republic, 7–11 November 2021; Moens, M., Huang, X., Specia, L., Yih, S.W., Eds.; Association for Computational Linguistics: Kerrville, TX, USA, 2021; pp. 8352–8364. [Google Scholar] [CrossRef]
  35. Han, Z.; Chen, P.; Ma, Y.; Tresp, V. Explainable Subgraph Reasoning for Forecasting on Temporal Knowledge Graphs. In Proceedings of the 9th International Conference on Learning Representations, ICLR 2021, Virtual, 3–7 May 2021. [Google Scholar]
  36. Li, Z.; Guan, S.; Jin, X.; Peng, W.; Lyu, Y.; Zhu, Y.; Bai, L.; Li, W.; Guo, J.; Cheng, X. Complex Evolutional Pattern Learning for Temporal Knowledge Graph Reasoning. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), ACL 2022, Dublin, Ireland, 22–27 May 2022; Muresan, S., Nakov, P., Villavicencio, A., Eds.; Association for Computational Linguistics: Kerrville, TX, USA, 2022; pp. 290–296. [Google Scholar] [CrossRef]
  37. Li, Y.; Sun, S.; Zhao, J. TiRGN: Time-Guided Recurrent Graph Network with Local-Global Historical Patterns for Temporal Knowledge Graph Reasoning. In Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, IJCAI 2022, Vienna, Austria, 23–29 July 2022; Raedt, L.D., Ed.; IJCAI: Somerset, NJ, USA, 2022; pp. 2152–2158. [Google Scholar] [CrossRef]
  38. Leblay, J.; Chekol, M.W. Deriving validity time in knowledge graph. In Proceedings of the Companion Proceedings of the The Web Conference 2018, Lyon, France, 23–27 April 2018; pp. 1771–1776. [Google Scholar]
  39. Dasgupta, S.S.; Ray, S.N.; Talukdar, P. Hyte: Hyperplane-based temporally aware knowledge graph embedding. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, 31 October–4 November 2018; pp. 2001–2011. [Google Scholar]
  40. García-Durán, A.; Dumancic, S.; Niepert, M. Learning Sequence Encoders for Temporal Knowledge Graph Completion. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, 31 October– 4 November 2018; Riloff, E., Chiang, D., Hockenmaier, J., Tsujii, J., Eds.; Association for Computational Linguistics: Kerrville, TX, USA, 2018; pp. 4816–4821. [Google Scholar]
  41. Xu, C.; Nayyeri, M.; Alkhoury, F.; Lehmann, J.; Yazdi, H.S. Temporal Knowledge Graph Embedding Model based on Additive Time Series Decomposition. arXiv 2019, arXiv:1911.07893. [Google Scholar]
  42. Jain, P.; Rathi, S.; Mausam; Chakrabarti, S. Temporal Knowledge Base Completion: New Algorithms and Evaluation Protocols. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020, Online, 16–20 November 2020; Webber, B., Cohn, T., He, Y., Liu, Y., Eds.; Association for Computational Linguistics: Kerrville, TX, USA, 2020; pp. 3733–3747. [Google Scholar] [CrossRef]
  43. Jung, J.; Jung, J.; Kang, U. Learning to Walk across Time for Interpretable Temporal Knowledge Graph Completion. In Proceedings of the KDD ’21: The 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Virtual, Singapore, 14–18 August 2021; Zhu, F., Ooi, B.C., Miao, C., Eds.; ACM: New York, NY, USA, 2021; pp. 786–795. [Google Scholar] [CrossRef]
  44. Gao, C.; Sun, C.; Shan, L.; Lin, L.; Wang, M. Rotate3D: Representing Relations as Rotations in Three-Dimensional Space for Knowledge Graph Embedding. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management, Virtual, 19–23 October 2020; pp. 385–394. [Google Scholar]
  45. Tifrea, A.; Bécigneul, G.; Ganea, O.E. Poincaré GloVe: Hyperbolic Word Embeddings. In Proceedings of the International Conference on Learning Representations, New Orleans, LA, USA, 6–9 May 2019. [Google Scholar]
  46. Chami, I.; Ying, Z.; Ré, C.; Leskovec, J. Hyperbolic graph convolutional neural networks. In Proceedings of the Advances in Neural Information Processing Systems, Vancouver, BC, Canada, 8–14 December 2019; pp. 4869–4880. [Google Scholar]
  47. Trivedi, R.; Dai, H.; Wang, Y.; Song, L. Know-Evolve: Deep Temporal Reasoning for Dynamic Knowledge Graphs. In Proceedings of the 34th International Conference on Machine Learning, ICML 2017, Sydney, NSW, Australia, 6–11 August 2017; Precup, D., Teh, Y.W., Eds.; Proceedings of Machine Learning Research: New York, NY, USA, 2017; Volume 70, pp. 3462–3471. [Google Scholar]
  48. Lautenschlager, J.; Shellman, S.; Ward, M. ICEWS Events and Aggregations. Harv. Dataverse 2015, 3. [Google Scholar]
  49. Mahdisoltani, F.; Biega, J.; Suchanek, F.M. YAGO3: A Knowledge Base from Multilingual Wikipedias. In Proceedings of the CIDR 2015, Seventh Biennial Conference on Innovative Data Systems Research, Asilomar, CA, USA, 4–7 January 2015. [Google Scholar]
  50. Goel, R.; Kazemi, S.M.; Brubaker, M.; Poupart, P. Diachronic embedding for temporal knowledge graph completion. Proc. AAAI Conf. Artif. Intell. 2020, 34, 3988–3995. [Google Scholar] [CrossRef]
  51. Zhang, S.; Tay, Y.; Yao, L.; Liu, Q. Quaternion knowledge graph embeddings. In Proceedings of the Advances in Neural Information Processing Systems, Vancouver, BC, Canada, 8–14 December 2019; pp. 2735–2745. [Google Scholar]
  52. Xu, C.; Nayyeri, M.; Chen, Y.Y.; Lehmann, J. Geometric Algebra Based Embeddings for Static and Temporal Knowledge Graph Completion. IEEE Trans. Knowl. Data Eng. 2023, 35, 4838–4851. [Google Scholar] [CrossRef]
Figure 1. Ablation studies on ICEWS14 and ICEWS05-15.
Figure 1. Ablation studies on ICEWS14 and ICEWS05-15.
Mathematics 12 02022 g001
Figure 2. Data sparsity evaluation on ICEWS14.
Figure 2. Data sparsity evaluation on ICEWS14.
Mathematics 12 02022 g002
Figure 3. Data noise evaluation on ICEWS14.
Figure 3. Data noise evaluation on ICEWS14.
Mathematics 12 02022 g003
Table 1. Statistics of four experimented datasets.
Table 1. Statistics of four experimented datasets.
DatasetICEWS14ICEWS05-15YAGO11kGDELT
Entities712810,48810,623500
Relations2302511020
Time Stamps365401770366
Train72,826386,96216,4082,735,685
Validation894146,2752050341,961
Test896346,0922051341,961
Table 2. Results on link prediction task over four experimented datasets. The best score is in bold.
Table 2. Results on link prediction task over four experimented datasets. The best score is in bold.
DatasetICEWS14ICEWS05-15
MetricsHits@1Hits@3Hits@10MRRHits@1Hits@3Hits@10MRR
TransE [22]0.094-0.6370.2800.090-0.6630.294
DistMult [26]0.323-0.6720.4390.337-0.6910.456
RotatE [25]0.2910.4780.6900.4180.1640.3550.5950.304
QuatE [51]0.3530.5300.7120.4710.3700.5290.7270.482
TTransE [38]0.074-0.6010.2550.084-0.6160.271
HyTE [39]0.1080.4160.6550.2970.1160.4450.6810.316
TA-DistMult [40]0.363-0.6860.4770.346-0.7280.474
DE-SimplE [50]0.4180.5920.7250.5260.3920.5780.7480.513
ATiSE [41]0.4360.6290.7500.5500.3780.6060.7940.519
TeRo [7]0.4680.6210.7320.5620.4690.6680.7950.586
RotateQVS [10]0.5070.6420.7540.5910.5290.7090.8130.633
TGoemE++ [52]0.5460.6800.7800.6290.6050.7360.8330.686
ATGIE (ours)0.5630.6910.7970.6420.6160.7490.8400.691
DatasetYAGO11kGDELT
MetricsHits@1Hits@3Hits@10MRRHits@1Hits@3Hits@10MRR
TransE0.0150.1380.2440.1000.00.1580.3120.113
DistMult0.1070.1610.2680.1580.1170.2080.3480.196
RotatE0.1030.1670.3050.167----
QuatE0.1070.1480.2700.164----
TTransE0.0200.1500.2510.1080.00.1600.3180.115
HyTE0.0150.1430.2720.1050.00.1650.3260.118
TA-DistMult0.1030.1710.2920.1610.1240.2190.3650.206
DE-SimplE----0.1410.2480.4030.230
ATiSE0.1100.1710.2880.170----
TeRo0.1210.1970.3190.187----
RotateQVS0.1240.1990.3230.1890.1750.2930.4580.270
TGoemE++0.1300.1960.3260.195----
ATGIE (ours)0.1550.2320.3570.2230.1980.3130.4800.294
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhao, X.; Miao, J.; Yang, F.; Pang, S. Geometry Interaction Embeddings for Interpolation Temporal Knowledge Graph Completion. Mathematics 2024, 12, 2022. https://doi.org/10.3390/math12132022

AMA Style

Zhao X, Miao J, Yang F, Pang S. Geometry Interaction Embeddings for Interpolation Temporal Knowledge Graph Completion. Mathematics. 2024; 12(13):2022. https://doi.org/10.3390/math12132022

Chicago/Turabian Style

Zhao, Xuechen, Jinfeng Miao, Fuqiang Yang, and Shengnan Pang. 2024. "Geometry Interaction Embeddings for Interpolation Temporal Knowledge Graph Completion" Mathematics 12, no. 13: 2022. https://doi.org/10.3390/math12132022

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop