WWW 2023 KRACL: Contrastive Learning with Graph Context Modeling for Sparse Knowledge Graph Completion

Key idea: negative samples and loss functions

  1. Abstract
    1. Task: predict entities that appear less frequently in knowledge graphs.
    2. Contribution 1: propose knowledge relational attention network to leverage the graph context by simultaneously projecting neighboring triples to diferent latent spaces and jointly aggregating messages with the attention mechanism
    3. Contribution 2: propose knowledge contrastive loss by combining the contrastive loss with cross entropy loss, which introduces more negative samples and thus enriches the feedback to sparse entities
  2. Intro
    1. Motivation: predicting entities rarely appear in knowledge graphs is still challenging. They investigate in-degree and link prediction performance (Maybe in-degree is a point I can investigate more on)
  3. Methodology
  4. Experiments
    1. sparse knowledge graphs performance
    2. Entity In-degree Analysis
    3. Combination of Different GNN Encoder and Projection Head
  5. Conclusion
    1. They present KRACL model to alleviate the widespread sparsity problem for knowledge graph completion
    2. introduce knowledge contrastive loss to introduce more neg- ative samples, hence more feedback is provided to sparse entities

Leave a comment