Key idea: negative samples and loss functions
- Abstract
- Task: predict entities that appear less frequently in knowledge graphs.
- Contribution 1: propose knowledge relational attention network to leverage the graph context by simultaneously projecting neighboring triples to diferent latent spaces and jointly aggregating messages with the attention mechanism
- Contribution 2: propose knowledge contrastive loss by combining the contrastive loss with cross entropy loss, which introduces more negative samples and thus enriches the feedback to sparse entities
- Intro
- Motivation: predicting entities rarely appear in knowledge graphs is still challenging. They investigate in-degree and link prediction performance (Maybe in-degree is a point I can investigate more on)
- Methodology
- Experiments
- sparse knowledge graphs performance
- Entity In-degree Analysis
- Combination of Different GNN Encoder and Projection Head
- Conclusion
- They present KRACL model to alleviate the widespread sparsity problem for knowledge graph completion
- introduce knowledge contrastive loss to introduce more neg- ative samples, hence more feedback is provided to sparse entities