/KGEPapers

Must-read papers on Knowledge Graph Embedding

MIT LicenseMIT

Must-read papers on Knowledge Graph Embedding (KGE)

  1. Learning Structured Embeddings of Knowledge Bases. AAAI 2011. [Paper]
    Antoine Bordes, Jason Weston, Ronan Collobert, Yoshua Bengio.

    This paper proposes Structured Embeddings (SE), which assumes that the head and tail entities are similar in a relation-specific subspace: formula.

  2. Translating Embeddings for Modeling Multi-relational Data. NIPS 2013. [Paper]
    Antoine Bordes, Nicolas Usunier, Alberto Garcia-Duran, Jason Weston, Oksana Yakhnenko.

    This paper proposes TransE, which models the relations are translation operations between head and tail entities: formula.

  3. Knowledge Graph Embedding by Translating on Hyperplanes. AAAI 2014. [Paper]
    Zhen Wang, Jianwen Zhang, Jianlin Feng, Zheng Chen.

    This paper proposes TransH to model the many-to-many property. It interpretes the relations as: 1) relation-specfic hyperplanes; 2) translation operations between head and tail entities projected on the hyperplane: formula.

  4. Learning Entity and Relation Embeddings for Knowledge Graph Completion. AAAI 2015. [Paper]
    Yankai Lin, Zhiyuan Liu, Maosong Sun, Yang Liu, Xuan Zhu.

    This paper proposes TransR/CTransR, which interpretes the relations as: 1) relation-specfic spaces; 2) translation operations between head and tail entities projected on the hyperplane: formula.

  5. Knowledge Graph Embedding via Dynamic Mapping Matrix. ACL 2015. [Paper]
    Guoliang Ji, Shizhu He, Liheng Xu, Kang Liu, Jun Zhao.

    This paper proposes TransD to improve TransR/CTransR. It uses two vectors to represent each entity and relation. The first one represents the meaning of a(n) entity (relation), the other one is used to construct mapping matrix dynamically. Compared with TransR/CTransR, TransD not only considers the diversity of relations, but also entities. It has less parameters and has no matrix-vector multiplication operations.

  6. Learning to Represent Knowledge Graphs with Gaussian Embedding. CIKM 2015. [Paper]
    Shizhu He, Kang Liu, Guoliang Ji and Jun Zhao.

    This paper proposes KG2E to explicitly model the certainty of entities and relations, which learns the representations of KGs in the space of multi-dimensional Gaussian distributions. Each entity/relation is represented by a Gaussian distribution, where the mean denotes its position and the covariance (currently with diagonal covariance) can properly represent its certainty.

  7. Modeling Relation Paths for Representation Learning of Knowledge Bases. EMNLP 2015. [Paper]
    Yankai Lin, Zhiyuan Liu, Huanbo Luan, Maosong Sun, Siwei Rao, Song Liu.

    This paper proposes pTransE to consider relation paths as translations between entities for representation learning, and addresses two key challenges: (1) we design a path-constraint resource allocation algorithm to measure the reliability of relation paths; (2) we represent relation paths via semantic composition of relation embeddings.

  8. Composing Relationships with Translations. EMNLP 2015. [Paper]
    Alberto García-Durán, Antoine Bordes, Nicolas Usunier.

    This paper proposes RTransE, which is an extension of TransE that learns to explicitly model composition of relationship via the addition of their corresponding translation vectors.

  9. From One Point to A Manifold: Knowledge Graph Embedding For Precise Link Prediction. IJCAI 2016. [Paper]
    Han Xiao, Minlie Huang, Xiaoyan Zhu.

    This paper proposes a manifold-based embedding principle (ManifoldE) which could be treated as a well-posed algebraic system that expands point-wise modeling in current models to manifold-wise modeling. The score function is designed by measuing the distance of the triple away from a manifold.

  10. A Generative Mixture Model for Knowledge Graph Embedding. ACL 2016. [Paper]
    Han Xiao, Minlie Huang, Xiaoyan Zhu.

    This paper proposes TransG to address the issue of multiple relation semantics that a relation may have multiple meanings revealed by the entity pairs associated with the corresponding triples. TransG can discover latent semantics for a relation and leverage a mixture of relation-specific component vectors to embed a fact triple.

  11. Knowledge Graph Completion with Adaptive Sparse Transfer Matrix. AAAI 2016. [Paper]
    Guoliang Ji, Kang Liu, Shizhu He, Jun Zhao.

    This paper proposes TranSparse to model the heterogeneity (some relations link many entity pairs and others do not) and the imbalance (the number of head entities and that of tail entities in a relation could be different) of knowledge graphs. In TranSparse, transfer matrices are replaced by adaptive sparse matrices, whose sparse degrees are determined by the number of entities (or entity pairs) linked by relations.

  12. Knowledge Graph Embedding on a Lie Group. AAAI 2018. [Paper]
    Takuma Ebisu, Ryutaro Ichise.

    This paper proposes TorusE, to solve the regularization problem of TransE. The principle of TransE can be defined on any Lie group. A torus, which is one of the compact Lie groups, can be chosen for the embedding space to avoid regularization.

  13. Knowledge Graph Embedding by Relational Rotation in Complex Space. ICLR 2019. [Paper] [Code]
    Zhiqing Sun, Zhi Hong Deng, Jian Yun Nie, Jian Tang.

    This paper proposes RotatE to model the three relation patterns: symmetry/antisymmetry, inversion, and composition. RotatE defines each relation as a rotation from the source entity to the target entity in the complex vector space.

  14. Relation Embedding with Dihedral Group in Knowledge Graph. ACL 2019. [Paper]
    Canran Xu, Ruijiang Li.

    This paper proposes DihEdral, named after dihedral symmetry group. This model learns knowledge graph embeddings that can capture relation compositions by nature. Furthermore, DihEdral models the relation embeddings parametrized by discrete values, thereby decrease the solution space drastically.

  15. Multi-relational Poincaré Graph Embeddings. NeurIPS 2019. [Paper]
    Ivana Balaževic, Carl Allen, Timothy Hospedales.

    This paper proposes MuRP to capture multiple simultaneous hierarchies. MuRP embeds multi-relational graphdata in the Poincaré ball model of hyperbolic space.

  16. Quaternion Knowledge Graph Embeddings. NeurIPS 2019. [Paper]
    Shuai Zhang, Yi Tay, Lina Yao, Qi Liu.

    This paper proposes QuatE to model relations as rotations in the quaternion space.

  17. Learning Hierarchy-Aware Knowledge Graph Embeddings for Link Prediction. AAAI 2020. [Paper] [Code]
    Zhanqiu Zhang, Jianyu Cai, Yongdong Zhang, Jie Wang.

    This paper proposes a novel knowledge graph embedding model---namely, Hierarchy-Aware Knowledge Graph Embedding (HAKE)---which maps entities into the polar coordinate system. HAKE is inspired by the fact that concentric circles in the polar coordinate system can naturally reflect the hierarchy.

  1. RESCAL: A Three-Way Model for Collective Learning on Multi-Relational Data. ICML 2011. [Paper]
    Maximilian Nickel, Volker Tresp, Hans-Peter Kriegel.

    This paper proposes RESCAL to perform relational learning based on the factorization of a three-way tensor. RESCAL is able to perform collective learning via the latent components of the model and provide an efficient algorithm to compute the factorization.

  2. A Latent Factor Model for Highly Multi-relational Data. NIPS 2012. [Paper]
    Rodolphe Jenatton, Nicolas L. Roux, Antoine Bordes, Guillaume R. Obozinski.

    This paper proposes LFM for modeling large multi-relational datasets, with possibly thousands of relations. LFM is based on a bilinear structure, which captures various orders of interaction of the data, and also shares sparse latent factors across different relations.

  3. Embedding Entities and Relations for Learning and Inference in Knowledge Bases. ICLR 2015. [Paper]
    Bishan Yang, Wen-tau Yih, Xiaodong He, Jianfeng Gao, Li Deng.

    This paper proposes DistMult to simplify RESCAL by restricting the semantic matching matrices to diagonal matrices.

  4. Holographic Embeddings of Knowledge Graphs. AAAI 2016. [Paper]
    Maximilian Nickel, Lorenzo Rosasco, Tomaso A. Poggio.

    This paper proposes holographic embeddings(HolE), which employs circular correlation to create compositional representations.

  5. Complex Embeddings for Simple Link Prediction. ICML 2016. [Paper]
    Théo Trouillon, Johannes Welbl, Sebastian Riedel, Éric Gaussier and Guillaume Bouchard.

    This paper proposes ComplEx, which maps entity and relation embeddings in complex spaces to effectively capture antisymmetric relations.

  6. Knowledge Graph Completion via Complex Tensor Factorization. JMLR 2017. [Paper]
    Théo Trouillon, Christopher R. Dance, Johannes Welbl, Sebastian Riedel, Éric Gaussier, Guillaume Bouchard.

    This paper is the JMLR version of ComplEx.

  7. Analogical Inference for Multi-relational Embeddings. ICML 2017. [Paper]
    Hanxiao Liu, Yuexin Wu, Yiming Yang.

    This paper proposes ANALOGY to model analogical properties of the embedded entities and relations.

  8. On Multi-Relational Link Prediction with Bilinear Models. AAAI 2018. [Paper]
    Yanjie Wang, Rainer Gemulla, Hui Li.

    This paper explores the expressiveness of and the connections between various bilinear models proposed in the literature.

  9. Canonical Tensor Decomposition for Knowledge Base Completion. ICML 2018. [Paper] [Code]
    Timothée Lacroix, Nicolas Usunier, Guillaume Obozinski.

    This paper motivates and tests a novel regularizer: N3, based on tensor nuclear p-norms. Then, this paper presents a reformulation of the problem that makes it invariant to arbitrary choices in the inclusion of predicates or their reciprocals in the dataset.

  10. Embedding for Link Prediction in Knowledge Graphs. NeurIPS 2018. [Paper]
    Seyed Mehran Kazemi, David Poole.

    This paper presents a simple enhancement of CP (called SimplE) to allow the two embeddings of each entity to be learned dependently.

  11. Tensor Factorization for Knowledge Graph Completion. EMNLP-IJCNLP 2019. [Paper]
    Ivana Balazevic, Carl Allen, Timothy Hospedales.

    This paper proposes TuckER, a relatively straightforward but powerful linear model based on Tucker decomposition of the binary tensor representation of knowledge graph triples.

  1. Reasoning With Neural Tensor Networks for Knowledge Base Completion. NIPS 2013. [Paper]
    Richard Socher, Danqi Chen, Christopher D. Manning, Andrew Ng.

    This paper introduces an expressive neural tensor network NTN:, which is suitable for reasoning over relationships between two entities.

  2. Embedding Projection for Knowledge Graph Completion. AAAI 2017. [Paper]
    Baoxu Shi, Tim Weninger.

    This paper presents a shared variable neural network model called ProjE that fills-in missing information in a knowledge graph by learning joint embeddings of the knowledge graph’s entities and edges, and through subtle, but important, changes to the standard loss function.

  3. ConvE: Convolutional 2D Knowledge Graph Embeddings. AAAI 2018. [Paper] [Code]
    Tim Dettmers, Pasquale Minervini, Pontus Stenetorp, Sebastian Riedel.

  4. ConvKB: A Novel Embedding Model for Knowledge Base Completion Based on Convolutional Neural Network. NAACL-HLT 2018. [Paper] [Code]
    Dai Quoc Nguyen, Tu Dinh Nguyen, Dat Quoc Nguyen, Dinh Phung.

  5. R-GCN: Modeling Relational Data with Graph Convolutional Networks. ESWC 2018. [Paper]
    Michael Schlichtkrull, Thomas N. Kipf, Peter Bloem, Rianne van den Berg, Ivan Titov, Max Welling.

  6. KBGAT: Learning Attention-based Embeddings for Relation Prediction in Knowledge Graphs. ACL 2019. [Paper] [Code]
    Deepak Nathani, Jatin Chauhan, Charu Sharma, Manohar Kaul.

  7. RSN: Learning to Exploit Long-term Relational Dependencies in Knowledge Graphs. ICML 2019. [Paper] [Code]
    Lingbing Guo, Zequn Sun, Wei Hu.

  8. CapsE: A Capsule Network-based Embedding Model for Knowledge Graph Completion and Search Personalization. NAACL-HIT 2019. [Paper] [Code]
    Dai Quoc Nguyen, Thanh Vu, Tu Dinh Nguyen, Dat Quoc Nguyen, Dinh Q. Phung.

  9. InteractE: InteractE: Improving Convolution-based Knowledge Graph Embeddings by Increasing Feature Interactions. AAAI 2020. [Paper]
    Shikhar Vashishth, Soumya Sanyal, Vikram Nitin, Nilesh Agrawal, Partha Talukdar.

  1. You CAN Teach an Old Dog New Tricks! On Training Knowledge Graph Embeddings. ICLR 2020. [Paper] [Code]
    Daniel Ruffinelli, Samuel Broscheit, Rainer Gemulla.

  2. A Re-evaluation of Knowledge Graph Completion Methods. ACL 2020. [Paper] [Code]
    Zhiqing Sun, Shikhar Vashishth, Soumya Sanyal, Partha Talukdar, Yiming Yang.