Knowledge graphs link entities through relations to provide a structured representation of real world facts. However, they are often incomplete, because they are based on only a small fraction of all plausible facts. The task of knowledge graph completion via link prediction aims to overcome this challenge by inferring missing facts represented as links between entities. Current approaches to link prediction leverage tensor factorization and/or deep learning. Factorization methods train and deploy rapidly thanks to their small number of parameters but have limited expressiveness due to their underlying linear methodology. Deep learning methods are more expressive but also computationally expensive and prone to overfitting due to their large number of trainable parameters. We propose Neural Powered Tucker Network (NePTuNe), a new hybrid link prediction model that couples the expressiveness of deep models with the speed and size of linear models. We demonstrate that NePTuNe provides state-of-the-art performance on the FB15K-237 dataset and near state-of-the-art performance on the WN18RR dataset.
翻译:知识图通过关系将实体联系起来,以提供结构化真实世界事实的描述;然而,知识图往往不完整,因为它们仅仅基于所有可信事实的一小部分。通过链接预测完成知识图的任务旨在通过推断作为实体之间联系的缺失事实来克服这一挑战。目前将预测的杠杆拉动和(或)深层学习联系起来的方法。集成法由于参数数量少而培训和迅速部署,但由于其基本的线性方法,其表达性有限。深层学习方法更明确,但计算性更昂贵,并且由于其大量可训练参数而容易过度适应。我们提出了神经动力塔克网络(NePTuck网络),这是一个新的混合链接预测模型,将深层模型的清晰性能与线性模型的速度和大小结合起来。我们证明NeptuNe提供了FB15K-237数据集的最新表现以及WN18RR数据集的近最新表现。