Few-shot relational learning on knowledge graph (KGs) aims to perform reasoning over relations with only a few training examples. While current methods have focused primarily on leveraging specific relational information, rich semantics inherent in KGs have been largely overlooked. To bridge this gap, we propose PromptMeta, a novel prompted meta-learning framework that seamlessly integrates meta-semantics with relational information for few-shot relational learning. PromptMeta introduces two core innovations: (1) a Meta-Semantic Prompt (MSP) pool that learns and consolidates high-level meta-semantics shared across tasks, enabling effective knowledge transfer and adaptation to newly emerging relations; and (2) a learnable fusion mechanism that dynamically combines meta-semantics with task-specific relational information tailored to different few-shot tasks. Both components are optimized jointly with model parameters within a meta-learning framework. Extensive experiments and analyses on two real-world KG benchmarks validate the effectiveness of PromptMeta in adapting to new relations with limited supervision.
翻译:知识图谱(KG)上的少样本关系学习旨在仅通过少量训练样本对关系进行推理。尽管现有方法主要侧重于利用特定的关系信息,但知识图谱中固有的丰富语义在很大程度上被忽视了。为弥补这一差距,我们提出了PromptMeta,一种新颖的提示元学习框架,将元语义与关系信息无缝集成,用于少样本关系学习。PromptMeta引入了两项核心创新:(1)元语义提示(MSP)池,用于学习并整合跨任务共享的高层元语义,实现有效的知识迁移并适应新出现的关系;(2)可学习的融合机制,动态地将元语义与针对不同少样本任务定制的任务特定关系信息相结合。这两个组件均在元学习框架内与模型参数联合优化。在两个真实世界知识图谱基准上的大量实验与分析验证了PromptMeta在有限监督下适应新关系的有效性。