Knowledge graphs (KGs) require continual updates as new information emerges, but neural embedding models suffer from catastrophic forgetting when learning new tasks sequentially. We evaluate Elastic Weight Consolidation (EWC), a regularization-based continual learning method, on KG link prediction using TransE embeddings on FB15k-237. Across multiple experiments with five random seeds, we find that EWC reduces catastrophic forgetting from 12.62% to 6.85%, a 45.7% reduction compared to naive sequential training. We observe that the task partitioning strategy affects the magnitude of forgetting: relation-based partitioning (grouping triples by relation type) exhibits 9.8 percentage points higher forgetting than randomly partitioned tasks (12.62% vs 2.81%), suggesting that task construction influences evaluation outcomes. While focused on a single embedding model and dataset, our results demonstrate that EWC effectively mitigates catastrophic forgetting in KG continual learning and highlight the importance of evaluation protocol design.
翻译:随着新信息的出现,知识图谱(KGs)需要持续更新,但神经嵌入模型在顺序学习新任务时会发生灾难性遗忘。我们使用TransE嵌入在FB15k-237数据集上评估了基于正则化的持续学习方法——弹性权重巩固(EWC)在知识图谱链接预测中的表现。通过五次随机种子的多次实验,我们发现EWC将灾难性遗忘从12.62%降低至6.85%,相比朴素顺序训练减少了45.7%。我们观察到任务划分策略会影响遗忘程度:基于关系的划分(按关系类型对三元组分组)比随机划分的任务表现出更高的遗忘率(12.62%对比2.81%),高出9.8个百分点,这表明任务构建方式会影响评估结果。尽管研究聚焦于单一嵌入模型和数据集,我们的结果表明EWC能有效缓解知识图谱持续学习中的灾难性遗忘,并突显了评估协议设计的重要性。