MetAug:元特征增强用于对比学习

522 阅读2分钟
论文题目:MetAug: Contrastive Learning via Meta Feature Augmentation[1]
论文来源:ICML22
代码地址:MetAug
一、Motivation

We argue that contrastive learning heavily relies on informative features, or “hard” (positive or negative) features. The informativeness of features learned from such augmented data is limited. We perform a meta learning technique to build the augmentation generator that updates its network parameters by considering the performance of the encoder. 作者认为对比学习对于信息含量丰富的特征(或者困难正负样本)具有强依赖性。其中从随机数据增强中获得的特征是有限的。因此作者提出使用元学习技术构建一个增强生成器用于依据编码器的表现来更新生成器的参数。

二、Model

image.png 较于传统的对比学习,其增加了两个增强器awja_{w_{j}}awja_{w_{j'}},使用这两个增强器来抽取更加有效的特征用于对比学习

三、Performance

image.png 实验结果表明了其方法的有效性。

四、Ablation Study

image.png 针对为什么不加增强的特征效果反而会更好,作者指出可能存在两个原因:

  1. the augmented features are generated to lead the encoders to learn discriminative representations (e.g., hj, ), which indicates that the augmented features contribute to the improvement of the encoders, but this does not mean that the augmented features are discriminative for downstream tasks;

  2. in the test of using augmented features, we do not discard the projection head gϑj (·), and recent works prove that the approach of using a projection head in training and discarding such head in the test can signifificantly improve the performance of the model on downstream tasks

其一可能是因为增强的特征可能对与训练编码器有帮助,但是对于下游任务(节点分类、节点聚合其负作用);其二可能是因为其在根据下游任务进行微调的时候没有去掉原先的projection head. 一般情况下是需要去掉的,这里有疑问,为什么作者不放一个微调去掉projection head的结果。

五、Conclusion

从实验结果可以看出该方法的有效性,可以将该框架用于所有需要对比学习的任务中。

六、References

[1] Jiangmeng Li and Wenwen Qiang and Changwen Zheng and Bing Su and Hui Xiong,2022,In ICML.12964-12978.