FlyAI小助手

  • 3

    获得赞
  • 85873

    发布的文章
  • 0

    答辩的项目

Informative knowledge distillation for image anomaly segmentation

用于图像异常分割的信息知识提取

作者: Qian Wan

作者邀请

论文作者还没有讲解视频

邀请直播讲解

您已邀请成功, 目前已有 $vue{users_count} 人邀请!

再次邀请

Unsupervised anomaly segmentation methods based on knowledge distillation have recently been developed and show superior segmentation performance. However, rare attention has been paid to the overfitting problem caused by the inconsistency between the capacity of the neural network and the amount of knowledge in this scheme. This paper proposes a novel method named Informative Knowledge Distillation (IKD) to address the overfitting problem by increasing knowledge and offering a strong supervisory signal. Technically, a novel Context Similarity Loss (CSL) is proposed to capture context information from normal data manifolds. Besides, a novel Adaptive Hard Sample Mining (AHSM) is proposed to encourage more attention on hard samples with valuable information. With IKD, informative knowledge can be distilled, so that the overfitting problem can be well mitigated, and the performance can be further increased. The proposed method achieves better results on several categories of the well-known MVTec AD dataset than state-of-the-art methods in terms of AU-ROC, achieving 97.81% overall in 15 categories. Extensive experiments on ablation studies are also conducted to show the effectiveness of IKD in alleviating the overfitting problem.

基于知识提取的无监督异常分割方法是近年来发展起来的一种新的分割方法,具有较好的分割性能。然而,由于神经网络的容量与该方案中的知识量之间的不一致而导致的过拟合问题却鲜有人关注。本文提出了一种新的方法--信息知识蒸馏(IKD),通过增加知识和提供强大的监督信号来解决过度匹配问题。在技术上,提出了一种新的上下文相似度损失(CSL)来捕获正常数据流形中的上下文信息。此外,还提出了一种新的自适应硬样本挖掘算法(AHSM),以鼓励人们更多地关注有价值信息的硬样本。通过知识发现,可以提取出信息丰富的知识,从而很好地缓解了过拟合问题,进一步提高了性能。在著名的MVTec AD数据集的多个类别上,该方法在AU-ROC方面取得了比最新方法更好的结果,在15个类别中总体达到了97.81%。此外,还进行了大量的烧蚀实验研究,以显示IKD在缓解过匹配问题方面的有效性。

文件下载

关联比赛

本作品采用 知识共享署名-非商业性使用-相同方式共享 4.0 国际许可协议进行许可,转载请附上原文出处链接和本声明。
本文链接地址:https://flyai.com/paper_detail/1171
讨论
500字
表情
发送
删除确认
是否删除该条评论?
取消 删除