オムライスの備忘録

数学・統計学・機械学習・プログラミングに関することを記す

【機械学習】Metric Learning / Distance Learning #まとめ編 #01

Index

Metric Learning / Distance Learning

ここでは、特に損失について、記す.

損失 / Loss

NCA Loss / 2014

マハラノビス距離を導入.

Contrastive Loss / 2005

「異なる」種類のデータ間の距離を大きく (遠く)、「同じ」種類のデータ間の距離を小さく (近く) なるように写像することを目的とした Loss.

Siamese Architecture に利用される.





Margin を用いた損失について













Triplet Loss / Triplet Margin Loss / 2005

Triplet Architecture / Triplet Network で利用される.



Ranking Loss / 2013

  • “Devise: A deep visual-semantic embedding model,” in NIPS, 2013.

  • “Deep fragment embeddings for bidirectional image sentence mapping,” in NIPS, 2014.

  • “Multimodal convolutional neural networks for matching image and sentence,” in ICCV, 2015.

  • “Learning deep structure-preserving image-text embeddings,” in CVPR, 2016.

  • “Dual attention networks for multimodal reasoning and matching,” in CVPR, 2017.

  • “Learning deep representations of fine-grained visual descriptions,” in CVPR, 2016.

Lifted Structure Loss / 2015



Multi-Class N-Pair Loss / 2016

Tuplet Loss と Multi-Class N-Pair Loss.

NTXent Loss / 2018

N-Pair Loss の一般化.

Info NCE とも呼ばれる.

  • Representation Learning with Contrastive Predictive Coding

  • Momentum Contrast for Unsupervised Visual Representation Learning

  • A Simple Framework for Contrastive Learning of Visual Representations

Large Margin Softmax Loss / 2016



  • Large-Margin Softmax Loss for Convolutional Neural Networks

Angular Loss / 2017



  • Deep Metric Learning with Angular Loss

Generalized Lifted Structure Loss / 2017

Person Re-Identification の研究.

Instance Loss / 2017

Image と Text の Embedding を用いる研究.

  • Dual-Path Convolutional Image-Text Embeddings with Instance Loss
    • [2017]
    • 4 PROPOSED INSTANCE LOSS
      • 4.2 Instance Loss
    • arxiv.org

Margin Loss / 2017

データのサンプリングについての研究.



  • Sampling Matters in Deep Embedding Learning

Proxy NCA Loss / Proxy Ranking Loss / 2017



  • No Fuss Distance Metric Learning using Proxies
    • [2017]
    • 3 Metric Learning using Proxies
      • 3.2 Proxy Ranking Loss
    • arxiv.org

Normalized Softmax Loss / 2018



  • Classification is a Strong Baseline for Deep Metric Learning

Ranked List Loss / 2019

Siamese Architecture に利用される.







  • Ranked List Loss for Deep Metric Learning

Fast AP Loss / 2019





Multi Similarity Loss / 2019



  • Multi-Similarity Loss with General Pair Weighting for Deep Metric Learning

Signal To Noise Ration Contrastive Loss / 2019

SNR Contrastive Loss







Soft Triplet Loss / 2019

従来の Softmax Loss より改善した.



Tuplet Loss / Tuplet Margin Loss / 2019

Batch からのサンプリングについての研究.

Intra Pair Variance Loss / 2019

Batch からのサンプリングについての研究.

Circle Loss / 2020



  • Circle Loss: A Unified Perspective of Pair Similarity Optimization

Proxy Anchor Loss / 2020



  • Proxy Anchor Loss for Deep Metric Learning

Sup Con Loss / 2020



  • Supervised Contrastive Learning

Centroid Triplet Loss / 2021



VIC Reg Loss / 2021



  • VICReg: Variance-Invariance-Covariance Regularization for Self-Supervised Learning

Face Recognition

Face Recognition で利用される損失.

Sphere Face Loss / 2017

  • SphereFace: Deep Hypersphere Embedding for Face Recognition

Arc Face Loss / 2018

Cos Face Loss / 2018

  • CosFace: Large Margin Cosine Loss for Deep Face Recognition

Sub Center Arc Face Loss / 2020