【迁移学习】domain adaptation and metric learning基础知识及博客整理

定义

Metric learning

  • supervised method
    • the basic idea is to find a metric under which the data from the same classes are close while from different ones far apart.[1]
  • unsupervised metric learning methods
    • the basic idea is to discover a space or metric in a low-dimensional space which preserves the geometric relationships of the data in the original space.[1]
  • domain adaptation approaches
    • adapt auxiliary data or classifiers to a new domain to optimize the learning objective in the target domain.[1]
  • metric
    • 在数学中,一个度量(或距离函数)是一个定义集合中元素之间距离的函数。一个具有度量的集合被称为度量空间[2]
  • metric learning
    • aims to improve the classification accuracy by capturing the spatial structure of the training set
    • 为了处理各种各样的特征相似度,我们可以在特定的任务通过选择合适的特征并手动构建距离函数。然而这种方法会需要很大的人工投入,也可能对数据的改变非常不鲁棒。度量学习作为一个理想的替代,可以根据不同的任务来自主学习出针对某个特定任务的度量距离函数。[2]
  • domain adaptation metric learning
    • The DAML utilize all labeled samples in source domain to obtain an optimal transformation. Therefore, the discriminative metric information of the source domain can be adapted to the target domain.[1]

数学基础

  • MMD(maximum mean discrepancy, 最大均值差异)
    • 用来度量两个分布的相似度。
    • 比较好的解释:
      • https://blog.csdn.net/a1154761720/article/details/51516273
    • MMD代码阅读
      • https://blog.csdn.net/a529975125/article/details/81176029
    • 相关知识
      • 再生核希尔伯特空间(Reproducing Kernel Hilbert Space, RKHS)
        • https://blog.csdn.net/haolexiao/article/details/72171523?utm_source=itdadao&utm_medium=referral
      • MMD和再生核希尔伯特空间的关系
        • https://zhuanlan.zhihu.com/p/25418364
      • 核函数
        • https://blog.csdn.net/sunshine_in_moon/article/details/51322285
      • 拉格朗日函数
  • Triplet Loss(三元损失)
    • 比起传统的softmax,三元损失加入了对差异性的度量,这有助于细节区分.
    • blog - https://blog.csdn.net/tangwei2014/article/details/46788025

参考资料

[1] Geng, B., Tao, D., & Xu, C. (2011). Daml: domain adaptation metric learning. IEEE Transactions on Image Processing A Publication of the IEEE Signal Processing Society, 20(10), 2980.
[2]http://blog.csdn.net/nehemiah_li/article/details/44230053

你可能感兴趣的:(Transfer,learning)