深度学习 反向传播
loss 函数用法
pytorch loss官方源码:https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/loss.py
所有loss的总结:https://blog.csdn.net/u011995719/article/details/85107524
triplet loss
lass TripletMarginLoss(Module):
r"""Creates a criterion that measures the triplet loss given an input
tensors x1, x2, x3 and a margin with a value greater than 0.
This is used for measuring a relative similarity between samples. A triplet
is composed by `a`, `p` and `n`: anchor, positive examples and negative
example respectively. The shape of all input variables should be
:math:`(N, D)`.
The distance swap is described in detail in the paper `Learning shallow
convolutional feature descriptors with triplet losses`_ by
V. Balntas, E. Riba et al.
Args:
anchor: anchor input tensor
positive: positive input tensor
negative: negative input tensor
p: the norm degree. Default: 2
Shape:
- Input: :math:`(N, D)` where `D = vector dimension`
- Output: :math:`(N, 1)`
Balntas, V. , Riba, E. , Ponsa, D. , & Mikolajczyk, K. . (2016). Learning local feature descriptors with triplets and shallow convolutional neural networks. British Machine Vision Conference 2016.pdf
where
数据加载,构造特定的dataloader
refs:
https://github.com/chencodeX/triplet-loss-pytorch
https://www.zhihu.com/question/272153256/answer/370823997
TripletMarginWithDistanceLoss
N
is the batch size; :math:d
is a nonnegative, real-valued function quantifying the closeness of two tensors, referred to as the :attr:distance_function
; and :math:margin
is a nonnegative margin representing the minimum difference between the positive and negative distances that is required for the loss to be 0.
>>> # Initialize embeddings
>>> embedding = nn.Embedding(1000, 128)
>>> anchor_ids = torch.randint(0, 1000, (1,))
>>> positive_ids = torch.randint(0, 1000, (1,))
>>> negative_ids = torch.randint(0, 1000, (1,))
>>> anchor = embedding(anchor_ids)
>>> positive = embedding(positive_ids)
>>> negative = embedding(negative_ids)
>>>
>>> # Built-in Distance Function 使用系统自带的距离函数nn.PairwiseDistance()
>>> triplet_loss = \
>>> nn.TripletMarginWithDistanceLoss(distance_function=nn.PairwiseDistance())
>>> output = triplet_loss(anchor, positive, negative)
>>> output.backward()
>>>
>>> # Custom Distance Function 使用自定义的距离函数
>>> def l_infinity(x1, x2):
>>> return torch.max(torch.abs(x1 - x2), dim=1).values
>>>
>>> triplet_loss = \
>>> nn.TripletMarginWithDistanceLoss(distance_function=l_infinity, margin=1.5)
>>> output = triplet_loss(anchor, positive, negative)
>>> output.backward()
>>>
>>> # Custom Distance Function (Lambda)
>>> triplet_loss = \
>>> nn.TripletMarginWithDistanceLoss(
>>> distance_function=lambda x, y: 1.0 - F.cosine_similarity(x, y))
>>> output = triplet_loss(anchor, positive, negative)
>>> output.backward()
CentreLoss
CosineEmbeddingLoss
Creates a criterion that measures the loss given input tensors , and a Tensor
label with values 1 or -1. This is used for measuring whether two inputs are similar or dissimilar, using the cosine distance, and is typically used for learning nonlinear embeddings or semi-supervised learning.
Additive Angular Margin Loss(加性角度间隔损失函数)
ArcFace中是直接在角度空间θ中最大化分类界限,而CosFace是在余弦空间cos(θ)中最大化分类界限。
ArcFace: Additive Angular Margin Loss for Deep Face Recognition pdf
梯度下降
次梯度
非梯度优化算法:
遗传算法
粒子群算法
凸优化
fmincon
https://www.zhihu.com/question/426888224
https://www.zhihu.com/question/304133157