A loss function is the function which can get the difference(gap) between a model’s predictions and true values to evaluate how good a model is. *Loss function is also called Cost Function or Error Function.
There are popular loss function as shown below:
(1) L1 Loss:
- can compute the average of the sum of the absolute differences between a model’s predictions and true values.
- ‘s formula is as shown below:
- is also called Mean Absolute Error(MAE).
- is L1Loss() in PyTorch.
(2) L2 Loss:
- can compute the average of the sum of the squared differences between a model’s predictions and true values.
- ‘s formula is as shown below:
- is also called Mean Squared Error(MSE).
- is MSELoss() in PyTorch
(3) Huber Loss:
- can do the similar computation of either L1 Loss or L2 Loss depending on the absolute differences between a model’s predictions and true values compared with
delta
.
*Memos:-
delta
is 1.0 basically. - Be careful, the computation is not exactly same as L1 Loss or L2 Loss according to the formulas below.
-
- ‘s formula is as shown below. *The 1st one is L2 Loss-like one and the 2nd one is L1 Loss-like one:
- is HuberLoss() in PyTorch.
- ‘s
delta
of 1.0 is same as Smooth L1 Loss which is SmoothL1Loss() in PyTorch.
(4) BCE(Binary Cross Entropy) Loss:
- can compute the differences between a model’s binary predictions and true binary values.
- s’ formula is as shown below:
- is also called Binary Cross Entropy or Log(Logarithmic) Loss.
- is BCELoss() in PyTorch. *There is also BCEWithLogitsLoss() which is the combination of BCE Loss and Sigmoid Activation Function in PyTorch.
Source link
lol