Losses

Implementation of new loss functions, especially stochastic loss functions. The package is also compatible with all the loss functions from tensorflow as well as custom loss functions. There are two main ways to use loss functions in tensorflow :

  1. Use the loss function directly with tf.keras API :

    >>> model.compile(
    >>> loss=gaussian_negative_log_likelihood,
    >>> ....
    >>> )
    
  2. Use the loss function as a standalone function :

    >>> y_true = [[5], [10], [35]]
    >>> pred = [[[6, 1]], [[7, 1]], [[36, 0.8]]]
    >>> gaussian_negative_log_likelihood(y_true, pred).numpy()
    2.756748
    

Here is the list of the new losses :

Gaussian Negative Log Likelihood

purestochastic.common.losses.gaussian_negative_log_likelihood(y, prediction)[source]

Gaussian negative log likelihood loss function.

The loss function computes the gaussian negative log likelihood between the ground truth values \(y=(y_1, \ldots, y_n)\) and the predictions which are the mean and the variance of a gaussian distribution :

  • \(\hat{\mu} = \text{prediction}[ : , \cdots , : , 0]\)

  • \(\hat{\sigma}^2 = \text{prediction}[ : , \cdots , : , 1]\)

Mathematically, the loss function is defined as :

\[\mathcal{L}(y, \hat{\mu}, \hat{\sigma}^2) = \frac{1}{2n} \displaystyle\sum_{i=1}^{n} \Bigg[\ln(\hat{\sigma}^2(x_i,\theta))+\frac{(y_i-\hat{\mu}(x_i,\theta))^2}{\hat{\sigma}^2(x_i,\theta)} \Bigg] + \frac{1}{2}\log(2 \pi)\]

If y is not a 1D array, the batch dimension needs to be the first and the output value is the mean over all other dimensions.

Parameters
  • y (tf.Tensor or np.ndarray) – Ground truth values.

  • prediction (tf.Tensor or np.ndarray) – The predicted values.

Returns

Value of the Gaussian negative log likelihood.

Return type

tf.float32

Input shape

(N+2)-D tensor with shape: [batch_size, d_0, ...,  d_N].

Output shape

(N+3)-D tensor with shape: [batch_size, d_0, ..., d_N, 2].

Note

The likelihood is a product of density functions and so the values can be between \([0, \infty]\). The negative log likelihood can thus be negative if the likelihood is greater than 1. Therefore, don’t worry if your loss function is negative, it’s often the case.