tslearn-team/tslearn

Can neural prophet use soft-dtw loss function?

weidongzhou1994 opened this issue · 4 comments

As title

Hello @weidongzhou1994,

NeuralProphet (https://github.com/ourownstory/neural_prophet/blob/main/neuralprophet/forecaster.py) proposes to use metrics from PyTorch. Indeed, the class NeuralProphet has the optional parameter loss_func:

        loss_func : str, torch.nn.functional.loss
            Type of loss to use:

            Options
                * (default) ``Huber``: Huber loss function
                * ``MSE``: Mean Squared Error loss function
                * ``MAE``: Mean Absolute Error loss function
                * ``torch.nn.functional.loss.``: loss or callable for custom loss, eg. L1-Loss

            Examples
            --------
            >>> from neuralprophet import NeuralProphet
            >>> import torch
            >>> import torch.nn as nn
            >>> m = NeuralProphet(loss_func=torch.nn.L1Loss)

However, when I run the code:

from neuralprophet import NeuralProphet
from tslearn.metrics import SoftDTWLossPyTorch
m = NeuralProphet(loss_func=SoftDTWLossPyTorch)

I obtain the following error message:

Traceback (most recent call last):
  File "/home/ycabanes/work/tslearn/codes/try_neuralprophet_with_softdtwlosspytorch.py", line 15, in <module>
    m = NeuralProphet(loss_func=SoftDTWLossPyTorch)
  File "/home/ycabanes/.local/lib/python3.8/site-packages/neuralprophet/forecaster.py", line 398, in __init__
    self.config_train = configure.Train(
  File "<string>", line 18, in __init__
  File "/home/ycabanes/.local/lib/python3.8/site-packages/neuralprophet/configure.py", line 112, in __post_init__
    self.set_loss_func()
  File "/home/ycabanes/.local/lib/python3.8/site-packages/neuralprophet/configure.py", line 134, in set_loss_func
    raise NotImplementedError(f"Loss function {self.loss_func} not found")
NotImplementedError: Loss function <class 'tslearn.metrics.soft_dtw_loss_pytorch.SoftDTWLossPyTorch'> not found

I have succeeded to combine the SoftDTWLossPytorch from tslearn with NeuralProphet from neuralprophet defining:

from tslearn.metrics import SoftDTWLossPyTorch
from tslearn.metrics.soft_dtw_loss_pytorch import _SoftDTWLossPyTorch

def soft_dtw_loss_function(x, y, dist_func=SoftDTWLossPyTorch._euclidean_squared_dist, gamma=0.1):
    d_xy = dist_func(x, y)
    return _SoftDTWLossPyTorch.apply(d_xy, gamma)

and then:

m = NeuralProphet(loss_func=soft_dtw_loss_function)

Here is the full code of a notebook that is running on Google Colab, inspired by a notebook available on NeuralProphet (https://github.com/ourownstory/neural_prophet/blob/main/docs/source/tutorials/tutorial10.ipynb):

Install the modules

try:
    import neuralprophet
except ImportError:
    !pip install neuralprophet[live]

try:
    import tslearn
except ImportError:
    !pip install tslearn

Import the modules

import pandas as pd
import torch
from neuralprophet import NeuralProphet, set_log_level
from tslearn.metrics import SoftDTWLossPyTorch
from tslearn.metrics.soft_dtw_loss_pytorch import _SoftDTWLossPyTorch

Define a SoftDTW loss function using tslearn

def soft_dtw_loss_function(x, y, dist_func=SoftDTWLossPyTorch._euclidean_squared_dist, gamma=0.1):
    d_xy = dist_func(x, y)
    return _SoftDTWLossPyTorch.apply(d_xy, gamma)
# Load the dataset from the CSV file using pandas
df = pd.read_csv("https://github.com/ourownstory/neuralprophet-data/raw/main/kaggle-energy/datasets/tutorial01.csv")

# Disable logging messages unless there is an error
set_log_level("ERROR")

# Model and prediction
m = NeuralProphet(loss_func=soft_dtw_loss_function)
m.set_plotting_backend("plotly")

df_train, df_val = m.split_df(df, valid_p=0.2)

print("Dataset size:", len(df))
print("Train dataset size:", len(df_train))
print("Validation dataset size:", len(df_val))

metrics = m.fit(df_train, validation_df=df_val, progress=None)
metrics

forecast = m.predict(df)
m.plot(forecast)

This helped me and my team a lot! Thank you @YannCabanes