Why do you use conv2d instead of conv1d with time-series data?
blacksnail789521 opened this issue · 3 comments
Within your implementation, you guys used conv2d in ResNet. Just out of curiosity, why don't you use conv1d, which is the norm in the time-series domain? Thanks.
Actually, both conv2D (https://github.com/cauchyturing/UCR_Time_Series_Classification_Deep_Learning_Baseline/blob/master/ResNet.py) and conv1D (https://github.com/hfawaz/dl-4-tsc/blob/master/classifiers/resnet.py) are adopted in time series modeling. As most of implementations of ResNet use conv2D for their implementation (which makes sense for images), we simply adopt the standard ResNet which uses conv2D as the backbone in our implementation. In this case, we treat each whole time series as a line of a two-dimensional image so that all time series together form a stream of 2D images. But if you focus on timestamp-level representations, we recommend you to refer to the paper: "Time Series Classification from Scratch with Deep Neural Networks: A Strong Baseline" to find the ResNet implementation specially adjusted for time series.
I'm sorry, but the paper clearly states that they use 1-D kernels.
To see the SOTA method implementations, you can check tsai.
They consistently choose to use conv1d instead of conv2d.
Thanks for your time.
Thanks for your advice.
What we want to emphasize is that it is not strictly defined to use 1D or 2D convolutions when modeling time series. In existing studies, it is more common to use conv1D in natural convolutional models for processing time series because of its better interpretability when "time" is the only dimension. But it can be 2D or more if you can find other axes that a convolution makes sense.
There are two distinct cases for using 2D convolution when working with time series: 1) processing multiple time series, 2) processing several consecutive intervals of a single time series. If you use the video stream data and want to apply the convolution filter, you can even use a 3D convolution.
Hope this helps. Thanks.