Repository of Deep Multitask Texture Classifier(MTL-TCNN) - created as a part of Independent Research study under Prof (Dr.) Dapeng Oliver Wu, ECE, UF, Florida, USA in Spring 2020 (Feb - April, 2020).
This project uses the paper: "Using filter banks in Convolutional Neural Networks for texture classification" [arXiv] as a baseline model.
V. Andrearczyk & Paul F. Whelan
The implementation of TCNN3 in Pytorch by me as a single task classifier as a part of this study, can be found at the following location
The report of this research is kept at the following location.
Texture bestows important characteristics of many types of images in computer vision and classifying textures is one of the most challenging problems in pattern recognition which draws the attention of computer vision researchers over many decades. Recently with the popularity of deep learning algorithms, particularly Convolution Neural Network (CNN), researchers can extract features that helped them to improve the performance of tasks like object detection and recogni- tion significantly over previous handcrafted features. In texture classification, the CNN layers can be used as filter banks for feature extraction whose complexity will increase with the depth of the network. In this study, we introduce a novel multitask texture classifier(MTL-TCNN) where we used multitask learning instead of pretraining sharing feature representation between two common tasks; one task being identifying the objects from Imagenet dataset using Alexnet and second, being classifying the textures using TCNN3. For evaluation, we used two standard benchmark datasets (KTH-Tips and DTD) for texture classifi- cation. Our experiments demonstrated enhanced performance classifying textures over TCNN3.
ImageNet: The ImageNet dataset files can be accessed from the location. One needs to download the files and place them in /Dataset/ImageNet folder.
DTD: The DTD dataset files can be accessed from the location. One needs to download the files and place them in /Dataset/Texture/DTD folder.
Kth: The DTD dataset files can be accessed from the location. One needs to download the files and place them in /Dataset/Texture/kth folder.
To reproduce the experiments mentioned in the report, first download the dataset as described above and then, type the following command:
python3 main_texture_classifier.py
Epochs(DTD): 400
Epochs(kth): 400
Learning rate: 0.0001
Batch size: 32
Weight Decay: 0.0005
beingshantanu2406@gmail.com
shantanu.ghosh@ufl.edu
© Shantanu Ghosh, University of Florida
Licensed under the MIT License