In fault diagnosis, varying working conditions, such as changes in rotational speed or load, present a significant challenge due to the domain gap. Data-driven deep learning methods often struggle to generalize to unseen domains, leading to a severe decline in fault diagnosis performance. One potential approach to enhance model generalization is Masked Autoencoder Pretraining. This repository explores the feasibility of this approach to address the difficulties posed by varying working conditions. The pipeline is shown in figure below.
We train the classifier on LinGang dataset under vaiable working conditions, e.g., using 1800rpm data as training domain and 1200 rpm as testing domain. The results is trained with 20 repetitive experiments. With MAE pretrained parameters, the encoder is frozen except for the last two encoder layers. The experiments show a 2-3 percent improvement in average accuracy compared to the baseline.
You may install the dependencies by the following command.
pip install -e .
Download the mechanical fault diagnosis dataset, i.e., CRWU bearing dataset, HUST, LinGang. Put the datasets under a folder, the path of which is needed in training.
your/direction
--CRWU
--HUST
--LinGang
export PYTHONPATH=/path/to/project
python tools/train_mae.py --datapath /path/to/your/data
python tools/train_classifier.py --datapath /path/to/your/data