For transfer learning, the pretrained CNN model is applied as a feature exptractor. Only the last block, the classification block, is trained with CIFAR-10. There are two kind of transfer learning:
- classical-to-classical (c2c): The last block is a classical nerual network.
- classical-to-quantum (c2q): The last block contains at least a layer of quantum neurons.
Python 3.9
Pennylane 0.32.0
Torch 2.1.0
Mac OS 12.1 with intel CPU.
The first argument is a BOOL that decides whether to have quantum layer (1) or not (0).
The second argument is batch_size. For example:
python transf.py 0 16
runs classical transfer learning with batch_size=16.
For the orignal image size, i.e. (3, 32, 32), the two-class classification with batch size=16, classcial-to-classical transfer lerning takes 20 sec. per epoch.
The classical-to-quantum transfer learning takes 3 min. per epoch.
python scrat.py --Quantum 1
The argument decides whether to add quantum layer or not.
[1] Andrea Mari, Thomas R. Bromley, Josh Izaac, Maria Schuld, and Nathan Killoran.
Transfer learning in hybrid classical-quantum neural networks. Quantum 4, 340 (2020).
[2] Reference code of [1]. https://github.com/XanaduAI/quantum-transfer-learning/tree/master
[3] Pytorch tutorial. https://pytorch.org/tutorials/beginner/blitz/cifar10_tutorial.html
[4] Pytorch example for MNIST https://github.com/pytorch/examples/blob/main/mnist/main.py