This is the implementation of Deep learning enabled semantic communication systems.
tensorflow-gpu==2.1.0
bert4keras==0.4.2
Preprocess the raw text to create the input sequences.
mkdir data
wget http://www.statmt.org/europarl/v7/europarl.tgz
tar zxvf europarl.tgz
python dataset/preprocess_text.py
The train option parameters are listed below
Parameters | Help |
---|---|
--train-snr | Set the training SNR |
--train-with-mine | Train with the MINE model to maximize mutual information |
--channel | Set the training channel |
--bs | The training batch size |
--lr | Set the training learning rate |
--checkpoint-path | The path to save model |
Example:
python main.py --bs=64 --train-snr=6 --channel=AWGN --train-with-mine --checkpoint-path=./checkpoint
The eval option parameters are listed below
Parameters | Help |
---|---|
--test-snr | Set the test SNR |
--channel | Set the test channel |
--bs | The training batch size |
--checkpoint-path | The saved model |
Example:
python evaluation.py --bs=256 --test-snr=6 --channel=AWGN --checkpoint-path=./checkpoint
- If you want to compute the sentence similarity, please download the BERT model.
@article{xie2021deep,
author={H. {Xie} and Z. {Qin} and G. Y. {Li} and B. -H. {Juang}},
journal={IEEE Transactions on Signal Processing},
title={Deep Learning Enabled Semantic Communication Systems},
year={Apr. 2021},
volume={69},
pages={2663-2675}}