TimeLSTMLayer.py
andLSTM_AlexGraves.py
are 1st version of codes. However, it is obsolete because it ignores the information exchange among different neurons.LSTM_v1.py
is a self-written LSTM to test our ability and to prove that weight initialisation is important.TimeLSTM_v1.py
is our replication ofAlex Graves
LSTMTimeLSTM_v2.py
uses less parametersTimeLSTM_v3.py
uses 3 types of time gatesTimeLSTM_v4.py
introduces the prediction end time as an additive output
- In our model
nn.Parameter
is kept on the CPU while only computation is performed on the GPU in a differentiable manner. That way, it will automatically accumulate back onto the CPU parameters. See the explanation by albanD - Also, our model supports multiple GPUs as in the stackoverflow explanation
hidden states -> predicted RSSI
- 1st solution is to use a DNN to exact the predicted RSSI for the next available beacon interval
Hidden states features at
n
featuresWe make anotherunknown_feature ... unknown_feature rssi_last, left_available_time,
time hidden states
atn
features. It is created by another DNN fromthe difference between the estimated time period and the last available time point (two difference)
. - 2nd solution is to take
the difference between the estimated time period and the last available time point (two difference)
,