0bserver07/One-Hundred-Layers-Tiramisu

The number of trainable parameters is more larger than FC densenet in paper

Shaun2016 opened this issue · 1 comments

I use the code to see the number of parameters for the model:
def parameter_num(model_structure):
with open(model_structure) as model_file:
model = models.model_from_json(model_file.read())
print(model.summary())
but the results are more larger than models in Simon Jegou et al 's paper:
4
Total params: 33,927,266
Trainable params: 33,902,114
Non-trainable params: 25,152

5
Total params: 59,332,562
Trainable params: 59,295,842
Non-trainable params: 36,720

6
Total params: 95,044,274
Trainable params: 94,993,778
Non-trainable params: 50,496

this issue caused by missing concat layer in denseblock.