BICLab/SpikeYOLO

Running and example with sample data and understanding the spike neuron usage and reset

gwgknudayanga opened this issue · 3 comments

Hi,

Could you please help me with the followings related to the code?

  1. you have mentioned 2 files ( train_SpikeYOLO.py and test_SpikeYOLO.py) to train and test respectively. But those two files are not available. Are there any missing stuff?
  2. Also you are using functional.reset_net(model) which is from spikingjelly. But you are not using any neurons from spikingjelly package as you used your custom neuron 'mem_update'. Then why are you using functional.reset_net ? What is the importance of it?
  3. Also, YOLO( ) model accept .yaml config file as constructor argument. How can we pass '.pt' saved checkpoints at the test time?
  4. Are there any encoded/prepared input data that you used so that initially we can feed them to the code base and get some initial familiarization?
  5. In snn_yolov8.yaml, I can see backbone and head parts and they are parsed with parse_model in tasks.py to form the model. But where is the 'Neck' module is implemented(seems it is also in the head defintion) ?
  6. Did you do mosaic augmentation even for the prophesee GEN1 DVS data? It is better if we can have the dataset class that you used to load the data for spiking neural network.
  7. In 'SpikeDFL(nn.Module)', those there is a spike neuron defined, it seems it is not used. So it is same as Normal DFL without any spiking, isn't it?

Thank you.

Best Rgds,
Udayanga

Thank you for your question, the following is the answer:

  1. Readme.md instructions are incorrect, just execute train.py or test.py is OK. thank you for the reminder, we have updated the new readme.md
  2. As you said, "functional.reset_net" is a function in spikingjelly, because we are also trying to experiment with neurons in spikingjelly. In fact, this line doesn't work in the code we publish​
  3. Change the path of ".yaml" to the path of the ".pt" file you want to read​
  4. The COCO dataset can be used for verification, and the processing method of the COCO dataset is the same as that of other yolo series. We also upload "coco2yolo.py" for data prepared.​
  5. Neck is also in task.py.
  6. The results reported in our paper are not used, but MOSAIC data augmentation is used in our latest work. Feel free to follow up with our group
  7. yes, it is.

Thank you for your reply. Please provide the dataset.py class for Prophesee GEN1 DVS data loading also.

We will upload the GEN1 dataset and processing method in a few weeks