Count in case of very Different objects
Closed this issue · 18 comments
@zhiyuanyou
Can the model count effectively if there are more than one class objects in image considerably varrying in their shape & color ?
Hi~
FSC-147 contains some images with the features you point. Actually speaking, the counting performance with these images is worse than the images with only one class of objects. But I think it still outputs reasonable results.
Hi~ FSC-147 contains some images with the features you point. Actually speaking, the counting performance with these images is worse than the images with only one class of objects. But I think it still outputs reasonable results.
@zhiyuanyou isint the model methodology generic enough to take into account count of objects falling in different classes.
It is because we do not design on the backbone.
Therefore, if the backbone-extracted features for different objects are quite different, it is fine for counting. However, if different kinds of objects are similar and share similar features, the performance drops a lot.
@zhiyuanyou
Thank you for providing the details
could you please share links of pretrained models used ,below is not working
https://drive.google.com/file/d/1mbV0xJdORIpSLlMCwlgENMB9Y1kUOhk2/view?usp=sharing
thisnt working
Will FSC dataset based model be the best model fitted for Retail Stores items ,that is where i plan to use it for
Regards
Jaideep
I have tried. I could download the model from https://drive.google.com/file/d/1mbV0xJdORIpSLlMCwlgENMB9Y1kUOhk2/view?usp=sharing.
What do you mean by saying "not working"?
Now
I can download. its extension is .pth only or .rar as I am unable to extract using rar extractor
Well, you do not need to extract this file.
What you need to do is to set config.saver.load_path as the path of the download file, i.e., /xxx/xxx/xxx/ckpt_best.pth.tar
. Then you could use this pre-trained model.
@zhiyuanyou sure thanku
what we have to do inorder to run on a single or handful of unseen Images whose density Map I dont have.Is there any demo kind of script
do we have to build a fresh ./config.yaml" file ?
It would have contained some default values for some important config attributes eg.
def build_network(config):
mtype = config["builder"]
module_name, fn_name = mtype.rsplit(".", 1)
module = importlib.import_module(module_name)
model_builder = getattr(module, fn_name)
kwargs = config.get("kwargs", {})
model = model_builder(**kwargs)
return model
As I understand, you want to use FSC-147 pre-trained model to infer on your own data (without ground-truth).
(1).
In this case, you need to rewrite an infer() function with reference to eval() function in main.py to make it only output results but do not evaluate the metrics.
(2).
Yes. You need make a new dir to save a new config.yaml and a sh file. You should also make you data format the same with the example.
If you do not willing to rewrite an infer() function, you could also choose to generate a density map whose sum equals the counting number. The run eval() to infer on your own data.
Note, for eval(), the only need is that the sum of the density map equals the counting number, since only the sum of the density map will be used to evaluate the performance. But for train(), you must generate a good density map.
@zhiyuanyou
There are three versions of safecount available safecount, safeexemplar,safecountcrossdata
which one to pick for fsc dataset ?
I was trying to figure out right config value of builder attribute that one needs to pass in to build the right kind of model
_mtype = config["builder"]
module_name, fn_name = mtype.rsplit(".", 1)_
@zhiyuanyou There are three versions of safecount available safecount, safeexemplar,safecountcrossdata which one to pick for fsc dataset ? I was trying to figure out right config values that one needs to pass in to build the right kind of model
_mtype = config["builder"] module_name, fn_name = mtype.rsplit(".", 1)_
It is safecount.py
. Please see Part 2 in README.
@zhiyuanyou yes i could see in experiments the desired values now.
Another thing
-
do we have to download resnet18 pretrained specific to your task ,you have given some path in your config files
pretrained_model: /mnt/lustre/share/DSK/model_zoo/pytorch/imagenet/resnet18-5c106cde.pth
self.resnet.load_state_dict(torch.load(pretrained_model))
-
Can we use Resnet50 as backbone or we need to retrain the model on FSC dataset ?
@zhiyuanyou yes i could see in experiments the desired values now. Another thing
- do we have to download resnet18 pretrained specific to your task ,you have given some path in your config files
pretrained_model: /mnt/lustre/share/DSK/model_zoo/pytorch/imagenet/resnet18-5c106cde.pth
self.resnet.load_state_dict(torch.load(pretrained_model))
- Can we use Resnet50 as backbone or we need to retrain the model on FSC dataset ?
- Resnet will be download automatically.
- You could set Resnet50 of course.
- You do not need to retrain the model.
- I was able to get decent test result with multiclass objects image thanks for your support
- During training with Resnet18 as backbone werent your backbone got finetune to objects of FSC 147 ?
- If i use resnet50 then Imagenet weights will get loaded to it ,will model still work fine without having to fine tune the backbone with FSC objects ?
- Could you help understands what are
meta['points']
,what is difference between density_maps and points ? - For my own full dataset if I have to fine tune your model how to generate the density_maps and points for them,any idea
- Can the accuracy of your overall model be improved if we use other state of Art Backbones like effnet, resnest101d,resnest200d ,regnet etc
- I was able to get decent test result with multiclass objects image thanks for your support
- During training with Resnet18 as backbone werent your backbone got finetune to objects of FSC 147 ?
- If i use resnet50 then Imagenet weights will get loaded to it ,will model still work fine without having to fine tune the backbone with FSC objects ?
- Could you help understands what are
meta['points']
,what is difference between density_maps and points ?- For my own full dataset if I have to fine tune your model how to generate the density_maps and points for them,any idea
- Can the accuracy of your overall model be improved if we use other state of Art Backbones like effnet, resnest101d,resnest200d ,regnet etc
- Congratulation.
- The backbone is always frozen during training.
- You could have a try. Note that the backbone is always frozen during training. However, other parameters may need to be re-adjusted on FSC-147 to work with the new backbone.
- You could have a detailed look with FSC-147 dataset.
["points"]
means the object centers, and density map is generated from object centers. - You could follow
gen_gt_density.py
under the sub-directories ofdata
. - Actually, we aim to compare with other works with similar backbones. Other works adopt resnet18 & resnet50. So we do not consider other backbones.
@zhiyuanyou how did you generate the bounding boxes ,in the dataset i could find just the density maps
,Do you have script to generate the json files
@zhiyuanyou how did you generate the bounding boxes ,in the dataset i could find just the density maps ,Do you have script to generate the json files
What do you mean by saying "the bounding boxes"? Do you mean the exemplar objects? If it is, it should be generated from your own annotation, i.e., you should annotate three objects as exemplars for counting.