Borda/pyImSegm

how to use GPU to compute?

Jian-danai opened this issue · 4 comments

I found that the computing process is slow, and no gpu is used. How to implement the training with gpu?

Borda commented

Thanks for your comment.
Indeed there is no GPU used since it was considered moreless a research package.
I agree that some computation can be moved on GPU, on the other hand, it is built on standard libraries so if you find an alternative with GPU the switch should not be hard...
@Jian-danai May I ask what training do you have in mind? Computing features or fitting the classifier/model...

I'm struggling with the same issue, when texture metrics are used the training is too slow, and as said no GPU is used, it would be a great addition.

Borda commented

@blasco thank you cor your interest, unfortunately, this is not actively developed any more as it was my PhD project several years back, but any PR with improvement would be very welcome 🐰

Borda commented

@Jian-danai @blasco feel free to reopen if you want to implement such GPU extension :]
I agree that it would be great but do not have time to do it... maybe some application of Kornia?