It is working great but classification of sexy images is poor. How can I improve?
FurkanGozukara opened this issue · 1 comments
I would like it to classify sexy / explicit images better
Are there any newer models?
Currently used model version is : 1.7.1+4cf622eb13c07947a38e6e3336221657007e635d
Here some incorrectly classified images. these are supposed to be sexy but they are not classified such and many more
for example this site much better at classification with best model : https://nsfwjs.com/
https://github.com/NsfwSpy/NsfwSpy.NET/assets/19240467/b410003d-1cbb-482f-ba00-744740aed93d
https://github.com/NsfwSpy/NsfwSpy.NET/assets/19240467/21653348-5ff8-4260-8f04-488ec19bc196
Hey,
I'm not quite sure which version of the model that is. If you're using version 3.5.0 from Nuget, you will have the latest model.
Just taken a look at both of those images, and from what I can see they're being classified correctly.
We define "sexy" images as "Images of people in their underwear and men who are topless.". Those pictures are neither so classifying them as "Neutral" is what I would expect.
Let me know if you have any more questions or can help in some other way 🙂