Trained on 60+ Gigs of data to identify:
drawings
- safe for work drawings (including anime)hentai
- hentai and pornographic drawingsneutral
- safe for work neutral imagesporn
- pornographic images, sexual actssexy
- sexually explicit images, not pornography
91% Accuracy with the following confusion matrix, based on Inception V3.
Review the _art
folder for previous incarnations of this model.
keras (tested with versions > 2.0.0)
tensorflow (Not specified in setup.py)
from nsfw_detector import NSFWDetector
detector = NSFWDetector('./nsfw.299x299.h5')
# Predict single image
detector.predict('2.jpg')
# {'2.jpg': {'sexy': 4.3454722e-05, 'neutral': 0.00026579265, 'porn': 0.0007733492, 'hentai': 0.14751932, 'drawings': 0.85139805}}
# Predict multiple images at once using Keras batch prediction
detector.predict(['/Users/bedapudi/Desktop/2.jpg', '/Users/bedapudi/Desktop/6.jpg'], batch_size=32)
{'2.jpg': {'sexy': 4.3454795e-05, 'neutral': 0.00026579312, 'porn': 0.0007733498, 'hentai': 0.14751942, 'drawings': 0.8513979}, '6.jpg': {'drawings': 0.004214506, 'hentai': 0.013342537, 'neutral': 0.01834045, 'porn': 0.4431829, 'sexy': 0.5209196}}
Please feel free to use this model to help your products!
If you'd like to say thanks for creating this, I'll take a donation for hosting costs.
- Keras 299x299 Image Model
- TensorflowJS 299x299 Image Model
- TensorflowJS Quantized 299x299 Image Model
- Tensorflow 299x299 Image Model
- Contribute Here? Convert the model!
Simple description of the scripts used to create this model:
train_inception_model.py
- The code used to train the Keras based Inception V3 Transfer learned model.visuals.py
- The code to create the confusion matrix graphicself_clense.py
- The training data came down with some significant inaccuracy. Self clense helped me use early iterations of the mode, to cross validate errors in the training data in reasonable time. The better the model got, the better I could use it to clean the training data manually. Most importantly, this also allowed me to clean the validation dataset, and get a real indication of generalized performance.
There's no easy way to distribute the training data, but if you'd like to help with this model or train other models, get in touch with me and we can work together.
Advancements in this model power the quantized TFJS module on https://nsfwjs.com/
My twitter is @GantLaborde - I'm a School Of AI Wizard New Orleans. I run the twitter account @FunMachineLearn
Learn more about me and the company I work for.
Special thanks to the nsfw_data_scraper for the training data. If you're interested in a more detailed analysis of types of NSFW images, you could probably use this repo code with this data.
Thanks goes to these wonderful people (emoji key):
Gant Laborde 💻 📖 🤔 |
Bedapudi Praneeth 💻 🤔 |
---|
This project follows the all-contributors specification. Contributions of any kind welcome!