trumanwong / ComfyUI-NSFW-DetectionView on GitHub
This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work. If an image is classified as NSFW, an alternative image is returned.
50Apr 21, 2025Updated 11 months ago

Alternatives and similar repositories for ComfyUI-NSFW-Detection

Users that are interested in ComfyUI-NSFW-Detection are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?