trumanwong / ComfyUI-NSFW-Detection
View external linksLinks

This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work. If an image is classified as NSFW, an alternative image is returned.
50Apr 21, 2025Updated 9 months ago

Alternatives and similar repositories for ComfyUI-NSFW-Detection

Users that are interested in ComfyUI-NSFW-Detection are comparing it to the libraries listed below

Sorting:

Are these results useful?