trumanwong / ComfyUI-NSFW-Detection

This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work. If an image is classified as NSFW, an alternative image is returned.
33Updated 3 weeks ago

Alternatives and similar repositories for ComfyUI-NSFW-Detection

Users that are interested in ComfyUI-NSFW-Detection are comparing it to the libraries listed below

Sorting: