trumanwong / ComfyUI-NSFW-DetectionLinks

This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work. If an image is classified as NSFW, an alternative image is returned.
35Updated 2 months ago

Alternatives and similar repositories for ComfyUI-NSFW-Detection

Users that are interested in ComfyUI-NSFW-Detection are comparing it to the libraries listed below

Sorting: