This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work. If an image is classified as NSFW, an alternative image is returned.
☆50Apr 21, 2025Updated 10 months ago
Alternatives and similar repositories for ComfyUI-NSFW-Detection
Users that are interested in ComfyUI-NSFW-Detection are comparing it to the libraries listed below
Sorting:
- ☆11Nov 26, 2024Updated last year
- Custom nodes for ComfyUI. It's for handling generation results in cycles!☆34Jan 12, 2025Updated last year
- Provides ComfyUI nodes for OIDN image denoising☆10Nov 27, 2024Updated last year
- ☆14Apr 8, 2025Updated 11 months ago
- 这是一个专门用于浏览和管理 ComfyUI 生成的图片文件,当然其他目录的图片也可以。☆10Dec 20, 2024Updated last year
- ComfyUI version of WithAnyone