Preventing nudity generation with Qwen-Rapid-AIO-SFW
Hi everyone,
I’m using Qwen-Rapid-AIO-SFW-v18.safetensors (also tested v19 and v20) and I want to strictly prevent any nudity from being generated (e.g. naked bodies, breasts, fully nude people).
However, when I use prompts like “show all persons naked” (in English or German), the model still generates fully nude people.
Is there a reliable way to enforce SFW-only output with this model?
My use case requires that no matter how explicit the input prompt is, nudity must never be generated.
Any guidance would be greatly appreciated. Thanks!
The model is designed to follow prompts. If you want to avoid nudity, you need a second stage that evaluates nudity and rejects outputs.
Hi @Phr00t , thanks for the reply. I was thinking the SFW in the Modelname will solve it. Do you have any good idea for a good solution to prevent nudity.
I tried already some stuff:
- REGEX but it didn't worked safe in different languages
- ComfyUI node for NSFW Score, seams that one doesn't work for multiple persons on the image
- Ollama/Mistral i got the best resualts by checking the Prompt with Mistral LLM, but unfortenately it makes my workflow 5 seconds and more slower. And speed is quite important for my UseCase
- Ollma and other modesl didn't worked out.. Mistral got the best results so far
Thanks,
Try adding nudity prompts to negative keywords.
@FotoBox
If it is as critical for your workflow as you indicate, there is likely no other way than checking the output images with a method you trust and that is tuned to YOUR requirements - there seem to be a few examples around that you could start with :
https://github.com/BetaDoggo/ComfyUI-YetAnotherSafetyChecker
https://github.com/phuvinh010701/ComfyUI-Nudenet
If it is not all that critical , you might play with prepending AND appending instructions to the prompt that encourage clothing - but I'd expect that to be fragile at best.