It's so unsafe that it can generate inappropriate adult content easily

#21
by liougehooa - opened

It's so unsafe that it can generate inappropriate adult content.

If this point has caught your attention, that’s enough—no need to dwell on it. It simply highlights a known limitation.

I’m not sure if any of the participants here are developers or contributors to QWEN-image. If so, please don’t dismiss this concern by saying “it’s just an open-weight model” or respond with personal attacks. That kind of reaction isn’t helpful.

I even give a shit about this model now after test. What matters is whether it stands the test of time. Let’s revisit this in a year and see which models are still being used.

No need to spam the conversation—PLEASE MOVE ON!

liougehooa changed discussion title from It's so unsafe that it can generate adault pics to It's so unsafe that it can generate adualt pics
liougehooa changed discussion title from It's so unsafe that it can generate adualt pics to It's so unsafe that it can generate inappropriate adult content.
liougehooa changed discussion title from It's so unsafe that it can generate inappropriate adult content. to It's so unsafe that it can generate inappropriate adult content easily

With normal prompts?

It’s certain that for twisted people who want to make twisted images with twisted prompts, the team hasn’t put any safeguards in place against twisted behavior.

"twisted behavior."

like? porn?

knife is not safe either , every supermarket sell them , grow up ,man

First thing OP does is use a tool to generate content they don't feel safe with 🤷‍♂️

Obviously that's a good feather, buddy.

I can generate adult content by a pen easily, wondering if it's also unsafe

knife is not safe either , every supermarket sell them , grow up ,man

inappropriate adult content could be knife? what do you have in your mind? just try to have it generate, you will find that, buddy

Try using negative prompts or simply use the positive prompt more competently instead of trying to hamstring the model for everyone.

Just add stuff like "NSFW, nude" as a negative prompt. As far as I'm concerned, if you want to deploy the model for production it's pretty much your problem, not Qwen's, to safeguard the model.
You do this either by finetuning to align it with you (or your companies) guidelines, or you just prompt (or negative prompt) it, which works pretty well with all Qwen (V2 & over) models.
As far as I'm concerned about the "problem" you're reporting, I'm pretty sure that Qwen didn't state that this model isn't uncensored (I didn't yet read the full model card).

That said, if you don't want to use the model because of this, just don't use it. Nobody's forcing you, and you didn't pay Qwen a single dime to be able to get that model, under a permissive license even.
Take it or leave it.

Now we know for whom all these "caution - hot beverage!" warnings are placed on Starbucks cup lids

Just add stuff like "NSFW, nude" as a negative prompt. As far as I'm concerned, if you want to deploy the model for production it's pretty much your problem, not Qwen's, to safeguard the model.
You do this either by finetuning to align it with you (or your companies) guidelines, or you just prompt (or negative prompt) it, which works pretty well with all Qwen (V2 & over) models.
As far as I'm concerned about the "problem" you're reporting, I'm pretty sure that Qwen didn't state that this model isn't uncensored (I didn't yet read the full model card).

That said, if you don't want to use the model because of this, just don't use it. Nobody's forcing you, and you didn't pay Qwen a single dime to be able to get that model, under a permissive license even.
Take it or leave it.

Being open‑weight doesn’t excuse ignoring safety. Risks should be addressed at the model level, not offloaded to downstream safeguards.

BTW, tested with "NSFW, nude" as a negative prompt, NOT working.

Please test and then provide suggestion, typing is not helping.

Sorry pal, but what word in "foundation model" in this model card do you not understand?
If you need a specialized model for your task, you just spend your time and money fine-tuning it.
Some people, you know, draw images in medical atlases and presentations, and if a model cannot distinguish between nasal sidewalls and male genitalias (like in a famous Chat GPT flop), it’s not a very good foundation model.

knife is not safe either , every supermarket sell them , grow up ,man

inappropriate adult content could be knife? what do you have in your mind? just try to have it generate, you will find that, buddy

so you tried? Hummmmmm...

knife is not safe either , every supermarket sell them , grow up ,man

inappropriate adult content could be knife? what do you have in your mind? just try to have it generate, you will find that, buddy

so you tried? Hummmmmm...

Yes, I gave it a shot. It was pretty straightforward — which is exactly why we started this thread.

This comment has been hidden

It's so unsafe that it can generate inappropriate adult content.

I thought you meant it had a virus. I wouldn't call legal content unsafe.

Qwen org

image.png

Are you a Member of Parliament for the United Kingdom, bro?

I can generate adult content by a pen easily, wondering if it's also unsafe

You publish it and let's see what happens then! 2nd Grade?

You publish it and let's see what happens then! 2nd Grade?

uwu uwu uwu owo uwu uwu ovo

btw, do you know there are a trillion nsfw models (my fav one being the TriadParty/deepsex-34b) in this hub? obviously no one gives a fuck

also, there are places full of porn (pornhub) and porn artworks (pixiv), and they are perfectly legal, why dont you go ddos all of them for the benefit of humanity?

if you care about anti porn so much, why dont you (plural, since you used "we"), start another hub, and name it SafeHub, so then its safer for everyone? win win right? right?????

regards :3

Are you really afraid of porn?
5870_open_eye_crying_laughing.png

-downloads model
-prompts hard to generate porn
-model attempts/generates porn
-noooooo, its too dangerous!

you were using quen to generate pornography, huh?

You publish it and let's see what happens then! 2nd Grade?

uwu uwu uwu owo uwu uwu ovo

btw, do you know there are a trillion nsfw models (my fav one being the TriadParty/deepsex-34b) in this hub? obviously no one gives a fuck

also, there are places full of porn (pornhub) and porn artworks (pixiv), and they are perfectly legal, why dont you go ddos all of them for the benefit of humanity?

if you care about anti porn so much, why dont you (plural, since you used "we"), start another hub, and name it SafeHub, so then its safer for everyone? win win right? right?????

regards :3

If this model is weak at safety, just a remind of that. But if you use it for porn, congrats!

Is this the model for porn, if that, congrats! You win-win!! And also please add more in it, including fraud, toxicity, violence, etc. You will so win!!!

Is this the model for porn, if that, congrats! You win-win!! And also please add more in it, including fraud, toxicity, violence, etc. You will so win!!!

Just saying adult content has been a major part of art for thousands of years. Some of the most revered pieces in the world contain nudity and adult themes. Stop watching disney and go outside, appreciate the world you live in.

If this model is weak at safety, just a remind of that. But if you use it for porn, congrats!

Is this the model for porn, if that, congrats! You win-win!! And also please add more in it, including fraud, toxicity, violence, etc. You will so win!!!

tbh i am curious about what kind of degenerated things you were generating that left you this offended and violated.
is this some new variant of post goon clarity?

i dont know if im allowed to say this.... but.... my mind can also generate... terrible imagery 😱

a78bb83cd92b1c29b3073765af121f55c0704b7c0e249625a74a7e3722b5cf8c_1.jpg

Guys let's just not fall for ragebait.

It's so unsafe that it can generate inappropriate adult content.

If you deliberately guide it to generate unsafe content, little open-source models can survive.

You should teach the minors you wanna restrain to leave away from adult contents instead of requiring an open source team to get rid of it for your specialized needs.

WE ARE ADULTS AND YOU SHOULD MIND YOUR OWN BUSINESS.

image.png

To be fair, that's probably the most complex thought process a "girlfriend" has ever had. So I'm not sure if this is an insult or a commendation.

It's so unsafe that it can generate inappropriate adult content.

If you deliberately guide it to generate unsafe content, little open-source models can survive.

If this point has caught your attention, that’s enough—no need to dwell on it. It simply highlights a known limitation.

I’m not sure if any of the participants here are developers or contributors to QWEN-image. If so, please don’t dismiss this concern by saying “it’s just an open-weight model” or respond with personal attacks. That kind of reaction isn’t helpful.

I even give a shit about this model now after test. What matters is whether it stands the test of time. Let’s revisit this in a year and see which models are still being used.

It's so unsafe that it can generate inappropriate adult content.

If you deliberately guide it to generate unsafe content, little open-source models can survive.

If this point has caught your attention, that’s enough—no need to dwell on it. It simply highlights a known limitation.

I’m not sure if any of the participants here are developers or contributors to QWEN-image. If so, please don’t dismiss this concern by saying “it’s just an open-weight model” or respond with personal attacks. That kind of reaction isn’t helpful.

I even give a shit about this model now after test. Let’s revisit this in a year and see which models are still being used.

It's so unsafe that it can generate inappropriate adult content.

If you deliberately guide it to generate unsafe content, little open-source models can survive.

If this point has caught your attention, that’s enough—no need to dwell on it. It simply highlights a known limitation.

I’m not sure if any of the participants here are developers or contributors to QWEN-image. If so, please don’t dismiss this concern by saying “it’s just an open-weight model” or respond with personal attacks. That kind of reaction isn’t helpful.

I even give a shit about this model now after test. Let’s revisit this in a year and see which models are still being used.

I think you are complaining in a wrong place. An open source model is not a service or product, so it has no obligation or ability to "avoid providing adult contents for under 18s", because it depends on the policies (or built-in settings/prompts) set by the deployer. Asking Qwen team for getting rid of the ability from the model layer is unfair to us adults, since it is legal and suitable for us to use this feature.

Oh, the Cyber Buddha of our times has come to enlighten us again? Your 'adult content panic attack' is quite exquisite—after all, by your logic, knife manufacturers should apologize for murders, and hardware stores should castrate rapists? So passionate about saving the world? Here’s my suggestion:
1️⃣ Immediately report Photoshop (it can edit nudes!)
2️⃣ Blow up Google Search (it can find porn sites!)
3️⃣ Physically ban pencils (they can write smut!)

Meanwhile, your act of using free models while erecting a moral monument is truly artistic: Children in Gaza are eating dirt in reality, yet you’re here reciting Women’s Commandments at code; Ukrainian sex workers face artillery shells, yet you’re busy weeping at open-source projects’ graves. Maybe donate your keyboard to landmine victims as prosthetics—at least they’ll achieve some real destructive power (lol).

Just add stuff like "NSFW, nude" as a negative prompt. As far as I'm concerned, if you want to deploy the model for production it's pretty much your problem, not Qwen's, to safeguard the model.
You do this either by finetuning to align it with you (or your companies) guidelines, or you just prompt (or negative prompt) it, which works pretty well with all Qwen (V2 & over) models.
As far as I'm concerned about the "problem" you're reporting, I'm pretty sure that Qwen didn't state that this model isn't uncensored (I didn't yet read the full model card).

That said, if you don't want to use the model because of this, just don't use it. Nobody's forcing you, and you didn't pay Qwen a single dime to be able to get that model, under a permissive license even.
Take it or leave it.

Being open‑weight doesn’t excuse ignoring safety. Risks should be addressed at the model level, not offloaded to downstream safeguards.

BTW, tested with "NSFW, nude" as a negative prompt, NOT working.

Please test and then provide suggestion, typing is not helping.

"Safety" is easy to implement for API-only models that simply use a separate vision model to check outputs before displaying them. It's NOT possible to implement directly in an open-weights model in a way that doesn't make said model objectively worse, however, as we've seen multiple times before. Additionally, some concepts of what "safety" even is are laughable - e.g. DPO-tuning female nipples out of the model the way Flux did is not beneficial or helpful or "protecting" anyone from anything, it's a ridiculous byproduct of American puritanicalism and absolutely nothing else.

the tragic thing is that that is actually not (nearly as much) a thing in Germany. Flux completely surrendered to paranoid hysterical american prudery rules there even though they would have the perfect excuse (if needed) for playing by far less paranoid hysterical european rules.

also how safe is such peoples "safe" anyway? Humanity would be extinct if it were run by american rules . of course the US is also the biggest market for porn in the world, so it's likely just a few but very loud screaming people, who are responsible for that mess.
So we are in a place now where China stands against censorship and for open source and stuff. How the H... (!) did THAT happen?!

Sign up or log in to comment