WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

310

(post is archived)

[–] 4 pts (edited )

transforms images into ‘poison’ samples, so that [AI] models training on them without consent will see their models learn unpredictable behaviors that deviate from expected norms

AI wars. I see nothing wrong doing this. In fact, I like it. I'll have to think of how to poison computer source code too.

[–] 1 pt

You'll have to be subtle and get it into the backups, that way they can't just restart from the latest bup.

[–] 2 pts

Someone it before I could!

[–] 1 pt

a shout out to your soul mate

[–] 1 pt

some web users have complained about it, suggesting it is tantamount to a cyberattack on AI models and companies

Lmao. "Defending against my cyber attack is a cyber attack!" No. It's really no different to the watermarks you see on stock images. You want to see it, see it. You want to take it and monetize it for yourself, you can pay for the version without the watermark or in this case without whatever Nightshade does to the image.

[–] 1 pt

Agreed; don't use people's images without permission and they won't have that problem.

[–] 1 pt

Every click and download goes straight to nsa saperstein's desktop

[–] 1 pt

They're just gonna know you're an artist. I'm sure everyone here is on a shorter list than that already.