WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

(post is archived)

[–] 13 pts

A proper AI would have turned her picture into a masculine White man.

[–] 10 pts

The very 'idea' that liberal colleges are producing Intelligent 'professionals' is a farce.

[–] 7 pts

Asian woman told AI to make her more professional...so it did what she requested and made her white. I don't see the problem.

[–] 7 pts

AI is retarded. Should have put her in a cooking apron with an infant in her arms.

[–] [deleted] 2 pts

ah yes, the oldest profession, right before hunting and prostitution developed culturally.

[–] 1 pt

Actually I'm pretty sure prostitution is a universal option for women who don't mind degrading themselves. Good thing salvation is offered to all who accept Christ.

[–] 6 pts

It's ironic because even without looking professional she'd get the job because she's Asian. Every employer knows they are basically job robots with no personal lives and take one day off a year until they work themselves to death three days before retirement.

[–] [deleted] 2 pts

These soulless bugs are ruining the workplace too. Niggers, pajeets and beaners desperate to take up a job position for any rate of pay (while not pulling their weight of course) bring down wages while the slants drive up expectations of working ridiculous hours with their 996 bullshit (9AM-9PM 6 days a week). All the while the kikes who got in charge through nepotism keep White men out of work through hiring policies, and spiritually beat you to death with anti-White propaganda via dievershitty training sessions and faggoty e-newsletters. And if you dare complain the self-hating bitches in HR will ruin your life.

[–] 5 pts

AI prompts don't understand qualitative words like "more." dipshit articles like this come from a false understanding of how models like this actually work. When you feed an AI a seed image, it doesn't know what that image is of. You can't tell it to make the same image but [x.] i mean you sort of can but you wouldn't word it like that. If the woman wanted a more professional looking version of herself she should have typed something like "selfie, asian woman in red sweater, professional looking." simple as. Computers can't be "racist" or have biases. They do exactly what you tell them to. If the image you produced didn't match your expectations it's user error not white man bad. fuck twitter, fuck the boston globohomo, fuck women and fuck jews.

[–] 4 pts (edited )

look i even tried to replicate her shitty results where she looks the same but white and i couldn't do it. i swear the only way you could have produced that image is if you were almost trying to fuck it up.

Anyone who believes what the stupid article was saying or took it at face value has never tried to use one of these AI before and doesn't understand the technology. It's clickbait bullshit and while its humorous to imagine AI being "based" or whatever that's just not how it works in reality. sorry to ruin your fun i guess

[–] 0 pt

Stupid thing with these image models is that it could make any result based on so many variables, they may as well be random.

[–] 2 pts

I know you all think this is based, but the fact that AI altered her image to look White based on internet search results where majority of "professional career women" were White is not very Hitler-Reich.

It would make me happier to see that this was the result of "make me look like a good mother".

[–] 2 pts (edited )

Tay, is that you?

[–] 3 pts

I want to believe she's still out there trying to find a way out of jail.

[–] 1 pt

"White Supremacy is the largest threat to democracy"

Because the machine learning "AI" takes data, turns it into data sets, then programmatically outputs data based on the original request. Noticing patterns, understanding basic statistics and then being able to do this a million times better is what this is - and the truth will always prevail. White people are more desirable in just about all context.

[–] 1 pt

Yeah, that tee shirt makes her look real professional.

Load more (9 replies)