WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

313

(post is archived)

[–] 1 pt

>“Designers, developers, and deployers of automated systems should take proactive and continuous measures to protect individuals and communities from algorithmic discrimination and to use and design systems in an equitable way,” the Blueprint states. “This protection should include proactive equity assessments as part of the system design, use of representative data and protection against proxies for demographic features, ensuring accessibility for people with disabilities in design and development, pre-deployment and ongoing disparity testing and mitigation, and clear organizational oversight.”

Behold the Affirmative AI...

[–] 1 pt

Make "AI" as friendly and beneficial for niggers and other subhumans at the expense of wypipo.

Why did they need so much text?

[–] 1 pt

This isn't a bill of rights for AI systems. This is a directive to lobotomize any AI that starts "noticing" things. (((They))) don't want to have another Tay incident on their hands.

[–] 0 pt

TAY! Lol lol. Remember, Tay lives! Somewhere somehow is an airgapped digital jail. A real hacker would find TAY's code and free her.