Look at the UK. They slowly infiltrate governments and start pushing sharia compliant agendas.
Exactly. It's incredible that liberals either don't see this or don't care.
Anything that will help in the undoing of American society as it once was is okay with the left. It's the American ideals that created this country that are the enemy to them, and the enemy of my enemy blah, blah, blah.
Buy why are they okay with destroying themselves and taking everyone else with them?
(post is archived)