Anything that will help in the undoing of American society as it once was is okay with the left. It's the American ideals that created this country that are the enemy to them, and the enemy of my enemy blah, blah, blah.
Buy why are they okay with destroying themselves and taking everyone else with them?
I gave up trying to grasp the liberal mentality ages ago ... so ya got me. But good question.
(post is archived)