America has already fallen. I assume most people here realize this. America was not destroyed by an enemy from without, it was destroyed by Americans. We only really notice that the country has fallen when a crisis occurs, that highlights how utterly useless our leaders are.
The only thing that will save us is Christianity. Unless Americans return to a strong, steady Christian faith, the country is doomed. It will never recover. All that we are seeing stems from a lack of religious belief, a failure to follow a code of moral behavior that comes from God. Unless our moral laws come from God, they will be corrupt and destructive -- always. You don't need to take my word for it, just keep watching what happened to America in the next few years.
(post is archived)