Yeah would be nice to have a country. The US hasn't been ours for decades though. Whether it was the immigration act of 1965 or later (or earlier) is debatable but our country was sold out from under us.
You won't find a single politician who will call it a white nation. Ask anyone who it belongs to and you'll usually get something meaningless. "Americans" that's a tautology. "Citizens" they give that out like candy. It means nothing.
The reality is that the US government is just a corporation. It's priorities have been profit and control, for most of our lifetimes. Now at least you have the benefit of knowing that.
(post is archived)