Apparently some say the United States of America ended in 1871 when Congress sold the US to England and now it is the United States corporation.
Not sure if I can believe that.
Apparently some say the United States of America ended in 1871 when Congress sold the US to England and now it is the United States corporation.
Not sure if I can believe that.
(post is archived)