And then ran the war so badly that he lost, collapsing half the country into a communist hellhole for decades, and even now reunited, the Germans are so hell-bent on not being Nazis that they’re torching their own culture. Hitler ultimately made things worse.
And sure, you can argue whatever you want about how he could have or meant to... but he didn’t, now did he?
(post is archived)