Thanks for the book recommendation, I'll check it out.
I believe its simply a matter of perspective. Women today believe that they're oppressed just because media/teachers/parents told them so. In reality women are the most priviliged (how I hate using that word). Even back in the day before our current gynocentric society, when women were supposedly treated "worse," it was men (and especially as you pointed out, men at the bottom of society) which worked to provide for the entire society, while women raised children (and therefore the men which would go on to build society; that's the beautiful complemetary relationship that men and women should have).
Men need to stop playing ball with the idea of feminism, that's the only reason it still exists. If men didn't tolerate how society was running, things would very quickly change.
(post is archived)