Don't women realize how completely this denegrates women? Feminists are saying to women, "You should be trying to be like men in every respect." They are saying to women, "Being like a man is the only important thing in a woman's life."
I've been making that point (or trying to) for years now.
>
Don't women realize how completely this denegrates women? Feminists are saying to women, "You should be trying to be like men in every respect." They are saying to women, "Being like a man is the only important thing in a woman's life."
I've been making that point (or trying to) for years now.
(post is archived)