Why does American culture constantly tell women to be more like men? The New York Times recently ran an op-ed entitled “Enough Leaning In. Let’s Tell Men to Lean Out,” opposing the assertiveness movement’s message that women should aspire to male standards. I found myself nodding along with the critique of “this fist-pumping restyling of feminism” until the author insisted that boys and men should be “trained” to be more like women.
As the mother of three daughters and one (still tiny) son, that’s most certainly not my goal. There is so much that is special about my daughters, but I have no interest in teaching my son to be an also-ran girl. I expect he’ll grow up fluent in the language of women, but I want him to excel at being himself. Based on what I’ve already seen, that self is quite different from his sisters.
So while I agree that our culture often devalues choices traditionally associated with women “from the color pink to domestic labor,” and it may be true that “girls are routinely given pep talks to be ‘anything a boy can be,’” my family doesn’t believe the former or impart the latter. I’m more likely to tell my daughters that if they work hard, they can be anything they want to be — without reference to anyone else, including boys.
I want my daughters to set their own compasses for success. They should pursue their own interests and talents, not what other people prefer. So, if STEM captures their imaginations, I’ll support that, but not because it’s trendy. If they’re more humanities-oriented like their mother, I’ll support that too. And as my children age, I’ll talk to them about the importance of finding ways to blend family and career, because I believe a full life includes more than work.
No, Men Don’t Need To Be More Like Women
Man by Chad is licensed under creative commons license 2.0
Man by Chad is licensed under creative commons license 2.0
Follow us
Get latest news delivered daily!
We will send you breaking news right to your inbox