Originally posted by hinky
A simplistic view of a woman is that she "takes care" of the man.
In a stereotypical view she is a home maker and care tender to the children and her man. She may not be the hunter / gatherer of the family, but she
does provide the meals for the family to eat. Think of the 1950's TV families of "Leave it to Beaver" or "Father Knows Best".
Perhaps some insight can be gained by considering a more historical and anthropological view of the roles of men/women.
Simplistically: Men hunt, conquer, and rashly effect change. Women nurture, care for the family, and provide balance and stability
Looking at the United States today, I feel it is still, in terms of business and politics, a male-dominated society. In many ways, men are doing what
they have always done -- compete and dominate.
I feel, though, that there is an imbalance. Women are not
providing an adequate 'stabilizing' force against the irrationalities of society.
Consider environmental stupidity, unfair economics, and unnecessary military aggression. Traditionally, women, as the gender 'responsible' for
providing a stable long-term environment for offspring, have acted to 'keep their men in check', by opposing such long-term unhealthy trends.
However, I see signs that many women in America today have given up that role, seeking a new independence. That's not wrong -- change is good, and
no one should be placed in a pigeonhole due to their gender. However, there may also a long-term danger to human society. Perhaps by embracing
short-term materialism and unchecked consumerism, a necessary balance, traditionally provided by women, has been lost.
In my opinion, a real
woman asserts her femininity reflective of the truth she finds within herself, without submission to either historical
ideas of what a modern woman 'should' be.