Feminism is a social movement that is committed to providing women with equal opportunities. As individuals within social institutions, men have dominated over women throughout history, and women have been subjected to sexism, hatred, and exploitation.
In the past, the common view was that girls should be obedient to their fathers and women should be obedient to their husbands. For the most part, women did not have the education or social encouragement and standing to make their voices heard. There has been a traditional double standard for men and women, and the law has generally prescribed second-class citizenship for women. Nowadays we call that tradition sexist, and we tell ourselves that the situation is different. But can we rightfully say that men and women have an equal relationship in society?
For example, the fact that most secretaries are women and most executives are men, that most doctors are men and most nurses are women, and that most kindergarten teachers are women and most university professors are men.
Notice that in every case it is the "male" job that has the greater social prestige and the higher pay. This raises many questions. Is it that women freely prefer to be secretaries, nurses, and kindergarten teachers, and men freely prefer to be executives, doctors, and professors? Do women generally seek out jobs that have lower prestige, or is lower prestige attached to the jobs because women have traditionally done them? Are people socialized differently according to their sex?
In recent history, feminists have rallied against patriarchy and male power structures. Because for years males have been trying to keep women in their place, their place being in the home, but women have challenged the men by placing more women in power and prestigious positions - for example, there are a growing...