According to Jean Piaget, sometime during adolescence gender roles are established. I want to talk about the gender roles that are depicted in our society today, and possibly why that is. In all aspects of every society, gender identity must be established. It is at birth when an infant (person) is given either a male or female identity. Once the parents have been told, the society will set the examples and attitudes for that gender. "Gender includes a broad spectrum of attitudes, behaviors, and social expectations that we acquire during our lifetimes, through interactions with one another and experiences in various environments". Stereotyping of genders, even in today's society still occurs, for example when thinking of a doctor, most people would associate a man with this occupation. On the other hand, a secretary would usually be referred to as a woman.
In my life experiences I have noticed that gender roles in society have changed dramatically.
I have been placed in what would be classified as typical male jobs, some of these included, joinery, bricklaying and driving. It was blatantly obvious that these jobs were male orientated as there were no women employed in these areas of work, no facilities available for women, such as no female toilets, also the general conversation between employees was derogatory to the opposite sex. This I found uncalled for, as in my experience I found that many women could do the same task to the same standard as most men. According to gender socialization men are supposed to be strong, and as a result most men go through their lives thinking that there is something wrong with them if they are not naturally strong. I speak from experience, right through school I was (and indeed am still not) strong, and as a result...