Men (and Women) of Medicine
The world dictates that in the field of medicine, men are doctors and women are nurses. This popular belief was globally crafted over time with cultural beliefs, laws, and restrictions. From early times, women have been thought of as inferior to men. This belief has caused a subsequent belief that women are less deserving of education. Due to this, women have been finding it difficult to obtain a high position job or an equal income compared to men in the United States.
Ever since the start of the professionalization of the field of medicine, women have been limited to allied health occupations such as nursing and midwifery. It was not until the 1800s when women started filling important roles in the medical field. In 1801, Lovisa Arberg became the first doctor and surgeon in Sweden. In 1803, Amalia Assur became the first dentist in Europe.
In the case of the United States, Elizabeth Blackwell was the first woman to graduate from a medical school in 1821. The list goes on and on. This demonstrates that over time, women have slowly been advancing further towards the gender equalization in the medical field. For instance, women were 9% of the enrolled at medical schools in 1969 but by 1976, this number had increased to 20%.
Even though this is true, discrimination against women in the medical field still exists. This is shown through the fact that men still populate the medical field more than women. Even when a web search is done on popular search engine websites such as Google or Bing, searching the term "nurse" leads to images of female nurses and the searching of the term "doctor" provides you with images of male doctors. Men still dominate the specialized field of surgery while...