Changing Roles of American Women/ Feminism

Before World War II, a woman's role was of being a wife and a mother because many jobs were reserved for men. Since the outbreak of the war, the need to mobilize the population was required to change the views on gender roles. Higher education for women was being accepted, but many opportunities for their education in the workplace were limited. Many organizations recruited women in government related services like the army. My group examined the changes before and after the war in Hartwick College and the responses of the civilians in the community. 

 

Credits

Marriam Iqbal, Gina Grauer and Suzanne Phillips