Impacts of Wwii

Impacts of Wwii

Impacts On American Society From WWII

Before the war, the citizens had been accustomed to a normal life style. The men would work and get paid so they could bring food for the table, while the women were our ordinary housewives. It was your typical American day. The society of America had been forever changed World War II, when we were forced into the war, the men had to leave the women to fend for themselves so they could help our country avenge their fallen allies from Pearl Harbor. But in doing so they would be leaving the United States to fend for itself.
So while the men were out fighting they left our women home alone with money decreasing by the minute. Women had been running out of money and food as well. They had been so used to the men doing the work and getting paid that they were so lost once the men had left. Because the women were alone with no husband to earn them the money they needed to support themselves and/or their children, they decided to stand to the challenge and take up a job.
Once women were grabbing jobs, their minds were being filled with ideas they’ve never thought of before. Labor had completely changed their point of view in life. Now that women had been working, they became more of use and a more independent woman. It changed many women throughout America. Not only did they now have jobs and were earning the extra dollar they needed, but they were having intentions to keep their jobs.
Another factor that had completely struck and changed America was the African Americans who helped aid our men in war and the rest of the immigrants who took a new lifestyle upon our country. They went a long way from what they were to what they had become. As a result of the event, many new things had occurred from their assistance in the war.
Gender, labor, and race had been the important three factors that helped change America’s society due to all the news occurring such as women gaining their rights and the African Americans aiding...

Similar Essays