preview

Effects Of Ww2 On American Society

Decent Essays

After World War II, the American psyche became permanently stained with new ideas. During this time period, the American government actively sought to change the way the American people thought. The support of the American public was crucial to the success of the war effort. Many ideas introduced during this point of time consisted of new roles of certain people groups in American society. Women and minority groups would prove themselves in the workplace, millions of citizens would be discriminated against, and social barriers would be broken and assembled. Even though World War II took place in Europe and the Pacific, it made lasting social changes that can still be seen in America. Prior to the war, most factory jobs were held by white men

Get Access