U.S. culture
U.S. culture changed radically in the immediate wake of World War Two. An enormous economic boom brought unprecedented prosperity for many people (not all), radical changes happened to U.S. families, an anti-communist hysteria overtook much of the nation, and the United States embarked on a new Cold War with a massive peacetime military establishment. Of …