Monday, July 27, 2015

What effect did World War II have on the political dispensation of America?

World War II affected American politics in complex ways. For one thing, it essentially ended the strong isolationist strain in American politics, relegating it to the fringes of the American political spectrum. The war, along with the Great Depression, also created what has often been called a "liberal consensus" that the government had a major role to play in the economic well-being of its citizens. Most Democrats and even most Republicans supported such programs as the GI Bill and others that promoted economic expansion. The war also created increasing momentum for civil rights for African-Americans, who served with distinction in the armed forces and achieved significant economic gains at home. Finally, the end of the war brought about the Cold War, which altered the landscape of American politics in profound ways. The nation was put on an anti-communist footing after World War II, a condition that led many Americans to accept and even embrace the rapid expansion of the American military, the development of increasingly destructive weapons, and the emergence of what President Dwight Eisenhower, speaking in 1960, called the "military-industrial complex." These are just a few of the complex and often contradictory changes in postwar American politics.

No comments:

Post a Comment

Thomas Jefferson's election in 1800 is sometimes called the Revolution of 1800. Why could it be described in this way?

Thomas Jefferson’s election in 1800 can be called the “Revolution of 1800” because it was the first time in America’s short history that pow...