The New Deal completely changed Americans' attitudes
towards the role of government. Before the New Deal, many or most Americans believed
that the government had no real role in maintaining the health of the economy or in
providing for people who were too old to work or who were otherwise unemployed. The
government was supposed to set tax rates and tariffs and things like that, but it was
not supposed to otherwise intervene in the economy.
After
the New Deal, this all changed. Americans came to accept the idea that the government
would be responsible for the performance of the economy as well as for the well-being of
the people. We have come to accept this to the extent that even budget cutters in
Congress say they will not touch Social Security or Medicare. These are programs that
were unthinkable before the New Deal but which are seen as indispensable by many
Americans today.
No comments:
Post a Comment