The Progressive Era is generally said to have ended with
WWI. So the answer to your question would be that World War I essentially ended
Progressivism.
After WWI, the US entered an era that is
generally known as "the Roaring '20s" and is said to be the time when the country
enjoyed a "return to normalcy." During the 1920s, the country moved away from the
reform spirit that had motivated Progressivism. In place of that, there was simply a
desire to have political peace and a desire to enjoy everyday life. These desires are
said to have ended Progressivism.
No comments:
Post a Comment