Tuesday, July 29, 2014

Were the changes in women's roles due to the war long or short term?Thank you!

In general, the changes in women's role in society that
happened during the war did not last.  After World War I, women generally went back to
playing the same role that they had previously.


During WWI,
many women took up jobs that had been left open by men going to fight.  However, once
the war ended, women did not keep those jobs.  Instead, they were generally encouraged
to go back to being wives and mothers.  For example, Congress passed the Sheppard-Towner
Maternity Act in 1921 to fund instruction for women in how to care for their infant
children.


However, there were some ways in which WWI did
change women's roles.  The most important of these was the fact that WWI was one factor
in women getting the right to vote in 1920.  Even so, there was not a major and
immediate change in women's roles that could be attributed to
WWI.

No comments:

Post a Comment

What accomplishments did Bill Clinton have as president?

Of course, Bill Clinton's presidency will be most clearly remembered for the fact that he was only the second president ever...