Few would argue that war has been a defining experience for people born in Europe and North America in the twentieth century. The degree to which war has been instrumental in improving women's social situation remains a vexed question, however. Conventional wisdom repeats the cliche that the Great War liberated women by allowing them to demonstrate their fitness for equality by recruiting them to work in men's jobs previously considered beyond their capabilities. In fact, their patriotic enthusiasm was used against them after the war, when they were seen to have profited from the deaths of...
Few would argue that war has been a defining experience for people born in Europe and North America in the twentieth century. The degree to which w...