Germans often claim that "we have learned the lessons of our history." But what, precisely, are the lessons they have drawn from their Nazi-era past? What experiences from that time continue to hold significant meaning for Germans today, and how have those experiences shaped postwar German cultural identity? Though Germans have come to recognize the evils of Nazism, for them, its primary evil derived from the war it unleashed and the hardships, death, and destruction that the war wrought on the Germans themselves, and less from the losses and suffering it caused others. Recent public...
Germans often claim that "we have learned the lessons of our history." But what, precisely, are the lessons they have drawn from their Nazi-era past? ...