From the founding of the first colonies until the present, the influence of Christianity, as the dominant faith in American society, has extended far beyond church pews into the wider culture. Yet, at the same time, Christians in the United States have disagreed sharply about the meaning of their shared tradition, and, divided by denominational affiliation, race, and ethnicity, they have taken stances on every side of contested public issues from slavery to women's rights.
This volume of twenty-two original essays, contributed by a group of prominent thinkers in American religious...
From the founding of the first colonies until the present, the influence of Christianity, as the dominant faith in American society, has extended far ...