While many Americans in the nineteenth century accepted death as a common and inevitable part of life, the experience and meaning of dying changed between 1880 and 1965 as the growing prestige of medicine led both patients and doctors to reject the inevitability of death and to emphasize the fight for recovery instead.