U.S. Health Care Workers Want Their Employers to Address Climate Change A survey of U.S. clinicians finds most believe it is important for their hospital to address climate change and is aligned with their organization’s mission. Read the full article ›