
NYT | O Zabeida
As clinicians at a major academic medical center, we have seen our patients turn to chatbots powered by large language models for emotional support that they would once have sought from family or friends — to discuss their fears, loneliness and uncertainty. This troubles us. But we understand how it can happen: When people feel overwhelmed by anxiety or intrusive thoughts, it can be easier to turn to a computer rather than a person. The chatbot won’t laugh at its users, berate them or ignore them. It’s always available. The typical chatbot response feels comforting; A.I. responses are designed to be warm, confident and validating.