There are already bad actors in science, including “paper mills” churning out fake papers. This problem will only get worse when a scientific paper can be produced with US$15 and a vague initial prompt. The need to check for errors in a mountain of automatically generated research could rapidly overwhelm the capacity of actual scientists. The peer review system is arguably already broken, and dumping more research of questionable quality into the system won’t fix it. Science is fundamentally based on trust. Scientists emphasise the integrity of the scientific process so we can be confident our understanding of the world (and now, the world’s machines) is valid and improving. A scientific ecosystem where AI systems are key players raises fundamental questions about the meaning and value of this process, and what level of trust we should have in AI scientists. Is this the kind of scientific ecosystem we want?