• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

information for practice

news, new scholarship & more from around the world


advanced search
  • gary.holden@nyu.edu
  • @ Info4Practice
  • Archive
  • About
  • Help
  • Browse Key Journals
  • RSS Feeds

The Hidden Human Cost of AI Moderation

Jacobin | d3sign/Getty
Jacobin | d3sign/Getty

The artificial intelligence boom runs on more than just code and compute power — it depends on a hidden, silenced workforce. Behind every AI model promising efficiency, safety, or innovation are thousands of data labelers and content moderators who train these systems by performing repetitive, often psychologically damaging tasks. Many of these workers are based in the Global South, working eight to twelve hours a day reviewing hundreds — sometimes thousands — of images, videos, or data points, including graphic material involving rape, murder, child abuse, and suicide. They do this without adequate breaks, paid leave, or mental health support — and in some cases, for as little as $2 an hour. Bound by sweeping nondisclosure agreements (NDAs), they are prohibited from sharing their experiences.

Posted in: News on 06/30/2025 | Link to this post on IFP |
Share

Primary Sidebar

Categories

Category RSS Feeds

  • Calls & Consultations
  • Clinical Trials
  • Funding
  • Grey Literature
  • Guidelines Plus
  • History
  • Infographics
  • Journal Article Abstracts
  • Meta-analyses - Systematic Reviews
  • Monographs & Edited Collections
  • News
  • Open Access Journal Articles
  • Podcasts
  • Video

© 1993-2025 Dr. Gary Holden. All rights reserved.

gary.holden@nyu.edu
@Info4Practice