• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

information for practice

news, new scholarship & more from around the world


advanced search
  • gary.holden@nyu.edu
  • @ Info4Practice
  • Archive
  • About
  • Help
  • Browse Key Journals
  • RSS Feeds

AI and the falling sky: interrogating X-Risk

Introduction

The Buddhist Jātaka tells the tale of a hare lounging under a palm tree who becomes convinced the Earth is coming to an end when a ripe bael fruit falls on its head. Soon all the hares are running; other animals join them, forming a stampede of deer, boar, elk, buffalo, wild oxen, rhinoceros, tigers and elephants, loudly proclaiming the earth is ending.1 In the American retelling, the hare is ‘chicken little,’ and the exaggerated fear is that the sky is falling.

The story offers a cautionary tale for considering the trend towards calamity thinking in artificial intelligence (AI). A growing chorus of tech leaders has warned that AI poses existential risk (X-Risk) that could result in the extinction of the human species, the collapse of civilisation, or a colossal decline in human potential and culture. In 2014, Hawking told the Washington Post that AI, ‘could spell the…

Read the full article ›

Posted in: Journal Article Abstracts on 05/21/2024 | Link to this post on IFP |
Share

Primary Sidebar

Categories

Category RSS Feeds

  • Calls & Consultations
  • Clinical Trials
  • Funding
  • Grey Literature
  • Guidelines Plus
  • History
  • Infographics
  • Journal Article Abstracts
  • Meta-analyses - Systematic Reviews
  • Monographs & Edited Collections
  • News
  • Open Access Journal Articles
  • Podcasts
  • Video

© 1993-2025 Dr. Gary Holden. All rights reserved.

gary.holden@nyu.edu
@Info4Practice