🐦 Twitter Post Details

Viewing enriched Twitter post

@DrTechlash

The Rationalist's Guide to the Galaxy: Superintelligent AI and the Geeks Who Are Trying to Save Humanity's Future - By Tom Chivers. The book opens with a meeting Chivers had in Berkeley with Paul Crowley, who told him, "I don't expect your children to die of old age." Chivers came from the UK to Berkeley because "over the years, I became more involved with the Rationalists. I started reading their websites; I learned the jargon, all these terms like 'paperclip maximizer' and 'Pascal’s mugging.'" "The key text – the holy book, according to those who think the whole thing is a quasi-religion – is a huge series of blog posts" by Eliezer Yudkowsky, "which came to be known as the Sequences." Having read them, Chivers provides many explainers of various rationalists' thought experiments. The story unfolds through the Extropians mailing list, Yudkowsky's SL4 (Shock Level 4) mailing list, the launch of LessWrong in 2009, Slate Star Codex in 2013, and Bostrom's "Superintelligence" book in 2014 that served as a turning point. It then describes the idea of FOOM, Roko's Basilisk hysteria, putting numbers on everything (even if they are estimates), thus thinking probabilistically and making humans better Bayesians (rational Bayesian optimizers), utilitarianism (shut up and multiply) and its effect on AI Safety (if you believe in AI existential risk), how the movement attracts man on the autistic spectrum, and the arrangement of polyamory and group homes. The "dark sides" are briefly discussed by Chivers: "They do share a lot of the surface features of a cult: a charismatic figurehead and other high-status inner-circle members; a key text that in-group members are supposed to have read, and which encodes the central tenets of their 'belief'; unorthodox sexual practices; a message of impending apocalypse, and a promise of eternal life; and a way to donate money to avoid that apocalypse and achieve paradise." He ties it to the Effective Altruism movement by quoting David Gerard: "Clearly, the most cost-effective initiative possible for all of humanity is donating to fight the prospect of unfriendly artificial intelligence, and oh look, there just happens to be a charity for that precise purpose right here! WHAT ARE THE ODDS." However, Chivers defends the movement shortly thereafter, stating that they are just "nonconformists." To find further criticism of the movement, he mentions a Reddit page called "/r/sneerclub." He does not recommend reading it. I am. Throughout the chapters, Chivers gradually embraces the notion that AI will wipe out humanity. To solve the dissonance and stress it caused, he met with Anna Salamon, president and co-founder of CFAR (Center for Applied Rationality). What was the goal of their meeting? Her "Internal Double Crux" (debugging) session. As he finished the inner debate, he broke down in tears, realizing that, indeed, he might not see his children "die of old age." Totally sold on Yudkowsky's claim that "AI will kill EVERYONE," Chivers became an even more ardent supporter of the movement. So he celebrates its achievement: "What they have achieved in terms of the AI debate is, I think, remarkable. They've taken the niche, practically dystopian-science-fiction idea of AI risk and made people take it seriously." It's no longer in the realm of "fringe nerds on an email list." To conclude, he ends his book with this sentence: "There is a small but non-negligible probability that, when we look back on this era in the future, we'll think that Eliezer Yudkowsky and Nick Bostrom – and the SL4 email list, and LessWrong – have saved the world." I enjoyed reading this book as I write my upcoming book on this topic. Just, please, tell me again, how is this NOT a doomsday cult?

Media 1

πŸ“Š Media Metadata

{
  "score": 0.81,
  "scored_at": "2025-08-09T13:46:07.554657",
  "import_source": "network_archive_import",
  "media": [
    {
      "type": "photo",
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/1860080727980933496/media_0.jpg?",
      "filename": "media_0.jpg"
    },
    {
      "media_url": "https://pbs.twimg.com/media/GdBTFWsa8AAoOxL.jpg",
      "type": "photo"
    }
  ],
  "reprocessed_at": "2025-08-12T15:26:48.963741",
  "reprocessed_reason": "missing_media_array",
  "original_structure": "had_both"
}

πŸ”§ Raw API Response

{
  "user": {
    "created_at": "2020-11-05T05:27:34.000Z",
    "default_profile_image": false,
    "description": "Communication Researcher, analyzing the tech discourse. Book Author: The TECHLASH. Former Research Fellow @USC. Substack: AI Panic. Signal: DrTechlash.16",
    "fast_followers_count": 0,
    "favourites_count": 509,
    "followers_count": 4490,
    "friends_count": 210,
    "has_custom_timelines": false,
    "is_translator": false,
    "listed_count": 120,
    "location": "Cupertino, CA",
    "media_count": 611,
    "name": "Nirit Weiss-Blatt, PhD",
    "normal_followers_count": 4490,
    "possibly_sensitive": false,
    "profile_banner_url": "https://pbs.twimg.com/profile_banners/1324221717896597504/1724460263",
    "profile_image_url_https": "https://pbs.twimg.com/profile_images/1837007622807113730/s1nTVrqE_normal.jpg",
    "screen_name": "DrTechlash",
    "statuses_count": 1576,
    "translator_type": "none",
    "url": "https://t.co/GqHJmoc7ab",
    "verified": true,
    "withheld_in_countries": [],
    "id_str": "1324221717896597504"
  },
  "id": "1860080727980933496",
  "conversation_id": "1860080727980933496",
  "full_text": "The Rationalist's Guide to the Galaxy: Superintelligent AI and the Geeks Who Are Trying to Save Humanity's Future - By Tom Chivers.\n\nThe book opens with a meeting Chivers had in Berkeley with Paul Crowley, who told him, \"I don't expect your children to die of old age.\"\n\nChivers came from the UK to Berkeley because \"over the years, I became more involved with the Rationalists. I started reading their websites; I learned the jargon, all these terms like 'paperclip maximizer' and 'Pascal’s mugging.'\"\n\n\"The key text – the holy book, according to those who think the whole thing is a quasi-religion – is a huge series of blog posts\" by Eliezer Yudkowsky, \"which came to be known as the Sequences.\" \nHaving read them, Chivers provides many explainers of various rationalists' thought experiments.\n\nThe story unfolds through the Extropians mailing list, Yudkowsky's SL4 (Shock Level 4) mailing list, the launch of LessWrong in 2009, Slate Star Codex in 2013, and Bostrom's \"Superintelligence\" book in 2014 that served as a turning point.\n\nIt then describes the idea of FOOM, Roko's Basilisk hysteria, putting numbers on everything (even if they are estimates), thus thinking probabilistically and making humans better Bayesians (rational Bayesian optimizers), utilitarianism (shut up and multiply) and its effect on AI Safety (if you believe in AI existential risk), how the movement attracts man on the autistic spectrum, and the arrangement of polyamory and group homes.\n\nThe \"dark sides\" are briefly discussed by Chivers:\n\n\"They do share a lot of the surface features of a cult: a charismatic figurehead and other high-status inner-circle members; a key text that in-group members are supposed to have read, and which encodes the central tenets of their 'belief'; unorthodox sexual practices; a message of impending apocalypse, and a promise of eternal life; and a way to donate money to avoid that apocalypse and achieve paradise.\"\n\nHe ties it to the Effective Altruism movement by quoting David Gerard:\n\"Clearly, the most cost-effective initiative possible for all of humanity is donating to fight the prospect of unfriendly artificial intelligence, and oh look, there just happens to be a charity for that precise purpose right here! WHAT ARE THE ODDS.\"\n\nHowever, Chivers defends the movement shortly thereafter, stating that they are just \"nonconformists.\" To find further criticism of the movement, he mentions a Reddit page called \"/r/sneerclub.\" He does not recommend reading it. I am.\n\nThroughout the chapters, Chivers gradually embraces the notion that AI will wipe out humanity. To solve the dissonance and stress it caused, he met with Anna Salamon, president and co-founder of CFAR (Center for Applied Rationality). What was the goal of their meeting? Her \"Internal Double Crux\" (debugging) session.\nAs he finished the inner debate, he broke down in tears, realizing that, indeed, he might not see his children \"die of old age.\"\n\nTotally sold on Yudkowsky's claim that \"AI will kill EVERYONE,\" Chivers became an even more ardent supporter of the movement. So he celebrates its achievement: \"What they have achieved in terms of the AI debate is, I think, remarkable. They've taken the niche, practically dystopian-science-fiction idea of AI risk and made people take it seriously.\" It's no longer in the realm of \"fringe nerds on an email list.\"\n\nTo conclude, he ends his book with this sentence:\n\n\"There is a small but non-negligible probability that, when we look back on this era in the future, we'll think that Eliezer Yudkowsky and Nick Bostrom – and the SL4 email list, and LessWrong – have saved the world.\"\n\nI enjoyed reading this book as I write my upcoming book on this topic.\nJust, please, tell me again, how is this NOT a doomsday cult?",
  "reply_count": 7,
  "retweet_count": 15,
  "favorite_count": 67,
  "hashtags": [],
  "symbols": [],
  "user_mentions": [],
  "urls": [],
  "media": [
    {
      "media_url": "https://pbs.twimg.com/media/GdBTFWsa8AAoOxL.jpg",
      "type": "photo"
    }
  ],
  "url": "https://twitter.com/DrTechlash/status/1860080727980933496",
  "created_at": "2024-11-22T21:59:38.000Z",
  "#sort_index": "1860080727980933496",
  "view_count": 32311,
  "quote_count": 3,
  "is_quote_tweet": false,
  "is_retweet": false,
  "is_pinned": false,
  "is_truncated": true,
  "startUrl": "https://x.com/drtechlash/status/1860080727980933496"
}