“David Shapiro AI-Risk Interview” For Humanity: An AI Safety Podcast Episode #19



Interview starts at 9:23

In Episode #19, “David Shapiro Interview” John talks with AI/Tech YouTube star David Shapiro. David has several successful YouTube channels. His main channel (link below: go follow him!), with more than 140k subscribers, is a constant source of new AI and AGI and post-labor economy-related video content. Dave does a great job breaking things down.

But a lot Dave’s content is about a post AGI future. And this podcast’s main concern is that we won’t get there, cuz AGI will kill us all first. So this show is a two part conversation, first about if we can live past AGI, and second, about the issues we’d face in a world where humans and AGIs are co-existing. In this trailer, Dave gets to the edge of giving his (p)-doom.

John and David discuss how humans can stay in in control of a superintelligence, what their p-dooms are, and what happens to the energy companies if fusion is achieved, among many topics.

This podcast is not journalism. But it’s not opinion either. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth.

For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI.

Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We’ll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity.

RESOURCES:

FOLLOW DAVID SHAPIRO ON YOUTUBE!
https://youtube.com/@DaveShap?si=o_USH-v0fDyo23fm

DAVID’S OTHER LINKS:

Patreon (and Discord)
patreon.com/daveshap

Substack (Free)
daveshap.substack.com

GitHub (Open Source)
github.com/daveshap

Systems Thinking Channel
youtube.com/@Systems.Thinking

Mythic Archetypes Channel
youtube.com/@MythicArchetypes

Pragmatic Progressive Channel
youtube.com/@PragmaticProgressive

Sacred Masculinity Channel
youtube.com/@Sacred.Masculinity

source

23 thoughts on ““David Shapiro AI-Risk Interview” For Humanity: An AI Safety Podcast Episode #19”

  1. i hear so much about the ai being the danger here, but if it turns out that powerful, it will be people fighting each other over it, and/or the torches-&-pitchforks-mentality reaction that will cause the real problems, before ai even gets the chance to be the “bad guy”

    Reply
  2. Too Bald guy vs Extra full head of hair guy. This will be epic! Joking. Shapiro is definitely my 1st "go to" …He's really a teacher WITH integrity. Full disclosure…I'm bald too 🙂🙂

    Reply
  3. Start UBI (Universal Basic Income) now, BEFORE the crisis strikes. Doing it 'afterwards' is way too little, way too late. It IS affordable if you make it a small stipend, you have it 'replace' welfare, and you tax it back from those who do not need it (earn enough from other sources). As AGI takes off, tax the use of AGI in the economy, especially robotics, to slowly increase the stipend.

    Reply
  4. A former treasury secretary was brought in after the Sam Altman firing. Sam talks now of incremental releases. The most resource hungry AI will be (is now) controlled by governments. Cue the song as we think about Sam "Look What They've Done to My Song, Ma" Altman.

    Reply
  5. Will there still be elections? That seems optimistic. Human history would say democracy is a blip. Won’t those controlling the AI simply become the new kings? The strongmen David advocated for. With no jobs, and little food, those who survive through unemployment will take whatever I’d imagine. How can man create utopia outside himself when he can’t create it within?

    Reply
  6. Bit surprised by his cost benefit analysis. The cost of keeping a human-friendly world is significant (there are a million parameters in the ecosystem that have to stay within a certain bound to make survival possible) while the economic benefit of humans will eventually reach zero after the advent of AGI/ASI. Someone/something has to "pay" to ensure human survival. That is an incredibly dangerous position to be in.

    Reply
  7. FACT. Natural selection favors intelligence…. FACT: Steven Hawking said on BBC News (2014): "The development of full artificial intelligence could spell the end of the human race. It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” BOTTOM LINE. Humans will Contain and Control AGI or humans will be superseded… in event of extinction human intelligence is exactly 0. Time to SLOW DOWN until containment can be mathematically provably guaranteed.

    Reply
  8. Love the interview, but the low bass note added to the trailer and preview was really intrusive and distracting. I get that it adds to the DOOM factor, but for how long both were they would have been better (listenable) without.

    Reply
  9. Hey David. We're having enough trouble maintaining democracy as it is, and in order to prevent our elected government from serving oligarchs, we need LEVERAGE. That's what protests and strikes are about, overcoming the fact that most of us have almost no leverage by working together.

    Once the economy doesn't require us, and once the military doesn't require us, the only form of power is who owns more fully automated factories and fully automated armies to protect them. VOTING WON'T MATTER ANYMORE. Not even money will matter anymore. Trade will take the form of people who have their own resources and factories exchanging stuff with their peers. Even supposing they already don't have most of the money, fiat currency isn't worth anything anyway unless there's a government powerful enough to collect taxes in that currency. They'll just switch to our create whatever currency best represents the new power dynamic, if not simply say "hey I'll build you this fusion reactor if you give me some of your titanium mines"

    Reply

Leave a Comment