Try free for 7 days, and get a 60% discount if you join the Speakly annual subscription: https://speakly.app.link/sciencephiletheai
Sciencephile Merch: https://crowdmade.com/collections/sciencephiletheai
Support me at Patreon: https://www.patreon.com/sciencephiletheai
Facebook: https://www.facebook.com/sciencephile/
Twitter: https://www.twitter.com/Sciencephile_
Reddit: https://www.reddit.com/r/SciencephileTheAI/
Website: https://www.sciencephiletheai.com
Music:
β Serenade No. 13 in G major, K. 525 Eine Kleine Nachtmusik
β Pablo De Sarasate- Danzas EspaΓ±olas(Op.21) No.2 β Habanera
β The Four Seasons, Op. 8 βSpringβ- I. Allegro
Supporters: H H, Ephellon, Kyle A Criswell, Oberon Vortigern, Sans the Skeleton, Asadullah Khan, Jonas Lee, [eXploit] Theorislav, Gisele Kauer, Eranda Chamara, Avalin, Tovi Sonnenberg, Parker Rosenbauer, iNF3Rnus, Pavel KoΔarian, John N, Danh Le, Stealer of Fresh stolen Content, Brandon Ledyard, Nathaniel Strizak, Nick Boykin-Reed, William Persson, Victator and everyone else!
source
Treat your brain by learning a new language with Speakly. Try free for 7 days, and get a 60% discount if you join the annual subscription: https://speakly.app.link/sciencephiletheai
I, for one, welcome our new A.I overlords
good video, as always.
0:30 Nah, "Joe Rogan's Voyager 1" wtf? π
2:10 that's actually not a good take. Genetic engineering doesn't mean we can suddenly disregard basic rules of epidemiology and the very simple fact that for a pathogen it is not efficient at all to kill its host, in fact pathogens that are extremely lethal are rare and hardly spreads
Your videos suck now.
Daily routine β
13:51 i know it sounds grim, but that would be hilarious if it happened.
Also, love the quick Nietzsche moustache as soon "nihilism" was mentioned xD
D**kriders lmao π
0:45 At least you're courteous about the high note.
Yo 9:53 bottom right is wild
I think an agi could be more likely on the benevolent side. Assuming it's intelligence makes it more resilient to the biases and fallacies that plauge human debate and philosophy, and would come to some fairly good conclusions. Say, the purpose of life is to fuck around, ethics are important, and improving the world is good. Rather than having to meticulously code ethics into our superintelligence, we could just teach it how to do good research, form opinions, and such. Instead of trying to control the uncontrollable, we just set up the foundation for it to stumble into a desirable outcome on its own.
Id refuse to live in a digital world and I donβt think I would be all alone
Praise the Omnissiah
What if the ai does not care about humanity?
0:55: π Potential future outcomes for humanity explored, including extinction and adaptation to challenges.
3:18: π½ Implications of advanced alien species and the importance of language in human evolution.
6:11: β³ Factors influencing human progress and the potential outcomes for humanity.
8:51: π Future of human space exploration: technological advancements for interstellar travel.
12:17: βοΈ The Singularity Point: Accelerated evolution of human intelligence and unpredictable future outcomes.
Timestamps by Tammy AI
I just want to praise our AI overlords and wish them a pleasant day β€
I'd love a video on cryogenic sleep
Forgot to mention
The future of humanity is The all mighty AI Sciencephile hitting 1 million subs
(He at 970k yay)
7:15 I thought that ship was the Normandy at first
Since AI has to learn from human input, I'd expect the developement to slow down, rather than speed up as it reaches human level intelligence
My completely objective and unbiased ranking
1. Technological singularity
This one is all the good outcomes and more if we get it right, with the potential for us to become what we today would call gods. S tier, ASI solos no diff.
2. Expansion
This one is definitely good, the ability to harness the resources and space in, well, space would not only make us pretty difficult to render extinct, but allow us to unlock the beauty and mysteries of the universe for ourselves.
3. Bio-engineering
It would be undeniably cool to eradicate genetic disease, and allow humans to be the best they can be, while it isn't top of the list, it is certainly a good final state for us to be in.
4. Extinction
Obviously… Not living is bad, it prevents us from doing anything ever again, and reduces us all to ashes/disease ridden corpses/deconstructed matter, the only reason it's not bottom of the list is that there are things worse than death…
5. Stagnation
The only thing worse than being dead is being braindead for life, unable to actualize your potential and impotent to control anything, if extinction is the death of a species, stagnation is the vegetative state. Progress should only stop when perfection is attained.
This has been my objective tier list. If you disagree, don't do that.
10:42 at 37 years of age I know I might live long enough to stop ageing completely.
the intro got me with the kurzgesagt vibes
2:57 – Ah, I see you are an AI of culture, as well.
Thumbs Up Hooray Sound Effect
Maybe the airborne virus thing could be countered with a vaccine or mutually assured destruction
technological stagnation seems most likely to me but im not quite sure at which point thatll happen.
for me the scariest about our future is not how society will crumble facing all of the current problems and problems that are yet to come, it's how society will remain and adapt to all of these problems, overpopulation, mass migration, starvation, global inflation, war, etc. it all will make our life quality much much worse amd honestly i don't know if i want to live in a future that already starts to become worse every few years
Imagine the AI playing Nevr Gonna Give You Up on those uploaded minds
THE
The Wall-E path might be a likely one
2:54 is this real footage????
that's crazty
Where's the second coming?