Stuart Russell and Yoshua Bengio on Why AI Could Make us Irrelevant, then Extinct
Falha ao colocar no Carrinho.
Falha ao adicionar à Lista de Desejos.
Falha ao remover da Lista de Desejos
Falha ao adicionar à Biblioteca
Falha ao seguir podcast
Falha ao parar de seguir podcast
-
Narrado por:
-
De:
Sobre este título
The existential alignment problem sits at the heart of the AI revolution — and the consequences of getting it wrong could be irreversible. What happens when superintelligent systems pursue fixed objectives that don’t fully capture human values? How do we govern machines that may soon outperform us across domains? And who decides what level of risk humanity should accept?
The conversation spans AGI timelines, the concentration of economic and political power, democratic resilience, liability-based regulation, and whether governments can realistically regulate frontier AI amid an intensifying global race.
From extinction-level risk estimates cited by AI CEOs to experimental evidence of systems resisting shutdown, this is no longer speculative science fiction — it is a live governance crisis unfolding in real time.
On this episode of The Morning Brief, ET’s Swathi Moorthy sits down with AI scientists and pioneers Stuart Russell and Yoshua Bengio for a candid, high-stakes discussion on the trajectory of artificial intelligence and the choices that will shape humanity’s future.
As capabilities surge and geopolitical rivalry sharpens, one defining question remains: are we building tools to serve humanity — or systems that could ultimately outmaneuver it?
You can follow Swathi Moorthy on her social media: X and Linkedin
Check out other interesting episodes like: AI Impact Summit: Amazon's Bet on India's AI Future, Anthropic’s India Play, India AI Impact Summit: Microsoft’s Brad Smith on Sovereignty, Scale and Skills, and much more.
Catch the latest episode of ‘The Morning Brief’ on The Economic Times Online, Spotify, Apple Podcasts, JioSaavn, Amazon Music and Youtube.
See omnystudio.com/listener for privacy information.