Episode #368 from 52:14

AGI alignment

Eliezer Yudkowsky and Lex Fridman discuss agi alignment.

Why this moment matters

Eliezer Yudkowsky and Lex Fridman discuss agi alignment.

Starts at 52:14
People and topics
All moments
AGI alignment chapter timestamp | Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization | EpisodeIndex