Discussion about this post

User's avatar
Kamran's avatar

I saw this when randomly checking my old email and unsubscribing from all the cult stuff I fell into as a mentally ill young person. "Yudkowsky was my father" hits home. His writing (and others) was there for me like my parents weren't -- but not there for me like parents need to be for a kid to grow up healthy and able to contribute to our collective well-being.

People like me are susceptible to what I might call his pompously grandiose paranoia. It feels like taking a red pill when you don't know any better. A lot of us have committed suicide because the perspective vortex is so punishing -- the promises of capital-R Rationality were so fantastical, and the reality of attempting to carry out the vision of the dojo was so banal and predictably like any other cult (or high-demand group.) The story of Leverage is a perfect example of the movement's irony.

Yud has a lot to lose by admitting his life's work is essentially a new form of Scientology. But I don't. I'm happy to admit I succumbed to a cult of personality. I was a kid who liked Harry Potter and didn't have a real father figure to guide me in utilizing my relative surplus of potential.

I survived. I have another chance. Thanks for reading.

Expand full comment
Liam's avatar

I'll concede that I think Eliezer's conclusions follow from his premises. What I really don't understand is his ur-premise, his epistemological premise, the conviction that he can reason this all out a priori without either formal proof or empirical evidence.

That's ~never worked before. It didn't work for Aristotle (how many teeth did your wife have, Aristotle? why didn't you check?) and I don't expect it to work this time.

What really gets me is that he thinks A(G)I will work the same way: Just like how he can figure out the inevitability of nanotech foom by thinking really hard, AGI will be able to build the diamondoid bacteria factories by thinking even harder.

Expand full comment
19 more comments...

No posts