Tempering hardcoreness
The kids are alright
There’s a kind of young guy who itches to accept great pain and risk for some glorious quest. Depending on other features of his brain and his particular place and time and upbringing, this could manifest as enlisting to fight the Nazis in World War II, free soloing El Capitan, trying to conquer the Achaemenid empire, grinding on the next great B2B SaaS unicorn, training all day to become a UFC fighter or a concert pianist, trying to build a machine god to conquer death, etc.
One popular way to channel this drive is to become super hardcore about a totalizing ideology or cause. You could become a Franciscan monk or join ISIS or try to achieve nirvana. And of course, you could become a hardcore EA (I highly recommend!).
I myself am not that hardcore. I do try to maintain a paranoid vigilance about the big picture of my career (am I really doing the highest impact thing I could be doing?), but I’ve put most of the hardcoreness points I’ve got into that. I spend a lot more money on myself than I need to for productivity or sustainability. I’m not vegan, nor am I doing an optimized IQ-maxxing diet featuring a bunch of oily fish or whatever. I don’t streamline my morning routine or optimize my keyboard layout or cycle through various stimulant cocktails to work 13 hour days. I devote generous amounts of time and money and energy to the soft and frivolous things in life like throwing parties and getting manis and participating in weddings and snuggling babies. I aspire to have children of my own, though this will be another massive expenditure of time and money that pulls away from helping the world.
I’m surrounded by twenty-something dudes who work 80 hour weeks to save the world, commuting the five minute walk from their Spartan group apartment to our shared office at 5:30 am every day, taking short breaks for spirited shop talk over vegan lunches and dinners in the building. They make generous salaries saving the world (unlike in the old days of EA) but then turn around and donate most of it.
Like seven years ago my main reaction to these guys would have been to feel terrible about my own sloth and decadence, but at this point I’ve gotten over myself / grown old and given up (career EAs age in dog years). There is a measure of guilt — and when I catch the right mood I still try to see if I can scrounge up another hardcoreness point or two — but at this point a greater measure of acceptance about my own drive. I’m left mainly with enormous affection, gratitude, and admiration for them all. I’m impatient for the day they make me obsolete.1
But as grateful as I am for these guys, I’m also grateful that the actual real-life community they find themselves in subtly de-radicalizes them. In this context, they are generally extremely productive and delightful people who I think are having a huge positive impact on the world. But it is unfortunately very easy for me to imagine many of these same people instead getting swept up in something like FTX, and having a giant negative impact and a miserable time.2
Don’t get me wrong, the scene is still a scaffold that takes intense young people as input and elicits like 3000 hours of high-energy x-risk reducing work out of them per year for a while. It’s not exactly for the faint of heart, and it’s not like “de-radicalization” is the point exactly. But relative to so many other ways this community could easily have been, it has that effect.
Firstly, the established senior folks in the scene, who have an outsized impact on the culture and discourse, are truly, deeply not naive utilitarians — but at the same time they’re also not the kind of disingenuously sanctimonious scolds who hand down simplistic rules that the young and hungry and hardcore would be tempted to roll their eyes at. A lot of the people who’ve been around for a few years are like the next Pokemon evolution of the slightly-crazed 22 year old Benthamite. They have a lot of empathy for that headspace, they have thought deeply about the gnarly questions that come up when you push that worldview to its brink, they can engage in lengthy and subtle debates about those topics.
And people are fundamentally tolerant of one another’s personal choices, and resist engaging in purity death-spirals (partly thanks to the cultural example of the aforementioned established folks). Within every ideological scene, the internal logic and social incentives tend to push you toward embodying a more extreme version of that ideology, whether that’s competing to declare more and more banal views “problematic” in some social justice circles or competing to say the most shocking and transgressive things you can in some young right circles. In the AI x-risk scene, this force certainly exists, but is empirically weaker than in most analogous communities. I find the social pressure is often channeled toward things like whether you have good arguments for your career choice rather than whether you’re working too little or spending too much.
This means that if you ever run out of steam being a monastic vegan who works 80 hours a week, you can pretty seamlessly transition into a reducetarian dad who works 40 hours a week. You don’t have to lose your friends or professional community, you’ll probably still have more impact year on year as your experience grows, you’ll look back with pride on how much you accomplished in a more intense season of your life — and you won’t end up like, in jail for the greatest financial crime in history or anything like that.
I’m thinking about 3.5 weeks before the AIs do.
Individuals vary, but on average, I don’t think these young utilitarians have the kind of discriminating eye that would have warned them away from Alameda and FTX in its heady early days.


Another excellent piece. From my personal experience, I worry that it’s easy for this hardcoreness to come from an internal sense of inadequacy/insecurity, and this has underappreciated epistemic costs.
If at bottom you don’t feel ‘good enough’ (very common), you’ll cling to the belief system + social group that makes you feel worthy. EA can do that well.
But relating to something from a place of inadequacy introduces all sorts of blind spots. You’ll be biased to not seriously entertain counterarguments that threaten your new source of self-worth (e.g., dismissing critics who claim your work is actually causing harm in X way). You also end up judging others harshly based on whether they ‘get it’, since your judging yourself harshly on this.
But it’s messy! Maybe I’m overpsychologizing. Not being approval-seeking at all is very hard and, in the right circles, approval-seeking can just in fact motivate great work. But I still think this is something to be vigilant of and a case for cultivating wisdom/self-knowledge in environments with a lot of hardcoreness.
EA specifically engages in a sort of anti-purity death spiral, where it’s socially rewarded to vaguely disparage EA. Ironically, this dynamic itself is the product of the same kind of naive consequentialism it claims to critique.