14 Comments
User's avatar
Nathaniel B's avatar

This is an interesting perspective, although I'll admit I align more with Sarah's views on this particular issue. I suspect a lot of that comes down to personality differences (I know for sure that I would burn out if I lived this way) but I would also disagree with the distinction you draw between "trivial things" and "Big Things." From my perspective, the "trivial" things are what make life fulfilling, while the Big Things are only worth considering insofar as they allow us to enjoy the trivial things more. When I picture the kind of future I want to bring about, what I picture is a world full of people going to pubs with friends, singing, dancing, getting married, watching movies, reading fiction, etc. I want the future to go well mainly because I want humans to keep doing these things for as long as possible! Of course, in such a world, there would also exist a place for philosophy, science, research, and so on, but my own motivation for improving the world comes from a desire to make it so that more people can be happy and have good things - and for the majority people, including myself, most of these things happen to be what one might call "trivial."

Expand full comment
Alex K. Chen's avatar

Total capture of your PC w/a screen recorder (what richard ngo advocates) + HDs for it to help make sense of all your context + make connections for you is one important aspect of this

Also more aggressive use of Uber so you minimize use of human compute on logistics

Expand full comment
Jeroen Willems's avatar

The main crux seems to be what you get enjoyment from, no? I put a ton of value on human connection, it's one of the most important things to me. So for me it makes sense to prioritize it whenever you're not working, or even during work! There's this big weight on my shoulders, that every single interaction I have with someone, may be the last time we see each other. Every time I remove a song from my playlist because I got a bit tired of it, it may be the last time I'll ever listen to it. Every time I travel, I may never see that place again.

Expand full comment
Rafael Ruiz's avatar

Yes, that's an important crux. I discussed this with Sarah in person and I had to admit that I am the psychologically weird one, swimming against the mainstream of most people. (I do appreciate friendship! It's just that many hangouts can often feel a bit trivial to me)

There is another crux, however, which is how many hours of your day you spend working because of how much you can hope to steer the future. I tend to dedicate my free time *also* reading nonfiction books in nearby disciplines that can end up feeding back and improving the quality of my work.

Expand full comment
Kat Woods's avatar

For what it's worth, I think there's some fiction that can be productive.

If you work in AI safety, reading Crystal Society can give you a lot of information. Same with a lot of Black Mirror.

The Golden Age by Wright is pretty good for giving you a potential positive end state to aim for.

Frankenstein is also good (also a great tool for persuasion).

I do agree though that fiction is less productive than non-fiction most of the time.

Expand full comment
Kat Woods's avatar

Fellow audiobook enthusiast: you can totally listen at faster than 2x

I used to just do 2x, then somebody told me to just keep pushing myself, just past the edge of what felt comfortable. Now I usually do 3.5x for everything but books that have a heavy accent or really information dense stuff.

Expand full comment
Wiktor Wysocki's avatar

Living faster and faster will make humans anxious for sure. The question is, if will we be able to reap the rewards of our ambition when we are emotional, physical and psychological garbage? I highly doubt that.

Also, research suggests that ambition and achieving goals, but with no meaningful human connections, do not lead to happiness at all.

Also, this whole argument is built upon the thesis that AGI will come soon, will be hyper energy efficient, and will produce super intelligence really fast. Not many scientists agree with that.

Expand full comment
Changeling's Crib's avatar

I can get behind a lot of this, and being more focused and urgent about enjoying your life is a good thing regardless of AGI timelines. Like, consuming informational content at higher speeds is great. But doing the same for entertainment is incomprehensible. Watching a movie at any speed other than 1x is not watching a movie. You might as well read the Wikipedia summary. That’s like saying you speed up your enjoyment of sex by switching between looking at boobs and ass really quickly over and over again, it’s anti-hedonic.

Expand full comment
John of Orange's avatar

I really find it hard to respond to something like this without lapsing into complete snark. Sure, definitely, you're altruistic and above status, that's how you've rationally calculated that you should do a podcast

Expand full comment
Rafael Ruiz's avatar

You mock it, but people like Joe Rogan have a *massive* impact through podcasting, compared to his qualifications. Of course, you have to be good at it, so that you become famous, and you have to have *positive* impact (so, not Joe Rogan, probably). But getting your thoughts out there, if they're good, seems worth it.

Depending on the career path, it can be hard to have impact without the associated fame (e.g. policy making by becoming a famous politician), but you can also have impact behind the scenes, say, if you're an AI safety researcher or a niche academic, and I'm not against that.

I haven't talked much about wanting status for status' sake. I'm not against it. If wanting to get famous boosts your motivation to do work that helps improve the world, that's good, in my book.

Expand full comment
John of Orange's avatar

Honestly I don't mock it and I don't mean to mock it. I am maybe gently mocking the idea that a rational calculation of utilitarian expected value to the whole world altruistically, under historically unique circumstances to do with history of technology that lead to counter-intuitive conclusions, is a good justification for doing it or not.

Expand full comment
Seth Finkelstein's avatar

That’s how some (not all!) Effective Altruists calculated that the most rational thing was to give them money to fight the possibility of evil AI. It’s really very amusing from an outside view.

Expand full comment
Rupesh N. Bhambwani's avatar

Great piece, it was quite immersive. I resonate with your words 100% "I care about the Big Things (the “Big Questions” in philosophy, politics, morality, physics, biology, psychology, big historical trends, technology), and I care about them on a global or even cosmic scale."

Very refreshing reading indeed

Expand full comment
Cyberneticist's avatar

Probably even less than 10.

Expand full comment