"Look at the sunset from the sun's point of view."
- Steven Wright
search this site
December 5. So I just tried the hot new apple variety, Cosmic Crisp, and they're for real. I've never been a fan of Honeycrisps. They're crunchy and sweet, but they lack the tartness and denseness of a great apple. The best eating apples are all russets, but you can't buy them in stores because their skins are not shiny. This actually goes back to Monday's subject: the bigger the crowd, the harder it is to get them to buy apples that are better on the inside than the outside.
But now, consumers don't need to be smart, because Cosmic Crisps are pretty on the outside, and dense and full-flavored on the inside. I just did a taste test against Fuji, which is no slouch, and it wasn't even close. They're also really expensive -- the other night I paid almost $10 for five of them. But in a few years, as the supply increases, prices will come down, and the Cosmic Crisp will drive a lot of varieties off the shelves.
Everyone knows the worst apple is Red Delicious, but a hundred years ago they were really good. What happened was genetic drift, reinforced by the values of industrial farming, so the apples got gradually cheaper to grow and ship, with worse flavor, but still pretty. The same thing might happen to Cosmic Crisp, and in another hundred years we'll need a new apple.
December 2. Smart blog post by a long-time reader, Idea strength, cringe, and the media environment. The basic idea is that technology has connected us so much that creative work is taking fewer risks:
When artists become more socially connected to each other and to consumers, bold choices get riskier. Every mind in a perceptual network is a vector for cringe. As the network grows more interconnected, the potential for cringe increases. An artistic risk that might have been low in a less connected environment becomes high.
This reminds me of a bit in this YouTube talk, Why Greatness Cannot Be Planned. It's by the creator of Picbreeder, a brilliant image-breeding site that no longer works because web browsers can no longer run Java. Anyway, he found out that images bred by individuals are much better than images bred by voting.
One of the comments explains it like this: "Consensus-driven frameworks prematurely optimize and miss the necessary low-fitness stepping stones needed to find creative complex solutions." In simpler language: you have to go through difficult stuff to get to good stuff, and the bigger the crowd, the harder it is to get them to go through difficult stuff. Another example is Hollywood test screenings, which polish out anything really good because someone thinks it's too weird.
But when I think about it more, there are two different things going on here. One is what I've just described, the blandifying effect of the crowd. The other is the long tail of taste: with more creative work available, there's more room to like stuff that fewer people like.
In listening to music, I've gone a long way down that rabbit hole, and the difficult thing is not in how I hear the music. It's not like I'm forcing myself to listen to stuff that sounds bad until I like it. The music always pulls me in, and the difficult thing is the loneliness of loving something that no one else understands. Or, the obstacle to exploring the long tail of taste, is not perceptual but social.
November 29. A few stray links. First, an interview with the author of McMindfulness:
Corporations in the U.S. are losing approximately 300 billion dollars a year from stress-related absences and seven out of ten employees report being disengaged from their work. So there's certainly a problem... The remedy has now become mindfulness, where employees are then trained individually to learn how to cope and adjust to these toxic corporate conditions rather than launching kind of a diagnosis of the systemic causes of stress not only in corporations but in our society at large.
Rex sends a 52 minute video, The Plant Ecology of Concrete, Garbage and Urine. As described by one of the comments: "Imagine a guy with a camera walking through homeless people territory and a dump place, getting all scientific about plants." Also with lots of fun and cynical social commentary.
A new Paul Graham post, The Bus Ticket Theory of Genius, arguing that the most important factor in doing great work, is to be obsessed with something with no expectation that it's useful, like people who collect old bus tickets.
This week I got obsessed with making a video of an obscure song, using images from two classic books of sci-fi paintings. I've just uploaded it: Wireheads - Sagan.
November 26. I'm taking the week off from blogging. There are some Thanksgiving-related recipes on my old misc. page.
November 21. So I just found out that the original Blade Runner movie was set in November of 2019. I've been toying with this idea, that the best way to keep AI from doing bad stuff, is to make every AI intelligible, to humans, as a person. The idea comes from science fiction, where you might have computers or androids who don't behave quite like humans, but they're still characters. You can still get a grip on their behavior with the same mental tools that you use to understand other humans. We might need this constraint on AI's, because we know how to think about people, but we don't know how to think about whatever Google is doing.
Related: a blog post by Adam Elkus, The Varieties Of The Technological Control Problem. Near the end, he uses the movie Ex Machina to illustrate how our relationship with technology is like an abusive couple:
All abusive relationships begin between two individuals that believe they are both in love and that they can meet each other's varied needs. But over time several negative things occur. First, the parameters of the relationship are subtly changed to the disadvantage of one of the parties. Second, that party becomes less and less capable of recognizing what is happening to them and breaking free of the abuser. So perfectly independent and emotionally stable men and women can in theory become shells of their former selves after being trapped in an inescapable web of abuse that, sadly, they come to believe that they deserve. This is a good metaphor for [Langdon] Winner's own formulation of the control problem. Technology can be "autonomous" in the sense that humans may enter into relationships with technologies with the goal of improving their lives. However in reality people end up finding themselves hopelessly dependent on the technologies and deprived of any meaningful control or agency over them.
New subject. It's getting close to the end of 2019, and I'm ready to list my favorite music of the decade. I have a chronological 2010's playlist on Spotify, but two of my top three songs are on neither Spotify nor YouTube. Those two also happen to be from my two favorite albums of the decade, which you can find from the Bandcamp pages. The top three:
The Lovely Eggs - Wiggy Giggy
Wireheads - Holiday
Big Blood - A Watery Down Pt II
November 18. Today's subject: feeling bad, starting with a reddit thread cross-posted to the subreddit: Suicide hotline is bullshit. Apparently this is only a problem in the USA, where unhappiness beyond a certain level is criminalized, and if the system finds out you're suicidal, they will make your life much worse.
That's why so many depressed people pretend they're happy. You can find a lot of examples in this thread: How does your depression manifest in ways that non-depressed people wouldn't expect or understand?
Related, from a couple weeks ago, How would you explain anxiety to someone who rarely deals with it?
And a smart article on deep brain stimulation as a cure for depression. This reminds me of a bit from the BBC documentary, Pain Pus and Poison, where the pioneers of anesthetics had to overcome the belief that it's impossible to cheat pain.
Finally, a fascinating article from the Guardian, I wish I'd never been born: the rise of the anti-natalists. There are different levels of anti-natalism, which I would define by how far people go in projecting their own unhappiness onto others. The first level is just not having kids of your own, because you think they're better off not existing. The next level is wanting all of humanity to go extinct. The next level is the belief that "all sentient beings should be spared from life."
Never mind the difficulty of defining "sentient", I'd like to hear a debate between a full-on anti-natalist and a squirrel. (It's easier for me to imagine a talking squirrel than a genuinely happy human.) I would say, isn't life full of good feelings that make the bad feelings worthwhile? The anti-natalist would say, those are two different orders of feeling, you can't compare them, and any amount of pain makes existence a bad idea. The squirrel would say the opposite: life contains a lot of pain, but it's all part of the joy of being alive.
So the real cause of anti-natalism isn't pain, but the absence of pleasure. Now we're tripping over language again, because "pleasure" implies stuff like eating ice cream and watching Netflix, which are overwhelmed by serious depression and anxiety. We don't have a word for the kind of good feeling that would overwhelm common bad feelings, because that kind of good feeling is not part of our way of life.
November 14. Why Technologists Fail to Think of Moderation as a Virtue, a smart review of a new book about artificial intelligence. There's a famous thought experiment called the paper clip maximizer, and Elon Musk tells a version about a strawberry maximizer, an AI that eventually "blankets every nook and cranny of the planet with strawberry fields and annihilates civilization in the process."
But the review cites sci-fi author Ted Chiang, who says the tech executives are projecting their own value system, every company trying to maximize growth at all costs. We're worried about computers, when corporations are already powerful and dangerous artificial beings.
I would argue it like this: an actual strawberry maximizer is almost harmless, because we understand what it's doing. It has a clear and simple motive, and the moment it destroys something we care about, in order to grow more strawberries, we'll stop it. But nobody understands what Google is doing, not even the people who work for it. It doesn't even make sense to say Google has a motive. It has behaviors, and consequences, and its behaviors are tied in with our own interests, so that it can do a lot of damage before we get serious about stopping it.
You could say the same thing about automobile traffic. A human creation has duplicated itself so successfully, that you can find it everywhere there are roads. Whole cities are made for it. It has made us completely dependent on it, while violently killing us, making us sick, isolating us socially, consuming massive resources, and throwing our planet's climate out of whack. Strangely, no one loves this radical contraption more than self-described conservatives (who have also found the one good use for it: racing).
But we don't call it an AI, because it's not intelligent in any way that we recognize. Its intelligence is hidden inside us. Our own simple motive, to go faster, has tricked us into doing all the thinking for a global takeover by an evil robot army.
By the way, our motive to understand this subject has led to a really unsatisfying discussion over on the subreddit. I think the deeper problem is, the human-built world has been so transformed by cars, that it's difficult to imagine a world where no one would miss them, especially one with other modern technologies.
But I think it's going to happen, not through utopian planning, but because maintaining roads is really expensive. In my own state, the roads are about to go to shit because of a populist revolt against car taxes. In a hundred years, populations will be falling everywhere, and places with more pavement than people will get abandoned.
November 11. Speed limits for ships can have massive benefits. This reminds me of an argument by Ivan Illich, that the world would be a lot better with a universal speed limit of 15 miles per hour.
Here's my own page of excerpts from Ivan Illich on Cars, where he calculates that Americans put so much time into their cars, including working to pay for car expenses, that their effective speed is only 5 mph. Also, the walkability of a city is a key factor in the social mobility of its residents. And why are cars killing more and more pedestrians? Probably because cities are increasingly being built for more, faster, and heavier vehicles.
Of course, even moderate transportation reforms are politically impossible. So I might as well go full-on utopian. Instead of limiting speed, I would limit momentum: mass times velocity. Say, 2000 pound miles per hour, or 40 kilogram meters per second, roughly the momentum of an average sized person on a bicycle. Old and sick people could still putter around on electric wheelchairs. Shrink the roads to trails, abandon the sprawl, turn the parking lots to food forests or high density housing, turn the railroad tracks to intercity trails.
With no heavy freight, all manufacturing would be local, and smaller scale. Every city would look different because new construction would have to be made out of local materials. Food would be local, so some cities would really need to innovate with greenhouses or algae or solar-powered protein fabricators. (This is not a low-tech utopia, only a slow one.)
Air travel would switch completely to craft that are lighter than air, and you could actually go a lot faster than 15 mph, on an airship that follows the wind. Anyone who wanted to go east would be waiting for an east wind. (I sort of already do this. There's a town eight miles east of here, with a good bike trail, and I only go when there's an east wind, so that the ride home is effortless.)
Now, let's get really crazy and put a speed limit on data. I think the best rule would be that data can only be carried on physical media by human couriers. That could be enforced by cable-cutting, signal-jamming, and shooting down drones, and big systems would have no advantage over independent agents in motivating couriers to move fast.
The social effects of slowing data to human speed are really interesting, and too big for this post. But I think both the right and left would get on board if they thought it through. And it could become politically realistic inside a thousand years, especially if there's a backlash against big data.
November 7. Some loose ends from the last post, mostly stuff I wrote over email:
I cringe a bit when I use the phrase "local communities", because when politicians use that phrase, they're always bullshitting. To actually bring back local communities, we would need politically impossible reforms. Or a tech crash.
I'm wondering if the breakdown of local communities could be explained purely by cars and telephones, which enable us to have a larger number of more far-flung and weaker human relationships. And of course the internet pushes that even farther.
If people have more money than time, they are forced to use money as the medium of social connection. Only when they have more time than money, can they connect through understanding.
Erik mentions "mental grey goo". Grey goo is the doom scenario with self-replicating tiny machines, and some mental artifacts are like that. They're very simple and they can take over our minds. So, money, internet memes, and certain religious or political ideas.
Maybe all it takes to get people interested in local politics, is to fill it with the same ingroup-outgroup drama as national politics, and then our cities and towns would become as fucked up as our country.
Some music for the weekend. My favorite album of 2019 is by a noise band called Sly and the Family Drone. Their instruments are drums, feedback, and baritone sax, and their sound is basically space jazz, ranging from ambient to heavy in the same track. Their best song, Jehovah's Wetness, has a climax that sounds like whale metal.
November 4. A few weeks ago I wrote that the thought of my own death gives me a sense of relief, because I would be free of all my responsibilities. This goes back to my favorite question lately: Why is there so little overlap between what's good for us to do, and what we feel like doing? Then I was reading Matthew Crawford's book Shop Class As Soulcraft and found a clue in this line: "We want to feel that our world is intelligible, so we can be responsible for it."
Coming at the same subject from another angle, a friend writes:
I am reading some family stories of my 92 year old neighbor, whose father and grandfather are prosperous farmers. They write a lot about the vital importance of being a prominent member of the local community. Not prominent in status or power, but prominent in the ability to help, to meet needs, in their local communities. It's their obligation, but also, their honor. They don't buy seeds from the fancy store far away for half the price; they buy seeds from the local guy, because that is what a community is all about.
Why did we stop doing that? I reject any kind of moral judgment, that people were better in the old days. People are the same as ever, we always do what seems like the best thing at the time, but our environment has changed so that abandoning local communities seems like the best move.
For the answer to both questions, I blame technological complexity. In a hunter-gatherer tribe, or a medieval village, or even the USA a hundred years ago, the human-built world was intelligible to you and your friends. You could wrap your head around the importance of whatever you were doing, and if something went wrong, you knew someone who could fix it.
Now the human-built world is so complex that you can't possibly know enough people to stay on top of it. We have to constantly deal with specialists who we might never talk to again. And the specialists, even if they're doing something useful, are doing the same thing over and over for strangers, so they're not really into it. And at the same time, we're supposed to be super-nice to each other and pretend to be happy, which means hiding our gnawing awareness of how many things could go wrong, that we have no idea how to deal with.
Lots of people have written about the costs of complexity. Joseph Tainter's book The Collapse of Complex Societies is mostly about physical stuff rather than human psychology. However you frame it, three things are certain: 1) More complexity, more problems. 2) It's easy to gradually raise complexity, and really hard to gradually lower it. 3) So when complexity falls, it tends to fall a lot.
Usually the way this happens, is that people withdraw their emotional support, and then their practical support, from old, overly complex systems, and give it to new and simple systems that make them feel better. Beyond that, I have no idea how this is going to play out, but I'm curious to see it.
October 31. The SpaceX Starship is a very big deal. It's fully reusable, it can take off and land vertically, and it's cheap.
The Starship is comparable in complexity to a 737, and so it's not unthinkable to have a construction rate of 500/year. If each Starship manages 300 flights per year, each carrying 150 T of cargo, then we are talking a yearly incremental cargo capacity growth of 22 million tonnes to orbit.
So, space factories, space hotels, satellites for all kinds of crazy shit, and I won't be surprised to see space advertisements outshining the stars. This Hacker News thread is mostly a debate about using space sunshades to fix the climate.
The article says that 90% of the cargo will be fuel for missions farther out. In sci-fi, these are human missions, but I think it will be almost entirely robots. Putting humans on Mars makes no sense economically, except as a way to take advantage of people who will do anything to go there. And when they get there, they will be so bored and homesick that they will do anything to come back. I think Philip K Dick nailed it: actual Mars colonists will stay sane by living in virtual worlds so convincing that they forget they're stuck on Mars.
More generally, a paradox: the deeper humans go into outer space, the deeper they will go into their own minds.
For Halloween, a song. The first time I heard this, even though it's by my favorite band, I thought it was terrible. Now I love it so much, that the more high I am, the more times in a row I want to hear it. And it totally sounds like ghosts and goblins: Big Blood - So. Po. Village Stone.
October 29. No big ideas this week, so I'm just going to chat. This is going to be a cold winter. We had a hard frost in early September, weeks before normal, and last night we tried to drive up to Spokane, but the roads were so icy that we turned around and ended up getting home four hours later, after lots of 20mph driving to stay on the road, and finally taking back roads to avoid a huge backup where cars had gone off. So today I paid $420 (heh) for some craigslist snow tires. They were overpriced, but they have good tread, they're well-reviewed, they're the right size, and they're on their own rims, so I don't have to go to the tire store to put them on and off.
We've been watching this German TV show, Dark, and it's got me thinking about the different ways that someone can have high or low intelligence. The show has so many characters, that it's a huge challenge for me to keep track of them. I had to rewatch the first four episodes, and every time they showed a family photo, I paused it and tried to memorize them like I was studying for a test. Still, last night I had to keep asking Leigh Ann, who is that guy? And despite not rewatching, she usually remembered.
So my brain is not good for people. When I was five years old, watching Bewitched, I noticed that Darrin was sometimes credited as Dick York and sometimes as Dick Sargent, but I couldn't tell them apart, so I assumed the actor had changed his name. Meanwhile, most five year olds could easily tell the actors apart, but couldn't read.
I was just having an email conversation about how humans have lost the ability to "tune in" to the natural world, and it occurred to me, I have a similar inability to tune into the human world. Things that other people "just know", I have to try really hard to figure out, and still might get them wrong. But the benefit is, doing things a weird way is no harder for me than doing things the normal way.
October 26. So after the last post, I got several emails about animals using drugs, like these. But the guy in the video isn't arguing that animals never use drugs, just that their motivations are different than ours. He might be wrong, but I still believe that humans have lost something that wild animals have, and good drugs can temporarily bring it back.
Now I want to take another angle, and look at cultural judgments about right and wrong reasons for using substances to change yourself. The video could be interpreted as having this subtext: animals are good because they use drugs only for practical reasons; humans are bad because we use them to feel good.
I've seen this judgment made explicitly in an anecdote about two people who did the same psychedelic, and met the same entity. The first one says to the entity, I'm seeking spiritual understanding, and gets rewarded; the second one says, I just want to get high, and gets punished.
But that's not the only way we draw the line. Consider steroids, where we think it's bad for athletes to use them for a practical upgrade, but it's okay to use them for certain medical conditions. In this case, the rule is: there's a human baseline, and it's good to use drugs to get up to that baseline, but not above it.
This is also how we think about psychiatric drugs: making yourself normal is responsible use; making yourself unusual is abuse.
Alex suggests a more interesting distinction:
These animals, as well as indigenous humans, tend to use these substances to feel more, not less. It's a running toward, not from.
Personally, I'm always trying to feel more. I mean, I avoid pain, but if it happens, I want to face the pain raw. And if I'm already feeling good, then I want to see the light behind the world, to be reminded of the potential bliss of existence. For me, this line from the Tao Te Ching is totally about drugs: "Use the bright light, but return to the dim light."
October 24. I have no ideas this week, but I just made a comment on this YouTube video that was posted to the subreddit, Why Do Humans Like to Get High? The guy argues that we are the only animal that uses drugs to feel good. I don't know if he's right, but if he is, I think it's because the way that drugs make us feel good, wild animals already feel that way, all the time. Some of the other comments say basically the same thing, like "He who makes a beast of himself gets rid of the pain of being a man."
Related, also linked from the subreddit, a dense essay about Capitalism as enchantment. The idea is, the age of reason did not snuff out the belief in "mysterious, incalculable forces that ruled and animated the cosmos," but shifted that style of thinking away from nature and religion, and toward money.
And some fun stuff for the weekend. I've been getting more into board games lately, playing three-spirit solo games of Spirit Island, and Wingspan with friends. This video, Feudum Complete Rules + Setup, is absolutely jaw-dropping in three ways: its impeccable production values, its sooooothing vibe, and the ridiculous complexity of the game.
Finally, something I've never heard before, a Led Zeppelin cover that's better than the original: Cato Salsa Experience and The Thing with Joe McPhee - Whole Lotta Love
October 21. Bunch o' links, starting with one from the subreddit, To make laziness work for you, put some effort into it. It's a rumination around the issues of laziness, idleness, and boredom, and the main idea is that it's good to be less busy and appreciate it. There's even a bit about free will, which reminds me of the time Leigh Ann and I were driving past some wind turbines, which were barely moving, and she said, "Those windmills are lazy!" Maybe we're more like wind turbines than we think: our motivation seems to come from inside us, when really it comes from our environment, and how well it fits us.
Related, Dominic sends this Hacker News comment thread about Coffee is Hard, a thoughtful essay about making an 80's style quest game about mundane daily activities. The idea is, tedious chores are a big part of life, and we should learn to enjoy them, but that enjoyment depends on not being in a hurry.
Some pessimism about technology, first from Scientific American, We Have No Reason to Believe 5G Is Safe. And another Hacker News thread, this one about plummeting computer literacy, as the user interfaces on our devices are polished down to contain less and less about how computers actually work. I remember when you would download a "program". Now they call it an "app". A program is something that lets you tell your computer what to do. An app is something that lets your computer tell you what to do.
And some optimism about food. Flour power: meet the bread heads baking a better loaf. It starts with growing wheat varieties that are not optimized for industrial efficiency. A wheat field with a wide gene pool has lower yields, but it's more robust against disease and weather, and the wheat tastes better. Then there are ways of milling the wheat and making the bread, that work the same way, trading efficiency for quality.
And this article is definitely overstating its case, but it argues that vat-grown protein is getting so good, so fast, that animal farming is going to collapse.
An article about the emotions of animals, and the growing acceptance that they're a lot like us: Can a Cat Have an Existential Crisis?
And Mark sends a link to Buddha at the Gas Pump, a YouTube channel of "conversations with spiritually awakening people."
October 18. Some feedback from the last post. First, some pessimism about the present society, a great 2018 blog post, There is Trouble in River City. The author uses two sources from the 1800's, Washington Irving's descriptions of two contrasting river towns, and Thomas Carlyle on "pig philosophy", to show how money can corrupt the human spirit, and how the thrill of material progress ends in malaise. I love this bit:
Irving had taken a steamboat up the Mississippi from New Orleans, had stopped at one of the "serene and dilapidated villages" that "border the rivers of ancient Louisiana," and had been there beguiled by the strangely joyous life of the tatterdemalion Creoles.
And a reader comment with some optimism about our species:
There are people who are trying to... evolve humanity on a spiritual level. Some call this the "5D reality" and some call it crystalline gateways, lol. Some major hoogey moogey there. But in my own meditations, it seems to resonate with the idea that we are capable of switching timelines, changing tracks, weaving in new threads entirely. I get the idea that the gods really love that shit. It feels GOOD. I think that's why we're still around.
Now that I'm re-integrated in the human world, I have a product recommendation and a sports highlight. My new favorite shoe, when I'm not wearing Vibram FiveFingers, is the Camper Peu Cami. It's not quite as low-profile as the Vivo Barefoot, but it's shaped even more like an actual human foot, and it's very stylish.
And check out this move by a player on my local soccer team, Brianna Alger making a defender fall down without even touching the ball.
October 16. Bad News for the Highly Intelligent. Like a lot of studies of supposedly intelligent people, they just look at members of Mensa, which is not the same thing. Still, the results are extreme: double the anxiety disorders, and triple the environmental allergies of the general population. They speculate that more intelligent people are more overexcitable. Or...
Depressed People See the World More Realistically. The evidence from studies is not conclusive, but depressive realism fits my personal experience. I used to be happier, until three things made me smarter. First, I've been in enough car crashes now that I understand that driving is extremely dangerous and we should all be terrified every minute that we're doing it. Second, I've become a lot more aware of subtext in conversation, and now the social world feels like a minefield.
Third, I've heard that psychedelic drugs cure depression, and maybe I just need to take bigger doses, but the reason I'm suddenly cynical about humanity, is that last week I took LSD and walked up the river trail out of town. Every time I do it, it's pretty much the best day I've ever had, and I understand that every blade of grass is more impressive than the combined works of humanity. And then I have to go back to the human world and default human cognition, and it's awful. This song describes it perfectly.
Don't worry, I'm not considering suicide. I believe that fate has plans for me, and if I quit, I'll have to start over. But when I think about my own death, the main thing I feel is relief. Then when I think about it more carefully, I don't actually want to die, I just want to have no responsibilities.
Don't we all -- and that's not normal. It's because our society is in full-on decline. I remember in third grade when they taught us the word "responsibility", and I was immediately suspicious. Only now can I explain why: Responsibility is a social tool to maintain the inertia of activities that at one time someone felt like doing, but now nobody does.