"Look at the sunset from the sun's point of view."
- Steven Wright
search this site
November 21. So I just found out that the original Blade Runner movie was set in November of 2019. I've been toying with this idea, that the best way to keep AI from doing bad stuff, is to make every AI intelligible, to humans, as a person. The idea comes from science fiction, where you might have computers or androids who don't behave quite like humans, but they're still characters. You can still get a grip on their behavior with the same mental tools that you use to understand other humans. We might need this constraint on AI's, because we know how to think about people, but we don't know how to think about whatever Google is doing.
Related: a blog post by Adam Elkus, The Varieties Of The Technological Control Problem. Near the end, he uses the movie Ex Machina to illustrate how our relationship with technology is like an abusive couple:
All abusive relationships begin between two individuals that believe they are both in love and that they can meet each other's varied needs. But over time several negative things occur. First, the parameters of the relationship are subtly changed to the disadvantage of one of the parties. Second, that party becomes less and less capable of recognizing what is happening to them and breaking free of the abuser. So perfectly independent and emotionally stable men and women can in theory become shells of their former selves after being trapped in an inescapable web of abuse that, sadly, they come to believe that they deserve. This is a good metaphor for [Langdon] Winner's own formulation of the control problem. Technology can be "autonomous" in the sense that humans may enter into relationships with technologies with the goal of improving their lives. However in reality people end up finding themselves hopelessly dependent on the technologies and deprived of any meaningful control or agency over them.
New subject. It's getting close to the end of 2019, and I'm ready to list my favorite music of the decade. I have a chronological 2010's playlist on Spotify, but two of my top three songs are on neither Spotify nor YouTube. Those two also happen to be from my two favorite albums of the decade, which you can find from the Bandcamp pages. The top three:
The Lovely Eggs - Wiggy Giggy
Wireheads - Holiday
Big Blood - A Watery Down Pt II
November 18. Today's subject: feeling bad, starting with a reddit thread cross-posted to the subreddit: Suicide hotline is bullshit. Apparently this is only a problem in the USA, where unhappiness beyond a certain level is criminalized, and if the system finds out you're suicidal, they will make your life much worse.
That's why so many depressed people pretend they're happy. You can find a lot of examples in this thread: How does your depression manifest in ways that non-depressed people wouldn't expect or understand?
Related, from a couple weeks ago, How would you explain anxiety to someone who rarely deals with it?
And a smart article on deep brain stimulation as a cure for depression. This reminds me of a bit from the BBC documentary, Pain Pus and Poison, where the pioneers of anesthetics had to overcome the belief that it's impossible to cheat pain.
Finally, a fascinating article from the Guardian, I wish I'd never been born: the rise of the anti-natalists. There are different levels of anti-natalism, which I would define by how far people go in projecting their own unhappiness onto others. The first level is just not having kids of your own, because you think they're better off not existing. The next level is wanting all of humanity to go extinct. The next level is the belief that "all sentient beings should be spared from life."
Never mind the difficulty of defining "sentient", I'd like to hear a debate between a full-on anti-natalist and a squirrel. (It's easier for me to imagine a talking squirrel than a genuinely happy human.) I would say, isn't life full of good feelings that make the bad feelings worthwhile? The anti-natalist would say, those are two different orders of feeling, you can't compare them, and any amount of pain makes existence a bad idea. The squirrel would say the opposite: life contains a lot of pain, but it's all part of the joy of being alive.
So the real cause of anti-natalism isn't pain, but the absence of pleasure. Now we're tripping over language again, because "pleasure" implies stuff like eating ice cream and watching Netflix, which are overwhelmed by serious depression and anxiety. We don't have a word for the kind of good feeling that would overwhelm common bad feelings, because that kind of good feeling is not part of our way of life.
November 14. Why Technologists Fail to Think of Moderation as a Virtue, a smart review of a new book about artificial intelligence. There's a famous thought experiment called the paper clip maximizer, and Elon Musk tells a version about a strawberry maximizer, an AI that eventually "blankets every nook and cranny of the planet with strawberry fields and annihilates civilization in the process."
But the review cites sci-fi author Ted Chiang, who says the tech executives are projecting their own value system, every company trying to maximize growth at all costs. We're worried about computers, when corporations are already powerful and dangerous artificial beings.
I would argue it like this: an actual strawberry maximizer is almost harmless, because we understand what it's doing. It has a clear and simple motive, and the moment it destroys something we care about, in order to grow more strawberries, we'll stop it. But nobody understands what Google is doing, not even the people who work for it. It doesn't even make sense to say Google has a motive. It has behaviors, and consequences, and its behaviors are tied in with our own interests, so that it can do a lot of damage before we get serious about stopping it.
You could say the same thing about automobile traffic. A human creation has duplicated itself so successfully, that you can find it everywhere there are roads. Whole cities are made for it. It has made us completely dependent on it, while violently killing us, making us sick, isolating us socially, consuming massive resources, and throwing our planet's climate out of whack. Strangely, no one loves this radical contraption more than self-described conservatives (who have also found the one good use for it: racing).
But we don't call it an AI, because it's not intelligent in any way that we recognize. Its intelligence is hidden inside us. Our own simple motive, to go faster, has tricked us into doing all the thinking for a global takeover by an evil robot army.
By the way, our motive to understand this subject has led to a really unsatisfying discussion over on the subreddit. I think the deeper problem is, the human-built world has been so transformed by cars, that it's difficult to imagine a world where no one would miss them, especially one with other modern technologies.
But I think it's going to happen, not through utopian planning, but because maintaining roads is really expensive. In my own state, the roads are about to go to shit because of a populist revolt against car taxes. In a hundred years, populations will be falling everywhere, and places with more pavement than people will get abandoned.
November 11. Speed limits for ships can have massive benefits. This reminds me of an argument by Ivan Illich, that the world would be a lot better with a universal speed limit of 15 miles per hour.
Here's my own page of excerpts from Ivan Illich on Cars, where he calculates that Americans put so much time into their cars, including working to pay for car expenses, that their effective speed is only 5 mph. Also, the walkability of a city is a key factor in the social mobility of its residents. And why are cars killing more and more pedestrians? Probably because cities are increasingly being built for more, faster, and heavier vehicles.
Of course, even moderate transportation reforms are politically impossible. So I might as well go full-on utopian. Instead of limiting speed, I would limit momentum: mass times velocity. Say, 2000 pound miles per hour, or 40 kilogram meters per second, roughly the momentum of an average sized person on a bicycle. Old and sick people could still putter around on electric wheelchairs. Shrink the roads to trails, abandon the sprawl, turn the parking lots to food forests or high density housing, turn the railroad tracks to intercity trails.
With no heavy freight, all manufacturing would be local, and smaller scale. Every city would look different because new construction would have to be made out of local materials. Food would be local, so some cities would really need to innovate with greenhouses or algae or solar-powered protein fabricators. (This is not a low-tech utopia, only a slow one.)
Air travel would switch completely to craft that are lighter than air, and you could actually go a lot faster than 15 mph, on an airship that follows the wind. Anyone who wanted to go east would be waiting for an east wind. (I sort of already do this. There's a town eight miles east of here, with a good bike trail, and I only go when there's an east wind, so that the ride home is effortless.)
Now, let's get really crazy and put a speed limit on data. I think the best rule would be that data can only be carried on physical media by human couriers. That could be enforced by cable-cutting, signal-jamming, and shooting down drones, and big systems would have no advantage over independent agents in motivating couriers to move fast.
The social effects of slowing data to human speed are really interesting, and too big for this post. But I think both the right and left would get on board if they thought it through. And it could become politically realistic inside a thousand years, especially if there's a backlash against big data.
November 7. Some loose ends from the last post, mostly stuff I wrote over email:
I cringe a bit when I use the phrase "local communities", because when politicians use that phrase, they're always bullshitting. To actually bring back local communities, we would need politically impossible reforms. Or a tech crash.
I'm wondering if the breakdown of local communities could be explained purely by cars and telephones, which enable us to have a larger number of more far-flung and weaker human relationships. And of course the internet pushes that even farther.
If people have more money than time, they are forced to use money as the medium of social connection. Only when they have more time than money, can they connect through understanding.
Erik mentions "mental grey goo". Grey goo is the doom scenario with self-replicating tiny machines, and some mental artifacts are like that. They're very simple and they can take over our minds. So, money, internet memes, and certain religious or political ideas.
Maybe all it takes to get people interested in local politics, is to fill it with the same ingroup-outgroup drama as national politics, and then our cities and towns would become as fucked up as our country.
Some music for the weekend. My favorite album of 2019 is by a noise band called Sly and the Family Drone. Their instruments are drums, feedback, and baritone sax, and their sound is basically space jazz, ranging from ambient to heavy in the same track. Their best song, Jehovah's Wetness, has a climax that sounds like whale metal.
November 4. A few weeks ago I wrote that the thought of my own death gives me a sense of relief, because I would be free of all my responsibilities. This goes back to my favorite question lately: Why is there so little overlap between what's good for us to do, and what we feel like doing? Then I was reading Matthew Crawford's book Shop Class As Soulcraft and found a clue in this line: "We want to feel that our world is intelligible, so we can be responsible for it."
Coming at the same subject from another angle, a friend writes:
I am reading some family stories of my 92 year old neighbor, whose father and grandfather are prosperous farmers. They write a lot about the vital importance of being a prominent member of the local community. Not prominent in status or power, but prominent in the ability to help, to meet needs, in their local communities. It's their obligation, but also, their honor. They don't buy seeds from the fancy store far away for half the price; they buy seeds from the local guy, because that is what a community is all about.
Why did we stop doing that? I reject any kind of moral judgment, that people were better in the old days. People are the same as ever, we always do what seems like the best thing at the time, but our environment has changed so that abandoning local communities seems like the best move.
For the answer to both questions, I blame technological complexity. In a hunter-gatherer tribe, or a medieval village, or even the USA a hundred years ago, the human-built world was intelligible to you and your friends. You could wrap your head around the importance of whatever you were doing, and if something went wrong, you knew someone who could fix it.
Now the human-built world is so complex that you can't possibly know enough people to stay on top of it. We have to constantly deal with specialists who we might never talk to again. And the specialists, even if they're doing something useful, are doing the same thing over and over for strangers, so they're not really into it. And at the same time, we're supposed to be super-nice to each other and pretend to be happy, which means hiding our gnawing awareness of how many things could go wrong, that we have no idea how to deal with.
Lots of people have written about the costs of complexity. Joseph Tainter's book The Collapse of Complex Societies is mostly about physical stuff rather than human psychology. However you frame it, three things are certain: 1) More complexity, more problems. 2) It's easy to gradually raise complexity, and really hard to gradually lower it. 3) So when complexity falls, it tends to fall a lot.
Usually the way this happens, is that people withdraw their emotional support, and then their practical support, from old, overly complex systems, and give it to new and simple systems that make them feel better. Beyond that, I have no idea how this is going to play out, but I'm curious to see it.
October 31. The SpaceX Starship is a very big deal. It's fully reusable, it can take off and land vertically, and it's cheap.
The Starship is comparable in complexity to a 737, and so it's not unthinkable to have a construction rate of 500/year. If each Starship manages 300 flights per year, each carrying 150 T of cargo, then we are talking a yearly incremental cargo capacity growth of 22 million tonnes to orbit.
So, space factories, space hotels, satellites for all kinds of crazy shit, and I won't be surprised to see space advertisements outshining the stars. This Hacker News thread is mostly a debate about using space sunshades to fix the climate.
The article says that 90% of the cargo will be fuel for missions farther out. In sci-fi, these are human missions, but I think it will be almost entirely robots. Putting humans on Mars makes no sense economically, except as a way to take advantage of people who will do anything to go there. And when they get there, they will be so bored and homesick that they will do anything to come back. I think Philip K Dick nailed it: actual Mars colonists will stay sane by living in virtual worlds so convincing that they forget they're stuck on Mars.
More generally, a paradox: the deeper humans go into outer space, the deeper they will go into their own minds.
For Halloween, a song. The first time I heard this, even though it's by my favorite band, I thought it was terrible. Now I love it so much, that the more high I am, the more times in a row I want to hear it. And it totally sounds like ghosts and goblins: Big Blood - So. Po. Village Stone.
October 29. No big ideas this week, so I'm just going to chat. This is going to be a cold winter. We had a hard frost in early September, weeks before normal, and last night we tried to drive up to Spokane, but the roads were so icy that we turned around and ended up getting home four hours later, after lots of 20mph driving to stay on the road, and finally taking back roads to avoid a huge backup where cars had gone off. So today I paid $420 (heh) for some craigslist snow tires. They were overpriced, but they have good tread, they're well-reviewed, they're the right size, and they're on their own rims, so I don't have to go to the tire store to put them on and off.
We've been watching this German TV show, Dark, and it's got me thinking about the different ways that someone can have high or low intelligence. The show has so many characters, that it's a huge challenge for me to keep track of them. I had to rewatch the first four episodes, and every time they showed a family photo, I paused it and tried to memorize them like I was studying for a test. Still, last night I had to keep asking Leigh Ann, who is that guy? And despite not rewatching, she usually remembered.
So my brain is not good for people. When I was five years old, watching Bewitched, I noticed that Darrin was sometimes credited as Dick York and sometimes as Dick Sargent, but I couldn't tell them apart, so I assumed the actor had changed his name. Meanwhile, most five year olds could easily tell the actors apart, but couldn't read.
I was just having an email conversation about how humans have lost the ability to "tune in" to the natural world, and it occurred to me, I have a similar inability to tune into the human world. Things that other people "just know", I have to try really hard to figure out, and still might get them wrong. But the benefit is, doing things a weird way is no harder for me than doing things the normal way.
October 26. So after the last post, I got several emails about animals using drugs, like these. But the guy in the video isn't arguing that animals never use drugs, just that their motivations are different than ours. He might be wrong, but I still believe that humans have lost something that wild animals have, and good drugs can temporarily bring it back.
Now I want to take another angle, and look at cultural judgments about right and wrong reasons for using substances to change yourself. The video could be interpreted as having this subtext: animals are good because they use drugs only for practical reasons; humans are bad because we use them to feel good.
I've seen this judgment made explicitly in an anecdote about two people who did the same psychedelic, and met the same entity. The first one says to the entity, I'm seeking spiritual understanding, and gets rewarded; the second one says, I just want to get high, and gets punished.
But that's not the only way we draw the line. Consider steroids, where we think it's bad for athletes to use them for a practical upgrade, but it's okay to use them for certain medical conditions. In this case, the rule is: there's a human baseline, and it's good to use drugs to get up to that baseline, but not above it.
This is also how we think about psychiatric drugs: making yourself normal is responsible use; making yourself unusual is abuse.
Alex suggests a more interesting distinction:
These animals, as well as indigenous humans, tend to use these substances to feel more, not less. It's a running toward, not from.
Personally, I'm always trying to feel more. I mean, I avoid pain, but if it happens, I want to face the pain raw. And if I'm already feeling good, then I want to see the light behind the world, to be reminded of the potential bliss of existence. For me, this line from the Tao Te Ching is totally about drugs: "Use the bright light, but return to the dim light."
October 24. I have no ideas this week, but I just made a comment on this YouTube video that was posted to the subreddit, Why Do Humans Like to Get High? The guy argues that we are the only animal that uses drugs to feel good. I don't know if he's right, but if he is, I think it's because the way that drugs make us feel good, wild animals already feel that way, all the time. Some of the other comments say basically the same thing, like "He who makes a beast of himself gets rid of the pain of being a man."
Related, also linked from the subreddit, a dense essay about Capitalism as enchantment. The idea is, the age of reason did not snuff out the belief in "mysterious, incalculable forces that ruled and animated the cosmos," but shifted that style of thinking away from nature and religion, and toward money.
And some fun stuff for the weekend. I've been getting more into board games lately, playing three-spirit solo games of Spirit Island, and Wingspan with friends. This video, Feudum Complete Rules + Setup, is absolutely jaw-dropping in three ways: its impeccable production values, its sooooothing vibe, and the ridiculous complexity of the game.
Finally, something I've never heard before, a Led Zeppelin cover that's better than the original: Cato Salsa Experience and The Thing with Joe McPhee - Whole Lotta Love
October 21. Bunch o' links, starting with one from the subreddit, To make laziness work for you, put some effort into it. It's a rumination around the issues of laziness, idleness, and boredom, and the main idea is that it's good to be less busy and appreciate it. There's even a bit about free will, which reminds me of the time Leigh Ann and I were driving past some wind turbines, which were barely moving, and she said, "Those windmills are lazy!" Maybe we're more like wind turbines than we think: our motivation seems to come from inside us, when really it comes from our environment, and how well it fits us.
Related, Dominic sends this Hacker News comment thread about Coffee is Hard, a thoughtful essay about making an 80's style quest game about mundane daily activities. The idea is, tedious chores are a big part of life, and we should learn to enjoy them, but that enjoyment depends on not being in a hurry.
Some pessimism about technology, first from Scientific American, We Have No Reason to Believe 5G Is Safe. And another Hacker News thread, this one about plummeting computer literacy, as the user interfaces on our devices are polished down to contain less and less about how computers actually work. I remember when you would download a "program". Now they call it an "app". A program is something that lets you tell your computer what to do. An app is something that lets your computer tell you what to do.
And some optimism about food. Flour power: meet the bread heads baking a better loaf. It starts with growing wheat varieties that are not optimized for industrial efficiency. A wheat field with a wide gene pool has lower yields, but it's more robust against disease and weather, and the wheat tastes better. Then there are ways of milling the wheat and making the bread, that work the same way, trading efficiency for quality.
And this article is definitely overstating its case, but it argues that vat-grown protein is getting so good, so fast, that animal farming is going to collapse.
An article about the emotions of animals, and the growing acceptance that they're a lot like us: Can a Cat Have an Existential Crisis?
And Mark sends a link to Buddha at the Gas Pump, a YouTube channel of "conversations with spiritually awakening people."
October 18. Some feedback from the last post. First, some pessimism about the present society, a great 2018 blog post, There is Trouble in River City. The author uses two sources from the 1800's, Washington Irving's descriptions of two contrasting river towns, and Thomas Carlyle on "pig philosophy", to show how money can corrupt the human spirit, and how the thrill of material progress ends in malaise. I love this bit:
Irving had taken a steamboat up the Mississippi from New Orleans, had stopped at one of the "serene and dilapidated villages" that "border the rivers of ancient Louisiana," and had been there beguiled by the strangely joyous life of the tatterdemalion Creoles.
And a reader comment with some optimism about our species:
There are people who are trying to... evolve humanity on a spiritual level. Some call this the "5D reality" and some call it crystalline gateways, lol. Some major hoogey moogey there. But in my own meditations, it seems to resonate with the idea that we are capable of switching timelines, changing tracks, weaving in new threads entirely. I get the idea that the gods really love that shit. It feels GOOD. I think that's why we're still around.
Now that I'm re-integrated in the human world, I have a product recommendation and a sports highlight. My new favorite shoe, when I'm not wearing Vibram FiveFingers, is the Camper Peu Cami. It's not quite as low-profile as the Vivo Barefoot, but it's shaped even more like an actual human foot, and it's very stylish.
And check out this move by a player on my local soccer team, Brianna Alger making a defender fall down without even touching the ball.
October 16. Bad News for the Highly Intelligent. Like a lot of studies of supposedly intelligent people, they just look at members of Mensa, which is not the same thing. Still, the results are extreme: double the anxiety disorders, and triple the environmental allergies of the general population. They speculate that more intelligent people are more overexcitable. Or...
Depressed People See the World More Realistically. The evidence from studies is not conclusive, but depressive realism fits my personal experience. I used to be happier, until three things made me smarter. First, I've been in enough car crashes now that I understand that driving is extremely dangerous and we should all be terrified every minute that we're doing it. Second, I've become a lot more aware of subtext in conversation, and now the social world feels like a minefield.
Third, I've heard that psychedelic drugs cure depression, and maybe I just need to take bigger doses, but the reason I'm suddenly cynical about humanity, is that last week I took LSD and walked up the river trail out of town. Every time I do it, it's pretty much the best day I've ever had, and I understand that every blade of grass is more impressive than the combined works of humanity. And then I have to go back to the human world and default human cognition, and it's awful. This song describes it perfectly.
Don't worry, I'm not considering suicide. I believe that fate has plans for me, and if I quit, I'll have to start over. But when I think about my own death, the main thing I feel is relief. Then when I think about it more carefully, I don't actually want to die, I just want to have no responsibilities.
Don't we all -- and that's not normal. It's because our society is in full-on decline. I remember in third grade when they taught us the word "responsibility", and I was immediately suspicious. Only now can I explain why: Responsibility is a social tool to maintain the inertia of activities that at one time someone felt like doing, but now nobody does.
October 14. I don't know why, but lately I haven't been feeling smart on Mondays. So I'll just mention two bits from that Mike Snider talk. First, that his years of searching...
revealed this whole show we call life, and the universe in which it appears, to all be imagination, if looked upon as something other than myself. If I look upon it as myself then all of it is real.
That's a new idea to me. I mean, a lot of people say the self is an illusion, or the physical world is an illusion, but I've never heard anyone say these illusions are directly linked. It reminds me of an image I saw years ago, I forget where, that turned the observing eye inside out. Instead of putting the world on the outside, and your mind on the inside, looking out of your head through your eyes, this drawing put the physical world on the inside of a circle, and outside the circle is consciousness or God, and the circle has a bunch of tiny holes. Those holes seem to be you and me looking out at the world, when really it's pure consciousness looking into it, from different perspectives.
The second bit is related to the popular idea of "enlightenment", something I've been skeptical of for a while. The idea is just too pretty and tidy, and I figure it's just a simplified way of talking about a lot of different ways to improve one's emotional health or spiritual understanding, with no clear place to draw a line.
But Snider has a great metaphor for how a clear line could be crossed. He says it's like one of those magic eye images, where at first you just see a bunch of meaningless dots, and then suddenly you figure out the right way to look at it, and you see a shape. And once you know how to shift your perspective in that way, it becomes easy.
October 11. Posted to the subreddit, a good summary of Matthew Crawford's book The World Beyond Your Head.
I read the book a few years ago, and I've written about it a couple times. From 2015: "The trend in technology is to make practical things boring and idiot proof, so they don't attract our attention but without our attention they still barely work. Meanwhile, entertainment and advertisements are being skillfully engineered to demand our attention." And from 2018, "...increasingly complex gadgets that are supposed to be our magical servants, but in practice, when they malfunction, or when their unseen handlers exploit us, we are powerless, because the gadgets are too complex for us to tinker with or even understand."
Right now I'm feeling really cynical about the whole human project. From a materialist perspective, where all life is accidental, humans are just a really big accident, far too unstable to find any equilibrium with the rest of life. And from a spiritual perspective, where there's only one Consciousness playing games with itself, I think it's just about burned out on being human.
By the way, on the subject of spiritual perspectives, thanks Mark for sending this 2017 recording, The Hillbilly Sutra. It's a two hour talk by Mike Snider, better known as a banjo player than a spiritual teacher. But this is the most impressive spoken word recording I've ever heard. Usually when I listen to someone talking, I use the settings on YouTube or VLC to speed it up so I can get through the chatter to the interesting ideas. But with Snider, I actually slow it down so I can transcribe stuff like this:
Consciousness is the all. Besides it there is no other. So we are putting anything and everything under this umbrella. This is why I use the term absolute consciousness. This term refers to my beingness, and the selfsame beingness of not only myself but the singular all-encompassing and all-inclusive void beingness or intelligence of everything.
It's radiating from you right now, as you, right here in this moment. It is effulgent, visceral, radiant, and absolutely void of any objectivity or subjectivity whatsoever.
October 9. Two comments on unknown knowns. From Voidgenesis on the subreddit:
This made me recall personal experiences of learning to play piano. My conscious awareness was mostly located in my dominant right hand. As I became more skilled and the left hand got involved it was as if someone else was controlling it much of the time. That in turn reminded me of all the neurobiology research showing that the mind is not a coherent construction, but composed of many different modules competing for access to the central self aware part (or frantic confabulator depending on your perspective). If attention is a neurological illusion then it tints the whole original conceptual framework.
And from Matt over email:
Perhaps the reason no one challenged your claim that attention can home in on something without us knowing it, is that people intuitively grasp how attention is more cloud-like than laser-like.
We can be thinking about an anxiety-inducing project at work, have a song stuck in our head, briefly be annoyed at another person on the train, and have a memory surface all within the space of seconds. It's easy to fail to realize that a part of our mind began replaying a song it heard from someone's smartphone before we boarded the train. We may suddenly wonder why we're thinking about so-and-so from college only to trace the memory to the fact that we've been replaying a song internally. We may or may not know why the song entered our thoughts at all.
If there's any activity that can be said to cause the most suffering, I'd say it's this: thinking about something without clearly knowing that you're thinking about it or knowing the negative effects that's having on your body.
October 7. A few more thoughts on attention. First, I was surprised that no one challenged me on category 2, "where your attention is, and you don't know it." How is that even possible? Isn't that the definition of attention, that whatever your attention is on, you know it? Maybe it's like "Yeah, when I'm focusing on that thing, I'm aware of it, but I didn't notice I was focusing on it that much."
I'm also thinking about the metaphor of fish not being aware of water. When I first heard that, it seemed profound. Since then, in some circles, it's become a cliche, and the usual interpretation is that the aware-of-water are better than the unaware-of-water. But notice, the best known presentation of the metaphor is probably by David Foster Wallace, who killed himself.
Taking it literally, an actual fish would have no reason to be aware of water, and becoming aware of water would add an unhelpful cognitive burden. Taking another shot at what I wrote last week: maybe the recent surge in depression and anxiety is caused by the cultural trend of assuming that more awareness is always better, and now we're struggling with conscious awareness of too many things that are better handled subconsciously -- or even mishandled.
October 2. Continuing on the subject of attention, this subreddit thread has helped clarify my thinking, and now I can define four categories: 1) where your attention is, and you know it; 2) where your attention is, and you don't know it; 3) where your attention is not, but you know it could go there; 4) where your attention is not, and you don't know it can go there.
This is a lot like Donald Rumsfeld's speech about knowns and unknowns. He was talking in the context of war, and information technology has put us in the biggest attention war of all time. We are fighting for four things: to see, to not see, to be seen, and to not be seen. Turn the TV to the game, mute that ad, look at my tweet, and don't track me Google.
There's a lot to be said about being seen and not being seen, but I want to focus on seeing and not seeing -- especially not seeing. This is the age of raising awareness, and it's gone so far that we're overwhelmed. Our ancestors could have not imagined how many demands we have on our attention, or how hard it is to choose among them.
I think this is why some people are pushing back against mindfulness. The last thing we want is even more shit we're supposed to be paying attention to. But the way I see it, the mind is like a web browser, and mindfulness is like changing your preferences. It's difficult, but it's an investment: by giving some attention to your own filter, you can learn to filter more stuff out, and free up some attention for whatever you decide is important.
Update: Mark calls this post "probably the best you've ever written." That's interesting, because almost all my other posts have been written straight to a computer screen, sober, and this one was written longhand on two puffs of good weed.