Archives

November 2012 - February 2013

home
previous archive

November 1. Earlier this week my friend Davis stayed a few nights with me. He asked me about spirituality, and I gave him a definition, something about rejecting philosophical materialism without following the centrally controlled beliefs of a religion. I asked what "spirituality" means to him, and he struggled to define it without using the word "spiritual".

So now I'm wondering, when people talk about spirituality, what exactly are they talking about? If I had more time, I'd make a survey with hundreds of questions and give it to thousands of hippies. If I wanted to make them look stupid, I would ask questions to trick them into muddling the word up with culture and politics: What's more spiritual, a Prius or a Hummer? But if I wanted to understand them, I would ask for stories about the most spiritual and anti-spiritual parts of their lives. Then I might be able to tease out some definitions, like "the feeling of engaging with a living universe" or "the yearning to be part of a larger story" or "absence of stress".

Underlying all of this, I think "spirituality" is how people refer to the gap between human nature and modern society. Our ancestors lived in close-knit extended families, imbedded in the spectacular complexity of wild ecosystems, viewing reality itself as full of awareness and intention. Now we grow up in neurotic nuclear families, work in office cubes, our closest engagement with nature is mowing the lawn, and we view consciousness as an accident in a mindless billiard-ball universe.

So how do we bridge the gap? This is such a hard question that almost everyone goes astray. We correctly observe that our society is wrong, and then we get seduced by incorrect but beautiful stories about why it's wrong and what we can do about it. As industrial civilization continues to decline, I expect all kinds of movements to appear, from exploitative cults, to unrealistic utopian crusades, to patient realignments of human culture, all of them feeding off the energy of people who are economically vulnerable and desperate to feel like they're part of a story. Eventually several of these movements will stabilize into dominant belief systems, and beyond that I'm not sure.

What if technology makes it easier to change human nature to fit society, than change society to fit human nature? I don't think we can just do anything we want. I believe in a conscious universe, into which post-human nature and post-human society must ultimately fit. We're either going to change ourselves to extinction, or change ourselves to fit this universe through a new interface that we cannot yet imagine.


November 2-4. In a Reddit thread about reality glitches, I made this post about White Rabbit, and how I'm sure I used to hear "feed your head" three times at the end, but now, in every version I can find, I only hear it twice. I'm not the only one. In the replies, Velinus finds two videos where the printed lyrics have it three times, and a few readers remember the same thing. Robert has no memory either way, but reports:

After you mentioned White Rabbit I decided to look up the lyrics online. All the lyrics I found on the major lyrics sites have the line "feed your head" repeated twice. I lost interest at this point and decided to get back to my work.

Now, there's a UNIX program called fortune. It basically just displays random quotations pulled from a database when it's run. I had set up our development server so that it runs when I log into the system. So, what should appear when I log in? The lyrics to White Rabbit with the "feed your head" line repeated 3 times!

And now, this is the top postcard on this weekend's PostSecret.


November 13. I've been meaning to write about Sarah Hrdy's book Mothers and Others. Here's a decent summary: Is Babysitting the Ultimate Source of Our Ability to Understand Each Other?

Hrdy starts with the observation that humans are nicer to each other than any other large primate, even bonobos. She looks at a bunch of explanations, and eventually argues that at some point in our history (for complicated reasons I won't get into) mothers started sharing the duties of raising children with other family members, especially maternal grandmothers. This led to a higher survival rate for children who learned to read the emotions of many potential caregivers, which led to a higher genetic potential for empathy.

Hrdy's book is mostly a lot of scaffolding to support several rockets that blast off in the final chapter. One shocker goes like this: for hundreds of thousands of years, it was easier for humans to get emotional support than food. So any child that did not have a rich and healthy emotional environment, also would not have had enough food, and would have died. But in civilization, it's normal for kids to be emotionally neglected and still survive to adulthood. This could explain increasing mental illness, depression, emotional detachment, and other problems of modernity.


November 28. View From Hell post, Fungibility and the Loss of Demandingness. It's already too clear and concentrated to be summarized, but here are three sample paragraphs:

To sum up, the modern economy is primarily composed of things and services available for money, ratcheting to allow fewer and fewer non-monetary costs. When things are available for money, anyone can acquire them; this dilutes the information about the self that can be contained in the ownership. Similarly, a major trend in the labor market is toward fungible skills that anyone can supply, reducing opportunities for virtuosity and positive information about the self through work. Everything is increasingly available for money, except, I will argue, a major thing we all want to buy that gives us the feeling of meaning: our own value and specialness.
...
The mistake is to view hipsterdom as pure signaling. It invokes signaling, of course, but also the genuine, authentic search for value in genuineness and authenticity. The hipster is a person who is particularly alienated by the world of purely fungible culture. His music and books, his old "vintage" items, are more demanding, harder to find. But at the same time, he is made more interesting and valuable through what they demand from him.
...
The fungibility of work, the reduction of demand for long-developed special skills, the impossibility of virtuosity in one's limited job, has made work less and less a source of reliable, positive information about the increasing value of the self - because it has ceased to truly improve people. But people still desire to work at what they love, and to improve themselves. The market will sell them the feeling of this, but will not commonly supply them with food in exchange for pursuing virtuosity.

Going off on my own tangent, humans have two contradictory desires. We want to feel like we're valuable people living good lives, which itself is a massive and difficult subject. A good place to start is this video, The surprising truth about what motivates us. The other thing we want is for life to be easy, but there is a trade-off between a good life and an easy life.

This conflict comes into clearer focus as more work is automated. Do you want a machine where you push a button and food comes out, or do you want the challenge and personal empowerment of growing and preparing food with your own hands? This was not an issue in preindustrial civilization, when work was done by slaves and peasants. The lower classes suffered, but not from existential angst, and the elite felt important because they were ruling actual humans. Now there is a growing class of people who have no political power but are served by machines.

If the tech system can adapt to resource exhaustion, we might emerge into a high-tech utopia/dystopia, in which it's easy to be comfortable but difficult to be happy. Social class will no longer be about power or even standard of living, but valuable activity. The upper class will hold the few important jobs that still require humans. The middle class will be hobbyists, practicing difficult skills that are not necessary for society. And the lower class will be content to consume entertainment.


December 27. Some links about wildness. From Jared Diamond, Best Practices for Raising Kids? Look to Hunter-Gatherers. There are three sections: hold them, share them, and let them run free. This reminds me of this article, How children lost the right to roam in four generations.

And How to Solve Homelessness, in which the homeless author argues that society should not try to end homelessness, but make it easier, with good free public toilets, super-cheap capsule hotels, the right to have a job without an address, and the right to sleep in public.


January 1, 2013. Ten years ago it really seemed like the whole system was about to come apart. People who saw a crash coming were seeing things that were being ignored by people who expected business as usual. Yet we were still wrong. After seeing how little daily life has changed after the 2008 financial collapse, seven years with global oil production on a plateau, and two catastrophic hurricanes, I think the big mistake of doomers was assuming that failures would have positive feedback like a house of cards. At this point, anyone still using the "house of cards" metaphor is not a serious analyst but an entertainer. It's clear that the interconnectedness of modern complex systems makes them stronger, not weaker.

This is especially true of technological systems. I no longer expect any kind of tech crash, except that resource-intensive benefits like driving and eating meat will become more expensive and less available to poor people. Economies will collapse as they adjust to decades of zero or negative growth, weaker nations and businesses will fail, but computers will continue to get stronger, and automation will adapt to resource decline by becoming more efficient and better able to compete with human workers. At the same time, no government that can possibly avoid it will allow its citizens to starve, so there will be even more subsidies for industrially produced human dog food.

Over the next few decades I see the global system passing through a bottleneck as it shifts from nonrenewable to renewable resources. We fantasize about apocalypse because we want the world to get looser, but I see it getting crappier and tighter. When we emerge from the bottleneck, life will get nicer... but are we coming out of the bottle, or going in? I think the "singularity" will match its meaning in astrophysics: the center of a black hole, with 90% of increasing computing power being used to stop the other 10% from doing anything interesting. I imagine an airtight sci-fi utopia/dystopia, where almost everything will be automated, nobody will have to do any work, everyone will be comfortable and safe, and we will have amazing powers to entertain ourselves. Other than that, we will have less power than any people in history or prehistory. The world will be lifeless and meaningless, a human museum, a suicide machine. Making the world alive again will be our next challenge.


January 10. Ten years ago, when I imagined "collapse", it was interesting: industrial collapse means there are no factories and everything new is made by hand. Infrastructure collapse means there are no electric grids and we're riding horses on the ruined freeways. Economic collapse means the banks are just gone, cash is worthless, and economies are gift and barter. Political collapse means you don't have to pay taxes, kids don't have to go to school, and there are no police.

Now it's increasingly clear that none of these things are going to happen, even slowly over 100 years. As someone known for writing about collapse, I have two career options. One is to follow the bait-and-switch: keep writing about "collapse" but redefine it as something much less interesting. The other option is to write about what has now become interesting, given the new forecast. And that is: if the tech system keeps grinding ahead, what kind of crazy stuff is it going to do?


January 16. The Edge.org 2013 question has just come out, and they ask a bunch of prominent thinkers What should be worried about?. In the two that worry me the most, Eric Weinstein writes about excellence and Evgeny Morozov writes about "smart" technologies, and both argue that by seeking improvement and perfection according to a narrow cultural view, we are locking out genius and change. In a related point, Murali Doraiswamy writes that the American view of mental illness is spreading globally, so that more and more mental states that Americans don't appreciate are being declared pathological, and overmedicated.


January 22. My latest thought about collapse is that there are two ways to frame it, or a continuum between two extremes. At one extreme, the threat of collapse is purely external. All systems are assumed to be equally strong, and the question is how many bad things have to happen to a system before it breaks down. This is like a view of infectious disease that assumes all immune systems are equally strong, and seeks to minimize exposure to germs. At the other extreme the threat is purely internal. So a strong system can endure anything, and a weak system will be toppled by anything. I lean in this direction. So if you want to predict if a certain system is going to crash, you don't look at all the threats lined up against it -- you look at how it behaves when things go wrong.

Specifically, I would look at feedback, how things going wrong cause other things to go wrong. Initially there is a continuum between zero feedback, where one thing going wrong does not make anything else go wrong, and maximum feedback, a chain reaction of bigger and bigger things going wrong, so that the slighest bump will bring total collapse. Over the longer term you can ask if a system gets hardened, where for example a hurricane makes it stronger against future hurricanes, or worn down, where every hurricane makes it weaker against future hurricanes.

I've stopped predicting collapse because I'm seeing low disaster feedback, chain reactions of smaller and smaller things going wrong, culminating in almost no change in the daily life of the average person. The technological systems seem especially strong, the economic systems weakest, and political systems somewhere in the middle. So if the European economy collapses, it will reshuffle but not totally overthrow the governments, and it will barely effect the electric grid.

The next question is, what would it take for any of these systems to change and become weaker? I see two ways it could happen. One is in complex systems theory, and Steven Strogatz raised this danger last week in his edge.org piece on too much coupling. Parts of the system are influencing other parts, and this is good up to a point, but if coupling passes some critical threshhold, the basic character of the system changes, and the next small disaster could cause something like a system-wide epileptic seizure.

The other change is in human psychology. I'm thinking of Fredy Perlman's book Against His-story, Against Leviathan, where he goes through the whole history of civilization arguing that empires fell because they were emotionally hollowed-out, because their own citizens no longer believed in them.


January 28. Over the weekend I read Philip K Dick's last finished novel, The Transmigration of Timothy Archer. There's a pivotal scene where three characters go to a medium, intending to talk with the spirit of another character who died. The medium is a fraud, but there's a twist: it's not that she has no psychic powers, but that she has a different power than she claims; rather than speaking with the dead, she can read minds. So she just tells everyone whatever they are already thinking. They think they're looking out a window, when they're looking in a mirror, and this leads to their destruction.

I think this is the key to two of Dick's biggest themes: reality and madness. Reality is looking outward, unreality is looking inward, and madness is when you can't tell the difference.

This reminds me of my favorite critique of technology. In the book In the Absence of the Sacred, Jerry Mander argues that our technological progress is like inbreeding: we have been turning our attention away from the world outside humanity, and deeper into a world created by humans. Look around where you're sitting now. Can you see or hear anything that was not created by humans? How common was this for your ancestors?

I have a challenge for transhumanists. The word can mean a lot of things, but I would ask: where will future transhumans be focusing their attention? With whom will they have relationships? If they're truly transcending humanity, they will be looking outside the world created by humans. I fear that transhumanism is really ultra-humanism, where we not only turn our attention deeper inside the world we created, but change ourselves so that we can't get out.


February 12-22. Two related links: Why Parents Need to Let Their Children Fail, and Smothered by Safety.

If we're doing this to kids, are we on the path to doing it to everyone? Politically it's much easier to increase safety than to increase danger, and my fear is that technology will soon give us the option to increase safety without limit. First you won't be allowed to do anything where your failure will harm anyone else, which is a good definition of responsibility. Then you won't even be allowed to put yourself at risk, and if you think that's silly, consider seat belt laws. Eventually all power and responsibility will be held by machines, and we will live in a padded cell civilization where even if every human is a murderous psychopath, nothing bad can ever happen. We'll all sit around taking euphoria drugs and consuming entertainment, losing the will to live and fading into extinction.

My vision is at the other extreme: no authority, no restrictions, nothing is ever locked, the world is full of dangers, and yet it's rare for anything bad to happen, because humans have attained great personal virtue and awareness. Obviously this has to be done slowly, because if we increase freedom too fast there will be a backlash. But even if it takes us a million years, I think it's the only thing worth aiming for.

More generally, I could frame the conflict between these visions as a conflict between internal progress and external progress. Is it more important to improve human nature or improve the human condition? I've noticed that technological progress is usually described as if improving the human condition is the only thing that matters. So if we build machines to do everything for us, that's good, even if we lose the ability to do anything for ourselves.

There is another dimension to this issue: the possibility that our world is part of an invisible larger world that has a plan for us. You can find this in most religions, and now even scientists are talking about the possibility that we are all living in a simulation. If so, the world that is simulating us must have a reason. And if we are here for a reason, how can we reconcile this with the observation that human life is getting more insulated and predictable? Why would that be in the plan? One story is that the insulation is another challenge we have to overcome, that we have to learn to create life from inside us in a lifeless environment. A similar story came to me in a dream last night: we are being turned into a substrate, a blank slate for another level of creation. If you want to write, you don't buy paper covered with chaos of words -- you buy clean white paper with uniform lines. Human life is being turned into clean white paper with uniform lines. What are we going to write on it?

next archive