"The bigger you build the bonfire, the more darkness is revealed."
- Terence McKenna
novel
Apocalypsopolis, book one
zines
Civilization Will Eat Itself, Superweed 1-4, best of
March 13. Continuing on Monday's subject, a reader sends this page about complexity of value and how difficult it is to encode human values into a system of rules:
Because the human brain very often fails to grasp all these difficulties involving our values, we tend to think building an awesome future is much less problematic than it really is. Fragility of value is relevant for building Friendly AI, because an AGI which does not respect human values is likely to create a world that we would consider devoid of value.
Another angle: The Best Intelligence Is Cyborg Intelligence. I think this is where we'll be for the rest of this century, because no matter how powerful computers get, it will always be easier to combine machine and human intelligence than to duplicate human intelligence with a machine. The more interesting possibility is that someone will build a self-improving AI that is not a computer.March 11. From Aeon Magazine, a smart essay about humanity's deep future and the threat of extinction from stuff we are only now beginning to create. My favorite ideas are not from Nick Bostrom, the guy photographed at the top, but Daniel Dewey, a specialist in artificial intelligence. This is the first time I've seen a plausible analysis of the motivations of a dangerous AI. We imagine that it will be like an evil human, but human motivations come from human nature and human culture, neither of which will motivate a machine. Dewey correctly observes that our AI will have exactly the motivations we give it, and he speculates that it will follow these motivations into consequences that our relatively low intelligence cannot predict.
'The basic problem is that the strong realisation of most motivations is incompatible with human existence,' Dewey told me. 'An AI might want to do certain things with matter in order to achieve a goal, things like building giant computers, or other large-scale engineering projects. Those things might involve intermediary steps, like tearing apart the Earth to make huge solar panels. A superintelligence might not take our interests into consideration in those situations, just like we don't take root systems or ant colonies into account when we go to construct a building.'
It is tempting to think that programming empathy into an AI would be easy, but designing a friendly machine is more difficult than it looks. You could give it a benevolent goal -- something cuddly and utilitarian, like maximising human happiness. But an AI might think that human happiness is a biochemical phenomenon. It might think that flooding your bloodstream with non-lethal doses of heroin is the best way to maximise your happiness. It might also predict that shortsighted humans will fail to see the wisdom of its interventions. It might plan out a sequence of cunning chess moves to insulate itself from resistance. Maybe it would surround itself with impenetrable defences, or maybe it would confine humans in prisons of undreamt of efficiency.
March 8. Yesterday on the subreddit a reader linked to this Ribbonfarm guest post that explores the WEIRD subject more deeply: Honesty and the Human Body.
Also, I keep forgetting to mention this. Does anyone have a Kindle 3 (a.k.a. Kindle Keyboard) that they want to donate or sell cheap? It's okay if the logic board or battery is bad because all I need is the screen.
March 7. I've just finished the latest update on my 100 things about me page. The biggest block of new stuff is from 26-30.
March 6. Three positive links. This Happy Homestead is a new blog by a guy in my area.
BookOS is a new ebook site. It's mostly pdfs, which are good for reading on the computer. If you have an e-reader it's better if you can find epub or mobi.
And a great reddit comment about the benefits of meditation, focusing on the skill of noticing and dropping unhelpful thoughts.
March 4. There's a lot of buzz about this video, Wealth Inequality in America. Here's a less slick video that goes inside the top one percent to show the wealth continuing to increase to even more absurd levels: The L-Curve: Income Distribution of the U.S.
Something most people are missing is that, past a few million dollars, money is no longer about buying stuff -- it's about political power. And this power is mostly being wasted. Only a few super-rich people are doing anything interesting -- Bill Gates is trying to eradicate malaria, Elon Musk is trying to colonize Mars, but most of them are ordinary idiots, throwing their wealth/power behind two contradictory goals: to continue to increase their wealth/power, and to keep the system stable.
March 1. Today, people bash American culture and I defend it. Dmitry Orlov's new post, Monkey Trap Nation, points out many ways that Americans are short-sighted, and he's right. But relative to other countries, we are only more short-sighted in the time dimension. We are less short-sighted in the social dimension, and I'm talking about political corruption. I'll define corruption as the use of a position of power to serve a narrower interest than the scope of that position. So instead of serving everyone you have power over, you harm most of them to serve yourself and people close to you. America is corrupt, but most of the world is much more corrupt, and short-sightedness toward other people might be even more harmful than short-sightedness toward the future, because when disaster happens, a socially far-sighted culture can hold together while a short-sighted one breaks into warring tribes.
Why Americans are the weirdest people in the world. This is the best article I've seen on the idea that psychological studies have been skewed by only testing educated westerners, and when you look at all of humanity, we are outliers, and Americans are the most extreme outliers. There's a fascinating argument that a particular optical illusion (see the article) is a cultural artifact of living inside square rooms. Americans are the most fooled by that one, but we're the least fooled by another one, in which a vertical line appears to be tilted by being placed in a tilted rectangle. The idea is that we're more individualistic, and cultures that see the illusion are more holistic, more focused on the context.
But wait -- there's a reason it's called an illusion. When you look at larger context -- the paper the test is written on, your computer screen, gravity -- the line really is vertical, and Americans are seeing this context, this whole, more accurately than anyone else! So in society and politics, we are more likely to act in the interest of all of humanity, or all life everywhere, or biological nature manifesting through our feelings, than to obey family and country. Individualism has its own dangers, but collective insanity is a greater danger, and something like the Cultural Revolution in China could never happen here.
February 27. Related to bees, some good news: Sting leads to charges for illegal chinese honey importation.
Also, I've just forwarded a reader request for political action on climate activism to the subreddit.
February 26. Short new post on the landblog/houseblog, linking to a big new page about a project I've been working on for months: building two Top Bar Hives so I can start beekeeping.
February 25. Unrelated links. A reader sends this inspiring article on India's rice revolution, a new growing method that increases yields without degrading the soil or feeding economic domination.
Two reddit posts by the same person about what it's like to be a heroin addict and to use it for the first time.
I also recently discovered reddit's SFW porn network. Here's SpacePorn and you can access all the others from drop-down menus at the top.
February 22. Related to the ongoing subject, a good reddit thread: Am I the only one who thinks Brave New World is a paradise rather than a dystopia? Thankfully, the OP does seem to be the only one... so far. A lot of the responses mention "what it means to be human," but as I mentioned the other day, it is possible to use technology to change human nature. I'd like us to change to become more able to handle freedom, power, responsibility, and risk; but we could just as easily change in the other direction, and become more and more tolerant of being infantilized.
There is another dimension to this issue that I haven't mentioned yet: the possibility that our world is part of an invisible larger world that has a plan for us. There's a lot of buzz about how we could be in a computer simulation, but I think the method must be as alien to us as computers are to cavemen. Anyway, the more interesting question is: Why? Personally I think it has something to do with learning, because when I try to learn, everything falls into place, but when I try to have too much fun I get slapped down. It's even possible that different people are here for different reasons.
If we entertain the idea that we're here for a purpose, can we reconcile this with the observation that human life is getting more insulated and predictable? Why would that be in the plan? One story is that the insulation is another challenge we have to overcome, that we have to learn to create life from inside us in a lifeless environment. A similar story came to me in a dream last night: we are being turned into a substrate, a blank slate for another level of creation. If you want to write, you don't buy paper covered with chaos of words -- you buy clean white paper with uniform lines. Human life is being turned into clean white paper with uniform lines, a square blank canvas, a formatted disk drive. What are we going to put on it?
This is getting off the subject, but I also had a dream where music in the future will expand in the dimension of variations in the same song, so that instead of listening to one version of one song, our new consciousness will enable us to hear thousands of versions at once and listen to the changes.
February 20. Last week I argued that it's better for us to have more freedom and power, which means more opportunities to harm other people and ourselves, even if it leads to more people being harmed. Obviously this has to be done in moderation, which is why I support moderate restrictions on guns and drugs and driving. Because if we increase freedom too fast, it can create so much trauma that there's a popular backlash, and a regression toward less personal power and more power monopolized by domination systems.
But there is a deeper issue: what are we working toward as a species? Suppose we end up in an all-powerful surveillance state, designed so that even if every human is a murderous psychopath, nothing bad can ever happen, and we all sit around taking euphoria drugs and consuming entertainment. Have we won, or have we failed? My utopian vision is at the other extreme: no authority, no restrictions, nothing is ever locked, the world is full of dangers, and yet it's rare for anything bad to happen, because humans have attained godlike levels of personal virtue and awareness. We may never get there, but I think that's what we should be aiming for.
More generally, I could frame the conflict between these visions as a conflict between internal progress and external progress. Is it more important to improve human nature or improve the human condition? Along another axis there is a political question: how far do you go in using central control? And which improvements work with or against central control?
I've noticed that technological progress is usually described as if improving the human condition is the only thing that matters. So if we build machines to do everything for us, that's good, even if we lose the ability to do anything for ourselves. I see tremendous potential to use technology for internal progress, but there is also the danger of improving ourselves in the wrong way.
February 16. Thanks Jane for this inspiring article on The Street Kids of San Francisco.
At the other extreme, thanks Leigh Ann for this reddit thread on the hilarious police blotter for Atherton, California, where the median house price is over four million dollars. Among reports from other cities of shootings and stabbings and robberies, in Atherton you get stuff like "A male was reported to be lying on the ground, possibly writing."
This fits with my new pessimism about the future. Rich people are not born insane -- they become insane because they're so insulated. And if the system doesn't collapse, eventually it's going to be Atherton everywhere.
Also I've just rewritten my February 12 post for clarity. Libraries will be closed Monday so I won't be online again until Tuesday.
February 14. Something I've been meaning to post for a while, the user page of Erinaceous, who as far as I know is the smartest person who regularly comments on reddit.
And something new, a project my friend Adam has been working on, The Art of Gratitude. Sometimes when I'm falling asleep at night I count things I'm grateful for... but more often I count things I would do with omnipotent power.
February 12. Two related links: Why Parents Need to Let Their Children Fail, and Smothered by Safety. I'm thinking, if we're doing this to kids, are we also on the path to doing it to everyone?
This reminds me of the current debate on gun control. There's an assumption in our culture that delaying death is an absolute good. But whatever our purpose is for being in this world, it's not to avoid leaving, since everyone leaves -- it's in what we do while we're here, which probably has something to do with holding power and responsibility, taking risks, and learning to get along with each other. I think guns are good for us because they give us the option to shoot each other and the challenge to not shoot each other. For the same reason, driving is good for us. If society is set up so that it's impossible for anyone to harm anyone else, then there is no power, no responsibility, and no need for learning.
This is my new vision of doom: politically it's much easier to increase safety than to increase danger, so we're going to see ratcheting tightness, in the name of safety, served by powerful technologies. In a hundred years only rich people and servants of the control system will be allowed to shoot or drive or otherwise put anyone else at risk. In another hundred years, you won't even be allowed to put yourself at risk, and if you think that's silly, consider seat belt laws. Eventually all power and responsibility will be held by artificial intelligence and robots, and they will exterminate us -- not with bullets, but by making life so predictable that we lose the will to live.
February 8. Three week-old links. First, a new View from Hell post, Single Income No Kids: A Supernormal Stimulus Benevolently Exploiting Ancestral Patterns. Sister Y argues that if one partner goes out and makes money, the other partner takes care of the home, and they don't have any kids, everybody wins -- except large institutions that feed on us through the money economy.
Are Placebos Really Sugar Pills? Kas Thomas accuses drug companies of using biologically active placebos, pills that do something while pretending to do nothing. Specifically, they are often designed to cause the same side effects as the drug being tested, so they can advertise that the drug doesn't cause that side effect more than a placebo. Note that "side effect" is a marketing term -- drugs just have effects.
And a reddit comment bashing high-speed rail: "it's a prestige project that costs a metric ass-ton of taxpayer money in return for a fabulous technological achievement that mostly benefits the wealthy." Personally I'm against high-speed anything. Because I am patient and know how to entertain myself, I like life to be slow and cheap. I enjoy riding the train, but even low-speed rail is usually out of my budget. If I were the impossible anarchist dictator, every freight train would be required to have a passenger car, with no attendants, so anyone could hop on anywhere and ride for free at their own risk.
February 6. Loose ends from my last post: in this subreddit thread polyparadigm comments that maybe people have trouble being themselves because they don't know themselves, and we're getting worse at knowing ourselves, because we "build identity in dialogue with one another" and more of our dialogue is with manipulative technologies instead of other people.
Also, this subreddit post argues that people are being forced to be less honest by economic desperation. Class is definitely a factor in this issue. The main reason I don't travel in poorer countries is that it makes me sick to see everyone sucking up to me just because I happen to have more money. It's almost impossible for people of different social class to be emotionally honest with each other, and American wealth inequality is greater now than ever.
I'm in Seattle for the rest of the week, so I'll have more internet time than usual.
February 4. In recent reddit threads, I've noticed something troubling: today's young people seem to think "be yourself" is bad advice. (Examples: 1, 2, 3, 4.) One possibility is that they're wrong, that being yourself is still the best move, but they have unrealistic expectations about the difficulty of life. Obviously it's not true that being yourself will make people like you. It will make them uncomfortable, it will make it harder to find friends, find a partner, find a job; but eventually you will find those things on your own terms and your life will be built on a foundation of authenticity. If you want the most people to like you the fastest, you should pretend to be what they want, or what they expect; but that choice puts you on a path to faking your whole life, being outwardly successful and secretly miserable.
Maybe people just don't get this, but the really disturbing possibility is that the world is changing, that it's getting harder and harder to get away with honesty. This trend would fit the trend I wrote about a week ago, of using technology to feed back our desires and expectations. If you want someone's attention, you're no longer just competing with other humans in the raw; you're competing with digital entertainment, photoshopped images, carefully managed internet profiles, online communities self-selected for agreement. Even other face-to-face humans might be on drugs to make them more likeable.
If this is the path we're on, where does it lead? Also related, a post I made in October 2011 about real and fake emotion.
January 31. This reddit thread on Michael Dell buying back Dell Inc. has some great comments about how a company can be ruined by putting it on the stock market, and how the future of business might be privately owned companies. Is this related to the end of economic growth?
January 28. (permalink) Over the weekend I read Philip K Dick's last finished novel, The Transmigration of Timothy Archer. There's a pivotal scene where three characters go to a medium, intending to talk with the spirit of another character who died. The medium is a fraud, but there's a twist: it's not that she has no psychic powers, but that she has a different power than she claims; rather than speaking with the dead, she can read minds. So she just tells everyone whatever they are already thinking. They think they're looking out a window, when they're looking in a mirror, and this leads to their destruction.
I think this is the key to two of Dick's biggest themes: reality and madness. Reality is looking outward, unreality is looking inward, and madness is when you can't tell the difference.
This reminds me of my favorite critique of technology. In the book In the Absence of the Sacred, Jerry Mander argues that our technological progress is like inbreeding: we have been turning our attention away from the world outside humanity, and deeper into a world created by humans. Look around where you're sitting now. Can you see or hear anything that was not created by humans? How common was this for your ancestors?
I have a challenge for transhumanists. The word can mean a lot of things, but I would ask: where will future transhumans be focusing their attention? With whom will they have relationships? If they're truly transcending humanity, they will be looking outside the world created by humans. I fear that transhumanism is really ultra-humanism, where we not only turn our attention deeper inside the world we created, but change ourselves so that we can't get out.
January 14. My friend Sarah will be traveling around the country this spring with her boyfriend and dog, in a Vardo tiny house on a pickup truck, staying with people and making pies. They're calling it the OccuPIE Pie It Forward Tour, and they're looking for hosts. You can email them at pieitforwardoccupie at gmail.
January 1. A reminder that I am semi-retired from blogging. Mainly that means that I will still post links but only reveal personal opinions once a week or so. When I reveal a personal opinion, you're welcome to email me with a comment, but if you have a comment on something I link to, that's between you and whoever wrote it. The thing I'm retiring from is the feeling that I'm sitting at a table with a thousand people who can talk to each other only by talking to me. Of course you can always post comments on the subreddit.