"The bigger you build the bonfire, the more darkness is revealed."
- Terence McKenna
novel
Apocalypsopolis, book one
zines
Civilization Will Eat Itself, Superweed 1-4, best of
April 3. Today, some links about emotional problems that I forget other people have. First, from Aeon magazine, While one person dabbles in drugs with few ill-effects, another will become a chronic addict. What's the difference? The author rejects the idea that addicts are morally depraved, and also that they're helpless and have no control. Instead, they're choosing to stay on their drug because the alternative seems even worse:
They are usually people who suffer not only from addiction, but also from additional psychiatric disorders; in particular, anxiety, mood and personality disorders. These disorders all involve living with intense, enduring negative emotions and moods, alongside other forms of extreme psychological distress... They are unlikely -- even if they were to overcome their addiction -- to live a happy, flourishing life, where they can feel at peace with themselves and with others.
From the confession subreddit, I'm a skydiver with over 500 jumps... What I don't tell them is that I would be relieved if I threw out my pilot chute and nothing happened.
And
Procrastination Is Not Laziness, in which the author discovers that he procrastinates because he's afraid of failure.
I can't relate to any of this. I procrastinate because retiling my shower is a painful chore and playing video games is so much more fun. The closest I come to depression is when everything I do feels like walking uphill and I just want to sleep all day. When that happens, I bite the bullet, force myself to do stuff I hate in the service of my future self, and in a few days I feel good again.
But I'm wondering if some of my writing is harmful to people who are not like me. Some people have a much harder time forcing themselves to do stuff, and they dream of a magical world in which they can just do what they feel like all day and everything works out. These people need more external structure, not less. They should not try to drop out of society -- they should get a job where they are surrounded by other people to motivate them.
This is related to Monday's final link, about giving money directly to the poor. There is a movement to give everyone a guaranteed basic income, and a reader sends a new article about it: Helicopter money: Federal Reserve should print money and give it directly to households. I support this, and I expect it to happen as soon as corporations realize that they no longer need people as workers (because of automation) but still need them as consumers. Economically it's perfect, but psychologically it could be a terrible ordeal for ordinary people who are not yet able to create their own structure and meaning.
April 1. Unrelated links. The Unintended (and Deadly) Consequences of Living in the Industrialized World. It's all about how exposure to dirt makes us healthier. But there's no practical reason that industrialized kids can't be exposed to dirt -- only cultural stupidity.
Also from Smithsonian magazine, What Major World Cities Look Like at Night, Minus the Light Pollution. A photographer has worked out a technique to make modern cities look like all the lights are out and you can see astronomically accurate dense starry skies behind them.
Something Other Than Adaptation Could Be Driving Evolution. "Speciation might not only be an evolutionary consequence of fitness differences and natural selection, but a property intrinsic to evolution, just as all matter has gravity."
Want to Help People? Just Give Them Money.
Investments in common goods such as roads, schools and wells are critical in helping people out of poverty. But GiveDirectly has a new concept: What if cash transfers are used as a standard benchmark against which to measure all development aid? What if every nonprofit that focused on poverty alleviation had to prove they could do more for the poor with a dollar than the poor could do for themselves?
March 29. New post on the landblog/houseblog about building three mini/bait hives. I've also tacked it onto the bottom of the Top Bar Hive page, and taken a prettier photo of the finished hives. I pick up my bees on April 13. I haven't mentioned it in a post yet, but I've planted a patch of strawberries, some more raspberries, and I'm preparing places for lingonberries and more blueberries.
March 27. A week ago on the subreddit, this post suggested that we can think of the corporation as a form of artificial intelligence. Today Leigh Ann sends two links with the same idea: The Singularity Already Happened; We Got Corporations, and a Charlie Stross post from 2010, Invaders from Mars.
March 26. Another good article about de-extinction: Efforts to Resuscitate Extinct Species May Spawn a New Era of the Hybrid.
Loosely related: Becoming the All-Terrain Human, about the world's best endurance runner. How long until biotech is able to give these skills to everyone?
He regularly runs all day eating only wild berries and drinking only from streams. On summer mornings he will set off from his apartment door at the foot of Mont Blanc and run nearly two and a half vertical miles up to Europe's roof -- over cracked glaciers, past Gore-Tex'd climbers, into the thin air at 15,781 feet -- and back home again in less than seven hours, a trip that mountaineers can spend days to complete.
March 25. Thanks GB for a $50 donation. In other personal financial news, I've started to research Obamacare rules, and it looks like they've eliminated the asset test for medicaid. This is great news for people like me with high assets and low income, and it's a smart move overall, because the administrative cost of testing for assets exceeds the savings of excluding people. Of course, the American medical system remains the worst in the world in terms of benefit vs cost, and I still don't see any politically possible way to fix it.
March 21. Appropedia is a new green living wiki. And on a similar subject, there's always good stuff on No Tech Magazine.
This reddit comment explains why it sucks to be an opera singer, and how opera companies are becoming more short-sighted in developing talent. And by a former reporter, Why I left news. I would say, as a general rule, the more glamorous a job sounds the more likely it will destroy you, and the more boring a job sounds the more likely you'll be happy. Or, there is an inverse relation between how much you enjoy what you do, and how much you enjoy answering the question "What do you do?"
March 20. Four links about biology. I've seen a lot of articles lately about bringing back extinct species, and this is my favorite: Cloning Woolly Mammoths: It's the Ecology, Stupid:
Is one lonely calf, raised in captivity and without the context of its herd and environment, really a mammoth? ... Perhaps the best course of action is to first demonstrate that we can effectively manage living rhinos and elephants before resurrecting their woolly counterparts.
Swallows may be evolving to dodge traffic. The evidence is that fewer birds are being killed by cars, their wings are getting shorter, and birds that are killed by cars have longer wings than birds caught in nets.
Ten predictions for the future of your microbial health, speculating that coming research will show that having a high diversity of good bacteria -- on your skin, in your gut, etc. -- is better than aiming for total sterility and ending up with an environment that favors bad bacteria.
And an excellent reddit comment about how bees swarm.
March 18. Some personal stuff. Last fall I grew a beard, and this is my new look. I just shaved it off for spring, but after getting used to the bearded look, I like it better than the shaved look and I'm going to grow it back. Notice the beehives in the corner of the photo. I just updated my Top Bar Hive page with a photo of the hives in their final position with the lids painted, and added some new info in the big paragraph about wax and comb.
I did my taxes, and my income last year was $8500. I didn't have to file but filed anyway to document my low income so I can try to get medicaid under ObamaCare. I overspent my income by a couple thousand dollars, so I'm getting more serious about frugality, and one thing I'm doing is eating less meat. Counting my cart the other day at Costco I noticed that a pack of two small organic chickens was over $20, and a giant bag of potatoes was under $5, and I know that potatoes are almost complete nutrition, so I put the chickens back. I've also been making lots of lentil soup after a friend found a local farmer selling dirty chaff-filled lentils for ten cents a pound. The dirt rinses out, the chaff can be picked out in half an hour for a large batch, and the flavor is just as good as the premium lentils I was buying for fifteen times the price.
If anyone likes tart cherries, Costco is also selling Evans (a.k.a. Bali) cherry trees for under $12. Evans will eventually surpass Montmorency as the standard for tart cherries. I don't know what the rootstock is but I bought one anyway and stuck it probably too close to the peach tree.
I'm still going to the library for internet, and still loving it. I get a nice bike ride every day, spend a few hours online, and then the late afternoons and evenings are open to do other things... not all of them "useful". I've been playing Twilight Princess on Wii and last night I beat the mini-boss in the desert dungeon. I've also been reading Charles Stross, and I want to read the first three books of Iain Banks's Culture series, but the library doesn't have them and my Kindle remains broken, so maybe I'll finally sign up for Paperback Swap.
March 15. Ten days ago Paula made a post, On Canceling Collapse, about my developing no-crash position, arguing that I've collapse-proofed my own life to the extent that collapse is invisible to me.
I think the word "collapse" is confusing us by blurring several different issues. One of them is poverty, and my position on that has not changed: for all the usual doomer reasons, there will be a lot more desperately poor people, and we should all try to change our lives to need less money.
The issue on which I've changed my mind is the stability of big systems, and it's strange that everyone thinks I've become more optimistic, when really I've become more pessimistic. Go back and read stuff like The Coming Expansion or How to Survive the Crash and Save the Earth. I was a doomer optimist. For my entire life the world has been getting tighter and tighter, and I was hopeful that I would see it crack open into freedom and possibility. I'm too young to remember the 1960's, but I remember as a kid roaming the neighborhood unsupervised, I remember when small airports had no security at all, when surveillance cameras were rare, when there were no seat belt laws, when you could share information without having to overcome DRM, when only young people had to show ID to buy alcohol.
My fear now is that it will never again get looser, that a tech crash is almost impossible, and that technology will make the control systems more airtight while channeling our urge for freedom into artificial worlds. You could even argue that this is part of the "collapse": increasing poverty causes increasing crime causes increasing fear causes increasing popular consent for strong central control.
March 13. Continuing on Monday's subject, a reader sends this page about complexity of value and how difficult it is to encode human values into a system of rules:
Because the human brain very often fails to grasp all these difficulties involving our values, we tend to think building an awesome future is much less problematic than it really is. Fragility of value is relevant for building Friendly AI, because an AGI which does not respect human values is likely to create a world that we would consider devoid of value.
Another angle: The Best Intelligence Is Cyborg Intelligence. I think this is where we'll be for the rest of this century, because no matter how powerful computers get, it will always be easier to combine machine and human intelligence than to duplicate human intelligence with a machine. The more interesting possibility is that someone will build a self-improving AI that is not a computer.March 11. From Aeon Magazine, a smart essay about humanity's deep future and the threat of extinction from stuff we are only now beginning to create. My favorite ideas are not from Nick Bostrom, the guy photographed at the top, but Daniel Dewey, a specialist in artificial intelligence. This is the first time I've seen a plausible analysis of the motivations of a dangerous AI. We imagine that it will be like an evil human, but human motivations come from human nature and human culture, neither of which will motivate a machine. Dewey correctly observes that our AI will have exactly the motivations we give it, and he speculates that it will follow these motivations into consequences that our relatively low intelligence cannot predict.
'The basic problem is that the strong realisation of most motivations is incompatible with human existence,' Dewey told me. 'An AI might want to do certain things with matter in order to achieve a goal, things like building giant computers, or other large-scale engineering projects. Those things might involve intermediary steps, like tearing apart the Earth to make huge solar panels. A superintelligence might not take our interests into consideration in those situations, just like we don't take root systems or ant colonies into account when we go to construct a building.'
It is tempting to think that programming empathy into an AI would be easy, but designing a friendly machine is more difficult than it looks. You could give it a benevolent goal -- something cuddly and utilitarian, like maximising human happiness. But an AI might think that human happiness is a biochemical phenomenon. It might think that flooding your bloodstream with non-lethal doses of heroin is the best way to maximise your happiness. It might also predict that shortsighted humans will fail to see the wisdom of its interventions. It might plan out a sequence of cunning chess moves to insulate itself from resistance. Maybe it would surround itself with impenetrable defences, or maybe it would confine humans in prisons of undreamt of efficiency.
March 8. Yesterday on the subreddit a reader linked to this Ribbonfarm guest post that explores the WEIRD subject more deeply: Honesty and the Human Body.
Also, I keep forgetting to mention this. Does anyone have a Kindle 3 (a.k.a. Kindle Keyboard) that they want to donate or sell cheap? It's okay if the logic board or battery is bad because all I need is the screen.
March 7. I've just finished the latest update on my 100 things about me page. The biggest block of new stuff is from 26-30.
March 6. Three positive links. This Happy Homestead is a new blog by a guy in my area.
BookOS is a new ebook site. It's mostly pdfs, which are good for reading on the computer. If you have an e-reader it's better if you can find epub or mobi.
And a great reddit comment about the benefits of meditation, focusing on the skill of noticing and dropping unhelpful thoughts.
March 4. There's a lot of buzz about this video, Wealth Inequality in America. Here's a less slick video that goes inside the top one percent to show the wealth continuing to increase to even more absurd levels: The L-Curve: Income Distribution of the U.S.
Something most people are missing is that, past a few million dollars, money is no longer about buying stuff -- it's about political power. And this power is mostly being wasted. Only a few super-rich people are doing anything interesting -- Bill Gates is trying to eradicate malaria, Elon Musk is trying to colonize Mars, but most of them are ordinary idiots, throwing their wealth/power behind two contradictory goals: to continue to increase their wealth/power, and to keep the system stable.
March 1. Today, people bash American culture and I defend it. Dmitry Orlov's new post, Monkey Trap Nation, points out many ways that Americans are short-sighted, and he's right. But relative to other countries, we are only more short-sighted in the time dimension. We are less short-sighted in the social dimension, and I'm talking about political corruption. I'll define corruption as the use of a position of power to serve a narrower interest than the scope of that position. So instead of serving everyone you have power over, you harm most of them to serve yourself and people close to you. America is corrupt, but most of the world is much more corrupt, and short-sightedness toward other people might be even more harmful than short-sightedness toward the future, because when disaster happens, a socially far-sighted culture can hold together while a short-sighted one breaks into warring tribes.
Why Americans are the weirdest people in the world. This is the best article I've seen on the idea that psychological studies have been skewed by only testing educated westerners, and when you look at all of humanity, we are outliers, and Americans are the most extreme outliers. There's a fascinating argument that a particular optical illusion (see the article) is a cultural artifact of living inside square rooms. Americans are the most fooled by that one, but we're the least fooled by another one, in which a vertical line appears to be tilted by being placed in a tilted rectangle. The idea is that we're more individualistic, and cultures that see the illusion are more holistic, more focused on the context.
But wait -- there's a reason it's called an illusion. When you look at larger context -- the paper the test is written on, your computer screen, gravity -- the line really is vertical, and Americans are seeing this context, this whole, more accurately than anyone else! So in society and politics, we are more likely to act in the interest of all of humanity, or all life everywhere, or biological nature manifesting through our feelings, than to obey family and country. Individualism has its own dangers, but collective insanity is a greater danger, and something like the Cultural Revolution in China could never happen here.
February 27. Related to bees, some good news: Sting leads to charges for illegal chinese honey importation.
Also, I've just forwarded a reader request for political action on climate activism to the subreddit.
February 26. Short new post on the landblog/houseblog, linking to a big new page about a project I've been working on for months: building two Top Bar Hives so I can start beekeeping.
February 25. Unrelated links. A reader sends this inspiring article on India's rice revolution, a new growing method that increases yields without degrading the soil or feeding economic domination.
Two reddit posts by the same person about what it's like to be a heroin addict and to use it for the first time.
I also recently discovered reddit's SFW porn network. Here's SpacePorn and you can access all the others from drop-down menus at the top.
February 22. Related to the ongoing subject, a good reddit thread: Am I the only one who thinks Brave New World is a paradise rather than a dystopia? Thankfully, the OP does seem to be the only one... so far. A lot of the responses mention "what it means to be human," but as I mentioned the other day, it is possible to use technology to change human nature. I'd like us to change to become more able to handle freedom, power, responsibility, and risk; but we could just as easily change in the other direction, and become more and more tolerant of being infantilized.
There is another dimension to this issue that I haven't mentioned yet: the possibility that our world is part of an invisible larger world that has a plan for us. There's a lot of buzz about how we could be in a computer simulation, but I think the method must be as alien to us as computers are to cavemen. Anyway, the more interesting question is: Why? Personally I think it has something to do with learning, because when I try to learn, everything falls into place, but when I try to have too much fun I get slapped down. It's even possible that different people are here for different reasons.
If we entertain the idea that we're here for a purpose, can we reconcile this with the observation that human life is getting more insulated and predictable? Why would that be in the plan? One story is that the insulation is another challenge we have to overcome, that we have to learn to create life from inside us in a lifeless environment. A similar story came to me in a dream last night: we are being turned into a substrate, a blank slate for another level of creation. If you want to write, you don't buy paper covered with chaos of words -- you buy clean white paper with uniform lines. Human life is being turned into clean white paper with uniform lines, a square blank canvas, a formatted disk drive. What are we going to put on it?
This is getting off the subject, but I also had a dream where music in the future will expand in the dimension of variations in the same song, so that instead of listening to one version of one song, our new consciousness will enable us to hear thousands of versions at once and listen to the changes.
February 20. Last week I argued that it's better for us to have more freedom and power, which means more opportunities to harm other people and ourselves, even if it leads to more people being harmed. Obviously this has to be done in moderation, which is why I support moderate restrictions on guns and drugs and driving. Because if we increase freedom too fast, it can create so much trauma that there's a popular backlash, and a regression toward less personal power and more power monopolized by domination systems.
But there is a deeper issue: what are we working toward as a species? Suppose we end up in an all-powerful surveillance state, designed so that even if every human is a murderous psychopath, nothing bad can ever happen, and we all sit around taking euphoria drugs and consuming entertainment. Have we won, or have we failed? My utopian vision is at the other extreme: no authority, no restrictions, nothing is ever locked, the world is full of dangers, and yet it's rare for anything bad to happen, because humans have attained godlike levels of personal virtue and awareness. We may never get there, but I think that's what we should be aiming for.
More generally, I could frame the conflict between these visions as a conflict between internal progress and external progress. Is it more important to improve human nature or improve the human condition? Along another axis there is a political question: how far do you go in using central control? And which improvements work with or against central control?
I've noticed that technological progress is usually described as if improving the human condition is the only thing that matters. So if we build machines to do everything for us, that's good, even if we lose the ability to do anything for ourselves. I see tremendous potential to use technology for internal progress, but there is also the danger of improving ourselves in the wrong way.
February 12. Two related links: Why Parents Need to Let Their Children Fail, and Smothered by Safety. I'm thinking, if we're doing this to kids, are we also on the path to doing it to everyone?
This reminds me of the current debate on gun control. There's an assumption in our culture that delaying death is an absolute good. But whatever our purpose is for being in this world, it's not to avoid leaving, since everyone leaves -- it's in what we do while we're here, which probably has something to do with holding power and responsibility, taking risks, and learning to get along with each other. I think guns are good for us because they give us the option to shoot each other and the challenge to not shoot each other. For the same reason, driving is good for us. If society is set up so that it's impossible for anyone to harm anyone else, then there is no power, no responsibility, and no need for learning.
This is my new vision of doom: politically it's much easier to increase safety than to increase danger, so we're going to see ratcheting tightness, in the name of safety, served by powerful technologies. In a hundred years only rich people and servants of the control system will be allowed to shoot or drive or otherwise put anyone else at risk. In another hundred years, you won't even be allowed to put yourself at risk, and if you think that's silly, consider seat belt laws. Eventually all power and responsibility will be held by artificial intelligence and robots, and they will exterminate us -- not with bullets, but by making life so predictable that we lose the will to live.