In industrial mass production, the more nails or screws you make, the cheaper each one of them becomes. In software, the costs go from a lot for the first copy to negligible. I'm trying to get at how there's something about software (and less so, other technology) that by making things more complex it also makes them cheaper. It's much easier to construct a complicated piece of software than a simple one; turns out the cost just got transformed into complexity.
More precisely, a piece of software that is extremely flexible, powerful, useful in many cases is also quite complicated. But the complexity does give something back, it allows a centralization of power, and there's something about having one hammer that works for all nails.
A couple people mention that when things get too complex, someone comes out with a stripped down alternative that takes over, and then in turn gets more complex. This has happened many times with music, but I can't think of any recent examples in IT. Do we really expect a new kind of computer and operating system, that's as simple as 1995, but you can still use it to check your bank balance and buy stuff online?
I actually think that zero-growth complexity is possible. Consider sharks. They've been the same for hundreds of millions of years, and we could do a lot with a social system, or a tech system, as complex as a shark.
But that's not going to happen this time around. And with no way to freeze complexity, or do a clean reset, it can only keep rising until there's a messy reset.
What the heck are we doing, making things more and more complex, so that fewer and fewer people know how they work?
My beloved Ubuntu Linux, formerly elegantly simple, now has 4 packaging systems. I played with 'flatpak' yesterday to see new features in a PDF viewer I use constantly. I was confident that there would be no changes to my system - that's the whole idea of flatpaks.
It bricked my system in a way that I haven't seen in 20+ years of using Linux. It took me five hours to fix it - and I'm not sure exactly what happened because I had to take big whacks at the problem (ie. deleting entire caches).
Not long ago, these things were worse, but were at least understandable - I knew the boot process of my PC, email was plain text, and you could watch clients and servers communicate in plain text.
I think the reason things keep getting more complex, is the same reason that Elvis and Michael Jackson died. Both of them had a personal doctor, with only one patient, and each doctor got bored doing nothing, and had to justify his existence by doing a lot of unnecessary and ultimately harmful stuff. That's what engineers (and managers and executives) of tech companies are doing. If they don't make upgrades, they feel useless, and I guess it's really hard to upgrade something without making it bigger or more complicated.
Maybe in the future humans will be able to enforce a law that puts a hard ceiling on the size and complexity of systems. So a computer operating system is limited to X lines of code, or the laws of a nation are limited to X words, and going above that is a crime.
Until then, it's runaway complexity and collapse, over and over.
To me, the most obvious rebuttal to the tragedy of the commons is the roommate who picks up after everybody else. It sucks to be that guy. I've been that guy. But I wasn't going to wash a cup every time I wanted a cup.
The tragedy of the commons assumes no one will care about their surroundings unless they fully own them. It's a weird thing to assume.
It also speaks to a weird sick pattern of possessive people: they express care for others/things in proportion to how much they can control others/things. There's also the weird sick pattern of assigning value only to that which can be controlled.
This reminds me of a Steven Wright line: "I have the world's largest collection of seashells. I keep it on all the beaches of the world."
And some music for the weekend. The other day, on weed, I did an experiment, where I put this Seraphim Simulation into this YouTube looper, and play-tested a bunch of psychedelic music to see what made the best combo. The winner: Moon Duo - In The Trees.
It's a bit sad to think of all the high school kids turning their backs on building treehouses and sitting in class dutifully learning about Darwin or Newton to pass some exam, when the work that made Darwin and Newton famous was actually closer in spirit to building treehouses than studying for exams.
And on the subject of eugenics, I've been reminded that the definition of that word is broad enough that it can point to two things with no overlap. First, you can have laws that say people with bad genes aren't allowed to procreate -- but inevitably, the definition of "bad genes" is calculated backwards from whatever makes the people in power uncomfortable.
Second, on a completely voluntary basis, we could use technology to improve the human genome. If biotech keeps progressing, this will turn into a huge issue, maybe the biggest issue of the 22nd century. Because after we fix obvious genetic diseases, like Huntington's, we'll get into stuff where the benefit is less clear. Do we want to eliminate sickle cell anemia, which also gives resistance to malaria? Is autism something to be cured, or a valid way of being human?
I expect trends, where the vat-babies of the 2190's have green eyes and wide noses. Or, if different populations go for radically different looks, it could exacerbate tribalism. (Is there a gene for tribalism?) Humans are short-sighted, especially when we're doing something new, so we're sure to make changes that seem to make humanity better, but end up making it worse.
And it's not like we can just tweak a gene to do whatever we want. Little changes will have cascading effects that we don't expect, so that everyone with the gene for pointy ears gets kidney failure.
Personally, I think DNA is overrated, and it will turn out that a big part of who we are, is neither genetics nor environment, but something we haven't discovered yet. Maybe in the 26th century, the biggest issue will be morphic field generators, or ancestral memory wipes.
If you go out and look, land held in common tends to be managed well, and privately owned land tends to be exploited. But in 1968 a eugenicist named Garrett Hardin pulled a paper out of his ass that said exactly the opposite with no evidence, and the owning classes thought it was brilliant.
Does it matter that Hardin was a eugenicist? Yes, because it's the same kind of evil thinking. To support control of human breeding, you have to be comfortable that the people who will be doing the controlling, are people like you. So you have to be confident that you are a member of a justifiably power-holding class. Hardin also wrote a paper on "lifeboat ethics," again a thought experiment with no evidence, arguing that it's bad to give money to the poor.
Note that the ecological destruction of the modern era is not an example of the "tragedy of the commons," but the tragedy of central control and private property. Related, from 2007: Iain Boal: Specters of Malthus, a smart interview arguing that population only outruns food supply when there's non-local control of resources.
I should also say, to reduce the human population, we only need two things: easy access to birth control, and some way of supporting old people who don't have kids. If people don't need to have kids for economic reasons, and if women aren't forced to be baby factories, then the birthrate drops to sub-replacement.
More negative links. Amazon Prime Is an Economy-Distorting Lie. Basically, by forcing third-party sellers to keep their prices high, and charging them massive commissions, Amazon subsidizes its free shipping. Without illegal monopoly practices, Amazon's business model falls apart.
A lot of pandemic homeschoolers are not going back. On the same subject, by Rebecca Solnit, Abolish High School.
Finally, The Age of Autonomous Killer Robots May Already Be Here, because last year in Libya a weaponized drone hunted down a human target without being told to. If we want to avoid the normalization of killer robots, we need a law that explicitly denies robots the right of self-defense. So if you try to destroy a drone, the most it can do is take your picture and use it to bring charges for vandalism.