2020-11-09 | Virus | Armchair Sand
A friend showed me this:
Certainly there is some truth to this; however, in my mind the problem is not humans being too smart. The problem is in our commitment to analysis and loss of intellectual capacity to deal with systems individually. In my world I have watched this fall apart as the systems became more and more overwhelming in complication and we ceded holistic analysis for incremental steps to somewhere. While there is some validity to the criticism of analysis paralysis, understanding where we are, where we want to go, and how to get there is fundamental to any kind of change.
But it gets worse... much worse. In the face of systems that are complicated, we are letting others do our communication and operate our systems as a species as we wrangle with these problems. We communicate on private platforms guided by AI for profit, or we just give in to a tribal approach. We identify idols to hate or love. We burn down our corner store in rage, but don't think about the months after. We are unable to put together long form, written thought. We no longer capture knowledge. We act as cogs in service to the existing machinery that, more and more is exactly the AI running platforms that sell us stuff, whether it is AI on a puck that sends us fruit or how we get the news.
There is/was no shortage of analysis in the fifties, sixties, and seventies that mapped out what we were doing to the planetary ecosystems. The problem is that the system is more and more complicated as time goes on and drivers like oil get embedded. Like the problem in IT, now, even in the sixties it was almost impossible to holistically address the real issues. Instead of determining what our requirements were and how to get there, we continued to punt along four weeks at a time, manipulated by feel-good party platforms. (Sound familiar?).
This is no new story. We can start at any time, though. Pick any requirement. Do you want to maintain human population levels? What about wealth? Should wealth be determined by markets or something else? What about the environment? Do we want to keep under 2C warming? How do we do that? How do we verify? How do we enforce? Quickly, if you ask these hard questions, it becomes quite difficult an painful, so we fall back to taking steps. Instead of broader, systemic changes, we focus on easier things to feel good. But the truth is we are destroying Eden.
What I find most fascinating is that what got us kicked out of Eden was knowledge, but at the same time we cede the responsibility of knowledge to other forces. We cede the responsibility to the higher powers, those running the AI, those managing cloud infrastructure. Instead, we work week by week on small goals, unhooked from the broad system, because we have convinced ourselves we can't own it all ourselves anymore. The thing is, though, that AI run privately has one goal: profit. And, while this is not bad in itself, it is not addressing any of the broad systemic goals. Now, it might be a goal to make people think they are working towards the broader goal by buying our products, but it doesn't take much to realize that it is the same goal of profit, ultimately.
Here is the secret, though, what I realized a few months ago. Any focus on establishing the basic knowledge and engineering aspects of a system are worth it. We will need this at any point in collapse or recovery. While it is difficult to get the cycles before various systems collapse, it is a somewhat fertile time to bring up the questions again. Where are we now? What worked well? What didn't work? Where do we want to go? How will we get there? What is affected by this design, this plan? How can we verify progress? How will we know when we get there? The iterative aspects should be in the navigation, the constant tweaks we make to stay on course (or even abandon a particular voyage). This is the true value of iterative workstreams; however, ceding the holistic view is what causes broad, systemic failure like we will/are experiencing.
Just getting people to start to build knowledge in a way that facilitates this towards goals that are related to the broad system rather then outside goals of platforms is a huge step forward. What is even crazier is that this is now possible in a way that can be defined in agreed on meaning (a big change in just the written word, but is certainly being leveraged by the AIs-for-profit).
Please don't take that to mean that systems analysis is the only answer. It is certainly something that is involved with my answer. It is certainly the largest problem (oil-ecosystem-population, etc). It is difficult to even imagine changing things without a systems perspective. But, not everybody thinks that way. Not everybody has those skills or interest. The author in the vid gets us to think. Changing systems takes traditional skills, not necessarily knowledge skills, particularly at the level needed for shared knowledge. Ideas and plans don't build things in themselves. (And the fantasy drone infrastructure and self-driving car system with robots is a factor more complicated as a system and still comes from oil, mostly. We likely will need good ole fashioned concrete as part of the mix.) We will need any/all forms of communication: art, writing, poetry. Tossing out individual analysis and the ability to build knowledge, though, is not going to be the solution. And, we need to own it rather than ceding it to the sky god and just leaving Eden. Again... we can do this at any stage of systemic collapse.
One other thing. It is certainly valid to say that we dismiss eating the apple at all. From my perspective, building knowledge really starts with written language, so you would have to forbid written language to fit with the story. We could insist on pure tribal. That is, rather than intellectual models and science, rather than calculations and knowledge necessary to support large urban centers, rather than the tools and weave needed to support civilization, we go tribal, pure tribal.
But it seems to me that once knowledge seeps in, it takes over in the form of power, so either everybody agrees they will leverage knowledge in a smart way, or the just diligently destroy all tools. But that will also mean that 95% of our population dies. As it is, likely 80% will anyway. Well... as it is, like some Huxley dystopia, we will cede it all, iterate in four week steps nowhere, working for the oligarchy who is running the AI privately for profit, and convincing us that the real problem is individual identity issues, prey on our cultish personality fetishes, and buy their safe house in New Zealand as the global ecosystem fails for large mammals. I'm going for the knowledge tools for everybody approach.
The answer is not Mars.knowledge drone civilization
Articles tagged with civilization on Mud Hut Club:
Articles tagged with civilization on Aggie Codrust:
2020-05-05: Where Does Stuff Come From?
2020-04-29: Victory Lap
2020-04-26: Matrix Batteries
2020-04-13: Is it Worth it?
2020-03-17: Nesting Doll Disappointment
2020-02-29: Homebrew Paperclips
2020-02-27: 1 over X
2019-10-03: Standing in the Way of Control