@@ 112,6 112,8 @@ To recap:
1. Allocating cognitive bandwidth requires having shared values because measurement is a drain on efficiency and not needed to the same extent when there is mutual trust.
2. Trust can be created through finding shared values, this process can be reasoned about as "resolving merge conflicts" in belief systems.
+That's it.
+
## HCIM
Here is a model of how a computer serves a user as a communication device.
@@ 121,23 123,19 @@ On the picture there are four arrows.
We need to program these arrows.
-Programming happens in layers; there is the versioning of data, the building of programs and the running of programs.
-
-Solving programming is very difficult, we'd need to get rid of turing completeness and that may not be feasible in the short term (i.e. pre-climate crisis devastation) and we'd also want it to be approachable which is like learning to run before you know how to walk.
-
-Solving build systems is more feasible but not as high leverage for the vast majority of users as solving version control (and a build system can leverage better version control).
+The purple arrow has some layers; there is the versioning of data that the user inputs, then the building of programs from that data and then running those programs to transform some more data.
-Version control is just democracy in disguise, that is, it makes the trade-off that availability may be sacrificed for consistency and partition tolerance. It is also the first step to making the user into a "technical" user.
+Getting rid of turing completeness and having an accurate model of the kind of computations that a computer can perform is a big goal but may not be feasible in the short term (i.e. pre-climate crisis devastation). However the fundamental step in configuration is not automation but rather versioning.
-I'd argue that the next step after NixOS (which tries to make configuration a faithful representation of the computer) is to escape the local optima of Git and build a "proper version control system", i.e. let's make it easy to manage personal data like secrets.
+Version control is actually just democracy in disguise, at least from the perspective of the CAP theorem they both make the same trade-off; availability may be sacrificed for consistency and partition tolerance.
-We already have git, git-annex and pass so I guess we can start by wrapping these tools and making some sort of state daemon for userspace data. I'm currently applying for a grant to work on this.
+If we want to make users into "technical" users then we need them to know how to use version control. Because of this and some other reasons I'd like for us to escape the local optima of Git and build a "proper version control system" but we'll see how it goes.
-but we also need to build the final layer, the window manager. However I'm not going to talk about it because there's too much to say.
+Anyway, the next step after NixOS (which tries to make a configuration file a faithful representation of the computer) is to give the user a version control system to traverse the space of known configurations as well as manage secrets and make backups. This will create demand for the generation of new configurations to extend the space of possibility for the user, which is a great feedback loop, and I believe it will take us to 'the year of the linux desktop'.
-So lets assume we finish cleaning all the dirty state and we finally make the computer a usable device.
+The final layer, the window manager, is also very important and in my opinion it should use the same semantics to make the traversal of configuration more intuitive. However I'm not going to talk about it because there's too much to say.
-Then the "year of the linux desktop" will happen naturally, the users go where they get greater agency and leverage in their daily struggle so long as the up front cost is not higher than the perceived value.
+So lets assume we finish cleaning all the dirty state and we finally manage to make the computer a usable device.
[Flip back up]