contents/articles/pragmatic-permacomputing.md
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 |
----- id: pragmatic-permacomputing title: "Pragmatic Permacomputing" draft: true subtitle: "Consideraton on building practical and resilient software" content-type: article timestamp: 1747484731 ----- [Permacomputing](https://permacomputing.net) should be taught at school. It should provide a _forma mentis_ for future engineers interested in building software and hardware that is mean to last, rather than doomed to be thrown away after a relatively short time. > [Permacomputing] values maintenance and refactoring of systems to keep them efficient, instead of planned obsolescence, permacomputing practices planned longevity. It is about using computation only when it has a strengthening effect on ecosystems. -- Devine Lu Linvega on [Permacomputing](https://wiki.xxiivv.com/site/permacomputing.html) Interesting stuff, no doubt, but _why_? What drives this semi-underground, off-the-beaten-track movement that aims at doing more with less, recycling old hardware, and building _resilient_ software? [Collapse OS](https://collapseos.org) and its less radical brother [Dusk OS](https://duskos.org) are two examples of software that is meant to be used at the [first and second stage](https://collapseos.org/why.html) of a [collapse of civilization](https://collapseos.org/civ.html) that is both _imminent_ and _inevitable_. Scary stuff. A bit over the top, if you ask me, and these two remarkable projects regularly get [criticized](https://news.ycombinator.com/item?id=43482705) on Hacker News for being excessively alarmist. Also, in a future where humanity is not able to produce computers anymore — maybe due to a sudden catastrophe like a nuclear holocaust, alien invasion, zombie apocalypse, ...take yor pick — people would be more concerned about survival rather than programming some old computer in Forth. I do think, however, that permacomputing can be a very practical philosophy for developing or choosing software and hardware. You can definitely be pragmatic about it, and do something good for yourself and the planet in the process. ### Realistic motivations There are definitely more down-to-Earth motivations to embrace permacomputing than imminent civilization collapse. Here are a few: - **Temporary or partial infrastructure failure** — Think earthquakes, black outs, terrorist attacks, cyber attacks, civil unrest, and the likes. Nasty, but definitely plausible as they did happen already. Still on the alarmist side of things, but if you are 100% reliant on the Internet, would you be OK if you couldn't connect to it for a day? How about a week? - **Lack of financial resources** — Imagine not being able to afford a new laptop or smartphone. Can you make do with older hardware - **Lack of free time** — You are studying at high school or uni, and building software using the latest stuff. You don't mind using hundreds of NPM packages and keeping them up-to-date every week. Fast forward a few years, you have a family and kids. Your priority changes, but you still want to run your own web site and apps even if you don't have 2 hours of spare time per day, or per week, even. - **Save money on VPS or self-host on RPis etc.** — A VPS is a fairly cheap way to run your own server. For four bucks per month, [DigitalOcean gives you a droplet](https://www.digitalocean.com/pricing/droplets) with Linux on it, and you can install whatever you want on it and run it 24/7. As long as 512MB of RAM is enough for you. You want more? You pay more. You can get 4GB of RAM for $24/month for example, are you OK to pay that amount? Even then, your laptop has what, 16GB of RAM these days? Forget running bloated software there. Same thing if you plan to self-host on your Raspberry Pi: you are going to have to deal with more resource-constrained hardware... but that's a _good thing_, because it forces you to re-evaluate your software stack and often go for less-bloated alternatives. - **Service shuutting down (or increasing prices)** — Only potentially related to permacomputing, but surely one of the biggest reason to go the [self-hosting](https://github.com/awesome-selfhosted/awesome-selfhosted) route. Are you OK using a "free" service that may shut down or go premium with minimal warning? Can you host it yourself? Can you implement a program that does the same thing? Your mileage may vary, but chances are that you already experienced at least one of the scenarios above. ### Recycle and salvage old hardware One of the first steps to reduce the amounts of e-waste that gets generated every year is realizing that _you may not need_ the latest and gratest laptop, or the latest iPhone. You don't need to change smartphone every year, especially because — let's face it — upgrading your phone is no way near as exciting as it was in the 2010s. You get what, a better camera? More GBs? Even higher processing power? Sure. But do you _really_ need it? Maybe not. I am currently using a three-year old iPhone 14 Pro. I used to change my iPhone every two years, and give the previous one to a family member. We have quite a few iPhones in the family all the way to iPhone 6, and, guess what, they all still work perfectly. Hell, my 2nd generation iPod touch from 2009 still works perfectly! Sure, the battery may not last as long after a few years, but for the most part, Apple device are still very well-made and durable. You should never throw away one of those devices, you should put it to use in one way or the other! Anyhow, back to my three-year old iPhone. This thing has a six-core CPU, an Apple A16 GPU, and 6GB of RAM. Those specs are _stupidly_ high. Think back at the average VPS that you can get for 5$/month, or think about your first computer... I got mine back in '98, and it had only 64MB of RAM. A few years later, I expanded it to 128MB and I was able to run — albeit sluggishly — Windows XP on it. Fast forward 20 years, and you can still run Windows XP [in your browser](https://lrusso.github.io/VirtualXP/VirtualXP.htm)! While few purists may not like it, the fact that these days there are a lot of very powerful proprietary devices at risk of being thrown out is a reality, and something we should do something about. While there are few, there should be more projects aimed at leveraging the high-end specs of these devices and give them a second life. Also, I wish big companies like Apple and Google could commit in keeping patching their old operating systems, or at least they could open source them and let volunteers do it. ### Portability: Target multiple platforms and architectures When it comes to portability, I think that pretty much nothing beats [virtual machines](https://wiki.xxiivv.com/site/virtual_machines.html) and emulators. That's why it is still possible to run old SNES games like the original Super Mario Bros. on much newer hardware than originally intended. Alternatively, some popular games like GTA San Andreas have been successfully _ported_ to many architectures and systems (yes, I have been re-playing it on my iPhone just recently). If you rely on a piece of software, or if you decide to make your own, you should make sure it runs on as many operating systems and as many architectures as possible. Implementing your software as an [αcτµαlly pδrταblε εxεcµταblε](https://justine.lol/ape.html) is probably today's best example of portability of a single executable file being able to run (as-is and _without_ being recompiled!) on Linux, MacOS, Windows, FreeBSD, OpenBSD, and NetBSD for both the ARM64 and AMD64 architectures. Justine Tunney explains why this it's important for her. > One of the reasons why I love working with a lot of these old technologies, is that I want any software work I'm involved in to stand the test of time with minimal toil. Similar to how the Super Mario Bros ROM has managed to survive all these years without needing a GitHub issue tracker. Speaking of ROMs, [Uxn](https://wiki.xxiivv.com/site/uxn.html) is another example of portability and permacomputing at its finest. The virtual machine is devised to be simple to implement, so that it can be ported to many platforms relatively effortlessly (if you are into that sort of thing). Uxn is remarkable for other reasons as well, as we'll see later on in this article, but especially because it is based on the principle that if you can implement the virtual machine in any way you want, you will be able to run pre-compiled ROMs forever. Of course, Justine and Devine are one-of-a-kind programming gurus, but the good news is that a lot of today's programming languages can built executables that run on different platforms without code changes. Rust, Go, and of course C (with some caveats) to name a few. In my case, I picked [Nim](https://nim-lang.org) to implement most of my very own [personal ecosystem](https://code.h3rald.com) that can run on MacOS, Windows and Linux. Maybe not the most popular choice (read on for more on _popularity_), but it is easy to use and does the job. ### Reduce dependencies on 3rd party code (within reason) When it comes to managing dependencies in software and the current state of things, no one describes it better than [XKCD #2347](https://xkcd.com/2347/). Certain ecosystems (NodeJS and modern JavaScript, I am looking at you!) encourage the proliferation of dependencies. The more dependencies on third-party code your program relies on, the higher the chance that something will change, and break things. This is the reason why I decided to implement [my own static site generator](/hastysite/) instead of using one of the millions existing ones out there. It was a pragmatic decision mind: changes in my life reduced the amount of time I had to tinker with code and stay on top of an ever-changing ecosystem (and I was using Nanoc and Ruby, nothing too crazy, even). My recommendation here is not to go too crazy in reinventing the wheel: you _probably_ don't need to reimplement an operating system from scratch, and you _probably_ don't need to implement your own programming language or blacklist NodeJS forever because of NPM. But you should definitely evaluate your options. I am running a bunch of web services in NodeJS, _but_ I am not relying on third-party dependencies, for example. NodeJS has been fairly considerate in its relatively short lifespan to not break things too often, even between major versions. They offer LTS releases, so all I have to do is keep the system patched automatically, and upgrade every couple of years. For me, right now, that is manageable. And if you decide to rely on a closed ecosystem like Windows or iOS, that may also be fine: there are programs that can still run on Windows after 20 years, and the same can (nearly) be said for some iOS apps. Other considerations apply, in this case, like interoperability of file formats, but it is important not to dismiss proprietary systems just because they are proprietary: the reality is that a lot of e-waste today is constituted by devices running on proprietary systems. Sure, you could blank a Windows laptop and install Linux on it, if you are so inclined and you are comfortable with it, but if you are not, it's fine too. ### Interoperability For years I have been dreaming about using some sort of Microsoft Excel alternative. There is gotta be something out there that does the job, maybe using CSV files or similar... and there is, probably. Devine recently made [Nebu](https://wiki.xxiivv.com/site/nebu) for Uxn, and I think that's a remarkable feat of engineering. Would I tell my father to use it? Probably not. Could I use it at work? Maybe, but as sad as it is, people would start wondering why suddenly I am using CSV files instead of XLSX so... let's just say that would be impractical. The important thing here is that in a lot of cases it's _fine_ to choose certain applications, proprietary or not, as long as you have a way out, a way to export your data to a common data format that can then be imported something else. We could debate on the inherent evil of the XLS or XLSX format, how horrible and inconsiderate it is on certain aspects, but at the end of the day: - It can be converted to CSV and then you are "free" again - It can be imported and managed by many competitors of Excel - We have been using it for decades, it's still there. You can probably open an .xls file created twenty years ago - It is _reasonable_ to expect that it will still work for the next twenty years Sure, Microsoft switched from XLS to XLSX, but hey, we survived. All this applies to text editors, spreadsheet editors, or whatever software uses a specific file format to save its state: there are thousands of Markdown editors out there, for any platform. It really doesn't matter which one you pick, as long as it uses markdown as a format. Does your favorite editor become unmaintained? No problem, you can pick another. Looking at my specific case, I am relying on a specific flavor of markdown ([Discount](https://www.pell.portland.or.us/~orc/Code/discount/)) with some custom extensions for macros and snippets for my own [HastyScribe](/hastyscribe/) and [HastySite](/hastysite/) projects. That's probably a red flag: if you want to use HastyScribe as a markdown processor (few other than me do), you have to live with the fact that: - It does not use _standard_ Markdown - It relies on 3rd party dependency - It is written in Nim I can live with all that, but I am also the sole maintainer. ### Understandability and open source I put understandability and open source in the same section because they go hand-in-hand 90% of the time. The assumption here is if you don't have access to the source of a program, chances are that you will not be able to fully understand how it works. There are probably other reasons for open sourcing your software, but as far as permacomputing goes, understandability is the most important. If a piece of software has to withstand the passage of time, it has to be understood by others. Again, there are extremes: a basic Uxn implementation is 100 lines of C and [fits on a napkin](https://wiki.xxiivv.com/etc/uxnmin.c.txt). The idea here is that it should be easy to understand and re-implement in other programming languages and systems. ### Run on limited memory/cpu ### local first (avoid AI, cloud, internet, container) ### build for resilience ### Popularity |