-- Leo's gemini proxy

-- Connecting to inconsistentuniverse.space:1965...

-- Connected

-- Sending request

-- Meta line: 20 text/gemini

Teasing out different kinds of permacomputing


I'm responding in part to


this post


but these are also things that have just been on my mind lately anyway, so this is going to be a slightly off the cuff post.


Durable computing vs time-capsule computing


Okay so in their post jsreed coins a couple of really useful terms that I want to start with as a launching point.


Durable computing is the idea of being able to keep hardware usable over very long periods. You can continually repair the device. You can update its software and make it continue interoperating with systems. A computational device is thus durable when there is an infrastructure in-place to keep it running.


On the other hand, time-capsule computing is the idea that a device should be able to be usable even if years have passed without continual updates and maintenance.


Heirloom computing


Now I want to talk about something I've seen in the literature about "digital heirlooms". The tl;dr of digital heirlooms is that they should be limited devices, self-contained, and not needing updates or anything like that. Digital heirlooms are kind of like digital art installations: you need to basically freeze the technology that was being used at a time into some kind of stable configuration that depends on as little in the outside world as possible. Digital heirlooms are generally only discussed in the context of grief, preserving memory, things like that.


They're rather literally supposed to be systems that serve the role of a family heirloom that can be passed down from generation to generation, preserving and building memories across time.


But I think we can generalize the concept slightly and set it against these other terms.


Heirloom computing is thus a subset of time-capsule computing. Something that is an heirloom computer is so self-contained it can always stand on its own, it can always get pulled out of storage and turned on and fulfill its original function. This is, of course, because these heirlooms are intentionally limited computers.


Long-term computing as infrastructure


So I think what jsreed's post made me realize, particularly when thinking about research into digital heirlooms, is that there's different degrees of reliance on surrounding infrastructure and tradeoffs between general purpose computing power and independence from this infrastructure.


An heirloom is something that has zero dependence on the outside world but is also very limited in what it can do.


A durable, but not time-capsulable (not sure how to adjectivize this but why not), computer is heavily reliant on infrastructure but is in principle able to fully participate in the world of computation.


So what takes place in-between an heirloom and a computer that is only durable with continual use?


I think it will come down to standards. Standards that have well-defined specifications, gracefull fallbacks to devices that only implement older versions of the standard, &c.


And I don't just mean what we normally think of as standards, like the RFCs we're all familiar with, I also mean things like 100rabbits working on Uxn as a portable computing platform: abstraction layers for computing that give a kind of contract that can be upheld no matter what kind of device underlies the abstraction layer.


So imagine a future where we have a PermaOS that defines a virtual machine that we start writing a permacomputing infrastructure on top of. What if we had graceful versioning of PermaOS and formal translations for compiling programs written for one version of PermaOS into older versions? This latter idea isn't actually crazy. There's already work in the applied category theory world on formally specified migrations between versioning of databases with proofs that the migrations are specified correctly and preserve the same invariants on the data. What if we had formal semantics for PermaOS instruction sets and features and had a translator capable of taking a program in a new version, recompiling it for an older version, with proofs that the semantics were the same modulo performance?


This is the kind of infrastructure I think we'd need to very intentionally build in order to be able to create our time-capsule computers, something that depends on the wider outside world but with well-defined translations from newer systems to old.


What do you think? As always feel free to email me


At my left_adjoint@rawtext.club email addreess

-- Response ended

-- Page fetched on Sat May 4 05:37:36 2024