-- Leo's gemini proxy

-- Connecting to warmedal.se:1965...

-- Connected

-- Sending request

-- Meta line: 20 text/gemini; lang=en

When Did Personal Computers Stop Improving?


kelbot wrote this really nice post about using an old PDA (remember those?) for many tasks.


It got me thinking. I wrote my bachelor's thesis on an Asus EEE PC. The one with the 9 inch screen, but the Intel Celeron M processor (the Atom processor came with the next model). If I recall correctly it ran at about 900 MHz. It had a 12 GB SSD and 1 GB of RAM. What did I ever use it for?


Web browsing.

Watching movies and TV series (downloaded .avi files, back then most movies were around 700 MB I think).

Writing, both LaTeX and OpenOffice.

Playing games. It ran StarCraft through Wine just fine.


I'm not sure OpenOffice (or LibreOffice, which I use these days) would be a good experience on that hardware today. And I'm pretty sure web browsing would be nerve wrecking. Watching movies? Uhm. I doubt it could handle the Netflix or Disney+ websites, and as for pirated content I'm not sure a single movie in bluray quality would fit on the drive alongside a Linux installation. I haven't really pirated anything for more than a decade, but I suspect that finding 2010-quality movie files is hard these days. Which is weird, because we watch a whole lot on tiny screens now but for some reason want everything in HD anyway.


The bullet points above is pretty much exactly what I use a computer for nowadays too, though. I'd add vector drawing and image editing to it, but I did that on my last laptop which had the same amount of RAM and that was only really unpleasant that time when I somehow thought I had to make a 300 dpi image for a book cover...


I guess the point I'm trying to convey is that I do the same things with my computers today that I did one or even two decades ago. The hardware has become monumentally more powerful, but my user experience hasn't really changed a bit. And most of all my level of productivity and enjoyment from my devices stays the same.


That Palm device that kelbot tried has a paltry 33 MHz CPU. Can you even imagine a modern smartphone app trying to get anything done on that? I doubt even Notes on the iPhone would work.


Do we keep making hardware more and more powerful only to be able to continue doing the things we do just fine already? I keep circling back to earlier thoughts on obsolescence and bloat, and I'm genuinely puzzled.


What in God's name are we doing? Why do we keep doing it?


Links


kelbot's post.

Privacy vs Obsolescence.

Website Bloat

Thoughts on the Internet and Climate.


-- CC0 ew0k, 2021-09-25

-- Response ended

-- Page fetched on Thu Dec 9 14:45:39 2021