-- Leo's gemini proxy
-- Connecting to gemini.circumlunar.space:1965...
-- Sending request
-- Meta line: 20 text/gemini; lang=en-US
This is scrawlspace. I scrawled elsewhere and then moved it into this space. Do not expect coherence or permanence here, either.
Gemini text style isn’t HTML style. In Gemini text, you can’t have inline links to explain what you’re talking about.
Sometimes, it seems to make more sense to pre-load the reader with links to explanations.
Sometimes, it seems to make more sense to provide links afterwards if the reader is interested in learning more about what was just discussed.
Eventually, I’m going to write a post here where some things are best pre-loaded and other things are best supplied afterwards. Calling them both “background” will seem more than a little bit weird, but I don’t have two good, separate names for this sort of thing.
Today, my Internet pipe was upgraded from 15/15 (≈1.8 MB/s) to 100/100 (≈11 MB/s).
As you might guess, Geminispace doesn’t seem to be any faster.
One of my favorite video-game albums is the one for The Secret of Monkey Island. Not only does the Red Book audio have some decent tunes, it also has some decent background tracks. Track 24 is called “Fight Ambience”, but it’s the ambiance used for sword fights on Mêlée Island — owls hooting, birds flying off, and that sort of thing. No music whatsoever.
This is nice background noise, but it wasn’t quite enough. I ended up opening Headspace and piped Ocean Time, an 8-hour track of beach-waves noise, to my Mac so I could get the sound of waves coming out through decent speakers. _This_ ended up being just the right amount of steady-state noise combined with _just_ the right amount of variation.
Oddly enough, the new iOS-wallpaper artwork reminded me of The Secret of Monkey Island and got me started with this setup. The dark-mode version of the wallpaper I use is deep in the purples and blues, and Mêlée Island at night has lots of blues and occasionally purples.
I have a 4.2 GB folder full of what is mostly tiny React projects. I don’t want its contents showing up in Alfred ever, so I decided to squish it into an archive.
My first instinct was to right-click on it and select “Compress ‘Units’”. This gave me a 2.5 GB zip file.
Because Node-based anything has loads of tiny files, I figured tar-and-xz might be the winning play here. I figured I could tar up the contents of the directory once, and then try different compression schemes on the tarball.
(DISCLAIMER: the xz manpage explicitly recommends against using -9 (aka --best) because it can require massive amounts of RAM on decode. RAM your computer might not have. That said, all of my computers have more than 65 MB of RAM free for xz.)
> time xz --best --keep units.tar ________________________________________________________ Executed in 37.62 mins fish external usr time 37.32 mins 121.00 micros 37.32 mins sys time 0.23 mins 711.00 micros 0.23 mins
This gave me a 796 MB file. Way better than the 2.5 GB-monster that zip gave me!
What if I tried using all the cores?
> time xz --best --keep -T 0 units.tar ________________________________________________________ Executed in 302.56 secs fish external usr time 57.89 mins 27.28 millis 57.89 mins sys time 1.45 mins 6.74 millis 1.45 mins
-T 0 gave me a 17-thread process that took up 1,300% – 1,400% of my CPU (no, that’s not a typo) and made my fans spin up. After a bit, the CPU bite dwindled down to ≈375% and then the process completed.
This process took up only five minutes of wall-clock time (way faster than the half an hour for the single-threaded way), but about 150% of the single-threaded version’s usr time.
Well, what if I used only two threads?
> time xz --best --keep -T 2 units.tar ________________________________________________________ Executed in 18.72 mins fish external usr time 37.08 mins 92.00 micros 37.08 mins sys time 0.12 mins 552.00 micros 0.12 mins
The fans didn’t spin up. That was kind of nice.
18 minutes and 45 seconds of wall-clock time; 37 minutes of usr time (not quite half). Not bad.
Since the xz manpage said that the work would be split up between multiple cores, I wondered if the splitting up would reduce compression ratios any. Turns out, I was onto something:
> ls -l units* .rw-r--r-- 820M units-t0.tar.xz .rw-r--r-- 796M units-t1.tar.xz .rw-r--r-- 820M units-t2.tar.xz .rw-r--r-- 7.8G units.tar
The 1-thread file is the smallest at 796 MB. Both multithreaded files weigh in at 820 MB. In fact, both multithreaded files are 820,078,888 bytes. Exactly. I thought the multithreaded files had a good chance of being larger than the single-threaded file, but I didn’t expect them to have the exact same size.
Oh, and they both have the same SHA-256 checksums. Didn’t expect that, either.
So if you want your xz files to be as small as possible, stick to one thread per file.
Installed Big Sur on my laptop this past weekend. Did my usual clean install, which was complicated by Apple’s CDN falling over for a couple days. Even after I got everything set up again, it still has this air of unfinished mediocrity. Mail, for example, will still eat my mail, so I can’t use it to handle my e-mail at all. Launchpad also dumped a bunch of lesser-used icons like Font Book, Chess, Stickies, and Time Machine into the root folder and now I’m going to have to check with my Mojave machine to see where they’re supposed to go.
Apple needs 20% more software engineers.
There’s an Apple event coming this Tuesday. They will almost certainly announce Apple Silicon — erm, ARM-based Macs. But:
I don’t need, or particularly want, a new Mac. The ones I have are excellent.
Big Sur is probably going to be a buggy pile of poo, on par with Catalina. I’m going to miss having a functional, safe Mail.app.
So why am I excited about this upcoming Tuesday?
Sometimes, when I’m watching old things and hear a dollar figure, I get off my duff, metaphorically speaking, to find out how much money that was back then.
Of course, I don’t generally _want_ to stop my old TV show just so I can look at a black-on-white† site and punch in some numbers, so it’s nice to have rough estimates of how much money that is. Here’re some numbers, comparing September of whatever year back then to September of 2020:
$100 in 1950 is ≈$1,000 now.
$100 in 1974 is ≈$500 now.
$100 in 1978 is ≈$400 now.
$100 in 1980 is ≈$300 now.
$100 in 1990 is ≈$200 now.
† I wish the site, like every other site, had a `prefers-color-scheme`-based dark mode.
I “needed to clear my head”, or so I thought, so I decided to take a walk.
It was a decently long one — 6 miles (10 kilometers). I paused at the starting point after walking about half that and decided to make another loop around. I kind of thought I could use it.
I’m glad I went for a walk, and I’m glad I went for the full two loops.
What I did _not_ expect afterwards is to better describe the walk as “restoring my executive function”.
In response to:
I don’t have a gemlog (yet), but if I wanted to write posts that would automatically get picked up and published I’d probably just put the draft in a directory off to the side that doesn’t get uploaded or otherwise monitored. Then, when it’s done, just move it into wherever it should be so it can get picked up by sftp/rsync/the index-file generator.
I use Git for my capsule as well, but I can’t imagine using more than one branch unless I wanted to radically restructure everything and break loads of links.
On second thought, I imagined it. Multiple branches might be a useful tool for managing the deployment and rescinding of a great April Fool’s joke.
iPad gaming is better than Switch gaming because the screenshots are all in PNG (not JPEG) and you don’t have to dig them out of a teeny SD card thing in the back of the device after shutting it off.
It’s Saturday. Do you know if your capsule’s TLS certificate is expired?
If you’re not automatically getting your TLS certificates renewed like Let’s Encrypt has you do, do you have a manual to-do set to pester you a month before your 1- or 3- or 5-year certificate expires?
I ask because I was checking out the usual CAPCOM and Spacewalk feeds and, like, three of the links I checked out had TLS-type errors in amfora. Unlike a lot of Gemini clients, amfora’s pretty strict about not letting TLS funny-business pass.
A while ago I wanted to make a structured document. More structured than what I could get out of a giant Markdown file (I wanted a deeply nested <section>/<h1>-style, and I can’t get that with Markdown).
My favorite editor for many things is Visual Studio Code. It has Emmet support, which will let you type a pretty-much-CSS bit of text and get nicely-structured HTML out of it. VS Code, with the right plugin or three, also has decent Prettier support, which will happily re-indent your HTML nicely (so you don’t have to). I don’t do reformat-on-save for Prettier, but I hit ⌥⌘F to run Prettier fairly frequently when I’m editing anything that can be reformatted by it.
VS Code won’t auto-curl your quotes like BBEdit will, but there’s nothing keeping you from typing in BBEdit and occasionally handling the re-formatting of your elements in VS Code. (Tip: in BBEdit, hold down `Control` and press your (double) quote key to get the opposite behavior, whether your quote marks are curly-by-default or straight-by-default.)
So, yeah. Kinda not awful if you have loads of modern tools close at hand. Then again, if you’re writing a big file like this in Markdown, you probably have somewhat different tools, like a live-preview that reloads your changes every time you save the file, so you’re probably also only getting by with a handful of helpful annoyances no matter what you’re writing in.
Initial time estimate: 4 hours, give or take.
Actual time taken: 11 minutes.
Maple bacon onion jam
Broken user query tests
Somehow, I ended up with mdcat(1) on my laptop and glow(1) on my desktop.
Not sure if I should have both, one, or neither on my computers. bat(1) already seems to do what `mdcat` does, and probably better. Typing `glow` isn’t part of my muscle memory, though. What’s more, I have trouble remembering the name of the darn thing for the times when I _do_ want pretty-printed Markdown in a Terminal window.
Update: I actually had mdcat(1) on both my machines. Out the airlock it went, because bat(1) is better.
(was: Re: I am the preëminent server of WebP in Geminispace)
(also, scroll down to “I am the preëminent server of WebP in Geminispace”)
I used lossy WebP on some The Legend of Zelda: Breath of the Wild screenshots (originally JPEG, much to my dismay) and they looked OK. The space savings was impressive.
Then, many months later, I looked again, more closely, flipping back and forth between the original and the lossy-WebP-recompressed version, and the WebP was clearly inferior. I remember someone else on the Internet said “well, waddya expect if you use a pile of video compression techniques on a still image?” and that sounded like a reasonable take. (IIRC lots of VP6? 8? compression techniques are used in lossy WebP.)
At any rate, I’m a teeny bit surprised someone would use imagemagick(1) to compress WebP, but that’s probably a handy swiss-army knife to a lot of people. I wonder what kind of lossless compression ratios nytpu (linked above) would get if `cwebp -z 9` were run on all the PNGs.
I can’t say I’m a huge fan of WebP, mainly because my desktop browser of choice and OS of choice won’t support it until I can be convinced that macOS Big Sur won’t be a raging dumpster fire. <picture>/<source> makes it usable in HTML, but there’s no widely-available fallback mechanism for CSS yet and “background hero images where nobody really needs to care how good they look” is where WebP really shines.
wget -np -w 1 --mirror [some site with a bunch of tiny files]
`-np` means “don’t ascend up into a parent directory”, `-w 1` means “wait 1 second in between requests”, and `--mirror` turns on a bunch of options that’re good to have on if you want to mirror a directory tree.
Total wall clock time: 2d 7h 5m 47s Downloaded: 173919 files, 2.6G in 1h 25m 17s (539 KB/s)
Two days of downloading, but only an hour and a half of downloading. Hence the title of this post.
Now I need to learn and memorize (or write down) the magic incantation that will allow me to skip downloading all of Apache’s sorted-every-which-way file-listing pages. You know, the ones with URLs like “index.html?C=D;O=D”.
I’d almost want to re-download all those files again, if only to see how much faster it’d be.
I went to gus.guru and had a look at the stats there. It had this line in it:
7 - image/webp
I happily serve up WebP-encoded images. I wonder if I’m responsible for all of them.
> ag webp …
Hm, I can’t count. Computers are better at that than I am, anyway.
> ag webp | wc -l 11
OK, the unpublished templates, the upload script, and the colophon verbiage account for all of the difference between 7 and 11.
I am literally the sole server of WebP in Geminispace.
If you’re looking to be Old on the Internet™ one day, you can say “I remember when all the WebP on Geminispace was lossless”.
I’ve been wanting to play around with API Blueprint for quite a while. This past Friday night (yes, yes, I know) I finally got a chance to mock up a simple API with it.
It’s my kind of thing, but the tooling around it is mediocre (at least for the tools I’ve tried). Aglio seems like one of the most popular to-HTML converters for API Blueprint, but for some reason it’s not outputting variable bits in URLs (things in braces). Snowboard is also a thing, but it’s in a Docker container which…I’m still not _happy_ about, even if Docker is a totally reasonable, fine choice for this sort of thing.
My lukewarm attitude towards API Blueprint reminds me of my attitude toward DocBook. Sure, DocBook can provide you with a fantastically rich documentation source that can be converted into a bunch of different formats, but if you don’t like the default DSSSL and XSL transformation stylesheets, you’re not going to like what you get out of DocBook. And I didn’t like the stylesheets’ output, and I didn’t have enough manpower to any of them into something I’d like (unlike, say, the GNOME Project).
Finally got a chance to try commitzen. My evaluation:
git rm .cz.toml brew uninstall commitizen
My hot take: Given current auto-update technology, a personal server in every home is a widespread security nightmare. Only a tiny handful of nerds vet the devices they allow on their home network, and most IoT devices are running some flavor of Linux or whatever that’s vulnerable to heaven alone knows what sort of exploit.
Also, “send some company some money every month to host your stuff” is a pretty decent idea, whether it’s a web host or VPS or Heroku. That way, someone else handles the patching and upgrades and hardware faults.
Oh, and basically none of these homebrew solutions seem to have cracked the “I want to share photos of my kids with just their grandparents and a few of my friends” problem. Facebook, Google, and Apple have all solved that mostly. Anyone can share anything to everybody pretty easily, but sharing to only a handful of people on the planet is much, much more difficult to get right and easy and not subsidized by ads.
I kind of resent blogs.
One of the things I’ve noticed is that it’s difficult to get your work noticed in Gemspace if you don’t have some kind of Atom feed. In order to have one of those, you’re probably going to have to have some kind of automatically-generated feed. What’s the best way to have an automatically-generated feed of anything? Having a blog.
I don’t want to have to set up a blog or a blog workflow or a static-site generator. I burned a couple days on generating an Atom feed from a YAML-formatted JSON Feed feed and I want those days of my life back. On the other hand, having a single-file blog was kind of a neat idea and I might actually use it in the future.
The weird part? I used to be a big fan of Atom. Now its insistence on titles grates. I think I’d rather patch Spacewalk to handle JSON Feeds in addition to Atom feeds. Unlike CAPCOM†, I don’t think I need to do anything weird to have it handle nonexistent titles…because it doesn’t seem to display titles anyway.
†: Boy, that was a confusing name when I first saw it in Geminispace.
> I can’t reach out and tell you that I’ve responded to your post, so I guess we’ll see if you notice this. this gemlog doesn’t have a feed of any kind, so if you see it, I guess content *can* be noticed without an atom feed.
Guess I should’ve read the Spacewalk README/source code a little bit more slowly and thoroughly, then. Thanks!
-- Response ended
-- Page fetched on Thu Dec 9 13:08:53 2021