-- Leo's gemini proxy

-- Connecting to log.aviary.biz:1965...

-- Connected

-- Sending request

-- Meta line: 20 text/gemini;

i like running servers


i run a few web sites in addition to this gemini service. they're all tiny, mostly single-page sites that i run for friends and family. i'm not too into the web design part of it, but i am very into the aesthetics of deploying them.

my job is the opposite of tiny: i hold a pager for thousands of customers, and i work on projects that make such large-scale maintenance tractable.

at home, though, i'm interested in exploring the minimum requirements for a functional service or program. here's the deployment script for a website i run.


#!/bin/ksh

set -e
set -u
set -o pipefail

echo "deploy to $domain at $destination"

deploy="$(TZ=UTC date +%FT%TZ)"

current="/var/www/sites/${domain}/current"

if ssh "$destination" -- test -L "$current"
then
	old="$(ssh "$destination" -- readlink "$current")"
	echo "old deploy is $old"
else
	echo "no old deploy"
	# we rm this later, and we don't want to rm -f if we don't have to
	ssh "$destination" -- touch "$current"
fi

echo "new deploy is $deploy"
rsync -a ./dist/ "${destination}:/var/www/sites/${domain}/${deploy}"

ssh "$destination" -- "cd /var/www/sites/${domain} && rm current && ln -s ${deploy} current"

# TODO: roll back automatically
curl --fail --silent "https://$domain/" >/dev/null || echo "curl failed"

echo done!

i serve several virtual hosts on this particular VM, so the domain name and destination hostname are different. this site is not "highly available." if i lose this server, i need to start over. that's ok, though; it's just the landing page for someone's small business. it gets maybe five thousand requests per month. i don't analyze the logs, but i'm sure most of those are crawlers and scanners.

my only monitoring is the fact that i regularly log in for fun and to maintain other sites. if httpd(8) is down, i'll notice within a day or so.

if it does go down, though, i want some automation. i ran a "disaster recovery test" and came up with these shell scripts.


#!/bin/ksh

set -x
set -e
set -u
set -o pipefail

doas pkg_add -u \
	checkrestart \
	kakoune \
	rsync-- \
;

if ! grep -q 'server "www.example.com"' /etc/httpd.conf
then
	doas tee -a /etc/httpd.conf <<EOF
server "www.example.com" {
	alias "example.com"
	listen on * tls port 443
	tls {
		certificate "/etc/ssl/www.example.com.fullchain.pem"
		key "/etc/ssl/private/www.example.com.key"
	}
	location "/.well-known/acme-challenge/*" {
		root "/acme"
		request strip 2
	}
	location "*" {
		root "/sites/redacted.example.com/current"
	}
}
EOF
fi

if ! test -d /var/www/sites/www.example.com
then
	doas mkdir -p /var/www/sites/www.example.com
	doas chown -R $(whoami):$(whoami) /var/www/sites
fi

i run this script on a fresh OpenBSD installation. it writes the httpd(8) config, etc. as you might tell from the config, i use acme-client(1) to generate TLS material, and daily(8) to periodically renew the certificate. i've omitted that configuration, but it's basically what's written in the manual page. when recovering from a failure, i copy the existing TLS material if i can.


#!/bin/ksh

set -x
set -e
set -u
set -o pipefail

echo "copy tls material for $domain from $src to $dst"

tmp=$(ssh $dst -- mktemp -d)
trap "ssh $dst -- rm -r $tmp" EXIT

scp $src:/etc/ssl/$domain.crt $dst:$tmp/
scp $src:/etc/ssl/$domain.fullchain.pem $dst:$tmp/
ssh $src -- doas cat /etc/ssl/private/$domain.key | ssh $dst -- tee $tmp/$domain.key >/dev/null

ssh $dst -- doas install -o root -g wheel -m 0444 $tmp/$domain.crt /etc/ssl/$domain.crt
ssh $dst -- doas install -o root -g wheel -m 0444 $tmp/$domain.fullchain.pem /etc/ssl/$domain.fullchain.pem
ssh $dst -- doas install -o root -g wheel -m 0400 $tmp/$domain.key /etc/ssl/private/$domain.key

ssh $src -- doas cksum -a sha256 \
	/etc/ssl/$domain.crt \
	/etc/ssl/$domain.fullchain.pem \
	/etc/ssl/private/$domain.key \
| ssh $dst -- doas cksum -c

i run this script from my local machine. it copies everything generated by acme-client(1).


you might ask what i use to generate html that i deploy. i use a text editor, of course! there are no "source" files in this site, only "dist."


.
|-- bin
|   |-- copytls
|   |-- deploy
|   `-- setuphttpd
`-- dist
    |-- assets
    |   `-- forms
    |       |-- Very Important Form.pdf
    |       `-- Yes We Have MS Word.doc
    |-- images
    |   |-- headshot.png
    |   `-- trees.jpg
    |-- index.html
    |-- robots.txt
    `-- styles
        `-- main.css

i've been maintaining this site for five years. it started out with npm and polyfill packages out the wazoo. all the text was templated from a json file through some gulp task. one day i broke my npm installation, and after a frustrating several hours to fix it, i decided to check in the generated html. one day i just deleted the source because it wasn't up to date anyway.


i like to run my own servers on rented VMs, and i like to understand what exactly is required to perform a given task. i make it a point to not rent VMs from a hyperscaler like AWS or GCP. instead, i rent from a friendly group of people at prgmr.


prgmr homepage

-- Response ended

-- Page fetched on Sat May 11 14:07:34 2024