-- Leo's gemini proxy
-- Connecting to tilde.team:1965...
-- Sending request
-- Meta line: 20 text/gemini; lang=en
This is an unstructured rant about anti-patterns I've found on the web that are ironically dubbed "best practices". What makes them anti-patterns? Some combination of cargo culting, having no actual basis for being superior or in general just being crap dressed up as gold.
npm is a series of programming language based package managers that all suffer from a variety of issues. npm however, consistently hits the news because of malicious packages or dependencies causing cascading failures:
While this topic has been discussed from death I'm going to focus on a Linux based perspective since I've seen little to no discussion on it. Programming language based package managers like npm, pip and cargo share one common goal: move fast and break things. There is essentially no quality control. Companies often use private registries, lock the version of their dependencies or run their own infrastructure to prevent all of the issues with npm. I find this ironic for reasons I'll discuss shortly. Due to similar issues one Python developer has suggested curating pip packages. It's also taken pip over a decade to implement a dependency resolver (that doesn't fully solve the problem, but that's a different story.)
Comparatively, Linux packages are superior for a few reasons including but not limited to:
Quality control: while all distributions do things differently, the general procedure is having a QA team, process and infrastructure in place to automatically build, test and allow users to determine if a package is of sufficient quality before deploying. This contrasts to the wild west of programming language package managers
Compatibility: due to the ad-hoc nature of programming language managers, there is basically no guarantee of any sort of compatibility. You're on your own. It is impossible for any developer to audit the thousands of dependencies that a single dependency might pull in, and as we've seen in practice, it can break things. Distro packages at least try to make sure that everything works together, up to and including not releasing a new version of the distro until there are no major bugs or incompatibilities
Security: it is very difficult to have malware enter a Linux repository. In Fedora for example, package maintainers must use their real identities and be sponsored (else, they use COPR). It wouldn't pass the QA stage as it would be caught very quickly and packages are GPG signed to ensure authenticity.
Does this sound familiar? This sounds very similar to what I mentioned above that companies to do to get away from the limitations of npm, which seems to negate all of the advantages programming language package managers have in the first place...
There is a data breach near daily, some disclosed, some not, but the root causes boil down to basically two things: management not seeing the value of IT/cybersecurity and lack of education from software engineers. It's simply not taught in school and developers see it as an inconvenience to be tacked onto applications rather than something to be addressed from the beginning.
For a static website, these are pretty standard suggestions:
A real life example of a web trend that's directly at odds with security is "extrating critical CSS":
Much like security, accessibility is something developers are either not taught or don't really think about. Some popular frameworks like Bootstrap actively defy accessibility for some unknown reason:
> Some combinations of colors that currently make up Bootstrap’s default palette—used throughout the framework for things such as button variations, alert variations, form validation indicators—may lead to insufficient color contrast (below the recommended WCAG 2.1 text color contrast ratio of 4.5:1 and the WCAG 2.1 non-text color contrast ratio of 3:1), particularly when used against a light background. Authors are encouraged to test their specific uses of color and, where necessary, manually modify/extend these default colors to ensure adequate color contrast ratios.
In general, though, there are some trends like hamburger menus that seem to be cargo culted simply because frameworks made it popular rather than it actually being good design. See these discussions:
Other examples in no particular order include:
Being blinded by pages that have white background and text with poor contrast (like grey text)
Not taking hotkeys or keyboard-based navigation into account
Excessive amount of DOM elements which can screw with screenreaders
Which leads me to...
In my opinion, a lot of web design boils down to (subjective) choices that aren't backed by scientific rigor or cargoculting. Sometimes it's literally what the designer thinks looks good on that particular day. Some firms like the nngroup provide actual good UX advice that I consider to be required reading.
A common violation of standard UX advice I see is not following Fitt's law. To summarize from Wikipedia:
> Fitts's law (often cited as Fitt's law) is a predictive model of human movement primarily used in human–computer interaction and ergonomics. This scientific law predicts that the time required to rapidly move to a target area is a function of the ratio between the distance to the target and the width of the target. Fitts's law is used to model the act of pointing, either by physically touching an object with a hand or finger, or virtually, by pointing to an object on a computer monitor using a pointing device.
In other words, if I'm constantly having to move my mouse far distances, zig zagging or there's no keyboard-based equivalent, a lot of time and patience is going to be wasted.
Another example are completely unnecessary animations and delays. People like speedy websites and navigation, but it seems like almost all media I encounter (single page applications, games, mobile OS) inject delays, whether needed or not. I often find myself enabling "reduced animations" whenever possible because I'm highly impatient. To cherry pick a quote from nngroup:
> The basic rules of human perception of time provide a framework for understanding the effects of webpage delays: people can detect delays as short as 1/10th of a second, so anything that takes longer doesn’t feel ‘instant.’ Delays of just 1 second are enough to interrupt a person’s conscious thought process, changing the experience into one of waiting for the system to catch up, rather than feeling as though you are directly controlling the interface. This delay reduces conversion.
I can understand delays that are added to mask loading times or similar, but if it's done purely for aesthetic purposes, I consider that an anti-pattern.
-- Response ended
-- Page fetched on Sun Aug 1 00:52:15 2021