-- Leo's gemini proxy

-- Connecting to gemini.techrights.org:1965...

-- Connected

-- Sending request

-- Meta line: 20 text/gemini;lang=en-GB



● 09.07.23


Gemini version available ♊︎


●● Removing Input Method Editors from Debian 12: Memory Bloat and How IBM Fedora Is (Not) Coping. More Yuzu Emulator Observations.


Posted in GNU/Linux, IBM, Red Hat at 6:43 pm by Guest Editorial Team


Reprinted with permission from Ryan Farmer.


↺ Ryan Farmer


Debian 12 KDE has been a pretty good system, so far.


However, one thing I noticed was that it included a bunch of input method stuff for non-Latin Alphabet languages, even though I got the KDE desktop image that’s in English (US).


Since I don’t have any non-English keyboards and wouldn’t know how to use them if I did, it seemed a little bit ridiculous to have uim, mozc, and ibus installed.


Looking at the KDE System Monitor, these were split into a few services under X11 and a really big one under the Wayland session, and seemed to require ~100 MB of RAM, again, so that they could tell me that the only option was “English”, which works without them.


I get that Debian is trying to be a “universal operating system” and that there’s a billion+ Chinese people in the world alone, let alone people who might want to use Japanese and Korean keyboards (among others), so people speaking English or at least some language where an input method editor like this are probably half the world or more, but you’ve got a huge userbase that needs this stuff.


Is it justifiable considering that the GUI for this live image starts out in English and you probably can’t use the system well enough to figure out how to use these programs to make it do something else?


If you ask me it’s just kind of sloppy really, that there’s so much internationalization on a disk image primarily intended for English speakers and someone should probably be splitting these up better, or having the installer remove everything not related to the localization settings chosen by the user.


Otherwise you end up with this, where packages that are basically useless if you only speak English, get dumped on the system to waste 100 MB of RAM.


Since I do not use a SWAP partition and am using swap on ZRam, even on a 16 GB RAM system, I should not be dealing with something using 100 MB unnecessarily.


The Yuzu Nintendo Switch Emulator is basically one of the hungriest applications I’ve ever used. Short of trying to compile Chromium or Android, it may be the biggest memory hog you’ll encounter, frequently using 5 GB or more all by itself.


So again, uninstalling the input method editors and using apt autoremove to clean up the leftovers, and rebooting, seems to have gotten rid of that mess. (I think more than one packaged matched mozc, which seemed to have something to do with the Japanese language.)


I was actually impressed, given that Debian has historically not done such a hot job sorting actual dependencies of meta-packages, that it didn’t propose removing anything related to them.


25 years ago, you didn’t need half of 100 MB to run Windows 98! The entire OS!


If you gutted Windows 98 using RoM II to purge Internet Explorer, Trident, Outlook Express, and the Web desktop garbage (and fed it the Windows 95 OSR 2 Shell files), you could get away with running the entire OS in just 7 MB of RAM (of course you’d need more for applications).


I sort of use this as my benchmark for how far we’ve gotten from developers and OS distributors actually caring about resource usage.


Including shipping modules that use 100 MB of RAM and don’t even do anything on computers in half the world!


When Alan Pope talked about “Sleeping Tabs”, a Microsoft-ism for a feature now in all major Chromium browsers, which Google actually wrote (Brave calls it Memory Saver), it made me stop and laugh.


After decades of giving Web “developers” more garbage to fill up your main memory with, there’s now a slider in the Settings area that basically says “Hey, if I walk away and leave some tabs in the background, just chuck everything but leave the tab in case I come back.”


This is a mess with a band-aid.


We see that far too often these days. Where even a 16 GB RAM system seems crunched.


I was amusing myself last night while researching whether I should attempt to set up the systemd-oomd, which replaces the kernel’s out-of-memory killer and has systemd try to figure it out instead.


The discussions (mostly on Reddit, of course, sigh) were full of people saying things like they had an expensive developer laptop with 32 GB of main memory, and they were using Fedora, and systemd-oomd in the default configuration, was “murdering” tabs in Chrome and shutting down their IDE (development environment for programmers) when they hit 12 GB of used RAM.


So like, that’s really really funny.


It starts panicking and killing things because Fedora put it in there with percentages and stuff, instead of looking at how much RAM (in at least one case I read about, a full 20 GB sitting there empty!) you actually still have and realizing that it doesn’t need to take action quite yet.


The main justification for “take something Linux could already do and do it worse”, this time, appears to be that “the kernel oom-killer is slow to respond”.


(But I’m not completely surprised by anything IBM Red Hat systemd does anymore. Lennart Poettering has always been a jerk and now he officially works at Microsoft. So why should its oom-killer work any better than the init part, where once every 10 shut downs you have to hold in the power button ever since they replaced sysvinit in 2014?)


Slow to respond in the oom-killer, meaning, “If the user has no swap configured at all, not even ZRam swap, they’re really playing with fire here and the system could just go ahead and lockup because there’s no time to go ‘Oh shit, oh shit, hurry, kill something!’”


As far as picking Chrome tabs, well, that makes sense at least.


Not that the modern Web is so bloated and full of useless junk that the user would ever end up here.


That’s not even funny anymore because it’s something that was foisted on us by uncaring slobs and big corporations.


It makes sense because if you’re out of RAM and have Web browser tabs sitting there taking 1.2 GB to run an instance of ‘Microsoft Word for the Clown’ when just using LibreOffice Writer would take a tad over 400 MB, then that goes first and if you still need more and the user is running “Clown IRC” in a Web app that takes 600 MB of RAM when HexChat would only want 50, or worse, Discord in a “Desktop App” which is Electron and takes nearly 1 GB because it loads the entirety of Chromium all over again just to have one tab, well…


>

>

> You’re next, bubbles!

>

> -Peter Venkman, Ghostbusters II

>


On the occasion I have opened a “Web app” and top at the same time, I’ve said “What’s wrong with this picture?”.


Poor computing practices that users really shouldn’t even be doing are giving them a rather rude introduction to their new best friend, the oom-killer.


Most of the $50,000 hammers and $25,000 toilet seats are “something something Web app”.


Mozilla does really f**k all about Firefox’s voracious appetite for RAM to the point that Firefox makes Brave look positively nimble.


I can’t imagine trying to use a computer these days that has less than 16 GB of RAM if you’re going to do even one really heavy task.


I’m already wondering if I should just spring for 32 GB myself next time I get a laptop (which could be System76 now that they’ve clawed out the really buggy and flaky UEFI PC firmware).


↺ could be System76

↺ really buggy and flaky UEFI PC firmware


Part of the Nintendo Switch emulation bloat is actually the fault of graphics chip designers.


Apparently, pre-Tiger Lake Intel was the only chipset on PC that could support the kind of texture compression that the Nintendo Switch uses, and since you can’t write a second code path you’re going to have to support users on pre-2020 hardware that’s not being sold have support for this, you have to decompress the textures before throwing them at the graphics card. Even on the small number of systems that have that feature.


It can re-compress the textures into a form PC graphics cards can handle, but in my testing it didn’t save enough RAM to be worth the compute cycles.


I did try the OpenGL renderer to see how much different the performance would be from Vulkan.


Under OpenGL, Red Dead Redemption fell from between 22-30 fps (playable) depending on what’s going on, on my Tiger Lake Iris XE to 11-14 fps under OpenGL, so I quickly went back to Vulkan. 🙂


Yuzu got a version bump yesterday. No performance improvement that I’ve noticed.


One strange thing I did find was that after I purged the foreign language input method programs from Debian and restarted, the framerate on Yuzu has been higher than it was before, with less choppiness.


I suppose it could be unrelated, but the two events happened back to back. █


Share in other sites/networks: These icons link to social bookmarking sites where readers can share and discover new web pages.


Permalink > Image: Mail


 Send this to a friend


Permalink

↺ Send this to a friend



----------

Techrights

➮ Sharing is caring. Content is available under CC-BY-SA.

-- Response ended

-- Page fetched on Sat Jun 1 07:04:45 2024