-- Leo's gemini proxy

-- Connecting to dfdn.info:1965...

-- Connected

-- Sending request

-- Meta line: 20 text/gemini;lang=en-US

Wirth’s Law and the story of Bloatware


While many people have heard of Moore’s Law, which I’ve discussed in a previous article, fewer might know about the potentially even more important Wirth’s Law.


Wirth’s Law states that software is getting slower more rapidly than hardware is becoming faster. A real-world example of this is illustrated below:


In a 2008 article in InfoWorld, Randall C. Kennedy, formerly of Intel, introduces this term using successive versions of Microsoft Office between the year 2000 and 2007 as his premise. Despite the gains in computational performance during this time period according to Moore’s law, Office 2007 performed the same task at half the speed on a prototypical year 2007 computer as compared to Office 2000 on a year 2000 computer.


Kennedy, Randall C. (2008-04-14).“Fat, fatter, fattest: Microsoft’s kings of bloat”.InfoWorld.


This is one of the reasons that the RAM that got humanity to the Moon wouldn’t even be able to load a single tab in Chrome. The issue of software development is more complex than a direct comparison giving us all the answers and some even go as far as to call modern software ‘fatware.’ Have you ever stopped to think how much of the program that is in front of you, or hidden within the code is actually needed to do the job you are asking that program to do? Wirth pointed to both of these as being contributing issues to the expansion of software that didn’t have a significant increase in function. Did the above example take into account any significant feature changes between those two versions of Office? One point that should be mentioned of course is that some of those additional systems allow the software to be accessible to a greater number and diversity of users. That of course means more people are able to access the benefits of a computer and in a colder sense, you have more consumers for your product as a software developer.


Consider a basic computing task: word processing. The very first version of Microsoft Word came on a 3.5″ or 5.25″ diskette. Microsoft Word 6.0 came on seven diskettes, Word 95, 97 and 2000 on a CD. A modern Microsoft Office 365 install (admittedly containing Word, Excel and PowerPoint) is 4GB. That is a significant evolution of space required for an application to type words onto a computer. Now of course it isn’t quite that simple since the modern word processor has to do a few more things and has more features, but it is hard to imagine that the application truly utilizes all of the space it requires to its fullest potential. As an aside, OpenOffice is a 143.3MB install and LibreOffice that carries its torch is 332MB in size which really makes you wonder what is going on under the hood of both products that these differences are so vast. I doubt SmartArt support makes up the difference. A part of that is likely going towards Microsoft’s efforts to make its software as easy to use for as many different people as possible; that functionality has to come at a cost of resources.


Let’s examine another oddity, the modern web browser. Tom’s Guide did a great little comparison back in 2021 with the following results:


Google Chrome Microsoft Edge Mozilla Firefox

10 tabs 952 MB 873 MB 995 MB

20 tabs 1.8 GB 1.4 GB 1.6 GB

60 tabs 3.7 GB 2.9 GB 3.9 GB

2 instances / 20 tabs apiece 2.8 GB 2.5 GB 3.0 GB

If we compare that to Netscape Navigator 1.0 in 1994, it required 4MB of RAM. Jumping ahead to 2000, Netscape 6.0 required 64MB of RAM. Internet Explorer 1 required 8MB of RAM in 1995. Internet Explorer 6 in 2001 required 16MB of RAM. This jumped significantly in 2006 when Internet Explorer 7 required 64MB. We would see another significant jump with Internet Explorer 8 with 512MB on Vista and again with Internet Explorer 10 demanding 2GB.


Why is this? The short, oversimplified answer is the internet and the code that runs it is more complicated. In 1997 HTML 4 was brought in with CSS sheets and the rest was downhill with modern web browsers having to support streaming video, WebGL, XML documents and several other standards. In other words, we made the internet do more, so it needs more resources to run. Building all of this functionality in meant it was generally easier to use and provided more functionality but that will of course come at again at the cost of resources.


So how does this all stack up historically? Are we really using that much more resources? Well, the answer wasn’t as clear as I originally thought.


To examine this I picked a laptop from the time period and calculated rough percentages for the software and the demands it placed on the system.



Wikström, Johan, CC BY 3.0 <https://creativecommons.org/licenses/by/3.0>, via Wikimedia Commons

IBM ThinkPad 360:

Released in 1994.

Max RAM: 20MB

Max HDD: 540MB


Resources used by Word 6.0: 4MB RAM, 25MB Disk Space or 20% of the RAM capacity and 4% of the HDD

Resources used by Netscape Navigator 1.0: 4MB RAM, 5MB Disk Space or 20% of the RAM capacity and 1% of the HDD




Lenovo ThinkPad X1 Nano:

Released 2021.

Max RAM: 16GB

Max SSD: 1TB


Resources used by Office 365: 4GB RAM, 4GB Disk Space or 20% of the RAM capacity and 0.39% of the SSD

Resources used by Google Chrome: 128MB RAM (~per tab averaged), 100MB Disk Space or 0.78% of the RAM capacity per tab* and 0.010% of the SSD

*It is not common for a user however to just have a single tab open in a modern web browser so this percentage is often considerably higher. However, using the worst-case scenario from the chart above, it still doesn’t break the 20% mark on a higher-end system. It would be more significant on a mid-to-low-end system.


What conclusions can we draw from this easily? Not many as there are many factors that these statistics simplify. It would appear however we have programs that respect our advancement in storage media more than our RAM. Or our advancements in storage technology have outpaced our advancements in RAM. Perhaps an argument could be made the computer will show its age the fastest is the one with the least amount of RAM as there are limits on how much can be paired with each chipset. Another point to consider is how much software does the typical user actually actively use at any given time? Granted there are those of us with 40+ tabs, virtual machines, and various document and project editors open but we are not the majority.


Wirth’s Law might not always be true, but there is some merit to the underlying reasons that it was proposed in the first place. We are asking our software to do more than it has ever done before and computing tasks are growing more complex as the end-user demands more complexity in what is possible while at the same time lowering the bar of entry in terms of the knowledge required to do those tasks. The big question of course is, will it be worth it? Are the tradeoffs worth the cost in performance? With the possibility of our CPUs not getting much more complex according to Moore’s Law beyond the year 2025, is there going to be a renewed need for software optimization? Feel free to reach on to me on Twitter, I’d love you hear what you think.

-- Response ended

-- Page fetched on Mon May 13 15:56:58 2024