While many people have heard of Moore’s Law, which I’ve discussed in a previous article, fewer might know about the potentially even more important Wirth’s Law.

Wirth’s Law states that software is getting slower more rapidly than hardware is becoming faster. A real-world example of this is illustrated below:

In a 2008 article in InfoWorld, Randall C. Kennedy, formerly of Intel, introduces this term using successive versions of Microsoft Office between the year 2000 and 2007 as his premise. Despite the gains in computational performance during this time period according to Moore’s law, Office 2007 performed the same task at half the speed on a prototypical year 2007 computer as compared to Office 2000 on a year 2000 computer.

Kennedy, Randall C. (2008-04-14).“Fat, fatter, fattest: Microsoft’s kings of bloat”.InfoWorld.

This is one of the reasons that the RAM that got humanity to the Moon wouldn’t even be able to load a single tab in Chrome. The issue of software development is more complex than a direct comparison giving us all the answers and some even go as far as to call modern software ‘fatware.’ Have you ever stopped to think how much of the program that is in front of you, or hidden within the code is actually needed to do the job you are asking that program to do? Wirth pointed to both of these as being contributing issues to the expansion of software that didn’t have a significant increase in function. Did the above example take into account any significant feature changes between those two versions of Office? One point that should be mentioned of course is that some of those additional systems allow the software to be accessible to a greater number and diversity of users. That of course means more people are able to access the benefits of a computer and in a colder sense, you have more consumers for your product as a software developer.

Consider a basic computing task: word processing. The very first very of Microsoft Word came on a 3.5″ or 5.25″ diskette. Microsoft Word 6.0 came on seven diskettes, Word 95, 97 and 2000 on a CD. A modern Microsoft Office 365 install (admittedly containing Word, Excel and PowerPoint) is 4GB. That is a significant evolution of space required for an application to type words onto a computer. Now of course it isn’t quite that simple since the modern word processor has to do a few more things and has more features, but it is hard to imagine that the application truly utilizes all of the space it requires to its fullest potential. As an aside, OpenOffice is a  143.3MB install and LibreOffice that carries its torch is 332MB in size which really makes you wonder what is going on under the hood of both products that these differences are so vast. I doubt SmartArt support makes up the difference. A part of that is likely going towards Microsoft’s efforts to make its software as easy to use for as many different people as possible; that functionality has to come at a cost of resources.

Let’s examine another oddity, the modern web browser. Tom’s Guide did a great little comparison back in 2021 with the following results:

  Google Chrome Microsoft Edge Mozilla Firefox
10 tabs 952 MB 873 MB 995 MB
20 tabs 1.8 GB 1.4 GB 1.6 GB
60 tabs 3.7 GB 2.9 GB 3.9 GB
2 instances / 20 tabs apiece 2.8 GB 2.5 GB 3.0 GB

If we compare that to Netscape Navigator 1.0 in 1994, it required 4MB of RAM. Jumping ahead to 2000, Netscape 6.0 required 64MB of RAM. Internet Explorer 1 required 8MB of RAM in 1995. Internet Explorer 6 in 2001 required 16MB of RAM. This jumped significantly in 2006 when Internet Explorer 7 required 64MB. We would see another significant jump with Internet Explorer 8 with 512MB on Vista and again with Internet Explorer 10 demanding 2GB.

Why is this? The short, oversimplified answer is the internet and the code that runs it is more complicated. In 1997 HTML 4 was brought in with CSS sheets and the rest was downhill with modern web browsers having to support streaming video, WebGL, XML documents and several other standards. In other words, we made the internet do more, so it needs more resources to run. Building all of this functionality in meant it was generally easier to use and provided more functionality but that will of course come at again at the cost of resources.

So how does this all stack up historically? Are we really using that much more resources? Well, the answer wasn’t as clear as I originally thought.

To examine this I picked a laptop from the time period and calculated rough percentages for the software and the demands it placed on the system.

Wikström, Johan, CC BY 3.0 <https://creativecommons.org/licenses/by/3.0>, via Wikimedia Commons

IBM ThinkPad 360:
Released in 1994.
Max RAM: 20MB
Max HDD: 540MB

Resources used by Word 6.0: 4MB RAM, 25MB Disk Space or 20% of the RAM capacity and 4% of the HDD
Resources used by Netscape Navigator 1.0: 4MB RAM, 5MB Disk Space or 20% of the RAM capacity and 1% of the HDD

Lenovo ThinkPad X1 Nano:
Released 2021.
Max RAM: 16GB
Max SSD: 1TB

Resources used by Office 365: 4GB RAM, 4GB Disk Space or 20% of the RAM capacity and 0.39% of the SSD
Resources used by Google Chrome: 128MB RAM (~per tab averaged), 100MB Disk Space or 0.78% of the RAM capacity per tab* and 0.010% of the SSD
*It is not common for a user however to just have a single tab open in a modern web browser so this percentage is often considerably higher. However, using the worst-case scenario from the chart above, it still doesn’t break the 20% mark on a higher-end system. It would be more significant on a mid-to-low-end system.

What conclusions can we draw from this easily? Not many as there are many factors that these statistics simplify. It would appear however we have programs that respect our advancement in storage media more than our RAM. Or our advancements in storage technology have outpaced our advancements in RAM. Perhaps an argument could be made the computer will show its age the fastest is the one with the least amount of RAM as there are limits on how much can be paired with each chipset. Another point to consider is how much software does the typical user actually actively use at any given time? Granted there are those of us with 40+ tabs, virtual machines, and various document and project editors open but we are not the majority.

Wirth’s Law might not always be true, but there is some merit to the underlying reasons that it was proposed in the first place. We are asking our software to do more than it has ever done before and computing tasks are growing more complex as the end-user demands more complexity in what is possible while at the same time lowering the bar of entry in terms of the knowledge required to do those tasks. The big question of course is, will it be worth it? Are the tradeoffs worth the cost in performance? With the possibility of our CPUs not getting much more complex according to Moore’s Law beyond the year 2025, is there going to be a renewed need for software optimization?  Feel free to reach on to me on Twitter, I’d love you hear what you think.

For those unfamiliar, Moore’s Law is an observation that the number of transistors in an integrated circuit doubles approximately every two years.

By Max Roser, Hannah Ritchie – https://ourworldindata.org/uploads/2020/11/Transistor-Count-over-time.png, CC BY 4.0, https://commons.wikimedia.org/w/index.php?curid=98219918

In 2005, Moore stated that this projection cannot be sustained indefinitely and in 2016 the International Technology Roadmap for Semiconductors moved away from this style of road mapping. Moore further said that the Law that he helped develop would likely end around 2025.

So what does this have to do with laptops and computers?

Simple. It shows a fundamental and unnecessary need to purchase a brand new machine based on CPU performance alone, at least for the majority of users. One thing that has been made clear is, that outside of certain chip requirements like TPM 2.0 for Windows 11, some laptops that are over 12 years old are still fully capable of doing the tasks that their owners require them to do. That of course is before you introduce Linux into the equation which further extends the usefulness of some older hardware.

Even if you do require Windows 11 and need a TPM 2.0 chip to ensure it is officially supported, you are still left with 5 generations of CPUs that are able to meet those requirements.

In recent years, one of the best things about CPU advancement has been power efficiency and the battery technology to support it. This is one of the reasons that laptops with 50Whr batteries can outlast their predecessors that had 99Whr batteries. But how much better are our CPUs for handling modern tasks? I would suggest outside of a very small group of people, the majority do not benefit directly and immediately from the incremental updates to chipsets that are currently taking place outside of video rendering technology (graphics cards) and even those advancements are likely debatable. We also have multiple cores now within a single CPU socket that, if the software is built to take advantage of, can lead to further performance gains but not usually at the scale we’d expect of two cores doing double what a single core would. That is a topic for another day.

Therefore it isn’t too much of a stretch of the imagination that buying a used computer or laptop is actually viable. This was further exemplified at the start of the COVID-19 pandemic and the resulting chip shortages. Used laptops increased in value not only because the supply of parts to assemble new ones was depleted, but older laptops were still capable of fulfilling their required role for many users. Again, there will always be the exception of those that need one of the new features coming in the latest Intel or AMD chipset, but for people who need a reliable computer for email, coding, document production and other tasks that older CPUs are more than capable of handling, it makes these processes more accessible to a wider audience and potentially help individuals take their next steps. I’ve had the pleasure over the years to read many comments on my channel about people doing exactly this; buying, finding or being gifted a cheap laptop to do the work that they needed to do and move forward.

All this taken together, the final message to deliver is that the majority of people don’t NEED a newer computer, they might WANT a newer one though. This could be based on a real or imagined need that the new piece of technology solved, but making that choice in part is a privilege that consumers shouldn’t take lightly. I’ve been using my used ThinkPad X220 since 2018 around the house running Linux for a variety of different tasks and it continues to perform admirably. To see my journey of upgrades and mods, see the playlist below.

If you want to see how far your dollars can stretch entering the world of used, quality hardware, I suggest this fantastic ThinkPad Price Guide to get you started.

ThinkPad Price Guide V7