The untold law in software


In memory of Niklaus Wirth passing last year, I will shed light on an adage named after him “Wirth’s law” derived from his article “A Plea for Lean Software” in 1995. Although it does not adhere to the properties of a law, it describes an observation of a phenomenon to do with software becoming less performant faster than the advancement in hardware, which still holds true to this day.

Scavenging through modern software does not take very long to uncover the sheer amount of resource greedy and slow applications in proportion to the seemingly trivial objective they attempt to fulfill. Actually, you might be familiar with this matter already, on your text editor, a word processor, a web site, a chat program, or even games which are renowned for taking optimization seriously. This growing prevalence of software pessimization arose from a culture of denial, ignorance and blind adoption of new technologies. What most people like to remind you about is Moore’s law as a mean to cope, despite the fact that it never promised that software developers would take advantage from microprocessors getting powerful for more performance output.

Strangely enough, people of different backgrounds alike turn awfully defensive the second you raise performance issues, and splutter all kinds of justifications. In contrary, a considerable number of individuals in the programming space are discouraging these discussions, but instead engage in mostly subjective topics namely code readability or project dependent ones like code architecture. Surprisingly, that same group of people are those who can’t stop nagging about scalability, in their beliefs these problems do not scale. There is little push back against bad development practices affecting software performance. Casey Muratori is the only one who comes to mind for his invaluable efforts to spread awareness of the potential of modern computers while debunking said excuses. Unfortunately, several misinterpretations and speculative conclusions emerged in response to Casey. One such misinterpretation is accusing him for suggesting to write more assembly and fine tune it, when in reality Casey only advised for a change of habits to see a noticeable improvement in performance, all that before doing any real optimization work.

There is an example that I like to share to demonstrate a poorly implemented feature that could have been performant if the authors had put some thoughts first: Uncle Bob and Casey Muratori were arguing over “c̵̰̼̙͛́l̵̝͍̽́e̸̘̩̭͆̋͠a̸̮͉͕͆͛͠ń̷̤̅̈́ code”, the debate was held in a Github repo by taking turns in editing a file, until Casey noticed a slowdown in typing speed. The culprit was the emoji picker that does a backward search for a : on each key stroke and only abort if it reaches the beginning of the line. Emoji names are whole words, they have no spaces in between, they could have checked if the word at the cursor position starts with a colon and match it with an emoji table, it takes a minute to achieve this. However Github took a different approach entirely, by stripping the emoji picker away from the file editor (still present in the PR and Issues text boxes). It is embarrassing to witness a product owned by a trillion-dollar company managed to drastically slow down a text box and then fails to fix it. Text editing is a solved problem, it was solved decades ago and received enhancements since. There is absolutely no room to tolerate bad performance here.

On the same talk Uncle Bob & Casey had, Uncle Bob admitted something worthy of note:

Bob: In my work I don’t care about nanoseconds. I almost never care about microseconds. I sometimes care about milliseconds. Therefore I make the software engineering tradeoff towards programmer convenience, and long term readability and maintainability. This means that I don’t want to think about the hardware. I don’t want to know about Ln caches, or pipelining, or SIMD, or even how many cores there are in my processor. I want all that abstracted away from me, and I am willing to spend billions of computer cycles to attain that abstraction and separation. My concern is programmer cycles not machine cycles.

Uncle Bob displayed his disinterest in hardware, he wants nothing to do with it. Many engineers share this mentality of distancing themselves from how computers operate in the name of good – sigh. To view hardware as an obstacle is only gonna evolve to crippling fear, furthermore limits your abilities as a developer. Software run on computers no matter how much you want to bury that reality under the rug of abstractions.

Due to the lack of performance education, there exist loads of misinformation on the internet that I wish to give my opinion and disprove:

  • Compilers produce the same result regardless how the code is written

    Firstly, which compiler? Secondly, compilers optimize behavior, not intent, their output is a reflection of what you feed them. Sometimes you have to outsmart the compiler to avoid certain edge cases where it could generate sub-optimal machine code.

  • Premature optimization is the root of all evil

    Premature judgement on “premature optimization” is a rotten mindset. I can safely say anyone who confront you with this card only heard it from a clueless person then echoed it back to you. Real premature optimization is the act of aimlessly trying to improve code performance without solid proof that doing so is of any help.

  • Warm up phase

    There is a commonly accepted idea that programs take time to launch. In particular, Python and NodeJS developers have acknowledged this for a fact. To them, it is normal for a program to take hundreds of milliseconds to seconds before it starts doing something useful. For reference, Gentoo Portage tools are written in python, it takes over 4 seconds to run emerge --version, Firebase CLI is worse in comparison, it would be faster if you curl the API by hand. We let that happen by using scripting languages to write complex systems.

  • But network speed is our bottleneck

    I doubt the case. Home internet is fast now, and the latency is lower than ever, so this bottleneck is likely to be artificial. I blame client side rendering (CSD), a plague in the web world with fictional gains. Instead of retrieving a web page’s content in one go, some web genius thought it was a good idea to bombard the server with dozens of requests and let a web frontend framework reconstruct the page:

    • Github sends 140+ requests to display the home page.
    • Linkedin sends 400+ requests to display my feed, it takes forever.

    If you live in CSR delusions and you use Reddit for whatever reason, please consider old.reddit.com over reddit.com everytime you visit the site. I also recommend to listen to what Casey had to say about IO bound issues.

  • Think from a business perspective

    No thank you, I will think from a user perspective with technical knowledge. I really do not care about your business, especially for the minimal effort to ship something respectful to my hardware. Deadlines are not an excuse for bad performance. Am I supposed to accept that Discord runs poorly because someone had to meet the deadline for the “super reactions” feature? Most software solutions are reskins, solving solved problems inadequately on repeat. Businesses just do not prioritize or dedicate resources for optimization or performance analysis because they do not understand how much it impacts their users experience.

Software development is regressing. It became harder to justify PC/Phone upgrades when the available software does not attempt to take advantage of the hardware in the first place. Performance is a budget mostly exchanged for laziness or wasted because of incompetence.

Articles from blogs I read

Generated by openring
  • There's a common narrative that Microsoft was moribund under Steve Ballmer and then later saved by the miraculous leadership of Satya Nadella. This is the dominant narrative in every online discussion about the topic I've seen and it's a commo…

    via danluu.com
  • In November of last year, I wrote Richard Stallman’s political discourse on sex, which argues that Richard Stallman, the founder of and present-day voting member of the board of directors of the Free Software Foundation (FSF), endorses and advocates for a ha…

    via Drew DeVault's blog
  • First Livestream Soon

    I’ve recently decided that I’m going to do a livestream. This livestream is mostly just a test, but feel free to stop on by if you want to. I have no idea what to expect…so I’ll see how this goes. The stream will start on August 7 at 5:30 PM (EST). The link i…

    via Bryce Vandegrift's Website