The Worst Enemies Of Progress (1/5)

Image for post
Image for post
Photo by Louis Reed on Unsplash

Most of us grew up with the notion that the world is constantly moving forward. Developing. Progressing. Out of the dark ages into the age of englightenment. Progress is good. New developments are good. Good for business, good for our living standard. Right?

Software, too, is progressing. Every few months there are new development models, new programming languages, new frameworks, new components. Software is built out of those. Other software is built out of that software. And then rebuilt, using newer components, because newer is better. And then rebuilt again, using still newer components. And then…

Oh wait. Are we still talking about progress?

Perfectionism

I don’t know about other software developers, but I have, from time to time, met the point, where I have built a large and complex piece of software with the resources and tools I had available at the point, and it took me so long to develop that by the time I was done, it was technically obsolete because now there were newer tools and resources I could have used.

The software was done, and in theory I could sell it, and reap the benefits. Or, because I am a perfectionist, I could re-build it using newer technologies — nicer, better, shinier. But by the time I would be done rebuilding it, there would be even newer tools and resources… Should I sell then, or should I re-build it using those? And when should I stop?

Or rather…
- will I ever be done with my product?
And the corollary:
- does perfect and up-to-date software really exist, and if so, who the heck makes it?

Is, then, perfectionism the worst enemy of progress, in our rapidly shifting software-internet-5g-turbocharged world? How do we get out of the perfectionism trap? Or, how does the industry ever finish any projects?

The Pareto Principle

Enter the 80/20 rule, also known as the Pareto Principle.

The interpretation of the rule differs from one company to the other. Traditionally, in large companies (some of which develop ubiquitous operating systems) the interpretation in question is, “You can achieve 80% of the intended result by investing 20% of effort”.

The corollary of that interpretation is, to achieve the remaining 20% of the intended result, you need to invest the other 80% of effort. And “business don’t play that.”

Which often enough results in released products ~80–90% ready, because it simply doesn’t pay off to invest more effort to make the product “more ready”.

Unfortunately, the final 80% of the effort is normally testing and debugging, which should give you an idea about the quality of many software products out there.

Since most software out there relies on previously created software-libraries, or tries to stay compatible (often even bug-compatible) to already existing software, the application of the 80/20 rule results in “progress” being made at the cost of quality: the bugs of every product built on top of an existing buggy product compound, the end product becoming progressively worse as time goes on. (Unless, of course, you throw everything away, and start from scratch. See Perfectionism…)

With this in mind, the Pareto principle can be easily converted into a weaker form of Sturgeon’s Law — at least 80% of everything we produce in the name of Progress is crap.

This has a few neat side effects for business, though: It is easier to sell something new by claiming that it is better. And it is the truth, too — after all, there is still 80% of effort left to perfection! You just need your marketing department to package that message nicely enough for the consumer.

Is, then, the Pareto’s Principle the worst enemy of real progress?

Well, look at it from the programmers’ perspective: what point is there to go beyond 80% if you are going to be rewriting your product in newest, bestest computer language half an hour from now?

Fashion

What has fashion to do with computers? Well, that’s easy. When my grandfather was programmer ( ; ), COBOL and LISP were “in”. The day before yesterday, C/C++ was hip. Yesterday, Java was cool. Five minutes ago, Dot-HET was “dah coolest evah”, and “java suxxorz”. Five minutes from now, it will be something else. Scala? (Who ever remembers Scala?) Clojure? TypoScript? Kotlin? Julia? Maybe finally one of those cool new functional programming languages that have been just around the corner for the last twenty-five years or so.

“But wait”, you’ll say, “this is progress!”

Erhm… is it?

Let’s look at Java for a moment (Or C-Sharp if you prefer, it doesn’t matter). Their main advantages: cross-platform, ubiquitous computing, byte code, just in time compilation, deep mature code base…

Did you know there is another language that has all those features, and some more? A language that was made to program AI in — partially because it can easily parse and write itself — because the code and the data have the same syntax. It happens to be both easily human-readable and easily machine-readable and in some people’s opinion it does a better job at both than XML.

It must be a pretty recent invention, right? After all, AI is all the rage now. What is that new wonder language’s name?

LISP. It was first specified in 1958. That’s right, that LISP that your grandma customized her favorite text and code editor in, with code completion and syntax highlighting. I am talking about EMACS — an editor built in 1976, that quite a few people still use to write code in, because — thanks to Emacs Lisp — it has all the features that you would expect from a modern IDE, like Eclipse or Visual Studio. (It was just about as slow back then, too.) Oh, and it had darkmode. (Every editor had darkmode back then. And only darkmode. I miss my amber monochrome.)

For more recent examples, in C/C++ you can do everything you can do in Java, with less text. Dot-nyet’s C# is a weird mix between Pascal (another flashback from the 1970s), C++ and Java with some “novel” concepts most of which you can easily replicate using C++.

Oh and don’t even get me started on graphics. It took less code to open a dialog window in assembler on AMIGA in 1992 than it is to open the same pop-up in most Microsoft languages or Java today. “High-level”… This is why we are delegating all our GUI needs to browsers nowadays.

If you sit down and compare the features, you will find out that the capabilities of most high-level and/or object-enabled languages, old and new, are very similar, not to say, almost the same. Which is why many people still use LISP, and C/C++, and, well, COBOL. (There are courses for new COBOL programmers now, because somebody got to maintain that 50-year old financial code running at your bank. Because banks mostly don’t do 80/20. Because for them, this way to save money turns out to be too expensive.)

The same is true not just for programming languages, but also for other technologies. Just look at how popular technology buzzwords change overtime. For example, if mentioning AI tripped you up about the language above… guess what, this is not the first time AI has been a buzzword:

A graph of word usage over time for the acronym AI and the phrase Artificial Intelligence, peaking in 1970s, 1990s and today
A graph of word usage over time for the acronym AI and the phrase Artificial Intelligence, peaking in 1970s, 1990s and today

For those of you who still have a wetware long-term memory though, just remember the technology buzzwords from ten, five and one year ago. How many of them are still there?

“But what’s so bad about this?” you might ask, “isn’t fashion what’s keeping the industry alive? People can sell new things!”

Sell things they do. But we are looking at progress now. As in, what brings our knowledge and our technology forward. And “new things” that are created this way don’t: Remember software is built on other software. Just like in science, where theories are built on other theories. If people don’t invest into projects or buy products unless those are based on the most current, the most hip library or language, it doesn’t really matter how many projects you finish: you will never sell. Or, if you sell, it is going to be an unfinished product.

It is like trying to build a house where every time you want to build a second floor, somebody says “but we have a better material for the first floor now, let’s tear it down and build it out of that”: the second floor will never be built. You are just doing the same thing over and over again. That’s not progress, that’s marching in one spot.

But why do people keep doing that? Why do we switch programming languages, libraries, technologies? Why does this fashion-thing exist in the software world — is it a matter of tastes, or is there something else in play?

Well, we did just talk about sales, didn’t we?

Marketing

Fashions come and go. It seems so natural, there are dress fashions, health fashions, diet fashions, so why not software fashions too?

Well, we know why not: it would be nice to actually build your house past the first floor. (Unless you are leaving that rebar sticking out for tax purposes, like they do in parts of Spain.) A better question is, are fashions “natural”, and how do they start?

Well, I can’t say how a dress fashion starts. None of my outfits are concerned with fashion in the least, they are concerned with placing things that I need to carry with me in places that don’t annoy me while walking or working.

I can tell you, though, how fashion in software starts. It starts with a hype.
Now, of course, the hype doesn’t come from no-where.

Remember Java? It was all the rage when I was just starting to discover the internet. “The rage” was started not by a community of developers, or researchers at a university, or anything like that. It was started by Sun Microsystems — back then, an enormous hardware, software and service company. (That was before Oracle bought Sun for their hardware and then quietly dissolved almost all of it, but this is a story for another time). Just as, about ten years later, Microsoft tried to jump on the same bandwagon with .NET. Both of those “hypes” were generated and pushed by the Sales/Marketing divisions of those companies.

Another fashion from around the same time was XML. This one was a very different kind of fashion, since it, from all I can see, started out from W3C creating XML to use as a standardized replacement for HTML. It was a free text markup language backwards-compatible with HTML and extensible, too. It is quite good in that particular role, but it turns out, this idea itself was a bit too idealistic: lay users making websites don’t really care about semantic cleanness of their pages, they care about writing content and putting it into some shape. Which is why HTML5 dropped the xhtml idea again, and people so often ignore the whole html shebang altogether and use a content management system ala wordpress, with a visual editor or simply Markdown for their text instead. Those trends were obvious all the way back in 2000, but people hyped up on XML wouldn’t listen. (This, too, seems a recurrent theme, recently seen with “crypto”.)

Those same people hyped up on XML also started to use it for pure data — and I can’t quite figure out who started this, or why. It wasn’t at all an obvious choice, since better formats have existed since 1985 (if libraries and tools for ASN.1 had been as widely available as those for XML, its perceived complexity and binary nature would not have been an issue. As seen with, for example, Google Protocol Buffers). Nor is it difficult to design a better format for that purpose, text-based or binary — which the slow switch to JSON for almost everything except configuration files (which are switching to YAML) quite clearly demonstrated. As for the alleged equal readability by machines and humans — XML is equally bad in both. They could’ve just used the back-then 30-year old LISP instead.

Somehow, this less-than-ideal idea snowballed, and one of the most frequent sentences said the beginning of the XML hype was “if all applications can XML, they can all exchange data with each other!”. Which, as you probably know by now, is neither so, nor does it make any sense, since there is no way in XML itself to assign any intrinsic “meaning” to a tag: an application doesn’t know whether a tag “p” stands for paragraph, person, parity or pound, neither is there a way to declare meaning like this in the underlying schema documents (though this might eventually evolve, but so it could with any other hierarchical(-lish) extensible language) — which makes it no different to a binary-only format like Protocol Buffers.

The sentence, though, was marketing. Not to the consumer, mind you — back then, a consumer was unlikely to encounter XML anywhere but as XHTML on a web page. But: enough people who invest into things don’t actually understand them, as you could nicely see during the bursting of various bubbles (for example, the DotCom bubble). And they bought shares in new and promising buzzword startups, thinking, “Yey, now we can invest into companies making programs that can understand everything!”. Some of those companies and investors actually survived and made money. But so do, in general, the top levels in any pyramid scheme (which, as you probably know, generates no additional value whatsoever, working purely by making the last people to buy in the suckers).

In all three cases there would have been prior, superior alternatives with existing infrastructure to invest into, and extend. You know, actual progress. And the only reason it didn’t happen, was marketing.

Which lets us conclude: marketing is not a friend of progress either.

But the marketing division doesn’t just go out there and start promoting things because they want to, either. Neither is it a mere coincidence that marketing buzzwords are parts of startups’ business plans. What is the reason for marketing? Is there maybe an even bigger enemy of progress lurking in the background?

Let’s find out … in the next article.

Written by

Science Fiction, Tech, sarcasm, and philosophical ramblings about the Universe.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store