Raison d'être

· im tosti


It's a French term, the literal translation means "reason to be". It's the claimed reason for the existence of something, the ultimate purpose of it. While humans are born without one (don't argue with me, argue with Sartre) this does not hold for technology.

Why Does Technology Exist? #

Different technologies, projects, and more are made with a purpose in mind. Yet, an important question arises: is that purpose in mind important? This question is well-explored by Roland Barthes, and the answer applies. The moment the work departs the mind of the author, it gains new meaning: just like the author of a book must (metaphorically) die in order to liberate the text, so too must an author of software (or hardware, or a whitepaper...) (metaphorically) die to liberate the produced technology. To do any less is a disservice to everyone who interacts with it.

Wait, stop. I write a thing, why should I not have absolute control over it? After all, no one else has put in their time, sweat, and blood into this thing. If they wanted to, they could always simply fork it, mold it to their desires, and be as well-off as me.

Any personal projects will inevitably be abandoned. The likelihood of it being worthwhile in terms of time spent (let's ignore the version in which it was done just for fun, for now) with a single user is very low. So much so that when met, it tends to suggest that the technology is half-baked. One easy way to ethically justify the energy put into the work is to increase its usefulness and use to a multitude of people. Hundreds of hours spent by one person to save that person 10 minutes is poor, a few extra hundreds of hours spent by one person to save millions days? That's a different matter.

Once the work is available, however, there is no point, practically, to reward the author further. Any future developments may not come, but the cost-benefit ratio has already been met.

These are all contradictory opinions, different worldviews. The point is that the interests of the users, the author, and the work itself, are in direct competition with one another.

The users want their use-case to be met, and to be as nice as possible. This includes regular users, but also corporations. Because their use-case needs to be communicated and implemented, a different way to explain their desire is through the word "control". Users want control over what the technology is and does, and as a side-effect, control over the time and effort as provided by the author. This even applies to contributors and forks: contributing is a way to exercise perfect control over the way something is implemented, a prerequisite to ensure your desire is not misunderstood.

The technology "wanting" something is of course a separate concern, it is not a subject, but a being-in-itself. That said, we can analyze it as a creative work, thus the prior mention of the death of the author. Under such a condition, the work wants to be free from its author's intent. To be interpreted and used in unfamiliar and novel ways. In short, for freedom, emancipation, the removal of limiting factors. In short, the work wants to become a platonic object. From the perspective of a "finished work", this may be possible, but is a technology ever truly finished?

Technology is a reflection of the world, and the world changes around it. In 1987, a never-before-seen language was developed by a linguist. It allowed one to run dynamic scripts like in shell, while integrating the most common shell task (text processing) directly into the language itself, rather than delegating. The powerful regular expression support, ability to easily run the code, parsing capabilities, and intention to be used as a unified UNIX scripting language quickly made it rise to the top of the list. By 1998, it got the moniker "the duct tape that holds the internet together". Perl, created by Larry Wall, was the language to know. The problems it set out to solve — integrating text processing, a more-complete universal shell and scripting language for UNIX, high quality CGI — remain unsolved. And yet, Perl is "out", the world has changed, and therefore moved on from Perl.

No matter how perfect a piece of technology is, it will need changing as the world around it changes. And so, the platonic ideal for a given piece of work will change over time. Consequently, what the work truly wants, is freedom from any specific author. To be a constantly evolving, yet perfect codebase. This can't happen, because the numerous humans that would need to work on it are fundamentally imperfect. As such, the two ways to try and do justice to a technology is either to try to finish it with the highest quality possible, accepting its eventual death, or to maintain an iron grip on quality while still having hundreds of volunteers continue to work on it. In short, a work wants control over its own destiny, without being able to seize it.

Speaking of authorship, what does the author want? People make things for all sorts of reasons. Some want to get bought out, some do it for the portfolio, some do it for the love of what they do, while yet others are experimenting with new ideas. There is no universal desire of the author. The one thing an author almost never wants is to lose control. Someone doing it for their portfolio only loses if it disappears. If it's not you that gets bought out, you've gained nothing. Even publishing software (the most ephemeral technology) anonymously itself is essentially a bid of control (one that always wins). After all, you decided on the rules, and no one can change that, or at least not easily. Satoshi Nakamoto certainly held a tight grip on his situation.

This is all a roundabout way of explaining that the desires of the three parties are contradictory precisely because control over the technology and its future is what they all want, with different things desired across them. So what then, is the "reason to be" of a given piece of technology? Is it what the author wanted for it? Is it what the technology could be, or become? Perhaps it is how the users could find use in it? Let's talk about python.

Python #

In the mid-1980s, a language was created to replace BASIC. Built to be a teaching language, easy for beginners, it included a powerful collection of 5 data types, declaration-free strong typing, no limitations besides how much ram you had, and nesting by indentation. Guido van Rossum worked on it for a few years! No, no, it's not Python, it's called ABC, and it came out of academia.

Rossum was inspired by ABC, but didn't like some of its parts. For example, it was monolithic, and so could not gain support for things like… files, UI, and networking. Do you know that Python, to this day, bundles a separate programming language? Indeed, Python includes tkinter, its de-facto GUI. This GUI is implemented by wrapping over Tk, a windowing toolkit. This toolkit is exclusively usable in the Tcl language, and therefore, to wrap around it, Python also bundles Tcl itself. Python from the start was built to be highly extensible via modules, no doubt because of this formative experience. Python is often seen as a "batteries included" language, precisely because of the many modules it bundles, and how useful they are. Put a pin on that for now.

This gives us a general idea as to what Python is to its creator. What do the users of Python look like, though?

Python is the most used language in the world, and is still growing, at least as per the TIOBE index. Perhaps then it is of little surprise that it's also the most studied. Its ABC heritage, rich standard library (keep holding that pin), and rich scientific ecosystem (numpy, symbolic processing, ML, cryptography…) ensure its continued popularity in those circles. Once a language becomes popular enough in academia, and it's the easiest thing to use to do some applied sciences (especially AI), it explodes in business. This is what happened before to Java, and it's the position Python is finding itself in now.

So the users of Python are students, researchers, security specialists, and businesses. What do they like about it? The concise quickness at which they can prototype, the usefulness of the built-in behaviors, the rich ecosystem.

Due to the sheer momentum that Python has, the secondary option is available to it. To change as the world changes, and adapt to ever be a platonic ideal of itself. This of course goes in tandem with those that change the world through their views, the users themselves, by maintaining the spirit of what's important to them.

In this case, the desires of the users and those of the technology, while not perfectly aligned, are also not that far off from one another. To double down on the rich standard library, improve the ecosystem further (making sure it doesn't become legacy cruft, or at least perceived as such like Perl often does nowadays), to keep including ever better batteries. What do the authors of Python want? Well, Guido has departed the project a while ago. We could look at what his actions were prior to then, but it's been a while, so let's just not bother and look at what's been happening lately, since 3.8.

So we get a new PEG parser, some performance improvements, dictionary merging (x | y), timezone information in the stdlib. Pattern matching, union types (also x | y), the removal of distutils in favor of third party tools!? TOML in the standard library, a bunch of new typing features, removal of crypt, removal of telnet, removal of cgi (too inefficient), removal of smtpd (use the aio version, more on this later). A new JIT?!

We're getting many language features, which push the language away from being a beginner language, signs of giving up on the ecosystem, and plenty of removals from the standard library, sometimes for odd reasons (so what if it's inefficient?). The direction of Python is directly opposite to that of most of its users. Remember that pin? They're taking the batteries out! (Ok not all of them, but you see the point, right?)

Now when I say "most of its users", there is a very specific set of users that are very happy with what's going on. These users are the new class of business users. If your AI product is written in Python (and it borderline has to be), every CPU cycle counts. So now, Python has a new JIT. How does this happen?

Well it's actually quite simple. When Guido left (even before then, but especially after then), Python became a community project. "Oh, well that's good news, isn't it?" You fool! You absolute buffoon! When a project becomes a community project without firm stewardship, what actually happens is that those with the most resources get to make all the decisions. Coincidentally, "those with the most resources" will generally mean businesses. Many businesses. All the businesses. All of their requirements incur maintenance burden. Burden that can't be reduced by removing what they just put in, even if they did contribute the code (rare). Burden that will be reduced by removing those abovementioned batteries, bit by bit.

However, once a module is outside of Python core, it no longer has to abide by the standards of the language itself. There's nothing stopping, say, the only serious cryptography library (since that's not in the standard library) from switching over to a different language. Of course, this is purely hypothetical. It definitely doesn't require a version of that language that oftentimes can't be found in the distribution's package manager.

Now, I have nothing against any (most) of the stakeholders here. I fully understand why cryptography switched to Rust — they've made their position quite clear. The point is that this is a case where the desires of the authors win out, since the authors ultimately hold the control, but where the desires of the authors are essentially dictated by those with the resources to exert said control. Of all the users that got Python to where it is today, none of them are businesses, but they are the ones that make the decisions. And the needs and desires of those users suffer. The Python ecosystem is not getting better, it remains a trash fire. The prototyping usefulness is decreasing, as the standard library is shrinking. The ease of teaching the language is decreasing as it gets progressively more complex. All of these are tradeoffs, but it's always important to look at what is being traded for what, who benefits, and who gets the short end of the stick.

SQLite #

Have you heard about this little database system called SQLite? Yeah, it's on virtually every computer. Made in 2000, it is widely used and generally well-liked to this day. It has changed with the times, and has gained new features that people immediately started using, like JSON support. What's different?

SQLite is Open Source, but not Open Contribution. The authors maintain an iron grip on the sources, and do what they want to do with it. As a consequence, users that want something new out of it basically have to ask and hope. Oh they can fork, of course, and they do. Turso forked SQLite into "libSQL". Dqlite is an "extension" of SQLite by Canonical. They're not particularly used outside of their niches, and the motivations are fairly transparent. Ultimately, SQLite remains what's used, and the SQLite team decides what's in.

This of course means that the project's health is directly determined by how good and benevolent the project leaders are. Not all users are going to get served equally. But as long as it's done "well-enough", the standards can be kept up to par, and so the project is in as healthy a state as it can be.

Other technologies keep switching to SQLite, even! For instance, Podman used to use a Go-native KV database called BoltDB. Anyway it's switched to SQLite.

In a world of fickle abandonment of prior goals, SQLite has maintained steady. It's definitely not the solution to the problem I've outlined at the start, but at least it's a solution.

Anyway back to why tech is shit. Something something boeing go crash.

The Hype Train #

While SQLite has kept on steady, Python exploded and continues to explode. This is because of the hype train. Let me outline a cycle to you.

A technology becomes hyped. Hyped investors tend to invest more in companies that use it. Companies using it now have statistically more cash, and therefore more resources. With those resources, they can influence existing technologies. They also hire more, meaning that people will end up learning the use of this technology. People want to get hired by companies that are doing well, and therefore want to materialize experience of the hyped technology. To do this, they will introduce it into their workplace, so they can now put "experience with [hyped technology]" on their CV. Half the industry now uses the technology in question, and half the potential users are now familiar with it. You can't afford not to know it.

I lied to you before. I said Python became as popular as it is because it was taught at schools. This is certainly true and a large source of its popularity. However if you look at the numbers, they truly exploded once Machine Learning became hyped. How many cryptocurrency companies got big in the wake of NFTs? Thank fuck that died before it got this bad.

The true reason technology sucks is the hype train cycle (and the capitalist hell that empowers it). Opinions differ as to why some things become hyped and some don't. I think I've started enough fires today, so I'm going to abstain from that one… for now! 😇

Because if it wasn't for this, we'd at least have more gentle failure modes.


Thank you for reading my blog post. It was intended to be a less serious blog than Bunker Labs, and it is! Still, I started off a little philosophical, and ended uh. It's 11pm, ok? Until next time!