π» Programming culture in the late aughts π€
Monday, November 28, 2022 :: Tagged under: engineering. β° 8 minutes.
π΅ The song for this post is an accordion and guitar cover of Bella Ciao, by Tobias Kemerich and Betto Malheiros. π΅
Sedat Kopanoglu did a write-up trying to answer the question "how is programming different than it was 20 years ago?", and I thought I'd take a crack at it as of ~12 years ago, when I entered the workforce as a professional programmer. Some of this bleeds into earlier trends than 2010, since I was still studying/eating as much programming material as I could in my last few years in school. I agree with Sedat in a few places, you can see the lobste.rs discussion here.

My best friend and roommate Saurya and I when we graduated, in 2010. I don't know what other photos to post for this so I'll be indulgent and post retro photos from The Time.
Multicore still doesn't matter, while "async" somehow does. 12 years ago everyone was wondering how we'd program on multicore CPUs, since Moore's Law days of automatic "transister count" increases were coming to an end. There was a fair bit of academic ink spilled over how we were going to keep systems performant; people were calling languages without great multicore like Python, Ruby, PHP, and OCaml as having limited lifetimes vs. those that had better stories for it (Java was a player, though Erlang was originally revived solving this problem instead of its high-uptime or distribution stories).
Big evidence of this war was the differences in how to achieve concurrency "scalably": a major consensus was that "locks don't scale," so everyone was looking at ways around locks. Brian Cantrill had a great rebuttal to some of these techniques that I don't fully agree with, but that article has a ton of pointers to what other people were pitching.
"Few problems are like raytracing" which was considered "embarassingly parallel." At the time the worry was on using multiple cores on individual workloads. You can Google "lock free data structures" to see the kind of thing people were writing whitepapers on.
Oddly, while parallelism didn't turn out to matter much, concurrency did:
datacenters, cloud compute, explosion of clients via smartphones, and the
realities of cell phone networks meant that we still had to turn our computing
models inside out, but for the different use case of minimizing latency waits
from blocking IO. That said, the biggest use case for non-mobile computing
turned out to be something like raytracing: handling multiple, independent
requests. Doesn't require parallelism, strictly speaking: Node and nginx
were
built on the wonders of a single-threaded event loop when they don't have to
share state.
"Full-stack" now includes a lot more pancakes. We've added layers and layers of abstractions, tools, and professionalism into every stage of "hosting a site." This is in part due to real factors related to adding hundreds of millions of people to the internet, giving them always-on supercomputers they check at several dozen times a day, common infrastructure and cloud platforms providing solutions for minutiae that used to gate out amateur players, and the explosion of VC/zero-interest capital in the last decade meaning a ton of companies were playing the "eat the world or die" game.
It also, in my opinion, is a result of the incentives on developers to brand themselves as expensive professionals by creating or becoming an authority on the Next Hot Thing, the predilection of people generally to believe simple narratives without measuring ("X is not web scale! Someone loudly said it on the internet!"), programmers naturally liking problems for their own sake and solving them even when they didn't have the problem in the first place, and companies acting irrationally based on aspirations that don't reflect their current reality (FAANG envy; think of how many Americans buy giant cars who don't need giant cars.)
12 years ago, even without AWS, you could get a Linode, host an app on its naked IP, and not made to feel too much like you weren't "serious" about hosting your app. These days you almost always need to know a fair bit of Docker and Docker Compose, a lot of people want to Kubernetes, ELBs got replaced by ALBs + NLBs which you gotta manage in VPCs which you gotta manage through Security Groups and your traffic gets routed through CDNs. Logs are structured and passed to a SaaS which will have a custom search syntax that takes tens of seconds to search them poorly, and they'll get lost in the noise of all those components. You'll spend a non-trivial amount of time devising systems for tracing requests through all these components.
Rich client experiences became their own tech industry. A similar thing to βοΈ happened in the client world. Before, JavaScript wasn't as mature, so jQuery was cutting edge and rich-experience web clients were only possible after a lot of custom work. Examples were Google's "rich Gmail" and Google Wave. We all knew it was conceptually possible to make projects that didn't follow the "webpage -> link to another webpage" paradigm, and act more like "apps," but the amount of JavaScript was unseemly, and server-side HTML was deeply ingrained in our brains.
I attribute this to a few factors, many being in common with the ops explosion I listed above (developer nature and incentives, explosion of clients, VC plays) but I'll give special mention to the iPhone, growing influence of Apple and its product/design culture, and subsequent smartphone proliferation.

Five years before that, freshman year in 2005. I was probably 3 lectures into CS15, which challenged me harder than any class I'd ever taken before. I thought "maybe⦠I could study computers!" I was reeling from disappointment that I probably couldn't take any real theatre classes that year, due to weirdness of Brown's program.
Death of the protocol, then the API. Protocols were a lot more popular: yes we had chat products like AIM and MSN and ICQ, but even with them it was way morepossible to use Adium or Pidgin to speak to all of them. Remember: email is still, somehow, a protocol that anyone can speak! When I was at Google in 2012, one of the TGIF topics was the loss of support of CalDAV. Generally speaking, making your app speak via a protocol was a mark of good computer citizenship, since it was understood that custom clients and giving data to users was a path forward.
Eventually this got replaced by companies and proprietary APIs. Part of this was that we were going beyond the "computer basics" world of chat and documents, so there was less predecessors for protocols (what was the interoperable protocol for maps? tweets? photo albums?) which led to an explosion of mashups and possibility. Protocols allowed for custom clients and servers; APIs kept the servers and data extremely proprietary, but you could at least have custom clients. Anil Dash wrote a lot about this time period with The Web We Lost.
Over time, API waned in power and popularity too. I suspect the main reason is as tech companies became less about customer empowerment and more about being financial instruments, it simply didn't make sense to have an API unless you had to: it was expensive to run and maintain, it exposed your data, more clients limited your control (Twitter clients, for example, didn't serve ads) and increased your liability (Facebook and Cambridge Analytica).
Protocols also suffered extra death because they have trouble keeping up with innovation: if Slack wants to add a feature, they can just add it. If you want to add the same feature to IRC or XMPP, you have everyone else complaining about how you did it, not updating their clients to add it, and locking out your users.
OOP vs. FP; Language semantics vs. language ecosystems. A bit of an extension of the "concurrency, parallelism, vs. async" point above, FP vs. OOP felt more like a battle that was being actively worked out, and people tended to argue over language features and semantics instead of ecosystems and hiring markets. Beating the Averages still had a ton of people believing That One Great Language could be the game-changer, and there hadn't been enough high-profile tech startups to succeed or fail to solidify a solid story around language choice or paradigm.
The story that solidified is: Use Boring Technology. Use Java (or Python or Ruby or, if you really must, JavaScript. I feel like Golang has now ascended here). At the time, there was a lot more belief that A Different Language might be good to use commercially; today that's largely viewed as a sign of poor engineering leadership.
(Cards on the table: I think this narrative, while conventional, is bollocks. ITA was written in Common Lisp. WhatsApp and Discord are written in Erlang and Elixir. Twitter did great with Scala. And Ruby's weird semantics are credited for how one person as able to use it to make a world-changing framework; to call Ruby a "boring" choice now is a testament to how successful the right weird tech can be! "Boring" has a place, but I don't think that's everywhere. I write more about this here).
What changed this was a whole lot of things, IMO:
-
Many "weird tech" shops died. As did many non-weird-tech shops; it's startups after all. But when weird tech shops died, there was an easy scapegoat. Mundane tech or unambitious tech strategy never gets pointed to when those startups die.
-
Professionalization, for lack of a better word: tech workers became a flatter workforce rather than a couple thousand creative weirdos. Imagine if all music went from from high school garage bands (each with a name, personality, and "sound") to a flatter set of interchangeable studio musicians who graduated from music schools with a structured curriculum. The latter probably produces more predictable, very useful music, but certainly is less interested in expressing themselves, and will take fewer risks.
-
Anecdotal, and I say this without judgement: a larger percentage of us (because of βοΈ) just don't care to think about computers much. It's a bit ungenerous, but it's my observation: for a lot of professionals in computing, if you have to think or understand something, it's Worse than the thing you can just intuit something which will be right 90% of the time. I hate when people say "Worse is Better" as if it's a law of nature and not a choice you're actively making; but whatever underlies that principle is in play here too. At least regarding language choice.
This last point dovetails well intoβ¦

Sophomore year. I quit my data structures and algos class, and figured CS was completely behind me. I picked it back up again in 2008, and picked it up hard. idk why I was allergic to looking normal.
Stack Overflow, Experts-Exchange, and more available community knowledge. Finding answers to your tech problems was way, way harder before Stack Overflow. The previous name in the game was Experts-Exchange (that hyphen was very important for the domain name), which had way fewer "experts" answering questions, and critically, was a pay site. To see the answers or ask questions of your own, you had to sign up. In college I learned to avoid the results in Google because they were never actually useful, even if they were the most promising and numerous.
You'd have to trawl forum posts, mailing lists, and read man pages and manuals. There was a lot more experimentation, trial-and-error, and downloard/reading source. This sounds Cool and Hardcore, but it was usually pretty inconvenient π
Of all the products that shaped how we program computers, it's hard to understate the impact of SO.
There are a few others, but I like where this list is!
Thanks for the read! Disagreed? Violent agreement!? Feel free to join my mailing list, drop me a line at , or leave a comment below! I'd love to hear from you π