I think there’s often a misalignment of incentives when annual perf reviews are judged on feature work delivered not quality. Engineers who spend any time polishing or finding and fixing bugs wind up rated mid, while folks who quickly crank out mediocre or bad code that does something new are on a fast track for promotion. This creates a current in the whole org where PMs, engineers, managers, etc., are all focused on new things all the time. Any quality work has to accompany a new feature proposal for traction, and the quality items in that project will never get cross functional support like the new feature work does.
This resource allocation strategy seems rational though. We could consume all available resources endlessly polishing things and never get anything new shipped.
Honestly it seems like the another typical example of the “cost center” vs “revenue center” problem. How much should we spend on quality? It’s hard to tell up front. You don’t want to spend any more than the minimum to prevent whatever negative outcomes you think poor quality can cause. Is there any actual $ increase from building higher quality software than “acceptable”?
> I've been tracking software quality metrics for 3 years
I don’t think you can draw conclusions from that short a period.
As a counterpoint: in the ‘80s and early ‘90s, my brain was almost hardwired to hit the hotkey for “Save” every few seconds while working, even though that could mean applications became unresponsive for seconds, because I didn’t trust the application to not crash while idle.
Yes, part of that is because applications nowadays rarely run out of memory, and likely don’t have code that tries to keep things running in low-memory conditions, but that’s not all of it. A significant part was that applications were buggy. (Indirect) evidence for that is that they also were riddled with security holes.
Where I’m at, needless complexity is forced upon us. At the same time, we are constantly pushed to deliver new capabilities on timelines that are dictated to us, devoid of any grounding in reality. There is no room to even have the conversation about proper design or solving the right problems. It’s all about hitting arbitrary dates with “features” no one really cares about, while ignoring the foundation it all has to sit on.
The more loudly someone speaks up, the faster they are shown the door. As a result, most people keep their head down, pick their battles carefully, and try to keep their head above water so they can pay the rent.
Complexity, state, and lack of thoughtfulness/design. When things are rushed they are lower quality and it seems nowadays almost everything is rushed. Also, time taken "pondering" outside of writing code and stuff can be robbed by constant distractions via mobile device waay more than in decades past.
I'd challenge your assumption that quality is collapsing.
I remember the good old days where nobody unit tested, there were no linters or any focus on quality tools in IDEs. Gang of four patterns we take for granted were considered esoteric gold plating.
Sure, memory usage is high, but hardware is cheap.
Shoulda posted your link as a link instead of hiding it in text where we can’t click on it. If your blog post doesn’t stand on its own without an explanation you should rewrite it.
My view is fairly simple - demand for technology is always increasing at a rate which far outstrips supply of _good_ engineers (by a significant factor). The lure of a well paid career tempts many to the world of software engineering even if they're not very good at it.
Look at the construction industry. Many buildings on this planet were built hundreds, sometimes a thousand or more years ago. They still stand today as the quality of their build quality was excellent.
A house built today of cheap materials (i.e poor quality software engineers) as quickly as possible (i.e urgent business timelines) will fall apart in 30 years while older properties will continue to stand tall long after the "modern" house has crumbled.
These days software is often about being first to market with quality (and cough security) being a distant second priority.
However occasionally software does emerge as high quality and becomes a foundation for further software. Take Linux, FreeBSD and curl as examples of this. Their quality control is very high priority and time has proven this to be beneficial - for every user.
I believe its a mix of three factors, (a) lack of transfer of institutional knowledge (b) lesser fundamental incentives for people to get better at fundamental skills/gaps (c) rise in hotfixes as we deal with time/scales that operate much faster, burn faster, and want to expand faster.
All of the above is multiplied 1.3x-1.5x with accelerating ways to get upto speed with iterative indexing of knowledge with llms. I believe we are reliant on those early engineers whose software took a while to build (like a marathon), and not short-sprinted recyclable software we keep shipping on it. The difference is not a lot of people want to be in those shoes (responsibility/comp tradeoffs.
> Big Tech is spending $364B on infrastructure instead of fixing the code
You mean CrowdStrike still crashes? Spotlight still writes 26TB every night? (Which only happened in beta, AFAIK...) Of course, they are fixing the code. Conflating infrastructure spending is not helpful.
The bitter truth is that complex software will always contain some bugs, it's close to impossible to ship a completely mathematically perfect software. It's how we react to bugs and the report/fix/update pipeline that truly matters.
One possible factor is the proliferation of LLMs to write code for us. I noticed that a few versions after Jetbrains implemented LLM integration, the bugs in their products skyrocketed. Also, in my own job, it's often tempting to use LLMs where I really shouldn't, by which I mean where I can't easily check the code to ensure that it's fully and subtly correct. For example, if I'm working with an unfamiliar library I might ask an LLM what the incantation is to do something, and then check it by reading the docs of each thing that it references. I regularly find issues when doing so, but a different developer who didn't check these docs may miss this and ship a subtle bug.
You're seeing apps crashing, but apps have always used to crash. What makes you think that software quality is collapsing? How do you know it's getting worse, not just staying the same?
It's a simple, timeless, inescapable law of the universe that failures, while potentially damaging, are acceptable risks. The Pareto principle suggests that addressing only the most critical 20% of issues issues yields a disproportionate 80% of the benefits, while the rest of the big bounties yield diminishing marginal returns.
We're seeing bugs in bigger slices because technology is, overall, a bigger pie. Full of bugs. The bigger the pie, the easier it is to eat around them.
Another principle at play might be "induced demand," most notoriously illustrated by widening highways, but might just as well apply to the widening of RAM.
Are we profligate consumers of our rareified, finite computing substrate? Perhaps, but the Maximum Power Transfer Theorem suggests that anything less than 50% waste heat would slow us down. What's the rush? That's above my pay grade.
I guess what I'm saying is that I don't see any sort of moral, procedural, or ideological decay at fault.
In my circles, QA is still very much a thing, only "shifted left" for tighter integration into CI/CD.
Edit: It's also worth reflecting on "The Mess We're In."[0] Approaches that avoid or mitigate the pitfalls common to writing software must be taught or rediscovered in every generation, or else wallow in the obscure quadrant of unknown-unknowns.
The same way that nothing can be built anymore in America
Every framework is like a regulation, something which solves an ostensible problem but builds up a rot of inefficiency that is not visible. The more frameworks, layers upon layers needed to make an application, the more it becomes slow, small errors are ignored, abstractions obfuscate actual functionality.
Some of it is the fault of AI and the belief that software is super easy to create now and maintenance won't be an issue in the future, mainly by folks who have very little to no experience with writing software but attempt it via the many ways you can do it with AI these days.
Then there's the shiny object syndrome of humanity in general even if we just look at websites they went through so many different cycles, plain html, flash, everything built with bootstrap.css, then came the frameworks, then back to SSR/SSG, etc... etc..
Both of those are just symptoms of a larger disease , namely lack of enthusiasm in general has fallen, a lot of it has to do with how demanding day to day software jobs have gotten, or how financially unstable the younger generations feel so they rarely set aside any time for creative endeavors and passion projects
There are far far too many sloppy devs who slip through the cracks. They never get the mentoring they need or shown the door.
All sense of teamwork was murdered about a decade ago by people with clipboards and other dead weight staff who don't give a rat's ass about anything.
Most devs under 30 don't have the same enthusiasm previous generations did because the opportunity being proposed just isn't the same. The room for creativity isn't there, and neither is the financial reward. Do more with less and these problems tend to go away.
I agree there is this problem, which is why I try to write software that is actually good (and use software that is actually good, if it is available). It is also one reason why I still use some DOS programs (even with emulation, which is slow, it is still better than the bad quality they have in newer computers).
I do not use any of the software mentioned in that article, and I also do not have that much RAM in my computer.
Short answer: because lock-in disables competition, and cloud-based business models enable almost perfect lock-in.
Software quality only matters when users can switch.
Well, Agile said that we don’t need testers (because everybody owns quality and slogans are magic). DevOps said we don’t have time for testers (because we reaaaally feel like shipping now). AI people said AI will do all the testing (because of course they did).
Nobody likes thinking critically and admitting that they haven’t achieved a responsible standard of care. If they aren’t forced to do it, why bother?
Speaking for myself, about my own software that I write alone: building new things is more exciting. Even with all financial incentives aside, I just like building new things more than I like polishing old ones.
I could improve the quality infrastructure, write more tests and clean up the code, but the work is not as fulfilling.
I might be going further than most, but my personal take is that it happened when Woz stopped being involved: https://lists.sr.ht/~vdupras/duskos-discuss/%3CZ4p_GHsw5arWG...
The rest is just a downhill trend.
And not only this, Jetbrains, which consumes tons of RAM, Chrome, and so on. When did we decide that resources are free?
Quality is not immediately economically useful in a way an average MBA would understand and be able to communicate to shareholders. Also, we have spent many decades wrapping layer after layer of complexity on top of the cpu. That is starting to show, nobody really understands what’s going on any more.
To me, a big reason is that people who don't understand how it works have a say in how it should be done.
It's not the case in traditional engineering fields: when you build a dam, the manager cannot say "hmm just use half as much concrete here, it will be faster and nobody will realise". Because people can go to jail for that. The engineers know that they need to make it safe, and the managers know that if the engineers say "it has to be like that for safety", then the manager just accepts it.
In software it's different: nobody is responsible for bad software. Millions of people need to buy a new smartphone because software needs twice as much RAM for no reason? Who cares? So the engineers will be pushed to make what is more profitable: often that's bad software, because users don't have a clue either.
Normal people understand the risks if we talk about a bridge or a dam collapsing. But privacy, security, efficiency, not having to buy a new smartphone every 2 years to load Slack? They have no clue about that. They just want to use what the others use, and they don't want to pay for software.
And when it's not that, it's downright enshittification: users don't have a choice anymore.
In JavaScript world the name of the game was always hiring and firing. Application quality, security, performance, accessibility and other things are largely irrelevant until someone gets sued.
What’s interesting is that the result is immature developers. This becomes evident in that although the goal is rapid hiring/firing, which is completely hostile to the developer, the impacted developer is somehow convinced such hostility is their primary vector of empowerment. For example if an employer mandates use of a tool to lower barrier of entry to less qualified candidates those less qualified candidates are likely to believe that tool is there primarily to benefit them. That makes sense if the given candidate is otherwise completely unqualified, but it’s nonetheless shortsighted and narcissistic.
As a result software quality degrades as the quality of people doing the work degrades while business requirements simultaneously increase in complexity/urgency to compensate.
Spotify and Youtube Music on desktop are so bad (esp CPU) that I vigilantly shut them down the second I'm not listening to music which isn't something I've done wrt computer resources since I had a 10" Atom netbook fifteen years ago with 2gb RAM.
I'm sure they're no better on my iPhone but I don't even have the appropriate tools to gauge it. Except that sometimes when I use them, another app I'm using closes and I lose my state.
There's no pressure to care. Most users can't tell that it's your app that's the lemon. The only reason I know anything about my Macbook is because I paid for iStatMenus to show me the CPU/RAM usage in the global menubar that can quickly show me the top 5 usage apps.
This basic info should be built in to every computer and phone.
> I've been tracking software quality metrics for 3 years as an engineering manager
What metrics specifically?
Old software also sucked.
I think one big issue is that companies got massive. Apple's revenue, for instance, would make them the 40th largest country in the world by GDP - larger than Portugal, Hungary, Greece, and the overwhelming majority of countries in existence. And it seems to be essentially a rule that as companies reach a certain threshold of size, the quality of what they produce begins to trend downward until they're eventually replaced by an upstart who then starts their trek towards reaching the threshold of fail.
But in modern political and economic times where number must always go up, too big to fail is a thing and anti-trust enforcement isn't (to say nothing of the FTC mostly just ¯\_(ツ)_/¯ with regards to basically any merger/acquisition of big tech), the current batch of companies just keeps growing and growing instead of being naturally replaced. To say nothing of the fact that a lot of startup culture now sees being acquired as the endgame, rather than even dreaming of competing against these monstrosities.
I have a hot take on this. Concentration. The ability to unify one's mind around a topic.
This is correlated with "joy" and happiness, or contentment, which brings about patience. It is anti-correlated with pain and stress which brings about restlessness.
In short, good things take time. You cannot hurry the seasons. The competition is too fierce to worry about leaks.
We need to feel safe before we get creative, barring that, we hurry there.
The 'upper' end of 'society' isn't holding itself to account. Who is left to look up to?
It went down due to speed and cost constraints, then went down farther with AI slop. That's what I've seen in my career. Testing has been an afterthought for a while and AI has resulted in "passing" but mostly worthless tests.
Dude go back in time and try to use bluetooth in 2003 and tell me things are worse. Try figuring out how to deploy code to multiple servers and how smooth that process was...
Not really answering your question, but: One completely imo unnecessary category of sloppy software is electron apps. It's totally ridiculous how little resources are put into alternatives like tauri given how most dekstop apps run on electron, and we know how bad it is.
Oh dear, starting a conversation about software quality on HN.
Sadly, it won't fare well. You'll get a mix of flags and downvotes, along with "There's no problem! This is Fine!".
I feel that software has become vastly more complex, which increases what I call "trouble nodes." These are places where a branch, API junction, abstraction, etc., give space for bugs.
The vast complexity means that software does a lot more, but it also means that it is chock-full of trouble nodes, and that it needs to be tested a lot more rigorously than in the past.
Another huge problem is dependence on dependencies. Abstracting trouble nodes does not make them go away. It simply puts them into an area that we can't test properly and fix.
we keep buying the stuff
[dead]
[dead]
its not, software quality is better than ever, far more sophisticated than the simple programs of the past.
Human beings are ephemeral. They're born, they die.
Everything human beings create is ephemeral. That restaurant you love will gradually drop standards and decay. That inspiring startup will take new sources of funding and chase new customers and leave you behind, on its own trajectory of eventual oblivion.
When I frame things this way, I conclude that it's not that "software quality" is collapsing, but the quality of specific programs and companies. Success breeds failure. Apple is almost 50 years old. Seems fair to stipulate that some entropy has entered it. Pressure is increasing for some creative destruction. Whose job is it to figure out what should replace your Apple Calculator or Spotify? I'll put it to you that it's your job, along with everyone else's. If a program doesn't work, go find a better program. Create one. Share what works better. Vote with your attention and your dollars and your actual votes for more accountability for big companies. And expect every team, org, company, country to decay in its own time.
Shameless plug: https://akkartik.name/freewheeling-apps