I think what you want is for software developers not to write bloated code, instead of computers not getting faster. The bloated code is a result of undisciplined programming and not paying attention to users' devices.
If our computers never got faster, we would never get faster computers (obviously...) to run efficient code even faster. 3D rendering and physics simulation come to mind.
I have noticed what you mention over longer timescales (e.g. a decade). But it's mostly "flashy" software - games, trendy things... Which also includes many websites sadly - the minimum RAM usage for a mainstream website tab these days seems to be around 200MB.
Anecdata: My 12 year old desktop still runs Ubuntu+latest Firefox fine (granted, it probably wouldn't be happy with Windows, and laptops are generally weaker). Counter-anecdata: A friend's Mac Pro from many years ago can't run latest Safari and many other apps, so is quite useless.
> I think what you want is for software developers not to write bloated code, instead of computers not getting faster. The bloated code is a result of undisciplined programming and not paying attention to users' devices.
I am so fed up of hearing this. I would love to optimise my code, but management will always prioritise features over optimisations because that is what drives sales. This happens at almost every company I've worked at.
Also more often than not, I have a huge problem even getting stuff working and having to wrangle co-workers who I have to suffer with that cannot do basic jobs, do not write test and in some cases I've found don't even run the code before submitting PRs. That code then get merged because "it looks good" when there is obvious problems that I can spot in some cases from literally the other side of the room.
> If our computers never got faster, we would never get faster computers (obviously...) to run efficient code even faster. 3D rendering and physics simulation come to mind.
The solution to that is a few decades old: plug-in a 3D rendering card. (Of course there's the whole system bus issue, but that's largely solved by a bigger bus, rather than a faster CPU and more system memory. 3d programs requiring more cpu/memory is largely software bloat)
A few decades ago there was a lot of research into system-level parallel processing. The idea was to just add more machines to scale up processing power (if needed). But because machines got faster, there was less need for it, so the research was mostly abandoned. We would all be using distributed OSes today if it weren't for faster machines.
If our computers never got faster, we would never get faster computers (obviously...) to run efficient code even faster. 3D rendering and physics simulation come to mind.
I have noticed what you mention over longer timescales (e.g. a decade). But it's mostly "flashy" software - games, trendy things... Which also includes many websites sadly - the minimum RAM usage for a mainstream website tab these days seems to be around 200MB.
Anecdata: My 12 year old desktop still runs Ubuntu+latest Firefox fine (granted, it probably wouldn't be happy with Windows, and laptops are generally weaker). Counter-anecdata: A friend's Mac Pro from many years ago can't run latest Safari and many other apps, so is quite useless.