I ran Motif from a terminal, and used command lines to bring up windows. Windows 95 felt like a toy in comparison, not to mention PC performance was pretty sad when compared to a high-end unix workstation. To each their own I guess.
It's not really fair to compare a bottom-of-the-barrel PC to a high-end unix workstation though. The high-end Windows boxes were running Windows NT 3.51, and later NT 4, and there just weren't many of them. NT 4 wasn't quite there yet, but it had a lot of what was good from the Windows 95 interface, but on a real, enterprise-grade OS.
It's almost a shame Microsoft clung to DOS compatibility for so long, that probably kept a lot of power users from seeing what Windows could do. But on the other hand, it's probably a good thing because it kept Unix popular and gave Linux and BSD room to grow.
In terms of plain wattage, it cannot be dangerous. Unless, of course, you were to stand with your eye up against the sensor and maybe stare at it for a few minutes.
In my experience (as limited as it might be), burnout is a very person thing, usually driven internally by the employee with an out of kilter sense of balance between self-commitment and job performance. Common drivers are broken, centralized processes (e.g. stack ranking) rather than individual managers. Staffing doesn't really help, it just raises the bar, because this is a matter of competition.
In the software world, the sheer focus on compensation is not helpful, especially when some of the larger tech firms promote levels of compensation that nearly all "ordinary" developers could never hope to achieve.
I wouldn't blame this on MBAs. The fault lays in the culture of the Board Room. There used to be a time that the board cared about the welfare of employees and the good of society as a whole. I know this is hard to believe in contemporary times.
I struggle to find things in the modern business world that cannot be blamed on the culture of the MBA. What you are talking about — boards not caring about the welfare of employees — is a fundamental result of the culture of the MBA, which has suffused through all business thought in a way that casually depersonalises and humiliates.
I used to work for a small business and I decided I would have to quit one day when my boss said, on the phone to a client, "yes, I've got a resource for that".
This one trips me up. Why are we sensitive about the word "resource"?
Literally nothing about the word "resource" has negative connotations for me. Resources are finite and precious. They are protected and important.
Sometimes they are exploited and undervalued, sure. What isn't? Certainly not humans or employees.
Every project requires resources. Some of them are human. It's just a category.
Would you be less bothered if he said "I've got a human for that"? Or "I've got a worker for that"? "The staff to handle that need is available"?
I don't use the word, and the first time I heard it, I thought it was a little impersonal. But then I thought about it more, and I just don't understand the strength of reaction.
It might help that, in general, my goal is not to be seen as a living human being with real human complexity and needs and desires, at work.
> Would you be less bothered if he said "I've got a human for that"? Or "I've got a worker for that"? "The staff to handle that need is available"?
"I've got a worker" is still somewhat dehumanising. "I have the staff for that" is somewhat less dehumanising.
But, for example, "yes we have someone here that can work on this with you" is so obviously less dehumanising.
I find it surprising that people would ever be confused about this. Perhaps it is because I am British and that sort of language is impolite, rude and arrogant. Or perhaps it is rejection-sensitive dysphoria (a real problem for me) making me sensitive to descriptions of myself and people I care about that reduce us to interchangeable allocatable units.
But again, the basic thing here is: there were four of us. Only one of us was ever going to do that job because there were four of us and we had four different jobs. So why ever lurch towards the language of interchangeability, in earshot.
Four people in a small business cannot really ever be a "category". And you should never use a word for a person that can also be used for a photocopier or a dictionary. A person can be resourceful; they are never a resource.
How about something like "Yes, we have the resources to handle that project"?
> Four people in a small business cannot really ever be a "category".
Sure they can -- they are all employees, for example.
I agree that "resource" is an impersonal word when used for "staff" (largely because it can apply to non-human things). I just don't feel the need to be considered more than a resource at work.
I bring special skills and knowledge, I have no concern that I am an interchangeable cog in the wheel of industry -- and yet at the same time, I have no illusions that I cannot be replaced (on some possibly-inconvenient timescale for business operations, although certainly that has varied over time in my employment history).
Actually that raises an interesting question, I think. When I was in high school, I worked a few summer temp jobs as unskilled labor. If anyone had called me a "resource" then, it would have felt patronizingly euphemistic to the point of absurdity. I was just a body. So in that case "resource" would be a silly upgrade.
So I guess it comes down to context. I can see where a four-person company, especially if you've been there a while, has a much higher expectation of personal relationships.
You mentioned that your boss was on the phone. The other party to the conversation might have been further removed (org chart-wise) from their staff. They might think only in resource allocation and not know any names or capacities at the productive level in their own org, never mind yours. Since they are a client, your boss may have mirrored their language, even though he was speaking about a full human, and within earshot of that human.
I don't know, maybe your boss was just a jerk in general, and this word was enough to make you feel like it was a summary of how he thought about you.
But maybe it was just a word. Neither incorrect, nor intentionally offensive.
Obviously, words can be triggers. I'm in the camp that believes they should not be, for all sorts of logical reasons, but I'm not an absolutist. Some words are intended to be triggering, for example, and although I think it's a mistake to give them that power, I understand it's not that simple and that I speak from a position of privilege.
However, I don't think that "resource" has reached the point of social awareness that it is actually offensive to some people. I think that most people who use the word intend no offense, and are not thinking in a way that, if fully explained, would be offensive.
It's simple dehumanization. It's not outlandish or anything, it's just really easy to notice. And the sophistry to try make them equivalent terms is also easy to notice.
For a business to need resources it means a category of stuff that can include people, tools, raw materials, etc... Using the name of a category to mean one thing inside it instead of explicitly naming that one thing is concealment. Just like how I might say "fertilizer" instead of "cow shit."
The better question is why we started concealing it. Why are we so sensitive about the words person, employee, or personnel?
Because starting from the 1980's corporate organisation was focused on managing resources, of which humans were a part that had to be dehumanized to fit with the rest of the theory. There was a brief phase where it was called HCM - human capital management, but that never caught on widely; so HRM it is with a focus on managing as opposed to organising and supporting. https://www.linkedin.com/pulse/evolution-hr-terminology-why-...
> when my boss said, on the phone to a client, "yes, I've got a resource for that".
Hahaha, I got hit with that, too, also working for a small company. Luckily it was the client who called me "a resource", not someone from my company, but good lord what a way that is to talk about human beings.
It's generally true, isn't it? Otherwise we'd have ground breaking discoveries every day about some new and fastest way to do X.
The way I see it, mathematicians have been trying (and somewhat succeeding every 5~ years) to prove faster ways to do matrix multiplications since the 1970s. But this is only in theory.
If you want to implement the theory, you suddenly have many variables you need to take care of such as memory speed, cpu instructions, bit precision, etc. So in practice, an actual implementation of some theory likely have more room to improve. It is also likely that LLM's can help figure out how to write a more optimal implementation.
I thought the same until I calculated that newer hardware consumes a few times less energy and for something running 24x7 that adds up quite a bit (I live in Europe, energy is quite expensive).
So my homelab equipment is just 5 years old and it will get replaced in 2-3 years with something even more power efficient.
Asking coz I just did a quick comparison and it seems to depend but for comparison I have a really old AMD Athlon "e" processor (like literally September 2009 is when it came out according to some quick Google search, tho I probably bought it a few months later than that but still ...) that runs at ~45W TDP. In idle conditions, it typically consumes around 10 to 15 watts (internet wisdom, not kill-a-watt-wisdom).
Some napkin math says it would cost me about 40 years worth of amortization to replace this at my current power rates for this system. So why would I replace it? And even with some EU countries' power rates we seem to be at 5-10 years amortization upon replacement. I've been running this motherboard, CPU + RAM combo for ~15 years now it seems, replacing only the hard drives every ~3 years. And the tower it's in is about 25 years old.
Oh I forgot, I think I had to buy two new CR2032 batteries during those years (CMOS battery).
Now granted, this processor can basically do "nothing" in comparison to a current system I might buy. But I also don't need more for what it does.
That is definitely true and why I compared idle watts. That Athlon uses the same idle watts as modern mobile CPUs. So no reason to replace during the mostly idle times. Spot on. I can't have this system off during idle time as it wouldn't come up to fulfill its purpose fast enough when needed and it would be a pain to trigger that anyway (I mean, really, port knocking to start up that system type thing). Else I would. That I do do with the HTPC which has a more modern Intel core i3.
The "nothing" here was exactly meant more for the times when it does have to do something. But even then at 45W TDP, as long as it's able to do what it needs to, then the newer CPUs have no real edge. What they gain in performance due to multi core they loose in being essentially equivalent single core performance for what that machine does: HTPC file serving, email server etc.
Spinning rust and fans are the outliers when it comes to longevity in compute hardware. I’ve had to replace a disk or two in my rack at home, but at the end of the day the CPUs, RAM, NICs, etc. all continue to tick along just fine.
When it comes to enterprise deployments, the lifecycle always revolves around price/performance. Why pay for old gear that sucks up power and runs 30% slower than the new hotness, after all!
But, here we are, hitting limits of transistor density. There’s a reason I still can’t get 13th or 14th gen poweredge boxes for the price I paid for my 12th gen ones years ago.
There’s no marginal tax impact of discarding it or not after 5 years - if it was still net useful to keep it powered, they would keep it. Depreciation doesn’t demand you dispose of or sell the item to see the tax benefit.
No but it tips the scales. If the new hardware is a little more efficient, but perhaps not so much so that you would necessarily replace it, the ability to appreciate the new stuff, but not the old stuff might tip your decision
reply