(2) At the office, it's harder to pretend that work took longer than it actually took. If you're done with the design doc in 2h, with colleagues sitting all around you, what can you do to pretend that it took you longer? Mess around with Confluence, pretending to work? You might as well move on to the next design doc, or find something useful to do meanwhile.
At home, you can just pick up the Xbox controller and say it took you 4 hours.
I like the comfort of WFH, and there are plenty of people (myself included) that are responsible enough to handle it, but the problem is that the average person isn't. And we all pay for it.
That's a very confident statement to make. My company is very much happy with remote working. 50% of my team aren't even in the country. Most other teams are the same. I don't feel like I'm paying any price for that.
Happy to hear that it's working out well for you, but there is always a cost. The less common successful remote work becomes across the industry, the more companies like yours stand out. And remote work goes back to being a premium, highly competitive perk, like it used to be in the past.
And I agree that remote work is perfectly possible with a good culture and responsible employees, but most companies don't really have the culture or hiring practices that optimize for that.
> The less common successful remote work becomes across the industry
You're presupposing your conclusion. I don't believe remote work is less successful. I think this is driven from angles such as:
* Managers who have made their career by visibility, struggling to adapt to a new normal
* A way to get rid of staff without paying severance
* Pressure from certain types of politician who have made up their mind (regardless of any evidence) that remote working is harmful
And others
It really is a shame that for once it felt like labour had something going their way, and somehow the leaders of capital have convinced us to fight each other on this, while they take it away again.
I agree with your sentiment, this incremental evolution is getting difficult to feel when working with code, especially with large enterprise codebases. I would say that for the vast majority of tasks there is a much bigger gap on tooling than on foundational model capability.
Also came to say the same thing. When Gemini 3 came out several people asked me "Is it better than Opus 4.1?" but I could no longer answer it. It's too hard to evaluate consistently across a range of tasks.
How do you break the window glass, how do you get into the display cases, how and when do you do it so that you're not stopped by the security, how do you carry the items away, how do you get away from the crime scene without being caught... And I'm sure I am forgetting other details. So I can see how it would take some work to make everything look so simple.
I recommend using GitHub's registry, ghcr.io, with GitHub Actions.
I invested just 20 minutes to setup a .yaml workflow that builds and pushes an image to my private registry on ghcr.io, and 5 minutes to allow my server to pull images from it.
Luckily there’s an easy fix, enabling JavaScript like a regular human being :p Honestly curious though, do you generally browse the web without JavaScript?
Yes, I do. The day uMatrix finally stops functioning will be an awful day. Excluding cute PoC sites, and demos; sites that function without javascript are objectively superior every way to sites that can't function without js. Though I admit that's correlative more than causative.
Also, the point isn't to be a regular human being, it's to be a hacker, or engineer, or, [other]. Why be boring (regular) when you can be good at something instead?
Interesting, and completely agree with both your remarks, but how does it relate to disabling JavaScript? What are you getting out of it, other than making the web less usable for yourself?
You mean other than the reduced risk of compromise from the latest js engine exploit? Or that it also prevents some xss injection? Or that often many sites will still function with most external scripts disabled, i.e. it disables spyware that many sites don't need to install, but still do?
Besides all of that, it makes the web more usable in most cases. Not more functional, more usable. I don't want your site to hijack my browser scroll, nor do I want your modal popups to interrupt me. Plus, I like knowing the level of competency of the site's developers. If it doesn't function at all without enabling a half dozen external scripts/sites, even if I still want to use your site. Which is then unlikely, I know to lower my expectations about how much I can trust you or your site.
How about loading the table row you just added instead of reloading the entire page, table and all?
I'm against bloated apps and ads, in favor of using forms and HTMLs strengths, but JS is the tool and not the cause of poor design. A web developer that doesn't use it is offloading their identity based allergy onto the users bandwidth.
> How about loading the table row you just added instead of reloading the entire page, table and all?
Show me a single site that uses js in this way. But, and here's the trick... it has to *actually* use less network bandwidth. My issue isn't with js as a web development tool. My issue is with js when it's the wrong tool.
> A web developer that doesn't use it is offloading their identity based allergy onto the users bandwidth.
That's an interesting hypothetical, but I'd be willing to be more user bandwidth is consumed by needless JS scripts, than by all the engineers that you'd call "allergic" to js.
> Plus you can make modals without JS
Show me one that triggers when I scroll too far, or move my mouse outside of the window that doesn't use js?
All of your claims lack evidence... there's plenty of things that sound great in theory, but have been toxic and user hostile every time someone has actually tried to apply said theory.
> How about loading the table row you just added instead of reloading the entire page, table and all?
Lots of sites use JS that way. This was the default way of doing it with JQuery. Yeah, an object with a couple fields is going to use less bandwidth than loading the whole page, in the case of a row.
I agree that you can absolutely find people overusing JS, but your original position was the web is better off without JS. In fact, you implied developers using it weren't competent, but now you've changed it to "actually it's okay if it's used right" which is what I said.
> All of your claims lack evidence.
Nothing I said lacks evidence, in fact they're pretty basic things you can verify? It seems like you want me to prove the entire Internet is using JS how you like. I'm sorry, bad sites are always gonna suck with or without JS
I'm asking for a single example. I can find a few, they're the sites that complain about the JS ecosystem. But my point was never that JS the theoretical feature is bad. My point was, and is; JS the thing that exists, is bad. Because nearing a total majority I've only seen JS misused.
You gave an example that I agree with, that would be a useful use of JS. Does that exist, or is that just a theoretical example? Theoretically, a lot of stuff is good. Theoretically, amphetamines could be over the counter... everyone just needs to use them responsibly!
I also browse with disabled JS by default, enabling it on selected sites for selected JS sources. It has several advantages:
1) Web is much faster.
2) Often JS makes continuous CPU load, raising speed of CPU fan to noisy levels.
3) Sometimes JS is used for animations, i hate animations on web pages.
4) Sometimes JS is used to auto-play videos (although recent Firefox with proper settings ca block that even with JS enabled in most cases), i hate auto-play videos. That was my primary reason to switch to disabled JS in the past.
5) Often cookie and other pop-ups are implemented with JS and do not show when JS is disabled (while the web still works).
Ads. Tracking scripts. Interface hijacks. Dynamics that change the page as I read it. A myriad of other poorly executed ideas that someone who considers themselves "very clever" thought were good at the time but only make my experience worse.
For the same reason you shouldn't rent or buy or use a backhoe when it's a shovel of dirt. For the same reason you shouldn't stand up multiple 8U rack mount servers to run your homeassisstant instance that would be happy on a raspi zero.
I would love it if web devs would stop externalizing the costs of their bloated apps onto their users. Make web devs care about resource utilization. How much bandwidth, electricity, and time is wasted on poorly written applications?
More like you shouldn't use a backhoe and charge passing people for its fuel and put your backhoe in the middle of the walkway. It brings negative value to the end user. If you don't care for your end user - fine, but then you can't expect them care for you or your product either.
At home, you can just pick up the Xbox controller and say it took you 4 hours.
I like the comfort of WFH, and there are plenty of people (myself included) that are responsible enough to handle it, but the problem is that the average person isn't. And we all pay for it.