Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> But "at work" you generally have these decisions made for you.

The idea that most employers make terrible decisions now, and amazing decisions back in the day, is plainly false. The author vividly recollects working at a decent Java shop. Even there I strongly doubt everything was amazing as they describe, but it sounds decent indeed. But plenty businesses at the time used C++, for no good reason other than inertia, usually in Windows-only Visual C++ 6-specific dialects. Their "build server" ran overnight, you'd check in your code in the late afternoon and get your compile errors back in the morning. The "source control" worked with company-wide file locks, and you'd get phoned your ass back to the office if you forgot to check in a file before leaving. Meanwhile, half the web was written in epic spaghetti plates of Perl. PHP was a joy to deploy, as it is now, but it was also a pain to debug.

If you care deeply about this stuff, find an employer who cares too. They existed back then and they exist now.



> Their "build server" ran overnight, you'd check in your code in the late afternoon and get your compile errors back in the morning.

This. Let's keep things in perspective when people complain about long Rust compile cycles. And even that's a whole lot better than filling in paper-based FORTRAN or COBOL coding forms to be punched into cards in the computing room and getting back line-printed program output (or a compiler error) the next week.


Rust's notorious compile times sticks out like a sore thumb partly because other system languages can run laps before your Rust build is done. And also because everyone and their grandma swears Rust is blazing fast.

Until you have to compile the program without prior build cache or start a build in a CI pipeline.


This is like economic growth: First bad, then upwards trajectory, then now in free-fall

You are describing it: Things in programing were bad, then suddenly all in the upside UNTIL it start to coming down.

Is not a refute of the problem. Is to pick a moment were both were bad, and like all the discussions about tech, MASSIVELY ignore that MASSIVE internet with MASSIVE money with MASSIVE backing is worse than before.

Is like people complaining that pistols on the wild west kill as do nuclear weapons, ignoring the massive difference in size and blast damage


Amen.

We love the idea that we don't have any agency in this field and we're constantly being pushed by the mean baddies at the top.


Not everyone has the privilege to live in world regions where there is a wealth of options in job opportunities.


Did people really only compile once a night in the days of visual studio 6? There were pentium 2s and 3s back then.


In large C++ codebases of mediocre quality (the example I'm referring to is a manufacturer of large complex machines), yes.

People would compile their local unit locally, of course (a "unit" would be a bunch of files grouped together in some hopefully-logical way). But they wouldn't be 100% sure it compiled correctly when integrated with the larger codebase until the nightly build ran. So like if you didn't change the .h files you were pretty sure to be in the clear, but if you did, you had to be careful and worse-case-scenario do a 1-day-per-step edit-compile-test loop for a week or so. I'm not entirely sure how they managed to keep these compile failures from hurting other teams, but they didn't always (I think they had some sort of a layered build server setup, not too dissimilar from how GH Actions can do nightlies of a "what if this PR were merged with main now").

Visual Studio 6 itself was pretty OK actually. Like the UI was very limited (but therefore also fast enough), but compiling smallish projects went fine. In fact it was known to be a pretty fast compiler, I didn't mean to suggest that VC++6 implies overnight builds. They just coincided. In fact better-structured big-ish C++ projects (pimpl pattern anyone?) could probably recompile pretty quickly on the computers of the day.


It was definitely on the order of hours for large code bases - the Microsoft Excel team passed out punishment “suckers” for those who broke the build - causing 100+ people to not have a new working build to look at and test.

Linux kernel compiles in the 1990s were measured in hours, and that codebase was tiny compared to many. So, yep, builds were slow, slow enough to have an entire xkcd comic written about them.


Entire builds being slow isn't the main point though, it's iteration time from changes. I have a hard time believing people were working on a single compile a day and building an entire huge program on every iteration. That's the whole point of compilation units.


The "terrible decisions" of yore hold no comparison to today's "terrible decisions". It's not the same ballpark, it's not the same sport.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: