Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

While I do appreciate you taking the time to write that, I am somewhat at a loss. How does this justify the antipathy towards notions of a first-party build system and package manager? That's how we got into this argument with each other: I was calling out C/C++ cultists who cling to the ugly patchwork of hacky tooling that is C/C++'s so-called build systems and decry any notion of a first-party build system (or even a package manager to boot) as being destined to become just like npm.

C/C++ developers clearly want a build system and package manager, hence all this fragmentation, but I can't for the life of me understand why that fragmentation is preferable. For all the concern about supply-chain attacks on npm, why is it preferable that people trust random third-party package managers and their random third-party repackages of libraries (eg: SQLite on conan and vcpkg)? And why is global installation preferable? Have we learnt nothing? There's a reason why Python has venv now; why Maven and Gradle have wrappers; etc. Projects being able to build themselves to a specification without requiring the host machine to reconfigure itself to suit the needs of this one project, is a bonus, not a drawback. Devcontainers should not need to be a thing.

If anything, this just reads like Sunk Cost Fallacy: that "it just works" therefore we needn't be too critical, and anyone who is or who calls for change just needs to git gud. It reminds me of the never-ending war over memory safety: use third-party tools if you must but otherwise just git gud. It's this kind of mindset that has people believing that C/C++'s so-called build systems are just adhering to "there should be some artificial friction when using dependencies to discourage over-use of dependencies", instead of being a Jenga tower of random tools with nothing but gravity holding it all together.

If it were up to me, C/C++ would get a more fleshed-out version of Zig's build system and package manager, ie, something unified, simple, with no central repository, project-local, exact, and explicit. You want SQLite? Just refer to SQLite git repository at a specific commit and the build system will sort it out for you. Granted, it doesn't have an official build.zig so you'll need to write your own, or trust a premade one... but that would also be true if you installed SQLite through conan of vcpkg.



> How does this justify the antipathy towards notions of a first-party build system and package manager?

I don't feel particularly antipathic towards notions of first-party build system and package manager. I find it indeniably better to have a first-party build system instead of the fragmentation that exists in C/C++. On the other hand, I don't feel like asking a 20-year old project to leave autotools just because I asked for it. Or to force people to install Python because I think Meson is cool.

As for the package manager, one issue is security: is it (even partly) curated or not? I could imagine npm offering a curated repo, and a non-curated repo. But there is also a cultural thing there: it is considered normal to have zero control over the dependencies (my this I mean that if the developer has not heard of dependencies they are pulling, then it's not under control). Admittedly it is not a tooling problem, it's a culture problem. Though the tooling allows this culture to be the norm.

When I add a C/C++ dependency to my project, I do my shopping: I go check the projects, I check how mature they are, I look into the codebase, I check who has control over it. Sometimes I will depend on the project, sometimes I will choose to fork it in order to have more control. And of course, if I can get it from the curated list offered by my distro, that's even better.

> C/C++ developers clearly want a build system and package manager, hence all this fragmentation

One thing is legacy: it did not exist before, many tools were created, and now they exist. The fact that the ecosystem had the flexibility to test different things (which surely influenced the modern languages) is great. In a way, having a first-party tool makes it harder to get that. And then there are examples like Swift where is slowly converged towards SwiftPM. But at the time CocoaPods and Carthage were invented, SwiftPM was not a thing.

Also devs want a build system and package manager, but they don't necessarily all want the same one :-). I don't use third-party package managers for instance, instead I build my dependencies manually. Which I find gives me more control, also for cross-compiling. Sometimes I have specific requirements, e.g. when building a Linux distribution (think e.g. Yocto or buildroot). And I don't usually want to depend on Python just for the sake of it, and Conan is a Python tool.

> why is it preferable that people trust random third-party package managers and their random third-party repackages of libraries (eg: SQLite on conan and vcpkg)?

It's not. Trusting a third-party package manager is actually exactly the same as trusting npm. It's more convenient, but less secure. However it's better when you can rely on a curated repository (like what Linux distributions generally provide). Not everything can be curated, but there is a core. Think OpenSSL for instance.

> And why is global installation preferable?

For those dependencies that can be curated, there is a question of security. If all your programs on your system link the same system OpenSSL, then it's super easy to update this OpenSSL when there is a security issue. And in situations where what you ship is a Linux system, then there is no point in not doing it. So there are situations where it is preferable. If everything is statically link and you have a critical fix for a common library, you need to rebuild everything.

> If it were up to me

Sure, if we were to rebuild everything from scratch... well we wouldn't do it in C/C++ in the first place, I'm pretty sure. But my Linux distribution exists, has a lot of merits, and I don't find it very nice when people try to enforce their preferences. I am fine if people want to use Flatpak, cargo, pip, nix, their system package manager, something else, or a mix of all that. But I like being able to install packages on my Gentoo system the way I like, potentially modifying them with a user patch. I like being able to choose if I want to link statically or dynamically (on my Linux, I like to link at least some libraries like OpenSSL dynamically, if I build an Android apk, I like to statically link the dependencies).

And I feel like I am not forcing anyone into doing what I like to do. I actually think that most people should not use Gentoo. I don't prevent anyone from using Flatpak or pulling half the Internet with docker containers for everything. But if they come telling me that my way is crap, I will defend it :-).

> I am somewhat at a loss.

I guess I was not trying to say "C/C++ is great, there is nothing to change". I just think it's not all crap, and I see where it all comes from and why we can't just throw everything away. There are many things to criticise, but many times I feel like criticisms are uninformed and just relying on the fact that everybody does that. Everybody spits on CMake, so it's easy to do it as well. But more often than not, if I start talking to someone who said that they cannot imagine how someone could design something as bad as CMake, they themselves write terrible CMakeLists. Those who can actually use CMake are generally a lot more nuanced.


Even though I understand why you prefer that, I feel like you're painting too rosy of an image. To quote Tom Delalande: "There are some projects where if it was 10% harder to write the code, the project would fail." I believe this deeply and that this is also true for the build system: your build config should not be rivalling your source code in terms of length. That's hyperbole in most cases, sure, and may well indicate badly written build configs, but writing build configs should not be a skill issue. I am willing to bet that Rust has risen so much in popularity not just because of its memory safety, but also because of its build system. I don't like CMake, but I also don't envy its position.


> but writing build configs should not be a skill issue

I think it shouldn't be a skill issue because a true professional should learn how to do it :-).

My build configs are systematically shorter than the bad ones.

Also I feel like many people really try to have CMake do everything, and as soon as you add custom functions in CMake, IMO you're doing it wrong. I have seen this pattern many times where people wrap CMake behind a Makefile, presumably because they hate having to run two commands (configure/build) instead of one (make). And then instead of having to deal with a terrible CMakeLists, they have to deal with a terrible CMakeLists and a terrible Makefile.

It's okay for the build instructions to say: "first you build the dependencies (or use a package manager for that), second you run this command to generate the protobuf files, and third you build the project". IMO if a developer cannot run 3 commands instead of one, they have to reflect on their own skills instead of blaming the tools :-).


> I think it shouldn't be a skill issue because a true professional should learn how to do it :-)

Therein lies the issue, in my opinion: I do not believe that someone should have to be a "true professional" to be able to use a language or its tooling. This is just "git gud" mentality, which as we all [should] know [by now] cannot be relied upon. It's like that "So you're telling me I have to get experience before I get experience?" meme about entry-level jobs: if you need to "git gud" before you can use C/C++ and its tooling properly, all that means is that they'll be writing appalling code and build configs in the mean time. That's bad. Take something like AzerothCore: I'd wager that most of its mods were made by enthusiasts and amateurs. I think that's fine, or at least should be, but I'm keenly aware that C/C++ and its tooling do not cater to, nor even really accommodate amateurs (jokey eg: https://www.youtube.com/watch?v=oTEiQx88B2U). That's bad. Obviously, this is heading into the realm of "what software are you trusting unwisely", but with languages like Rust, the trust issue doesn't often include incompetence, more-so just malice: I do not tend to fear that some Rust program has RCE-causing memory issues because someone strlen'd something they shouldn't.


> It's like that "So you're telling me I have to get experience before I get experience?"

Not at all. I'm not saying that one should be an architect on day one. I'm saying that one should learn the basics on day one.

Learning how to install a package on a system and understanding that it means that a few files were copied in a few folders is basic. Anyone who cannot understand that does not deserve to be called a "software engineer". It has nothing to do with experience.


> I'm saying that one should learn the basics on day one.

Except that C/C++ have entirely incongruous sets of basics compared to modern languages, which people coming to C/C++ for the first time are likely to have a passing familiarity with (unless it's their first language, of course). Yes, cmake configs can be pretty concise when only dealing with system packages, but this assumes that developers will want to do that, or whether they'll want to replicate the project-localness ideal, which complicates cmake configs. We're approaching this from entirely different places and is reminding me of the diametrically-opposed comments on this post (https://news.ycombinator.com/item?id=45328247) about READMEs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: