Hacker Newsnew | past | comments | ask | show | jobs | submit | testdelacc1's commentslogin

Worth mentioning that this is the documentation of 3.15 alpha 3. I feel like we’re better waiting for a release candidate or the final version before posting this page, in case there are any changes. Most people reading this are going to assume it’s final.

> You and Greg are sorely mistaken.

Greg Kroah-Hartman has been a Linux kernel developer for 25 years, responsible for large parts of the kernel.

You’ve been a hacker news commenter for 1 day.

Could you pipe down with these wild claims that you know better than him?

Also, please don’t complain about downvotes. It’s tedious to read.


[flagged]


This is a valid appeal to genuine expertise not authority.

Saying "appeal to authority" doesn't refute the point made above. Expertise is real. Someone with 25 years of experience with the linux kernel will know a lot more about linux and C than the average HN commenter. Almost certainly more than me.

Its possible that you might be right about whatever point you're trying to make. But if you are, I can't tell that from your comments. I can't even find a clear claim in your comments, let alone any substantive argument in support of that claim.

I'm unmoved and unimpressed.


No one is doing any kind of serious computing on 30 year old CPUs. But the point of the hobby isn’t turning on the computer and doing nothing with it. The hobby is putting together all the pieces you need to turn it on, turning it on and then doing nothing with it.

There’s an asymmetry in what the retro computing enthusiasts are asking for and the amount of effort they’re willing to put in. This niche hobby benefits from the free labour of open source maintaining support for their old architectures. If the maintainers propose dropping support because of the cost of maintenance the hobbyists rarely step up. Instead they make it seem like the maintainers are the bad guys doing a reprehensible thing.

You propose they get their hands dirty and cherry pick changes from newer kernels. But they don’t want to put in effort like that. And they might just feel happier that they’re using the “real” latest kernel.


Greg Kroah-Hartman has been in charge of CVEs in the Linux kernel for a decade.

Your account is 1 day old.

I’m in a real dilemma here about whose word to take on the seriousness of this CVE.


You can say this but the people who need to hear it won’t listen. They lack the perspective of actually reading history to understand the scale at which people were killed before the world police era.

But even reading history isn’t enough. I think we’re fundamentally not equipped to understand what a large number of deaths actually looks and feels like. 10 deaths happening in our vicinity is an unbearable tragedy. 1 million deaths is just a number. So folks are struck by a nostalgia for a time when humans killed each other by the millions.

In some ways they remind me of the people who long for the days before vaccines eliminated a bunch of diseases.


Really weird bringing religion into this discussion. Don’t be weird.

It’s uncharitable to assume they’re lying. In which case, it’s perfectly ethical and legal to reimplement an existing program.

The run time overhead is 2-4x.

Isn't this putting it in the Java territory?

Yeah, pretty much. If someone is ok with a 3-4x slower program with higher memory consumption, that’s great if it saves you development time. But I can’t see someone starting a new project in Fil-C when more performant and ergonomic options exist - Java, C#, Go, Swift, Rust. Even Javascript.

This isn’t a recent decision, which the title implies. This rewrite started in 2020, and they released Arti 1.0 in 2022. Check out the release post (https://blog.torproject.org/arti_100_released/) where they explain their rationale for the rewrite. They were unhappy with the state of the C codebase and couldn’t see a way to slowly refactor it. Their experience with Rust was positive for all the commonly cited reasons - if it compiles it works, good ecosystem leading to development velocity, better portability across operating systems, and attracting more contributors. They did say they weren’t happy at the time with binary sizes.

The change log in the arti repo (https://gitlab.torproject.org/tpo/core/arti/-/blob/main/CHAN...) shows a lot of recent development too- versions 1.6, 1.7 and 1.8 were released in the last 3 months and they talk about setting the foundations for larger features to come. All in all it seems like the decision worked out for the team.


Arti is also designed to be embedded as a library in other apps, so messaging clients (for example) will be able to leverage the network without needing a correctly configured Tor daemon on the host.

The extra safety in the code base is nice, but this seems like a bigger deal.


Yes, this is a complete exaggeration of a headline and should be flagged for that alone.

This has been a long running project, and the Tor team clearly took their time to make it, as opposed to being a spur-of-the-moment change.


You're reading way to much into the title. "[...] is switching to [...]" does not have any implication of being a "spur-of-the-moment" thing

>better portability across operating systems

Does Rust have better portability than C?


As the link I posted says

> Portability has been far easier than C, though sometimes we're forced to deal with differences between operating systems. (For example, when we've had to get into the fine details of filesystem permissions, we've found that most everything we do takes different handling on Windows.)

Remember, we’re not talking about the language. We’re talking about the stdlib and the surrounding ecosystem. The Rust stdlib and third party libraries generally handle windows well, so the Tor developers don’t need to special case windows.


It might have better abstractions over the current popular operating systems (Windows, Mac, Linux). Obviously not more portable in general.

Depends on what C code you are talking about.

Doubtful. I can't even get Rust to work here on my slightly older Mac system. So with TOR switching away from a well supported language like C it's simple another project lost to me (unlikely you can stick with an older version for long in this case as they regularly break backwards compatibility in their network.)

> I can't even get Rust to work here on my slightly older Mac system.

Could you elaborate on that? macOS Sierra (released on 2016) on Intel macs is supported[1][2], which should allow for Macs from late 2009 onward to work. The Intel Mac build is no longer Tier 1 because the project no longer has access to CI machines for them, and 32-bit cross building is hampered by Xcode 14 not shipping the corresponding SDK[3].

1: https://github.com/rust-lang/compiler-team/issues/556

2: https://doc.rust-lang.org/nightly/rustc/platform-support/app...

3: https://github.com/rust-lang/rust/pull/118083


I'm on a slightly older version than 10.12. I'm out of this limited support range. The problem is that as soon Apple cuts off older systems by heavy force all of the Mac developers running in circles like lemmings and cut off their projects from the older versions too: feeding right into Apple's policy to deprecate older systems for their own greed.

For comparison… I have zero problems bootstrapping a recent gcc with all bells and whistles on my setup. I probably even could bootstrap it still on my even older G4 Mac with 10.6.

But so called "modern" language compilers like Rust of Go… no luck. There's a slight hope for the gcc rust support someday, I guess.


> as soon Apple cuts off older systems by heavy force all of the Mac developers running in circles like lemmings

You can make this point without showing contempt for people who are trying their best to support their users. It’s very hard to support Apple platforms if the OS isn’t supported or if XCode drops support for it. There are no workarounds if no one offers CI hardware running that OS/arch.

Nor is there much incentive to try. Only a tiny minority are on the same OS as you, and with good reason. It was released 9 years ago and marked End-of-Life 6 years ago.

It costs time and money to support users like you, and open source volunteer driven projects sometimes don’t have that. Rust and other projects have added support for smaller platforms if passionate volunteers drive it forward. You’re welcome to do that, but it sounds like you’re happy with GCC.


In my experience the small projects with actual volunteers have no trouble helping out here. They are happy to accept a patch too, f.ex.

It's the big "open source" projects with company backing (or other larger organisational structures) behind them which supposedly have a lot of "volunteers" (probably some paid folks, and IMHO lots of people that just slave away for free for someone else's gain) that often have trouble keeping 3 or 4 code lines for backwards compatibility.

I wrote hundreds of GitHub issues, send in patches too, etc., and most discrimination you get is from the bigger projects (not saying this is the case for Rust here; I simply haven't managed to get Rust going at all, so I couldn't even report an issue or send in a patch to fix something.) I do not use the term "discrimination" lightly either. Not everyone can afford to buy new hardware when Apple decides to abandon specific machines, often w/o true technical need (see OpenCore which proves this point). So this is essentially a discrimination of a poor minority.

For "infrastructure" projects like Rust and Go where a lot of other projects depend upon I generally would prefer a more conservative approach here, which doesn't seem to happen for some reason or another… "for the sake of 'progress'", I guess.


Thanks! This is a much better link than the OP, and better preempts the usual round of "why not my other pet language X instead"? questions. Clearly this choice and strategy has been in the works for a long time and the team has carefully thought out all options.

For the work phone.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: