Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Windows signing is a ripoff, $500/year you're getting nothing. Your certificate is not trusted. You have to "get reputation for it" before Windows Defender would stop giving users warnings. Also, renewing certificate is not a thing. Every time you have to get a new one, with same story of "reputation" again.

[1] https://www.digicert.com/order/order-1.php



Funny thing about trust: I trust a developer who drops some $$$ on a code-signing certificate more than I trust a developer who doesn't. Even if it's just $20.

Also, the validation requirements to obtain a code-signing certificate, while certainly not bulletproof, are not nothing: you need to send in articles of incorporation and your business needs a listing with a physical address and phone number in a public directory (e.g., bbb.org), and someone representing your business needs to pick up that phone when the cert validator calls it.

Your business name and physical address are injected into the certificate. Basically code-signing certificates make it easier for people to find you and sue you if they truly want to. I suspect that's the whole point.

The problem here is that the Notepad++ developer wants his certificate to say CN=Notepad++, but he won't be able to obtain that until he has some kind of business or organization registered in his jurisdiction with that name. Whereas CN=FIRSTNAME LASTNAME he could probably obtain immediately (just send in his driver's license during validation).


> Funny thing about trust: I trust a developer who drops some $$$ on a code-signing certificate more than I trust a developer who doesn't. Even if it's just $20.

Why? If I expect to make four figures on spreading malware/adware, and I can assuage the nerves of people like you by spending two or three figures on a certificate, I'm going to buy the certificate and make it look all nice and pretty and take your money.

> Your business name and physical address are injected into the certificate. Basically code-signing certificates make it easier for people to find you and sue you if they truly want to. I suspect that's the whole point.

So I have to incorporate in Delaware, make up a fake address, and rent a burner phone for a while. I'm not seeing the downside.

https://www.bizfilings.com/toolkit/research-topics/incorpora...

> Delaware does not require director names and addresses to be listed in the Certificate of Incorporation.


They didn't say they trust the developer who spends the money absolutely, they said they trust the developer who spends the money more than they trust one who doesn't. Which is fair -- as you note, not every scammer will be scared off by the need to spend some money to pull the scam off; but some will, so the ratio of legitimate developers to illegitimate ones will be higher in markets where there's some cost to entry.


This is fallacious reasoning. A well intentioned open source developer who does not earn any money out of a labor of love has no incentive to further spend money to sign his app that he’s giving away for free anyway. On the flip side, a malicious actor that expects to earn money through a scam has every incentive to spend some money making the app look legit, especially if there is no risk involved.


The signed app can be globally disabled.


This could be easily achieved by Microsoft running a free signing service. Lowering the cost of signing to zero would significantly increase the proportion of signed apps.


The question was 'is someone who spends money for code signing more trustworthy than someone who doesn't' and it was being treated as if the trust or at least, increase in comfort, somehow comes merely from the act of spending money. It's an opt-in to a service that mitigates the impact of malicious code.


The parent statement was that having signed apps made them easy to disable. If all apps had to be signed, everything would have a reputation hook, and also be easily disabled. It's the hang-up of using the for profit 'verified' code signing ecosystem that makes signing ineffective.

Of course, MSFT/Apple etc will abuse it to kill apps they/govt don't like.


I don't really understand how any of this makes signing ineffective.


If the only way to play is to go through entrenched gatekeepers, who watches the watchers, hmmm? If anything this should be seen as a power grab by entrenched interests to have a cryptographic lever to pull to shut people out of what should be a user's discretion decision pre-emptively. Walled gardening at it's finest.

Code signing is a bit like gun control. It really doesn't solve the problem at all. It just pushes it up a level, and makes things more difficult for legitimate users.

It also lines up incentives such that the preferred model of software distribution shifts in the grand scheme of things toward for profit code.

While code signing is a neat technical solution, it's still a technical solution parading about as a solution to a social problem. And the social problem it is a solution to (that of untrustworthy folks existing) is not in any way mitigated by the act of signing as mentioned previously.


I don't really understand how any of this makes signing ineffective.


Apps submitted to the Microsoft Store are signed by Microsoft (only), iirc.

Although it costs $19 or $99 to sign up, one time.


That is partly because only those apps can run on Windows S (also available in 1703: https://www.howtogeek.com/302352/how-to-allow-only-apps-from...)

They could enable this for win32 apps but they want to push things towards the walled garden.


> The signed app can be globally disabled.

Why can't the unsigned app be globally disabled? Is this not the basic premise behind Windows Defender and every other antivirus?


Well sure, a known-malicious app will be detected by Windows Defender, provided it has updates making it aware of the app. But a known-malicious signed app will also fail the code signature verification, in addition to the virus scan, if its certificate has been revoked.


In what way is this not two separate things uselessly duplicating the same functionality? If you can get a CRL you can get a definition update, and they both effectively do the same thing.


They are very much not the same thing. A signed app can be distributed from anywhere with the assurance it's the same app - it can't be maliciousified and if it was malicious from the start, it can be disabled. The non-signed up can have zillions of malicious variants which something like Defender may or may not catch. It also gets a shot of circumventing (or even exploiting) AV.


Also, the "disable" in the signed case is much more powerful, since it disables all apps signed by the same key.


> A signed app can be distributed from anywhere with the assurance it's the same app - it can't be maliciousified

This is only true if there is some trust in what is signing them. If anyone can get one then anyone can sign the malicious version of the app with their own key, or one they stole from someone else. The user doesn't know who is supposed to be signing the app -- and if they did then you could be using TOFU or importing the expected source's key from a trusted channel without having to pay fees to anyone.

> and if it was malicious from the start, it can be disabled.

In the same way that Defender can block it. Then the attacker makes a new version signed with a different key.

The problem with CA-based signing is that it's a garbage trade off. If you make it easy to get a signing key, the attacker can easily get more and it does nothing. If you make it hard, you're kicking small developers in the teeth.

> The non-signed up can have zillions of malicious variants which something like Defender may or may not catch.

Which is still possible with code signing. The attacker gets their own key, uses it to infect many users, then some of those users are developers with their own signing keys and the attacker can use each of those keys to infect even more people and get even more keys.

Using keys as a rate limiter doesn't really work when one key can get you many more.

> It also gets a shot of circumventing (or even exploiting) AV.

As opposed to a shot at exploiting the signature verification method and the AV.

There is a better version of this that don't require expensive code signing certificates. You have the developer host their code signing key(s) on their website, served over HTTPS. Then the name displayed in the "do you trust them" box is the name of the website -- which is what the user is likely more familiar with anyway. If the program is signed by a key served on the website, and the user trusts the website, then you're done.

The application itself can still be obtained from another source, only the key has to be from the developer's website. Then future versions of the software signed with the same key can be trusted, but compromised keys can be revoked (and then replacements obtained from the website again).

This is better in every way than paying for EV certificates. It doesn't cost the developer anything, because they already have a domain (and if not they're very inexpensive and independently useful). But the attacker can't just register thousands of garbage domains because they're displayed to the user and nobody is going to trust "jdyfihjasdfhjkas.ru" or in principle anything other than the known developer's actual website, which the user is more likely to actually be familiar with than the legal name of the developer or their company.


I think if you don't like code signing for ideological/process reasons, you can argue that, preferably in reply to someone who wants to argue about it. But trying to work backwards from there to technical arguments that show how signing is the same thing as AV is futile, it just makes you type up longer versions of obviously technically inaccurate things.


There are good ideological reasons to not like code signing. But people present technical arguments in favor of it, which then need to be addressed so that people don't erroneously find them convincing.

And the technical arguments in favor of code signing are weak. They started off claiming a major benefit -- globally disable malicious code. Except that AV can do that too. The argument in favor of having code signing on top of that then becomes weaker -- AV can stop identified malicious code but it can't stop other malicious code from the same malware author. Except that code signing can't do that either since the malware author can sign other versions with different keys. So then the argument becomes, well, at least it rate limits how many different versions there are. Except that is only meaningful to the extent that getting a new key is arduous and not a lot of people have them, otherwise the attacker can get arbitrarily many more by either just applying for more under false identities or by compromising a moderate number of machines to capture more keys from the large number of people who have them. Moreover, using domain validation would already capture the case where you want to get the incremental benefit achievable from a minimal imposition on the developer.

Meanwhile the process of obtaining a code signing key has to be sufficiently easy and non-exclusive that even individual developers can reasonably do it, so making it purposely more arduous than that is a directly conflicting requirement.

The explanation is long because the details are relevant, not because anything "obviously technically inaccurate" is there.


Revoking a certificate removes the ability to sign the malicious executable and any future executables.

Blocking a specific executables block that one. Depending on AV used, simply rebuilding may get you through (different hash); some trivial modifications will do.


> Which is fair -- as you note, not every scammer will be scared off by the need to spend some money to pull the scam off; but some will, so the ratio of legitimate developers to illegitimate ones will be higher in markets where there's some cost to entry.

The big scammers, the ones who are the most likely to have an actual business plan for selling my information, aren't.

Note that I know Adobe is "trusted" in the relevant sense.

Note that I don't think Adobe is trustworthy in any real sense.


The good scammers absolutely will not be scared off by the need to pay a penny to steal a dollar. You have to buy a cheap watch/violin/purse if you want to pass it off as an expensive one. You have to pay off in the back of the operation if you want to keep cash coming in through the front. Indeed, one of the easy ways to short-circuit human trust defenses is to make a show of trust first, such as by placing personal assets at risk. "Here, I'll trust you to hold my wallet full of $500 cash, while I drive your expensive late-model car--that's worth even more when shipped to mainland China--to go get help. You know I'm coming back, because $500 is a lot of money."

The scheme shifts the need to trust from Random Q. Hacker to the certificate-issuing authority, and that only helps if the authority is more trustworthy than the individual. If they don't put forth an effort to really dig in to those applying for certificates, they're just selling costumes for the security theater.

I trust Microsoft more than someone I have never heard of, but I don't inherently trust them more than the informal assembly of Notepad++ contributors and lead FOSS developer Don Ho. If Microsoft's code-signing certificate validation process is not capable of recognizing organizations that are not formally incorporated, and allowing them to use the name of their brand, rather than the names of their lead developers or maintainers, they are leaving a huge fraction of my installs hanging in the wind.


I don't disagree with your reasoning. But: I posit there are fewer "good scammers" than "scammers." Added friction probably reduces the total number of active scammers.


Doesn't that just clear out the low-quality, low-effort competition for the better scammers? And create a stronger presumption that any given person is not a scammer, because otherwise Apple/Microsoft/Google/Amazon/whomever would have kicked them out?

People might forget that "caveat emptor" still applies, even in a walled garden.


The huge number of fraudulent and malware-ish apps for android vs iOS does suggest that costs reduce the number of low quality attackers. I guess that is good for protecting the naive user. But I'm more concerned about protecting against the threat that will take your whole digital identity.


> not every scammer will be scared off by the need to spend some money to pull the scam off; but some will

The same goes for developers, as you can see in this article.


It is not fair. The same cost in money does not translate to the same cost in efforts to earn trust. This is structural discrimination.


You are not necessarily in disagreement with the poster you responded to. They meant it was fair in the sense that the user is using a fair heuristic to determine how likely they are to be scammed. You're saying it's unfair to the person who made the software, as they are experiencing structural discrimination. Those two points are not mutually exclusive.


So the person who spends the most amount of money is the one you trust the most? That's some interesting reasoning. Does that mean if I spend $21 you trust me more than the guy who spends $20?


I would imagine barrier to entry plays a large role in the sense of comfort here. If you have to incorporate in Delaware and pay $500 and jump through hoops, you’re more likely to turn to a simpler and easier alternative.


Exactly! I trust "you must do some things to obtain this" more than I trust "you don't have to do anything".


The part you're missing is that your delaware company burner phone certificate investments can be invalidated in two seconds by the certificate's trust getting revoked.


The part you're missing is that they won't be in practice. There's malware on virustotal with (still) valid code-signing certs.


I mean, if someone is going to commit a crime, forcing them to leave a paper trail is going to scare at least some of them off. And if I'm installing Adobe Photoshop, and the cert comes from Bob's Software Emporium, Delaware, it raises questions.


Is that actually true? How many people, when installing Photoshop, actually look at who issued the cert?


In Windows, the name of the publisher in the cert shows up on the UAC prompt whenever the program asks for elevated privileges. That's the point of this whole thread -- the author isn't paying for a cert because he can't make the UAC prompt say Notepad++ instead of his real name (which, he could, and I have no idea why he thinks it's so complicated, but there it is).


That doesn't have anything to do with the question I asked. See also, TLS exceptions.


> I trust a developer who drops some $$$ on a code-signing certificate more than I trust a developer who doesn't.

This is a misapplication of Bayes theorem.

  P(bad) is the probability any app is bad
  P(signed) is the probability any app is signed

  P(bad if signed) = P(signed if bad) * P(bad) / P(signed)
Essentially your trust model requires that "the fraction of bad apps that are signed is small", or P(signed if bad) approaches 0. But signed malware is available - famously stuxnet, but others before and since: http://users.umiacs.umd.edu/~tdumitra/papers/CCS-2017.pdf

Malware authors have incentive to make their apps appear legitimate, either by stealing keys, impersonating companies, or other mechanisms. Signing also helps get past automated checks (per the paper above).

Further, those probabilities assume random distribution, but I'd suggest that really expensive/dangerous malware has greater incentives to appear safe, so it is even more likely to be signed, even if most malware is not. Stuxnet would be a case in point - high value, sophisticated malware, signed.

  P(really bad if signed) = P(signed if really bad) * P(really bad) / P(signed)

  P(really bad) is lower, 
  but P(signed if really bad) approaches 1, 
  so P(really bad if signed) approaches P(really bad)
Meaning the worse the malware, the less the signature tells you.


All the GP is saying is that he believes that P(signed if bad) < P(signed).

It's a fair belief because paying for something leaves a paper trail. MS certificates are a farce at $500/year, but Google's $20 once is a very reasonable thing.

Your point that really bad malware has higher odds of being signed is a good one, but really bad malware is much less likely than simply "malware".


I believe he's saying P(bad if signed) < P(bad if not signed) - "I trust signed apps more than unsigned apps". I'm saying, maybe, but I'm not sure, and Cost * P(bad if signed) may be much worse than Cost * P(bad if not signed).

When the Transmission bittorrent client site was hacked to distribute ransomware, it was signed using an unrelated certificate that was likely stolen. This happened twice within a year, with different valid (stolen) certificates:

https://blog.malwarebytes.com/threat-analysis/2016/09/transm...

Stuxnet certificates were also stolen.

This disproves the GGP's premise that a signed app implies the developer paid for it, as well as your assumption that the paper trail for legally acquiring a certificate is an impediment to signing malware.

You're not only trusting the developer who purchased the certificate and the CA that granted the certificate, but also trusting the ongoing security of everybody else who has purchased a trusted certificate. That's a pretty open circle of trust.

Certificate revocation can limit the time of exposure once malware is distributed, but it isn't always implemented.

https://arstechnica.com/information-technology/2017/11/evasi...

"they found 189 malware samples bearing valid digital signatures that were created using compromised certificates issued by recognized certificate authorities and used to sign legitimate software. In total, 109 of those abused certificates remain valid."


Well I don't care if the developer payed the certificate, and I don't see why someone that develops FOSS should pay money for something that doesn't bring to him any of that money back. At least for open source software certificates should be offered for free, in my opinion.

Also the fact that you are required to have a corporation, why? If I develop again an open source software I need to register a corporation just to deploy that software on Windows the correct way? That is total bullshit for me, just let me sign my software like is done on other platforms for nothing or a very small fee and done.


Certum offers cheap code signing certificates for open source developers. I don't know how "good" they are, though, just stumbled across them a while ago because MPC-BE is using one.


When we were getting EV certs from Digicert, somebody called up the office to confirm the name of the CEO. He answered the phone and the person told him he couldn't validate his own identity. So he passed the phone to the person sitting next to him, she said "Oh yeah this totally him sitting next to me" and we got our EV cert within the hour.


> Even if it's just $20.

$20 doesn't get you an EV certificate anywhere.

We're talking about non-trivial hundreds of dollars per year, which is completely unsustainable for an open source driver for example.


My point was I would trust someone who drops $20 more than I would trust someone who drops nothing.

$61/yr (USD) will get you an OV cert - there are 10% discount codes that are easy to find for these guys, and their list price is $67/yr:

https://codesigning.ksoftware.net/

But the Notepad++ guy will need a business registered with that name before he can obtain CN=Notepad++, no matter how much he's willing to pay.


This certificate doesn't help to bypass UAC and "unsecure" prompt still be shown to the user.


That's incorrect. All code authenticode signing certificates (trusted by microsoft) turns the UAC prompt from yellow to blue. EV certificates is probably for auto-trust for smartscreen.


> EV certificates is probably for auto-trust for smartscreen.

Yes, that's correct. EV just skips the reputation building phase.

Source: https://blogs.msdn.microsoft.com/ie/2012/08/14/microsoft-sma...

> Programs signed by an EV code signing certificate can immediately establish reputation with SmartScreen reputation services even if no prior reputation exists for that file or publisher.


Only after you noticed that UAC color is blue instead of yellow i started to see difference.

No user will understand this though.


Up until reading this thread I wasn't even aware that there were different UAC colours.


Your trust is misplaced; a developer who drops $$$ on a certificate could be a dyed-in-the-wool criminal. Just because code is signed and certified doesn't mean it doesn't do anything bad.

Signing and certificates revolve around trust/mistrust in the delivery channel not in the purveyor.

That problem can be solved with other tools, like PGP. You don't have to be blackmailed by a platform's certificate racket.


> That problem can be solved with other tools, like PGP. You don't have to be blackmailed by a platform's certificate racket.

It kind of works that way in Linux world where artifacts are PGP signed and to get your key into distro store one has to have "reputation". With the caveat that different distros have different schemes.

X.509 used by Windows has two nice properties that PGP doesn't - certificate attestation (MS can be sure your private key is on a hardware token) and timestamping (even if the cert expires if the signature has a timestamp it's still valid).


> It kind of works that way in Linux world where artifacts are PGP signed and to get your key into distro store one has to have "reputation". With the caveat that different distros have different schemes.

... none of them financial.

I'm not saying that financial incentives are bad, necessarily, but I am saying that being able/forced to buy your way in privileges the most organized scammers, the ones who have a cogent business plan to make money from their chicanery and some seed capital, over programmers who don't have money, have no expectation of making money, and are only motivated by getting their code out there and used.

Debian has a Social Contract. Microsoft has a pricetag. I know which of them Adobe is more comfortable with.


> I am saying that being able/forced to buy your way in privileges the most organized scammers

This works both ways because legitimate software developers also don't have easy ways of pushing their signed software to end users. Usually step 1 in installing software from external developer is "get my PGP key imported" [0].

[0]: https://www.sublimemerge.com/docs/linux_repositories

I don't mean Linux distro's model is worse or that Windows model is better. What I mean is that none of them is significantly better than the other. Just different with different trade-offs.


> Usually step 1 in installing software from external developer is "get my PGP key imported" [0].

Even #%@! Oracle does it:

https://www.virtualbox.org/wiki/Linux_Downloads

I wonder what’s the point of the PGP key then.


> I wonder what’s the point of the PGP key then.

Trust on First Use. Once the key is imported it stays the same.

People working for that organization can sign the key to attest it's real (Web of Trust). Although I wonder how would they check it. Organization (non-individual) keys are weird because ultimately it's just an individual behind it.


Kinda. You can use mimikatz to override the checks that the private key is isolated, you can even override 'no export' flag. Timestamping relies on external trusted timestamp providers implementing RFC 3161. There are many out there, maybe you could get a false timestamp out of them. I agree could be stronger than PGP, however it suffers a design flaw in that it considers the geometry of the PE file. PGP signs the whole blob. CVE-2017-0215 is an example of bypass by copying a previously signed header. It is more fragile and has been bypassed historically.


> You can use mimikatz to override the checks that the private key is isolated, you can even override 'no export' flag.

"No export" flag is not the same. What I'm talking about is keys stored in hardware modules (TPM, Yubikey) so that the private key is never disclosed, you can only ask the hardware to perform actions using that key.

See for example Yubikey docs: https://developers.yubico.com/PIV/Introduction/PIV_attestati...

> There are many out there, maybe you could get a false timestamp out of them.

Maybe? That's how CA model works, they are trusted third parties. Code signing CAs are required to operate timestamping services so it getting a cert from them is not a security issue, timestamping should also be fine.

PGP on the other hand if used in a Web of Trust model makes every valid key a CA. Not to mention that PGP doesn't have extended key usage flags so signing software is the same as signing e-mail (you cannot specify that you want to have this key be used for code signing exclusively).


> I trust a developer who drops some $$$ on a code-signing certificate more than I trust a developer who doesn't.

Why?

> you need to send in articles of incorporation and your business needs a listing with a physical address and phone number in a public directory (e.g., bbb.org), and someone representing your business needs to pick up that phone when the cert validator calls it.

So you only trust code that comes from businesses?


> Funny thing about trust: I trust a developer who drops some $$$ on a code-signing certificate more than I trust a developer who doesn't. Even if it's just $20.

Ever notice how most con-men wear nice suits? Your advocating for the digital equivalent.


Ever noticed how con-men in nice suit is much more convincing than the one in tracking suit?


To me it's rather the opposite. Especially for open source software. Spend your time, money and energy on improving the actual product instead of wasting it on smoke and mirrors.


You can file a "doing business as" (DBA) certificate online for under $10, at least in Texas.


>Funny thing about trust: I trust a developer who drops some $$$ on a code-signing certificate more than I trust a developer who doesn't.

I don't trust poor people either. Bigger chance that they're scaming .... because they need the money.


The flip side of this reasoning is that successful scammers aren't poor, but they are very likely to to still be scammers.


Wow.


> but he won't be able to obtain that until he has some kind of business or organization registered in his jurisdiction with that name.

Even in good old bureaucratic Germany this will take you less than an hour and cost you about 30€. Can't believe it can be much worse anywhere else.


You know our whole civilization is based on $$$ == trust. Just wait until you lend money to someone who is going to remind you every month he will pay you back because he is a good person, but you never see the money.


There's a backdoor that lets your bypass the SmartScreen reputation requirement: pay more money for an EV cert[1]. I don't agree with this industry practice.

Reputation requirements either shouldn't have backdoors or shouldn't exist in the first place.

1. https://twitter.com/JosephRyanRies/status/951643158118567937


The Reputation requirement exists simply because there's CAs in the Windows certificate store that aren't super trustworthy, and frankly that malware could seek to get a code signing certificate.

Arguably the Reputation requirement is more helpful than the information held in the certificate, since Reputation is hard to fake whereas that information is provided by the requestor and its validation depends on the CA's processes (which as I said varies wildly).

It is one of those "greater good" things. It does suck for FOSS however.


I'm not arguing against reputation requirements, I'm arguing for consistency.

EV certificates are literally a reputation requirement backdoor.

If EV-signed apps had to deal with the same SmartScreen reputation requirements as non-EV-signed apps, Microsoft might actually have to address this issue brought up in the parent comment:

> Every time you have to get a new one, with same story of "reputation" again.


I can not stick EV token into my cloud VPS build server.


EV code signing certs cost more because they require the private key to be stored exclusively on the hardware token so it's harder to misuse.


I have to say I agree here. Notepad++ if it provides hashes for the downloaded EXEs is completely in the right for not wanting to pay middle men for fancy "this is OK" screen on installation. That seems ridiculous and greedy.


Especially for a program that caters to developers. People will understand.


Corporate IT security departments may not, though. Certificates can be used as a secure way of managing AppLocker exemptions.


Wait are you saying that Apple Developer program for 99$/year is actually quite a good deal in comparison?

I will definitely pull this thread out next time someone complain that Apple is too expensive and that they are milking the poor developers...


For macOS for 99/year you're getting nice green pass on Gatekeeper.

For 400$/ year you're getting nothing. I was told here that I could go lower, but this is what I was paying for last decade.


> I will definitely pull this thread out next time someone complain that Apple is too expensive and that they are milking the poor developers...

Other companies also milking their developers does not invalidate this argument.


They're both too expensive?

(The only acceptable price is $0, IMO.)


You can deliver unsigned Windows apps.


You can deliver unsigned Mac apps too, just not on the MAS.


Running an unsigned app requires a separate option that's reasonably well-hidden, however [0], so it's difficult to avoid the signing requirement if you're going to distribute your app to a wide audience.

[0] https://www.macworld.com/article/3140183/how-to-install-an-a...


The are blocked by default from the latest two of three versions of macOS. You'll have to explain your users to right-click on the app icon and click Open to launch your app.


You can buy a Comodo Code Sign cert for $95 if you buy from a reseller rather than direct.


According to juliusmusseau's comment, you can even get it down to $61/year. (I've used KSoftware for signing certs before and would recommend them too.)

https://news.ycombinator.com/item?id=19330504


At least Windows lets you install pretty much whatever you want. None of this nonsense is mandatory.


Not from my experience. Many Windows users have anti-viruses installed, and they flag unsigned soft. I'm speaking from experience: it is cheaper to buy pricy certificate than loose big chunks of customers because of Windows Defender warnings. This is especially problem if you have software with rapid development cycle. Like I do - we got daily builds going up every couple days or so.

Also, Windows "trust system" seemingly works also on unsigned binaries. If you don't disable bunch of security settings, you would be blocked similar to self-signed certificates on HTTPS. You can go around, by clicking tiny button. Is enough users do it, software getting flagged as OK.

Certificate solves that this should be done per certificate, not per binary.


Shhh! Don't give them ideas! :)


You can get a certificate far cheaper than that - K-Software offer them for $85/year.

I've used them for years and can recommend them.


K-Software does not sell EV certificates for $85/yr. They start at $349/yr.

The parent comment's issue is that EV certificates are essentially required due to the poorly-designed SmartScreen reputation filter. The $85/yr certificate you're mentioning doesn't help solve this.


I didn't know about the EV workaround and the OP didn't mention it.

I've never used an EV cert before for code signing. When we first started, I think Smartscreen was a nuisance for about 2 weeks, but years on, and I've never had to think about it again. Even when we've renewed the cert.


Thanks for taking the time to describe your experience.

When you renewed the cert did you use the same key pair? (I'm wondering how does Microsoft correlate reputation).


No, it was a new key pair each time. I'm also rather interested to know how it decides reputation though!


Probably based on Subject name. It'd require a lot of money to verify that though (e.g. buying cert from another CA with the same name).


IME Windows SmartScreen still gives a scare-warning for software that's signed with a valid certificate, unless some magic "reputation-threshold" is reached and who-knows what factors into this.

The current code-signing-certificate model is pointless, regardless of price.


I didn't realise that. We released desktop software several years ago, and customers/trialers did report that Smartscreen was flagging it for a couple I'd weeks. Never had an issue since then though.

Having said that, Smartscreen is opaque, and a nuisance.


I recently got an $85 certificate from k-software which is actually Comodo now Sectigo. It was a nightmare. Took two months and fifty emails.


> Also, renewing certificate is not a thing.

Oh, no. We just kept renewing our EV certs with them for past several years... if only we'd known that we can't. Damn. Such an amateur shop this Digicert. Unacceptable.


You are getting new EV certificate every time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: