Folks, now that there's a bit more code in Wine that "verifies" file signatures, I wanted to make sure everyone understands its current limitations.
1. It's only implemented for PE files and .cab files. Windows supports more formats, of course, notably MSI files (see bug 11759, http://bugs.winehq.org/show_bug.cgi?id=11759 )
2. Wine doesn't actually verify that the signature in the file matches the file being checked. Any valid certificate could be put into a file, and Wine would accept it.
I don't consider this a serious security flaw, because I think the concept of a signature validating anything useful about a binary is flawed. Hence I'm not terribly motivated to fix it.
Flame away, --Juan
[Juan]
- Wine doesn't actually verify that the signature in the file
matches the file being checked. Any valid certificate could be put into a file, and Wine would accept it.
I don't consider this a serious security flaw
I assume you don't ship signed software. If you did, you might see things differently. Unless I've misunderstood, you've made this possible:
1. I release my software with my digital signature attached
2. A malware author downloads my software, extracts my certificate, and applies it to his malware
3. His software infects a user's machine and damages it. The user discovers the infection, looks at the signature, **Wine says that the certificate is valid**, and the user blames me.
Please, either tell me I'm wrong, or make Wine honest about what it's telling the user.
I assume you don't ship signed software. If you did, you might see things differently. Unless I've misunderstood, you've made this possible:
I release my software with my digital signature attached
A malware author downloads my software, extracts my certificate, and
applies it to his malware
- His software infects a user's machine and damages it. The user
discovers the infection, looks at the signature, **Wine says that the certificate is valid**, and the user blames me.
Please, either tell me I'm wrong, or make Wine honest about what it's telling the user.
No, you're not wrong, and this email was my attempt at being honest.
I'll point out that there are other avenues of attack that can lead Wine to "mislead" the user about who signed an executable. However, in my professional opinion, a signature on a binary isn't worth the bits its encoded in. Any software, signed or not, can contain vulnerabilities. With the size and complexities of today's software, and with signatures only being affixed to the largest and most complex software, I'll state that in my opinion it's the signed software which is more at risk than the unsigned software. If you believe a piece of software signed by Microsoft (or Apple, or...) is any more trustworthy than some random piece of code, you needn't look far to disabuse yourself of that notion.
Even so, an exploit is far more likely to target Windows, and perhaps to fail on Wine, than it is to target Wine. I'm not attempting to hide behind a security through obscurity defense. I'm pointing out that even if digital signatures meant anything--and I maintain that they don't--the probability of their being attacked in Wine is very low. Therefore, from a risk management point of view, there's no compelling reason to fix it. I may fix it someday, but as I said before that wouln't remove all code signing vulnerabilities from Wine, it would only remove this particular one.
If you disagree, patches are welcome. --Juan
Hi Juan,
On Friday 25 July 2008 16:49:34 Juan Lang wrote: [...]
Please, either tell me I'm wrong, or make Wine honest about what it's telling the user.
No, you're not wrong, and this email was my attempt at being honest.
... and your honesty is appreciated!
I'll point out that there are other avenues of attack that can lead Wine to "mislead" the user about who signed an executable.
Security often involves providing many barriers. There's a tacit assumption that none are going to be perfect. A common mantra is "security in depth".
[...] With the size and complexities of today's software, and with signatures only being affixed to the largest and most complex software, I'll state that in my opinion it's the signed software which is more at risk than the unsigned software.
As an aside: this looks to me like a logical fallacy. If I may rephrase your argument: 1. Most signed software is from a large code-base (probably true) 2. Large code-bases are more likely to have vulnerabilities (probably true) 3. Therefore, signed software is more likely to have vulnerabilities (wrong: not deducible) See: http://en.wikipedia.org/wiki/Fallacy#Logical_fallacy
If you believe a piece of software signed by Microsoft (or Apple, or...) is any more trustworthy than some random piece of code, you needn't look far to disabuse yourself of that notion.
Whether the code operates correctly (e.g., doesn't crash, taking out your filesystem and burning down your house) isn't under scrutiny. You are quite right in saying that a software having a valid digital signature from Microsoft makes no statement whether the software will work. Maybe it does, maybe it doesn't. But, this isn't what digital signatures are about.
Instead, signed software attempts to prevent silent injection of Trojan software. The question is rather: did this software which claims to come from Microsoft really do so? Has it been hacked to include some "additional functionality"?
A digital signature makes one precise statement: that someone (or some agent) with access to the corresponding private key decided to sign the software.
If you also trust that: 1. you have the correct public key for the certificating authority (CA) that issued the code-signing certificate,
2. the CA are doing their job correctly,
3. the recipient of the code-signing certificate tries to ensure that the key is only used to sign their own software (e.g., hasn't been silently stolen).
then you can be pretty certain that, if some software was signed by a valid code-signing certificate, that the software is the genuine article.
Even so, an exploit is far more likely to target Windows, and perhaps to fail on Wine, than it is to target Wine.
Well, maybe, but (without being too paranoid) it is possible to target attacks against individuals through on-the-fly rewriting of packages, or through DNS poisoning, or transparent web-proxies that parse the HTTP User-Agent string, or ...
I'm not attempting to hide behind a security through obscurity defense. I'm pointing out that even if digital signatures meant anything--and I maintain that they don't--the probability of their being attacked in Wine is very low. Therefore, from a risk management point of view, there's no compelling reason to fix it.
The risk-analysis processes I've been involved with combining two metrics, likelihood and impact, to provide a combined risk metric (e.g., multiplying the two numbers).
So, if I may counter your argument: I believe your analysis fails to take into account that the vulnerability may be targeted and that automated updates may make use of digital signatures (so greatly increasing the likelihood) and the effect is the end-user running an arbitrary, targeted payload (so the impact is pretty high).
I may fix it someday, but as I said before that wouln't remove all code signing vulnerabilities from Wine, it would only remove this particular one.
OK, but generally speaking bugs are fixed one at a time. I may have missed your point here, but (in general) just because there are other bugs exist doesn't preclude fixing an specific bug.
If you disagree, patches are welcome.
Sure ... and this is the acid test!
If no one spends the time to fix the problem then it isn't so important. (I know I fall foul here from having far too little spare time ;-)
But, to be honest, I'm a little surprised this made it into wine. The policy used to be something like "no hacks to support a specific application". I haven't seen the patch(es) but from how you've described it, this doesn't seem to pass the sniff test.
I think one should at least warn the user that the code has not been check thoroughly. The alert dialogue box could include a "don't show this message again" check-box, but I feel we should allow Wine users to care about this.
Friendly,
Paul.
Security often involves providing many barriers. There's a tacit assumption that none are going to be perfect. A common mantra is "security in depth".
Sure. It's just my professional opinion that a signature on an application provides no security. Zip, nada. It does give you some assurance of who it came from, that's all.
As an aside: this looks to me like a logical fallacy. If I may rephrase your argument:
- Most signed software is from a large code-base (probably true)
- Large code-bases are more likely to have vulnerabilities (probably true)
- Therefore, signed software is more likely to have vulnerabilities (wrong:
not deducible)
That isn't my argument. My argument is that signatures provide no guarantees that the code behaves as intended. 1. Software may either contain deliberately malicious code (improbable but not impossible) or vulnerabilities (generally true) 2. Signed software contains no malicious code (probably true) 3. Signed software can't behave maliciously (wrong).
This isn't true because signed software may contain vulnerabilities, and these, when exploited, will cause the application to behave maliciously whether or not the code was signed. My statements about the size of signed applications was to emphasize that they may contain vulnerabilities.
Well, maybe, but (without being too paranoid) it is possible to target attacks against individuals through on-the-fly rewriting of packages, or through DNS poisoning, or transparent web-proxies that parse the HTTP User-Agent string, or ...
This doesn't pass the likelihood metric of risk analysis. As you said, you prioritize in terms of probability x impact. If probability is vanishingly small--which I argue it is--it doesn't matter what the impact is. If you argue that isn't as small as I claim it is, well, we can disagree, but now on to the impact: the impact is you've installed a piece of software into Wine that claims to be from someone other than who it's really from. So what? What additional rights and privileges does it gain? In Wine, none--it always has precisely the same rights that you do, barring a Wine-specific privilege escalation attack, which I'll argue is exceedingly unlikely indeed.
That's just the thing--the privilege models in Unix and Windows are different. In Windows, just about everybody runs as an administrator anyway, so at least Microsoft tries to counteract that by providing signatures on code so users will be more careful about installing something. I don't think it's really a security property they're assuring. Instead, I think it gives the users a second chance to decide whether they want to run that cool lemming app someone emailed them. In Windows, the risks of running it are very high. In Linux, the risks are somewhat lower.
Signatures of automatic software updates are probably useful, but that's not what we're taling about here. I continue to maintain that signatures of random apps people are trying to run in Wine are not. To be clear, there are three applications I've worked on that use signature checking: 1. iTunes 2. AIM 3. The DirectX9 runtime
The first two are not trustworthy, no matter what the signatures say, but people want to run them anyway. Wine's raison d'etre is to let people run those applications if they wish. The DirectX9 runtime only provides a few usable DLLs that Wine doesn't provide, and that's in order to get some game to run (I'm guessing, I didn't need it myself.) The games are also not trustworthy *cough* rootkit *cough*.
But, to be honest, I'm a little surprised this made it into wine. The policy used to be something like "no hacks to support a specific application". I haven't seen the patch(es) but from how you've described it, this doesn't seem to pass the sniff test.
This isn't a hack. The code is more correct now than it was, and it gets a previously running application to run again. It's just not as correct as it could be.
Look, I'm not against fixing it at all. I'm just trying to clear about what the code does and doesn't do right now. I have a limited time to work on this sort of thing, and I feel that keeping applications from working in the interest of some misguided notion of security is not in anyone's best interest.
As usual, patches are welcome. --Juan
Hi,
Paul Millar wrote:
As an aside: this looks to me like a logical fallacy. If I may rephrase your argument:
- Most signed software is from a large code-base (probably true)
- Large code-bases are more likely to have vulnerabilities (probably true)
- Therefore, signed software is more likely to have vulnerabilities (wrong:
not deducible) See: http://en.wikipedia.org/wiki/Fallacy#Logical_fallacy
A digital signature is intended to certify that the software was really published by it's claimed vendor.
It does not protect against bugs, vulnerabilities, intentional malware or anything else.
But protects you from hosting that modify intallers to drop malware for example. Or may save you from viruses pretending to be Microsoft software.
Kornél