The effect of disclosure
Once again, I’ve not posted in a while, so I’ll start off my apologising for that. Today, I’m gonna talk about the reality of computer security. When I say reality, I immediately put my hands up and say that I’m talking bullshit and really just gonna spew a lot of my opinion in regards to computer security. So, here goes…
Person A buys a knife. Person B buys a bat. In most places, person A and person B wouldn’t be accused of anything until they actually did it. Person C releases information on a vulnerability. In some cases, person C would be prosecuted for just doing that. What is the distinction though? Sure context is needed in all these situations but lets take it for what it is. All of the people (A, B and C) possess a weapon and ultimately, a choice. However, in the case of C a choice can be very limited.
Person C is sitting here with this vulnerability in a known software. He is immediately given two black and white options: disclosure or not. By going for disclosure various other routes are opened, but by doing absolutely nothing will lead to nothing. Which, is bad!
Lets say person C opts for disclosure, and contacts the software provider. He has done the right thing, let the company know of it’s mistake and prepared for a fix. Three months later, there are no updates and no replies from the company. Concerned that the vulnerability could be used for bad, he opts for public disclosure.
Public disclosure then puts pressure on the company to produce a patch. It also gives the bad guys access to the bug, suddenly its a race. This is bad! (but necessary).
However, this poor researcher could find himself in a legal crossfire if the company doesn’t understand the difference between the ethics of a good and bad hacker. This is very bad!
From my perspective, these legal fights can easily make a researcher think twice about disclosing a vulnerability to the public and keep it quiet. As I said earlier, keeping a vulnerability quiet can be even worse than disclosing it. If the company producing the software doesn’t know, they can’t fix it! However, since person C found the vulnerability, it means that somebody (with perhaps more malicious intent) can also find it… and could do worse things.
Perhaps the change that is needed is an educational change. Large software producers such as Mozilla and Google offer bounties (as in, they encourage disclosure to them!) for bugs and vulnerabilities. However, other companies I feel don’t have the education needed to distinguish between the baddies and the goodies of the hacker world. Their views are probably twisted and rotted by modern media’s depiction of a hacker (as in a hacker related to security) but perhaps the first step into improving security is to appreciate the effect of disclosure, rather than tirelessly trying to bury shit into concrete.
Short post, but I think it is enough.
Questions? post below or ask me on twitter!
Next time, I’ll be talking about the double-edged world of computer tools.