Back when I covered Linux more often, there was no story I hated to see more than the released-on-a-Friday "vulnerability analysis." You've probably seen them, too: A researcher sits down with a bunch of operating systems, pores over vulnerability disclosures, puts everything in a spreadsheet, averages some figures, and bang, he figures out once and for all which operating system is the most secure.
There are all sorts of reasons to hate this sort of report, but let's start with the big one:
They don't help anyone but people with a horse in the "My operating system is the bestest!" race. That can include people who work for companies that make operating systems, developers who work on operating systems, or (and these are the people who always make the very most noise over the very most worthless information) garden-variety fanboys. All these folks get a lot of mileage out of such reports, either as fuel for bragging rights or as yet another source of outrage over just how harmful their hated OS adversary truly is.
Maybe you're saying, "But, Mike ... knowing how vulnerable an operating system is surely helps people make a more informed decision! That's how our free market flourishes! With information!"
And you'd have gotten off on the wrong track right at the start. Conflating a vulnerability count with security guarantees you're asking the wrong questions, which means your information isn't doing you much good.
The number of ways even simple things in an operating system can be compromised to wreak incredible havoc is staggering. Consider, for instance, "the PNG of death," a funny little bug in the way some Microsoft messaging products dealt with PNG images that could allow an attacker to "take complete control of an affected system." Ouch. All that from a Spongebob buddy icon? Spend some time looking around and you see even more like that.
In fact, there's a steady stream of potential vulnerabilities that are routinely announced, patched and forgotten. They're found through code audits that ferret out buffer overflows and the like, or from someone poking around and wondering what might happen if they were to feed an application some odd data. We should all be grateful a good guy found them first, but few exploits grow out of all these vulnerabilities.
"Exploits" is the key word there. Vulnerabilities are a dime a dozen. An actual exploit a method that uses a vulnerability to compromise security on a computer is a more serious matter. Exploits are the DNA of the assorted worms and viruses about which your security software very conscientiously maintains your state of amplified paranoia and terror.
So perhaps, as a recent Microsoft report suggested, Windows Vista "appears to have a lower vulnerability fix and disclosure rate than the other products analyzed, including [...] reduced Linux installations." Curiously, XP had a similarly low rate. That hasn't worked out very well for XP users over the past few years. At least, not compared to Mac OS and Linux users.
Microsoft defenders might well shoot back that there are many more Windows installations than there are Linux or OS X ones. Which is true for desktops, perhaps, but decidedly untrue when it comes to Web servers, where ... once again ... Microsoft admins enjoy an "advantage" when it comes to the amount of malware they have to be on guard against.
But once we're into all those qualifications, we're back where we started: Counting vulnerabilities did little to tell us how secure from harm or compromise our systems will actually be with a given operating system installed.
But here's an interesting note from the author of that Microsoft report:
[... ] the security researcher industry is maturing, growing and becoming more proficient at finding and disclosing software vulnerabilities. In recent years, tools have improved significantly, several professional code scanning tools have released as products and newer techniques such as Fuzz testing have been developed and expanded to further stress the boundaries of software security.
How much more scrutiny does a new operating system face today compared with the year 2001? I can't easily put a number on it, but in my opinion, it does seem like there are more researchers, better trained, and with better tools and techniques than ever before creating an ecosystem better able to find and disclose security vulnerabilities.
And there's certainly no dearth of security software out there to protect us from what does slip through each vendor or project's security net.
And yet, with all that improvement and protection, we're still contending with spiraling malware rates, increasingly sophisticated and dangerous botnets, and threats to our financial security in the form of subtle exploits that don't eat all our information they just compromise our identities.
So maybe the real problem lies less with the operating systems and more with the people using them.
Two years ago, for instance, McAfee reported that at least 50 percent of the computers operating on the Internet aren't up-to-date on the latest patches. And as that Ars Technica article on the Microsoft study I linked to above says, "none of the listed operating systems, even when fully patched, can prevent the deliberate installation of malware disguised as useful software by an end user, which is the attack vector most commonly used by attackers today."
In all the reading I did about this latest vulnerability comparison, that's the most useful thing anyone had to offer.
|Do you have a comment or question about this article or other small business topics in general? Speak out in the SmallBusinessComputing.com Forums. Join the discussion today!|