In recent years, Consumer Reports (CR) decided to take on reviewing security software in addition to the scores of energy drinks, vacuum cleaners and toasters they put through the paces annually. Consumer Reports has faced some challenges in this new endeavor; you might remember they generated a fair bit of industry discussion based on their security testing practices. Well-known organizations like McAfee, Sunbelt and others voiced concerns at test methodology that fell well outside the industry norm.
Last year, we were – to put it mildly – shocked by the information and results published in the Consumer Reports security software evaluation. We immediately requested their test methodology to understand how they’d arrived at their conclusions. There’s admittedly still a lot of variability in security test practices—so we try to give the benefit of the doubt when we see something that just doesn’t add up. We repeatedly attempted to engage Consumer Reports to provide our feedback and open up a dialogue with their reviewers. In spite of our efforts, these discussions never took place. While we don’t always agree on the methodology reviewers use, there’s almost always a conversation that takes place to determine where the product didn’t get the job done or, alternatively, where the test methodology fell short. We were pretty disappointed when this didn’t happen with such an influential publication like CR.
Fast forward to one year later and the publishing of this year’s State of the Net and security software review. As I read through the report and the review, it was pretty clear that little had changed from last year’s misadventures in testing. While we’d love to speak with Consumer Reports directly about their latest test, based on last year’s absence of meaningful dialogue and what’s looking like another year of stonewalling, we figured we’d share our thoughts in a public forum. So here’s a short list of the areas of the report that we felt sounded a little funny or were well off the mark.
Automobiles, Security Suites & Timing First off, what would you think of a publication that released a review for this year’s line-up of luxury sedans a month before next year’s line-up was released? You arrive at the dealership only to realize that the review you consulted and relied upon to make your purchasing decision is no longer that relevant. You’d probably be pretty disappointed in the magazine and its ill-timed assessment of last year’s product. Consumer security software undergoes major revisions every year and most vendors release the new versions of their products between August and October. It’s been this way for a long time now. Which begs the question, why would CR release its review of last year’s products in August, right before all the new versions are released? Are they paying attention?
AntiSpyware Strangeness AntiSpyware is an area where security vendors vary considerably in terms of their detection policies and approaches. While we generally agree on the basics, there’s a lot of room for interpretation and widely varying implementation of features. You can witness significant differences in how a product detects and displays its findings, even for something as mundane as tracking cookie detection or adware like Zango. Couple this with the differing effectiveness of detection and removal engines and the AntiSpyware category usually shows pretty significant differences across products, just like the one here. So when ten vendors in a test rate exactly the same in AntiSpyware, something doesn’t smell right. Consumer Reports’ response to this testing anomaly: Time for a tie-breaker! CR went on to select a really strange mix of functionality in order to determine their rankings in this section. I can see why they included “Protects Browser Settings” as one of the features (though I think a better approach is browser exploit blocking which prevents threats from ever having the chance to change your browser settings in the first place), but the typical consumer and CR reader probably doesn’t care much about sending “threat info to manufacturer” or perusing the “threat details”. The primary goal is protection, not reporting back to the manufacturer or reading about what’s going on behind the scenes. Consumer Reports should instead be testing product aspects like AntiRootkit functionality or checking for inclusion of a system recovery tool for scrubbing a deeply infected PC. Maybe it’s just me, but I’d much rather purchase a product that’s effective at protecting my system from rootkits and solving deep infections than for its ability to tell me about the spyware that snuck onto my system or send information ex post facto to the security vendor.
The Internet "Off" Button As a global company we see all types of product reviews. Some are mostly based on the user interface, support and how the product’s out of the box settings perform; others drill-down deep into customization and cover a full range of features. The German publications are well-known for their detailed analysis of security products, whereas a gamer magazine would not test at that level, opting instead to measure the impact on network latency and how much the product inhibits them during a WoW raid.
What we are not used to seeing is a publication that modifies malware to perform their own testing and then rates products by the inclusion of an Internet “off” button which “cuts Internet access if a malicious file is suspected” as CR has done in their latest review. Let’s say you think that’s a reasonable idea (you can take the support calls from my mother when she accidentally kills her connection!) and that CR is rating security suites for a less-advanced user who might like to have a “big red off button”. Why then would they rate privacy filters which require users to configure “rules” to prevent data leakage as an essential feature? Or for that matter, why rate suites on having a pop-up blocker that is now standard in every modern Web browser?
(As a side note, at least one of the privacy features from the vendors rated here can be given the slip simply by capitalizing the letters of the string that is supposed to be blocked. I imagine it was beyond the scope of their testing, but it did not escape our summer intern. He’s pretty clever, but he wasn’t presenting at DefCon last week either on the latest firmware hack.)
Missing What Consumers Care About We conduct a steady schedule of primary consumer research, pay others to do it on our behalf as well as purchase research reports from other organizations. I’m betting this doesn’t surprise you. One of the obvious questions we ask when people buy or switch security brands is why they did it. The results from NPD research we commissioned last year were obvious yet instructive—the main reason people switch security brands (44%) is the impact it has on performance, both real and perceived. This ranked even higher than basic protection effectiveness (e.g., detecting and removing threats), which came in second place at 28% of the reasons for switching. Pretty straight-forward, right? “Keep bad stuff off my PC and don’t make the cure worse than the illness itself.” At least CR acknowledges performance as a key criterion but, when you rate a suite’s consumption of system resources with seemingly equal weighting as a pop-up blocker and Internet “off” button, something has gone awry. Firefox blocks my pop-ups quite well and I can yank the cat-5 from my Linksys router when all hell breaks loose, but there’s not much I can do to keep greedy security software from pegging my CPU other than uninstall it and try something else.
Sometimes You Need It, Sometimes You Don’t? Even though CR themselves cite “downloading free software” as the #4 top “online blunder,” they go on to use “easy temporary disable to pause protection when you’re installing software” as one of four criteria by which they rate Antivirus protection. Translation: Accidentally installing applications that contain security threats is the #4 top online blunder, but make sure you hit that “easy disable” button on your security product so you can install malware. Your security software should be checking the applications that you attempt to install. Period. Pausing AV protection to install software is dubious advice to consumers, especially given the rise in threats, like the Storm Trojan that use social engineering tactics to convince users to install malware. Not to nitpick too much but CR names Storm as a “recent” threat, though its inception dates back to January 2007. Constantly changing? Absolutely. Recent? Consider that in January 2007 the word “Obama” still meant a little known city in southern Japan to anyone but the politically savvy.
Fun with Freeware We know that some people simply do not want to pay for security. Got it. I refuse to purchase a GPS for my car, preferring instead to map my route beforehand and use Google Maps on my Nokia e61i. I know it’s not as good as a fancy new TomTom or Garmin device, but I’ve just learned to live without it and I’d rather use the money for something else. So in light of this, I understand why CR would compare a freeware alternative to paid security suites. Nonetheless, one thing caught my eye—they did not measure the annoyance factor of using freeware. Notably, the pop-up and other advertising nags you suffer if you use a combination of freeware products as suggested in the article. Maybe they found a magic combination, but a recent test conducted by Security Catalyst resulted in 42 different pop-ups in 1 week. Now web browsers have built-in pop-up blockers that will prevent most ads that would have been handled by the security suites that still insist on carrying them, but what will block the pop-ups that your freeware security products are assaulting you with? There are other factors to consider beyond annoyance, such as the time needed to install, software conflicts, and effectiveness of protection. In fairness CR touches on effectiveness somewhat, but concludes that it will probably suffice if you practice careful computing practices. That’s just not the case anymore. All of us in the security field know that ‘staying in safe neighborhoods’ simply isn’t viable advice. Perhaps, but a large percentage of today’s attacks come from legitimate hacked sites. Check the Washington Post & Websense, ScanSafe, or any of the other recent threat landscape reports that are out there from reputable sources. StopBadware, which CR is a member of, also had an article on this topic. If you visit a legit site where the ad network let an infected ad slip-through, the advice to stay in the good neighborhoods of the Internet is cold comfort.
There’s more that could be mentioned but if you’re still reading this post you get the idea—security software is complex and requires deeper consideration than what has been provided in the Consumer Reports review. As someone who has always considered CR a solid source of information, this is disappointing and as the product manager for a security product included in this review it’s downright frustrating. Chime in and let me know what you think. And CR, if you’re reading this, while I won’t be consulting your magazine to buy my next vacuum cleaner, I’m still happy to discuss how to improve next year’s security review with you. The door is open.
Update 8/18: Larry Seltzer published an interesting column on the Consumer Reports security issue entitled: Security Software Reviews Done Wrong. Check it out if you're interested in another person's take on their "State of the Net" and security suite review.
Update 8/19: The Tech Herald’s Steve Ragan posted a piece today entitled: Consumer Reports – Reviews Gone Wild. Another in-depth and interesting perspective on the Consumer Reports’ security review issue.
Message Edited by Sondra_Magness on 08-19-2008 02:41 PM