Beyond the Headlines: Don’t be fooled by misleading security tests

by on ‎04-15-2013 10:35 AM - last edited on ‎04-23-2013 10:18 AM by Administrator

By Alejandro Borgia, senior director, product management, Symantec Corporation

 

For customers looking for the best security technology for their needs, it’s critical to have access to reliable and meaningful information so they can make smart decisions to stay protected in today’s complex threat landscape.  As an industry, we face a challenge in that testing security products is a highly technical and time-consuming process, and there are shortcomings in the approach of some tests that result in data that is misleading at best. 

 

Last week, PC Magazine published an article titled “Microsoft Outperforms Symantec in Antivirus Test,” which shared the results of a recent on-demand file-scanning test performed by AV-Comparatives. Although the article explains a shortcoming of the test related to the treatment of false positives, a much bigger issue with the test, and others like it, is that the cited detection rates are misleading and not representative of real-world product efficacy. These types of file scanning tests are run in artificial environments that cripple all modern protection features. The latest Norton security products (and some other security products) employ multiple, complementary protection technologies in order to block threats. Classic fingerprint-based protection, network intrusion prevention, behavior-based protection, and Insight reputation-based security are four distinct and highly complex security layers in Norton products which all work together to detect and block malicious attacks before they can reach an end user.

 

Think about this analogy – these layers of protection are like the multiple layers of protection found in modern automobiles. Today’s cars don’t just have lap seat belts—they have dozens of features and systems that work together to maximize passenger safety, including shoulder harnesses, head restraints, multiple airbags, crumple zones and more. Auto safety tests attempt to simulate real-life crashes as closely as possible, and measure the impact on crash test dummies. In contrast, imagine a test that evaluated a car’s safety by first disabling every safety feature except for the car’s lap seatbelts — the tester removed shoulder harnesses, disabled all the airbags, etc.  Such a test may conclude that the car was extremely dangerous to drive. Yet such a conclusion would be entirely flawed since it evaluated only a single safety feature, explicitly ignoring all of the protection afforded by the car’s overall safety system.

 

Similarly, testing security products with just a file scan, as seen in the recent AV-Comparatives report and similar types of tests, is misleading. Such a testing approach does not accurately represent the real-world threat conditions seen today – and also does not accurately represent the level of protection provided by a security product. Putting forward conclusions from such flawed testing creates confusion and does a disservice to consumers.

 

Symantec has long been an advocate of independent “whole product” or “real-world” tests that most closely represent the threat environment and utilize all of the proactive technologies provided with a product.  Symantec consistently ranks at the top of real-world tests performed by independent testers. The Norton products have received recognition for industry-leading protection in real-world, unsponsored tests conducted by independent testing labs such as AV-TEST and Dennis Technology Labs.  These labs evaluate each product in realistic infection scenarios that typical users might experience.

 

We look forward to the day when all published tests are real-world tests. In the meantime, readers need to beware of artificial tests that show misleading product comparisons.

 

[edit: changed links to point to the company websites]

Comments
by Norton Fighter on ‎04-15-2013 03:32 PM

So AV-Test is not the same as AV-Comparatives ?

 

Confusing ....

by on ‎04-17-2013 11:18 AM

Yes, they are two separate testing organizations.

by ‎04-18-2013 02:42 AM - edited ‎04-18-2013 02:47 AM

I know that File-Detection tests aren't good tests for now but, why Symantec still using detection algorithms that are utilizing hash signatures as a main File-Detection feature? Why Symantec doesn't take advantage of different type of signatures, which can detect more than one file per signature? I just can't understand that.
Why generics are so rare(I don't mean Trojan.Gen/Trojan.Gen.2 signatures,  they're utilizing the same hash algorithms)?

by on ‎04-21-2013 03:39 AM

Agree with Nerimash, and want to know answer to his question.

by on ‎04-23-2013 11:08 AM

Symantec uses a combination of detection technologies that incudes generic signatures and heuristics, and do detect many files per signature. Within file-based protection, we go beyond simple pattern matching and apply generic and heuristic techniques when looking for threats. But in addition to file-based detection, we have several other layers of protection that work together to combat threats. You can find out more about our layered approach here.

by BorisVasilev22 on ‎04-25-2013 10:57 AM

In av-test.org Symantec scores 5.5 of 6, while MSE scores 2.5. If you personaly test Norton, you'll find for yourself theat it protects you like no other solution. AV-Comparatives's file detection test is not realistic, because we no longer rely only on file detection, we now have behavioural blockong, cloud-based detection, Intrusion Prevention, Norton Safeweb, etc...

by generalp2 on ‎05-06-2013 07:42 AM

I can't speak to this test of Norton but the latest issue of Consumer's Report also tested Norton against other anti-virus software and it didn't think highly of either Microsoft or Norton. I was very disappointed to see Norton rank 12th out of 14 programs tested. It did score a 50 compared to Microsoft's 43 but far below G Data, ESET and others in the high 60's.

 

I have been a fan of Norton since its beginning but I think Norton is getting lax (A symptom that I attribute to the influence of the corporate culture of Semantics, more so than the original Norton team). Semantics is not known, from my personal experience, of having software that is thoroughly tested and functionally complete. I am disappointed in the Norton rankings and sincerely hope that someone at Semantics is taking this wake-up call to put more resources into the Norton product -- give up some short-term profits for the long-term returns of having the best product in its class, which it no longer has. 

 

by BorisVasilev22 on ‎05-17-2013 04:18 AM

Man, you don't even know the company's name, how do you afford to speak about them? Norton is one of the best in detection, one of the best in performance and one of the best it additional extras. Take care.
And next time at least learn the company's name before making deep insights about it ;-)

by iggy64 on ‎05-26-2013 05:45 AM

Looks like a speeel checker changed "Symantec" to " Semantics" happened to me a number of times with Email.  I hate when that happens. :-)

 

 

 

by Super Trojan Terminator ‎06-28-2013 08:30 PM - edited ‎06-28-2013 08:30 PM

Maybe that's why there is a feature which allows the speeel checker to learn how to sppel new words

 

[this post has not been spell checked by either of the available features]

by on ‎07-09-2013 05:32 AM

You can't tell me that Symantec only uses thoughly tested, unbiased, "real world" bench testa in it's promotion of their consumer based products. Like a majority of AV suites out on the market these day's use all kinds of propaganda to promote their products, my opinion comes from some who works in the support field not as a paid employee for any  AV's.