Poor results in latest AV Comparitives test

Gary, I'll tell you how I know I have no infections. I do backup scans with other utilities.....


SendOfJive wrote:

PhoneMan wrote:
There is a note 2 on page 4 of  that report that Symantec did not apply to be tested but was included because it was highly demanded by readers and the press.

It is interesting that AV-Comparatives decided to include Norton in a test from which Symantec had withdrawn over concerns that the testing methodology produced misleading results, but did not choose to include Norton in the Real World Test in which the product had always scored exceedingly well.  Are the public and press not also interested in how well Norton actually prevents malware infections in the first place?


Are there any other vendors that have withdrawn from this particular AV-Comparatives File Detection test because they support Symantec's assertion that the testing methodology produces misleading results? At face value, all of the other major vendors' antivirus products appear to have been successfully tested without a testing integrity challenge being raised by any of the other vendors in that report. It appears that Norton has been deliberately included in this particular test to answer the question that's weighing on the minds of millions of Norton users: "If my Norton 2013 product was included in this test, how would it fare?".

 

The results are in and they're not pretty. If Symantec wants prove how well Norton 2013 would perform in an AV-Comparatives Real World Test, then they need to pay for it, just like all the other vendors do:

 

http://www.av-comparatives.org/about-us/funding-en

 

 


elsewhere wrote:
Are there any other vendors that have withdrawn from this particular AV-Comparatives File Detection test because they support Symantec's assertion that the testing methodology produces misleading results?

Yes.  As security companies develop new technologies to prevent systems from becoming infected in the first place, the relevance of file detection tests to spot malware on already-compromised machines is being called into question from many quarters.  Webroot recently withdrew from AV-Comparatives testing for reasons similar to Symantec's - the current testing methodology does not reflect the effectiveness of these newer approaches to malware prevention.

 

http://community.webroot.com/t5/Security-Industry-News/Joint-message-from-AV-Comparatives-and-Webroot/td-p/17708

 

There's a new blog up on testing

 

http://community.norton.com/t5/Norton-Protection-Blog/Beyond-the-Headlines-Don-t-be-fooled-by-misleading-security/bc-p/944005#M708 

Which is worse, a bad showing by AV-Comparatives, or the 100% across the board score by the paid for Dennis Labs? Me, i prefer to take multiple test by various sites. I don't believe the 100% score of everything by Dennis Labs, but nor do I think Norton is as bad as the AV-Compartive test. I still say it's in the top 5.

But Norton hasn't shown a 100% across the board. They have even been beaten in protection at DTL.

going to the Article that norton released. these results from AV-c do seem misleading. I have played around with norton and it does well. AV-tests likes norton and in the youtube videos, It usually blocks most to all the links the youtube testers throw at it. and usually achives detection rates above 85%. I agree with Myrdhinn, I would not base opinions off of 1 site. I would check out multiple test-sites adn watch a couple youtube videos first. then  I would download the trial first to see if I like how the program behaves.


huwyngr wrote:

There's a new blog up on testing

 

http://community.norton.com/t5/Norton-Protection-Blog/Beyond-the-Headlines-Don-t-be-fooled-by-misleading-security/bc-p/944005#M708 


PC Magazine has responded to the blog post above:

 

http://securitywatch.pcmag.com/security-software/310393-symantec-declares-on-demand-antivirus-tests-misleading

 

More on that later...

 

When Microsoft found itself in a similar position after the AV-Test December 2012 round of tests, they published the following article:

 

Key lessons learned from the latest test results

 

Microsoft raised some interesting points with regards to the malware samples tested and the number of their customers who actually encountered those malware samples.

 

Why is this relevant? As incredible as this sounds, Norton version 20 (2013) products don’t actually install a complete set of virus definitions by default. The products install a cut-down definition set known as Smart Definitions that only “contain the most important virus definitions that are required for latest security threats as viewed by Symantec”. If Symantec’s view of the ‘latest security threats’ doesn't align with AV Comparative’s ‘recent/prevalent’ sample set, then, as Microsoft noted, there are bound to be detection discrepancies.

 

Given the above, and expanding on the car analogy used earlier, it should be noted that, by default, Symantec’s Norton car doesn’t come with any seatbelts installed for passengers seated in the rear seats of this car (Smart Definitions). This may prove to be a unique safety exclusion when comparing this car with other cars reviewed in this AV Comparatives Car Seatbelt safety test. The Norton car owner’s manual indicates that Symantec feels that the added weight of having rear passenger seatbelts installed in the car by default may have an unnecessary impact on the performance/fuel economy of the Norton car. For those customers who’d prefer to have the added safety reassurance that having rear seat passenger seatbelts installed provides, these customers simply need to visit their nearest Symantec/Norton LiveUpdate Service Centre and have these additional rear seatbelts installed, at no extra cost, to the Norton car owner.

 

It should be noted that if the three crash test dummies that were originally seated in the rear seats of this Norton car during this crash test could talk, then they would most likely express their extreme dissatisfaction with Symantec’s decision to exclude installing rear seatbelts in the Norton car by default...

 

That said, some questions for Symantec:

 

  • Did AV Comparatives test the Norton AntiVirus (NAV) v20 product using Smart Definitions or did they test NAV v20 with the ‘Complete Set’ enabled?

  • If the test was actually performed using Smart Definitions, then have you requested a re-test of the original sample malware set using the NAV v20’s Complete definition set and, if so, could you please advise us of the outcome?

Thanks

 

(Edit: fixed broken Norton Help file link)

Thanks elsewhere, your post was an eye-opener to me.

 

Users have the ability to choose Smart Definitions (which protects against the most prevalent and current threats), or the full set which is much more comprehensive.  A few months ago, Twixt posted an excellent piece on the topic:

 

 

 

http://community.norton.com/t5/Norton-Internet-Security-Norton/smart-definitions-on-or-off/m-p/804554#M217004

 

Regards,

Kelly

I couldn't attempt to comment on any of this except to say that the PCMag article is by Neil Rubenkin for whom I have the greatest respect.

 

I've known his work for many years from the early days of Compuserve but I would query the correctness, technically speaking, of his remark near the end:

 

<< Consider this. If you purchase antivirus software for a system that never had protection, you'll expect it to clean up any and all malware, without griping that it wasn't given a chance to use its network intrusion prevention.  >>

 

We know and see here frequently that there can be better tools to clean up than those designed to protect an active system.

Norton give better  protecion to their user for keeping your data safe from virus and malware attacks. They also help their user  by giving Norton help service to her.

 

 

 

 

 

 

 

 

 

 

 


Bombastus wrote:

Norton has more layers of protection than most competing products and doesn't rely solely on signature detection for protection.


But Symantec has decided to not participate in AV-C's whole-product tests, which are ecclectic.  Additionally, I don't agree with your statement since "most" competing products do not solely rely on signature detection.  Bottom line: Norton refuses to be compared with competing products and Norton bombed the one test AV-C was able to perform.  "Most competing products" don't seem to have this issue.

 

Another issue is that IT professionals work for someone else (bosses or clients) and sometimes its necessary to have supporting facts to CYA.  If I recommend Norton to a client and they still get infected, chances are they will blame Norton and me for recommending/installing Norton and I don't have anything tangible and objective to qualify my decision.  If I can point to AV-C's site and show that Norton was one of the top three products consistently for the most recent year, I can quell the emotional flare-up/scapegoating and assure the client that no security software is 100% (still pointing at AV-C's tests).

 

Hi everyone

 


elsewhere wrote:

That said, some questions for Symantec:

 

  • Did AV Comparatives test the Norton AntiVirus (NAV) v20 product using Smart Definitions or did they test NAV v20 with the ‘Complete Set’ enabled?

  • If the test was actually performed using Smart Definitions, then have you requested a re-test of the original sample malware set using the NAV v20’s Complete definition set and, if so, could you please advise us of the outcome?

Thanks

 


I'm sure there are a lot of Norton users , who are waiting for Symantec to answer those questions!


Has an opinion:

intelligent descriptions in any case do not turn off - the level of security will be higher

 


Сonducted two tests:
1) NIS 20.3.1.22
Smart Definitions
2) NIS 20.3.1.22 Full
Definitions


The test archive is 2559 samples (all sorted by Kaspersky Antivirus)

 

20-04-2013 1-48-27.png

 

 

 

Test 1
not detected 57 samples

 

20-04-2013 2-47-31.png

 


Test 2
not detected 79 samples

 

20-04-2013 1-33-23.png


huwyngr wrote:

 

I couldn't attempt to comment on any of this except to say that the PCMag article is by Neil Rubenkin for whom I have the greatest respect.

 

I've known his work for many years from the early days of Compuserve but I would query the correctness, technically speaking, of his remark near the end:

 

<< Consider this. If you purchase antivirus software for a system that never had protection, you'll expect it to clean up any and all malware, without griping that it wasn't given a chance to use its network intrusion prevention.  >>

 

We know and see here frequently that there can be better tools to clean up than those designed to protect an active system.


Neil Rubenking’s comment above is correct. From what he is saying, it appears that Symantec’s interpretation of the term ‘real-world’ is conditional, as highlighted below.

 

From the link provided earlier:

 

“It's true that AV-Comparatives made sure the test systems had Internet access, thereby giving the Symantec installation access to the powerful cloud-based Norton Insight reputation system. When I asked my Symantec contacts about this, they explained that for full power Norton Insight relies on full information, "how the file was obtained, when it was obtained, or from where it was obtained (e.g. URL and IP address)." An on-demand file scanner test on files whose arrival Symantec's antivirus did not observe is not the same as when the user actually downloads files. That's true, but it is the same as when a user installs antivirus to clean up an existing malware problem.

 

The network intrusion prevention components also got no chance to help out, since the file samples were downloaded before installation of antivirus software. Once again, you'd be in a similar situation when installing antivirus for the first time on an infested system.”

 

So, what exactly does the term ‘real-world’ cover? Is there anyone out there in the real world who hasn't encountered at least one of the following scenarios?

 

These are just some of the real world scenarios where Norton Insight won’t have full information about the files. The onus will then be placed on Norton’s File Detection ability or SONAR to provide protection for the end user. This is why the File Detection tests are still important.

 

Not being content with the AV-Comparatives results, forum member PRIOR and I decided to test NIS 2013’s File Detection ability for ourselves.

 

Our first question to Symantec/AV-Comparatives/the Norton Community would be for someone to clarify the term ‘Missed Samples’ used in the graph on page 6 of the AV-Comparatives ‘File Detection Test March 2013’ report. If ‘Missed Samples’ represents the number of files remaining after all the sample files have been scanned (expressed as a percentage), then this calculation can’t be applied to Norton’s scan results. For example, if the malware sample folder contains 100 files and, after scanning, 9 files remain in the malware sample folder, then the calculated ‘Missed Samples’ value would be 9%. Again, if this is way the AV-Comparatives results are calculated, then this calculation can’t be applied to NIS 2013’s scan results, as we will show below.

 

To illustrate the issue, 14 sample files containing the ‘W32.Sality.AE’ virus were scanned by NIS 2013’s Auto-Protect and Virus Scanner features. At first, the scan results were unsettling as Norton Auto-Protect advised that the 14 files had been ‘Removed’ yet Windows Explorer clearly showed that the files were still present:

 

http://www.youtube.com/watch?v=BtvltVA1DUQ

(Video of the test conducted by PRIOR)

 

What actually happened was that the 14 infected files were ‘Repaired’ as shown when the files were scanned with the Norton Auto-Protect feature disabled:

 

http://www.youtube.com/watch?v=oCbIpyYEiNI

(Video of the test conducted by PRIOR)

 

This then begs the question; did AV-Comparatives do a before and after comparison of the hash values of the ‘Missed Samples’ with the hash values of original sample files to see if any of the ‘Missed Samples’ had been modified by NIS 2013? If the ‘Missed Samples’ calculation described above doesn't take into account the number of remaining files that have been ‘Repaired’ by NIS 2013, then obviously this calculation is going to be incorrect. NIS 2013 would have completely failed the test on these 14 files: Missed Samples 100%.

 

At this stage, it would be good if a Symantec Software Engineer could get involved in this thread. The videos above highlight a number of product defects and a repaired file outcome doesn’t necessarily mean that the file in question is actually malware free...

 

Hopefully, we'll get a better response from Symantec this time around than the response that we got to the questions that I asked Symantec earlier in this thread.

 

To be continued...

As I said I don't have the competence to go into the tests or results.

 

<<  Neil Rubenking’s comment above is correct.  >>

 

My query was about a good protector being automatically expected to do a good cleaning up job after the event of an infection getting through.

 

It's horses for courses is all I'm saying especially since so much malware gets in by invitation of the unwitting, inexperienced user.

 

I'm not commenting on the nature of the tests or on the results but only on what I (and others) consider a reasonable expectation.

<< Consider this. If you purchase antivirus software for a system that never had protection, you'll expect it to clean up any and all malware, without griping that it wasn't given a chance to use its network intrusion prevention.  >>


 I do not agree, I would never install an AV on an infected system and I would not expect it to clean up all the malware if I did. I install the AV on a clean system and I rely on the AV and common sense to keep it that way. An old adage comes to mind: An Ounce Of Prevention is Worth a Pound of Cure.

 

 


Turbo wrote:

<< Consider this. If you purchase antivirus software for a system that never had protection, you'll expect it to clean up any and all malware, without griping that it wasn't given a chance to use its network intrusion prevention.  >>


 I do not agree, I would never install an AV on an infected system and I would not expect it to clean up all the malware if I did. I install the AV on a clean system and I rely on the AV and common sense to keep it that way. An old adage comes to mind: An Ounce Of Prevention is Worth a Pound of Cure.

 

 


I agree although I was trying to jump beyond and since nothing is 100% perfect in stopping infection especially when you factor in the human element that other software may do a better job at cleaning up. They really are different functions and it's normal to use specialized tools for special tasks.

I rely on my roof to keep out the rain but if it does leak then I don't expect it to clean up afterwards -- I get the specialists with powerful vacuum cleaners, air blowers etc! ..... Ghostbusters ....


huwyngr wrote:

I agree although I was trying to jump beyond and since nothing is 100% perfect in stopping infection especially when you factor in the human element that other software may do a better job at cleaning up. They really are different functions and it's normal to use specialized tools for special tasks.

I rely on my roof to keep out the rain but if it does leak then I don't expect it to clean up afterwards -- I get the specialists with powerful vacuum cleaners, air blowers etc! ..... Ghostbusters ....


Hey Hugh

My comments were not aimed at you, I was only trying to comment on something Rubenking said that I do not agree with.

Speaking of Rubenking, here is something he recently wrote that might be relevant to this thread:  http://www.pcmag.com/article2/0,2817,2417806,00.asp


Turbo wrote

Hey Hugh

My comments were not aimed at you, I was only trying to comment on something Rubenking said that I do not agree with.

Speaking of Rubenking, here is something he recently wrote that might be relevant to this thread:  http://www.pcmag.com/article2/0,2817,2417806,00.asp


Thanks for the linki -- definitely one to be known here.

 

I note Rubenking repeats his cleanup statement:

 

<< An on-demand file scanner test on files whose arrival Symantec's antivirus did not observe is not the same as when the user actually downloads files. That's true, but it is the same as when a user installs antivirus to clean up an existing malware problem.  >>

 

and 

 

<< If you purchase antivirus software for a system that never had protection, you'll expect it to clean up any and all malware ...   >>

 

I'll think about posting on this in his comments -- although it would be better if someone from here who actually knew what they were talking about did it <g>

 

As for whether the test method is valid or not .... it may be a valid test but the conclusions drawn from it may not be!