Poor results in latest AV Comparitives test

I'll think about posting on this in his comments -- although it would be better if someone from here who actually knew what they were talking about did it <g>


 

Hey Hugh

 

I may have come on a bit tioo strong, but I posted a comment.

 

AV Cleanup

Well ,,,,,, perhaps ......

The new Dennis Technology Labs test:

 

http://www.dennistechnologylabs.com/reports/s/a-m/2013/

 

This one was an unsponsored test.

> The _new_ Dennis Technology Labs test

 

The test isn't exactly new.  The "Home" AV Protection report is dated April 2.  That's more than a month ago.

 

The next line (after the one you referenced) says that the tests for this report began back in January.

 

> This one was an unsponsored test.

 

For some variable definition of "unsponsored," apparently.

 

From the Dennis Labs website:

"Dennis Technology Labs offers a range of testing services that are focussed on performance benchmarking hardware and software products.  Clients who _commission_ bespoke tests can be assured of accurate, repeatable results delivered reliably with full reports and supporting data."

 

Clients, e.g., Symantec, _pay_ for Dennis Labs services.

 

Perhaps DL was trying to say that for clients who have previously subscribed to their services, DL ran this test without charging them extra.

 

With regard to DL's assertion that "Clients who commission bespoke tests can be assured of _accurate, repeatable results_," from their report on page 20:

 

"This is a statement from Avast!:

'We were not able to reproduce the behavior in our test lab. We will continue to work with Dennis Technology Labs to better understand the cause of the(ir) problem.' "


joen wrote:

> The _new_ Dennis Technology Labs test

 

The test isn't exactly new.  The "Home" AV Protection report is dated April 2.  That's more than a month ago.

 

The next line (after the one you referenced) says that the tests for this report began back in January.

 

> This one was an unsponsored test.

 

For some variable definition of "unsponsored," apparently.


In what way do these points not apply as well to AV-Comparatives and its test that is under discussion?


joen wrote:

The test isn't exactly new.  The "Home" AV Protection report is dated April 2.  That's more than a month ago.

 



It was published yesterday... Results from any AV test isn't published the day it is finished. It's the same with AV-Test and AV-Comparatives. The results have to be analyzed and compiled and be made presentable.

 


huwyngr wrote:

Turbo wrote

Hey Hugh

My comments were not aimed at you, I was only trying to comment on something Rubenking said that I do not agree with.

Speaking of Rubenking, here is something he recently wrote that might be relevant to this thread:  http://www.pcmag.com/article2/0,2817,2417806,00.asp


Thanks for the linki -- definitely one to be known here.

 

I note Rubenking repeats his cleanup statement:

 

<< An on-demand file scanner test on files whose arrival Symantec's antivirus did not observe is not the same as when the user actually downloads files. That's true, but it is the same as when a user installs antivirus to clean up an existing malware problem.  >>

 

and 

 

<< If you purchase antivirus software for a system that never had protection, you'll expect it to clean up any and all malware ...   >>

 

I'll think about posting on this in his comments -- although it would be better if someone from here who actually knew what they were talking about did it <g>

 

As for whether the test method is valid or not .... it may be a valid test but the conclusions drawn from it may not be!


Rubenking isn't repeating anything; the link above references the same article that I provided earlier.

 

Please keep in mind that you are actually quoting out of context:

 

<< Consider this. If you purchase antivirus software for a system that never had protection, you'll expect it to clean up any and all malware, without griping that it wasn't given a chance to use its network intrusion prevention.  >>

 

The intent of the sentence above isn't the example; it’s the phrase “without griping”. If you read that sentence again within the context of the surrounding paragraphs, then this should immediately become apparent.

 

From the link provided earlier:

 

Multiple Tests Have Value

Symantec's blog post concludes, "We look forward to the day when all published tests are real-world tests. In the meantime, readers need to beware of artificial tests that show misleading product comparisons." I, too, would be thrilled to see more tests that match a user's real-world experience, but I don't think we can discard file-detection tests.

 

Consider this. If you purchase antivirus software for a system that never had protection, you'll expect it to clean up any and all malware, without griping that it wasn't given a chance to use its network intrusion prevention. In a case like that you'll probably look for high scores in a test like the AV-Comparatives on-demand test, a test that fairly closely matches your situation.

 

For ongoing protection, yes, you'll want a product that earns top scores in real world tests also. So choose a product that scores high in both areas, and in tests from multiple labs. That way you'll get protection that can take care of any problems existing at installation and also fend off future malware attacks.

 

Symantec, themselves, actually consider Rubenking’s example above to be real world. From the Symantec STAR Malware Protection Technologies page:

 

Remediation

 

Although our goal is never to allow a threat to reach a machine, in the real world there are still situations where a user's system can get infected. Such circumstances likely include:

 

  • Users who previously had no installed security product.
  • Users whose product subscription expired.
  • Users attacked by a new zero-day threat.

Symantec remediation technologies address these situations by providing capabilities to clean up already infected machines. The core set of these technologies is built into all our malware security products.

 

More recently we made available a set of standalone tools to assist with remediating more aggressive infections. These tools include Norton Power Eraser and Symantec Power Eraser (included in the Symantec Endpoint Protection Support Tool). Features of these remediation tools include: [...]

 

This Remediation capability is also highlighted in Symantec’s marketing of the Norton Version 20 products. For example:

 

N360 V20 Product Advertising.png

 

Note the statements highlighted in blue above:

 

  • Powerful threat-removal layer targets and eliminates even the hardest to remove infections.
  • Deep clean your PC – our powerful Threat-Removal Layer targets and scrubs out aggressive, hard-to-remove infections that less sophisticated products often miss.
  • NEW  • Threat-removal Layer – Targets and eliminates hard-to-remove threats less sophisticated products often miss.

With the product capabilities described above, I’d expect Norton’s Version 20 products to have the same or better results than the AV-Comparative’s number one product in this test. The number one product missed just 137 malware samples in this test whereas Norton missed 12,022 malware samples.

 

That’s why I’m looking forward to some answers from Symantec to the questions that I posed earlier.

 

  

Elswhere,

 

I said I was not going to argue -- I've stated my viewpoint on one specific issue and although his premise is common it is in fact not necessarily correct.

soj:

> In what way do these points not apply as well to AV-Comparatives and its test that is under discussion?

 

You missed the point.  I was only responding to bombastus' post.

 

You might discuss his choice of words ("new, unsponsored") with him.

on 5/7 bombastus wrote:

> It was published yesterday

 

It doesn't matter when it was published.

It matters when the research was done.

 

Headline in your local newspaper _tomorrow_:

"Ravens beat 49ers 34-31 in Superbowl XLVII"

(held on Feb 3, 2013)

I'll take the 49ers and 4 points, lol.

 

Research that started back in January has limited validity a third of a year later.

 

How many of the products that they started testing back then have been updated/modified by May 3?

All of them, I hope.

 

 

On their home page:

"May 3: reports on home, small business, and enterprise A-M software"

 

In their report, top of page one:

"January - March 2013"

 

In their report, bottom of page one:

"[signed by their representative], April 2"

 

In their report, page 19:

"The test rounds were conducted between January 24 2013 and March 28 2013"

 

So they have mentioned January, March, April, and May.


joen wrote:

soj:

> In what way do these points not apply as well to AV-Comparatives and its test that is under discussion?

 

You missed the point.  I was only responding to bombastus' post.

 

You might discuss his choice of words ("new, unsponsored") with him.


You seemed to be raising questions about Dennis Technology Labs and its test over issues that are not fundamentally different from the AV-Comparatives test regarding dates and funding.  The Dennis Technology test was unsponsored in the same way as the AV-Comparatives test (both organizatons receive funding from AV product vendors, which is not the same thing as individual test sponsorship), and both covered the same time period, so I guess I do miss your point..

I notice MSE has lost it's AV Test cert,  I wonder if they did the work to get it back.

 

Quads


Quads wrote:

 

I notice MSE has lost it's AV Test cert,  I wonder if they did the work to get it back.

 

Quads


It appears that Microsoft is now considered the baseline in these tests and that their security software will now no longer receive any official certifications.

 

The next AV-Comparatives File Detection test is due in August/September 2013 and we still haven’t received any direct answers in this thread from Symantec or an AV-Comparatives representative with regards to the questions that we raised earlier here. So, what's going to happen if the Norton products perform worse than Microsoft again in the August/September 2013 AV-Comparatives File Detection test?

 

Symantec: please follow this up with AV-Comparatives as a matter of urgency and update this thread accordingly. Ideally, we would like to see some direct replies/posts from AV-Comparatives in this thread in response to this request.

 

Our next question for AV-Comparatives, given the number of False Positives detected in the last test, is:

 

Given that Symantec's participation in your tests is involuntary, are you testing the Norton product with 'out-of-the-box' settings or are you testing the Norton product with heuristics set to Aggressive (assuming that if Symantec did choose to participate in this test, that they would have chosen this setting based on the one that they chose way back in the August 2011 File Detection test)?

 

Please advise.

 

Thanks.

Very poor indeed...

 

http://www.av-comparatives.org/images/docs/avc_fdt_201303_en.pdf

I would like an official explanation from Symantec on these horrible results ASAP otherwise I will have to uninstall my NAV and never renew again

I have used Norton products since the DOS days with good results, until now. And it is not the 2013 version but the 2012 version of NIS. About 3 weeks ago I had a rootkit virus that NIS couldn't remove. Tried NPE but it said it couldn't remove it either. Left that system off line for about a week or so then decided to go back at Norton Support  again which wasn't helpful the first time around. Connected and Norton live update downloaded. then ran a scan and it said it removed the virus. Haven't been able to find out if that specific virus was actually targeted in that update since it was one that had supposedly been around a while.

 

So now I am not as assured that my system is protected as I used to be. just concerned that the virus got through in the first place and would like to be able to know what was in the update that came down May 30th.   


elsewhere wrote:

Quads wrote:

 

I notice MSE has lost it's AV Test cert,  I wonder if they did the work to get it back.

 

Quads


It appears that Microsoft is now considered the baseline in these tests and that their security software will now no longer receive any official certifications.


AV-TEST certifies that products meet its standards in various performance categories, and does not run specific product comparisons.  I am not sure what you mean about MIcrosoft being a baseline.  MSE either qualifies for certification, or it doesn't.

 

soj

> AV-TEST certifies that products meet its standards in various performance categories,

> and _does not run specific product comparisons_

 

Wrong.

 

They also have test comparisons of Windows AV products.

av-test.org:

"AV-TEST: Products Tested on Windows XP"

 

Following that link also gets you to Vista, Win 7, and Win 8 test comparisons.

 

They even have comparisons of Android AV products.

 

> I am not sure what you [elsewhere] mean about MIcrosoft being a baseline.

 

His words were "It appears that Microsoft is now _considered_ the baseline."

 

A baseline is a _basic standard_. 

It's something that everyone thinks of when some subject is discussed.

 

MS Security Essentials is virtually ubiquitous.

Almost all users know about it.

It's a "basic standard."

Don’t expect miracles

There’s little doubt that security software can and does block a lot of the bad stuff that comes in from the Internet, but the big question is, how much of it will it catch? The truth that many vendors don’t want to admit is that it’s a race. It’s always been a race, between the security vendors and the malware authors, but while many would try to claim it’s almost won, the fact is the opposite. The reason why antivirus software is often on the losing end of this race is that malware is no longer the domain of sophisticated hackers.


joen wrote:

soj

> AV-TEST certifies that products meet its standards in various performance categories,

> and _does not run specific product comparisons_

 

Wrong.


The testing for certification determines if a product meets the standards that AV-TEST sets - it is not a comparative test, although products can obviously be ranked, based on how well they scored on the various test components.  Any product that meets the standards will be certified.  If MSE lost its certification, it is because it failed to meet those standards, not because other products outperformed it in direct comparisons.  The AV-TEST website states "Home-user products must achieve at least 10 of the 18 points available and at least 1 point in each category in order to earn an "AV-TEST CERTIFIED" seal of approval." - that is the baseline, not MSE.