9th-round antispyware comparison report has been published !!

G

Guest

Ninth Round (August 7, 2006) Antispyware Comparison Report:

Cleanup Success Rate for Entry-based Viewpoint:

‧Trend Micro Anti-Spyware: 74.02%
‧Webroot Spy Sweeper: 58.27%
‧McAfee antispyware: 56.69%
‧Sunbelt CounterSpy: 53.54%
‧PC Tools Spyware Doctor: 50.39%
‧Lavasoft Ad-Aware: 48.82%
‧F-Secure Internet Security: 46.47%
‧ZoneAlarm Anti-spyware: 43.31%
‧Panda Platinum Internet Security: 40.16%
‧Norton Internet Security: 38.58%
‧Microsoft Windows Defender: 38.58%
‧ewido anti-malware: 37.80%
‧Agnitum Outpost Firewall Pro: 35.43%
‧SUPERAntispyware: 35.43%
‧Kaspersky Internet Security: 25.98%
‧Aluria Anti-Spyware: 25.20%
‧Computer Associate Anti-Spyware: 22.83%
‧NOD32: 17.32%
‧Spybot S&D: 11.81%

For detailed information, please see
http://www.malware-test.com/test_reports.html.

Note that due to cleanup success rate depends on spyware samples, so we hope
you can help collect top 100 spyware list and post it to Forum
(http://malware-test.com/smf/index.php?topic=1495.0). Thanks.
 
G

Guest

**********
The data is interesting, however, I do not fully understand the comparison
basis. Esets NOD32 is primarily a tool for defeating virus infections (both
documented and in the wild). I don't think NOD32 should be included but that
is up to the folk who are doing the testing.

**********
 
A

Anonymous Bob

Samplas said:
Ninth Round (August 7, 2006) Antispyware Comparison Report:
For detailed information, please see
http://www.malware-test.com/test_reports.html.

Note that due to cleanup success rate depends on spyware samples, so we hope
you can help collect top 100 spyware list and post it to Forum
(http://malware-test.com/smf/index.php?topic=1495.0). Thanks.

Eric L. Howes has posted a good critique of their tests here:

http://www.dslreports.com/forum/remark,16527721?hilite=malware-test

Bob Vanderveen
 
G

Guest

I am always skeptical of most software testing and reviews. At best, most
are VERY subjective opinions, confusing the casual reader.

Ask Eric Howes pointed out in the cite, these tests are superior to most due
to the simple fact that the methodology and test environment are clearly
revealed. This is something that MOST technical publications neglect to
disclose (often because it would reveal the author's outright bias or utter
incompetence).

Conversely, I dislike the comparison report for most of the same flaws as
Eric.
 
G

Guest

Though I agree in general with the discussion thus far, I have a concern
unrelated to the test results themselves. Defender is designed primarily to
intercept spyware, rather than scan/remove, and tests such as these ignore
that role completely. As in this case they usually perform testing using an
exact duplicate of an already infected OS, which assumes the malware has
already been successfully installed.

Though in theory you could claim the results will be 'the same' for a
scan/clean as for a Real-time detection/block, this ignores the potential
interaction of the user. Purists will claim that if the item isn't detected
as 'bad' automatically it wouldn't be blocked anyway, but for me that test
result is worthless, since I feel I would generally know when to direct
Defender to 'block' a rouge installation. And since some malware can either
'hide' or protect themselves from removal once installed, the only effective
method may be to block the initial installation.

Unfortunately this is also much more difficult to test since there may be a
time critical component and the question of testing while connected to the
Internet rather than offline comes into play, adding even more variables. For
this reason I understand why no one attempts such testing, but for my uses it
removes any real value from the results.

Unlike static file scanning, which is the nature of a basic antivirus
scanner, spyware detection/removal is generally a much more dynamic
situation, especially when an Internet connection exists. Until someone can
create a consistently repeatable testing environment that simulates the
dynamic nature of the Internet, I don't feel any antispyware test result can
be considered valid, no matter how well intentioned.

Bitman
 
G

Guest

Exactly! Comparing proactive and reactive detection methodologies is a bit
like comparing apples to rutabagas.

There are far too many variables and unknowns to make clear "comparison" of
anti-threat tools, especially when you must consider the greatest variable of
all ... the Internet.

It is safe to say that I could develop a benchmark that could make virtually
*ANY* anti-threat tool appear superior to all others on the market. One
simply needs to assess the strengths and weaknesses of each product, and
devise a test to pits the benefits of one against the failings ofother
products. Conversely, one might just as easily reverse the process to defame
a particular product.

As always, there will be lies, damned lies, and benchmarks. ;-)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top