Category Archives: WildList

Testing AV: Why VB Tests are still relevant

The latest Virus Bulletin Anti-Malware product test, the largest ever of it’s type (a mammoth 60 product test) demonstrates several things; that testing Anti-Virus products never gets any easier; that discussing (or dissing) the tests never gets any less popular; and that the results of testing are never less than controversial.

Virus Bulletin has been in the testing game a very long time, and their comparative testing and VB Award have been around since early 1998. Before that time, VB was reviewing AV products since its inception in 1989. Their test methodology is well known, and is based on a combination of Wildlist testing, tests for ‘zoo’ viruses (that is, non-wildlist known malware) and False Positive (FP) testing. The full current methodology can be found here.

Despite there being a large number of people decrying this sort of WildList based testing, and indeed some vendors entirely withdrawing from any sort of ‘static’ tests (i.e. based on scanning of predetermined files, rather than live incoming threats), the fact that 60 products participated in a test like this shows that there is still life, and worth, in this type of testing.

The surprising thing is that while many criticize WildList based tests for being limited in scope (the WildList certainly is not a comprehensive list of malware) so many products fail to pass these tests. This perhaps more than anything highlights their usefulness as a baseline. If your product isn’t reasonably consistent in achieving the VB 100 Award, perhaps you should think about a different one. Often the problem is not detection so much as false detection, making the FP part of the test very important. Any product could detect 100% of all viruses very easily, it’s much more difficult to detect ONLY viruses, and nothing else.

The other aspect of the testing, that perhaps is not clear from the results, but is highlighted in the short review written of each product, is that of the experience of the tester in being able to test and use the product.

John Leyden, writing in the register points out that 20 out of the 60 products (1/3 for those of you who still remember how fractions work) failed to achieve the certification. He also quotes John Hawes (VB’s tireless tester) as saying “It was pretty shocking how many crashes, freezes, hangs and errors we encountered in this test” – indeed damning words considering that the test was on Windows XP, a mature platform that has been a standard for many years.

So, while attaining VB 100 Awards is not the be all and end all of testing Anti-Malware products, it’s still a good place to start looking. Congratulations to all those whose products did pass, from someone who knows only too well just how high that particular bar is set.