Category Archives: Anti-malware Testing

Anti-malware testing resource

Testing security software has been part of my life for almost as long as I’ve been involved with computing: not only in terms of evaluating the efficiency of products and technologies for the organizations I worked for, but as an independent tester (especially of Mac AV) way back in the 90s. I stopped testing when I began to foresee a time when I simply wouldn’t have the time or resources to do justice to what even then was a difficult job. There was a time around 2006 when I was discussing roles on both sides of the vendor/tester divide, but for better or worse, I went over to the dark side and focused on supplying consultancy services to the AV industry (primarily ESET). However, I didn’t escape the testing controversy, being involved almost from the beginning in in the Anti-Malware Testing Standards Organization (AMTSO) and even serving for nearly three years on its Board of Directors.

While I’m still in sympathy with the ultimate aims of AMTSO, when the organization decided that the blog I set up on behalf of the Board no longer met its needs, I found myself needing a platform where I could continue to provide independent commentary on testing issues. Hence, the Anti-Malware Testing blog. While most of the material there right now consists of articles I originally posted to the AMTSO blog (as an independent commentator, not on behalf of AMTSO) that are no longer available elsewhere, it’s primarily intended for new articles. (I am, however, currently working on a resource page similar to the one on the extinct amtso.wordpress.com blogsite, with links to useful articles, papers and other testing-related resources.)

Right now there are three new articles there:

  • Explaining the Anti-Malware Testing Blog is what the title suggests it is.
  • Imperva-ious to Criticism looks at Imperva’s continued defence of its flawed quasi-test methodology, which inappropriately tried to use VirusTotal as a measure of the detection abilities of anti-virus/anti-malware products.
  • A Little Light Relief is a little lighter in tone. Literally. It points to an entertaining article by Robert Slade. After all, if I had to take testing seriously all the time, I’d get very depressed.

Compliments of the season to all our readers, and very best wishes for the New Year.

David Harley CITP FBCS CISSP
Small Blue-Green World/Mac Virus
ESET Senior Research Fellow

More AMTSO stuff

They say there’s no such thing as bad publicity, though quite who ‘they’ are, and why ‘they’ would make such a clearly daft statement is beyond me. It seems that AMTSO has had it’s fair share of bad publicity recently –  a further example is the piece by Ed Moyle over on his blog at http://www.securitycurve.com/wordpress/archives/1773. It’s a long article, but it does show that Ed clearly doesn’t understand (or doesn’t want to accept) what AMTSO is trying to do – maybe that does just mean that AMTSO needs a better PR representation. Anyway, once again Kurt Wismer (or perhaps I should adopt his anti capitalist rendering and use kurt wismer) has provided some excellent analysis of Ed’s piece over on his blog at http://anti-virus-rants.blogspot.com/2010/07/i-see-standards-organization.html

There’s little more that really needs to be said from my perspective. For the record, I personally agree with Kurt (just can’t seem to get my head around the ‘kurt’ thing), in his analysis of the NSS report done by AMTSO – which seems to be at the root of this whole anti AMTSO campaign. The central point is that NSS did a good job, and came very close to the ideal – (if you haven’t read the review, then it’s here). It’s unfortunate that that has been taken as a negative thing or a slight against them to say that they did not fully meet the ideal standard set by AMTSO – it was still far better than many other tests, and I have every hope that people are sensible enough to recognise that. It’s hard for me to see quite how Ed jumps from that report to an accusation that AMTSO is ‘Slapping the labs’ – an argument even harder to see when a lab like Dennis Technology Lab (who have very similar methodology to NSS) voluntarily submitted their own test for the AMTSO review process (see the report here).

If there’s one thing we can learn from this, it’s that it does seem that there’s a double standard here – testers can criticise AV vendors with impunity in their reviews and tests of AV products, but when someone tries to apply that same process and rigour to the tests done by those testers, that is somehow anathema. Personally, I think that’s shoddy thinking, and I have no doubt that AMTSO will continue to strive, as it has done from inception, to provide the public with an insight into tests, and to support good testing practice (and incidentally point out less than ideal practice where needed).

Andrew Lee
AVIEN CEO / CTO K7 Computing

Virus Researchers are community outcasts

Lately I’ve been reading a lot of blogs and articles attacking and defending AMTSO and their attempt at establishing standards for the testing of counter-malware products. Unfortunately I think BOTH sides are missing the larger picture here. AMTSO was formed to address some critical shortcomings in the testing of counter-malware products: some tests were arguably unethical, most unscientific and some just poor from the word go. So where does the dissent come from? It comes from the very people who done or supported those poor non-science based tests. Yet it goes beyond that. The people who are condemning AMTSO and their efforts are in some cases well respected in the general security arena, and are very knowledgeable, and this is the rub. These people, most people in academia, and in management as well do not recognize Malware research and prevention as a specialty niche. They attempt to apply the same rule-set to fighting a malware outbreak as they do a simple intrusion, and see nothing wrong with that solution.

A majority of people not engaged in the Malware field as a profession still feel that the average Security Professional has the same knowledge and skill sets as used by the Counter Malware Professionals. Unfortunately nothing can be further from the truth. It goes beyond the abilities and skills for reverse engineering, programming, and identifying abnormal network traffic. This argument goes back to at least the early 1990’s when in a panel discussion a firewalls specialist attempted to answer a question about a virus. On that panel was Wolfgang Stiller, creator of Integrity Master Anti-Virus, Wolfgang interrupted him saying along the lines of “look I’m here for the virus questions, I would never presume to speak with authority or experience on firewalls issues, but you presume to have the same experience and expertise with viruses that I do, and that is mistaken”. Similar exchanges have happened on other panels with people such as Robert Vibert and Rob Rosenberger, among others. These are also the same people who demand that anti-malware products protect against threats that are not viruses, nor are they specifically malware, but “Potentially unwanted programs”. So this is not a new phenomenon. The question in my mind is why does it still exist?

Anti-Virus ‘Experts’ helped establish the disaster recovery field, and were among the very first to teach classes in th at subject. It was the Anti-Virus Researchers who developed the field of Computer Forensics, in both cases it was the Anti-Virus field that had the necessary expertise and skill set needed to fill the holes and expand the career field. So now that Disaster Recovery, and Computer Forensics are recognized as specialty fields and given a high degree of respect from schools and management, what happened to the Anti-Virus researcher? Their mindset is not of an operational nature, they bore easily, some may even say they have attention deficit disorder (ADD), yet they are anal about doing things the same way every-time. They dwell on minutiae, arguing to the point of splitting hairs. I sometimes think some of my colleagues can SEE the traffic on the wire in their minds eye. Yet with all this contribution to the Computer Security Community they are still (almost purposely) maligned and misunderstood. At a Virus Bulletin Conference, I stated that we as a community must take action or go from the ranks of professional, to the ranks of the tradesmen. I still don’t know what action that is, or how to go about it, but AMTSO is a good step in that direction, and the naysayers need to start looking outside their comfort zone and realize they know enough to be dangerous and not enough to be helpful at this point.

Ken Bechtel
Team Anti-Virus
Virus Researcher and Security pontificator

The edge of reason(ableness): AV Testing and the new creation scientists

First, let me start out by saying that I am in a bad mood. I probably shouldn’t write when I’m in this mood, because I’m in danger of just ranting, but I’m going to anyway. I’m in a bad mood because I am pretty fed up that some people are so deliberately trying to destroy something I’ve personally (along with many others) worked very hard to build in the last couple of years.

I’m in a bad mood because writing this is distracting me from the many other things that I need to do, and get paid to do.

I’m in a bad mood because I’m fed up with hearing that I, and others like me, have no right to comment on things that fall directly within my realm of expertise (and goodness knows, that’s a narrow enough realm) – and that if I do, it’s simply self-interested nonsense.

Secondly, let me also point out that although I’m now going to reveal that, yes, I’m talking about Anti-Malware Testing, and may mention AMTSO, I’m not speaking on behalf of AMTSO, nor my employer, nor anyone else, but me, myself and I (oh, that there were so many of us).

So, “What’s the rumpus?*” Well, in what has become an almost unbelievable farce, the last few weeks have seen mounting attacks on the AMTSO group and what it does.

For some background – those who are interested can read these articles.

http://kevtownsend.wordpress.com/2010/06/27/anti-malware-testing-standards-organization-a-dissenting-view/

http://krebsonsecurity.com/2010/06/anti-virus-is-a-poor-substitute-for-common-sense/

There are some very good points in the second (Krebs) article, although cantankerous is not something that I would say characterizes AMTSO all that well – as Lysa Myers has pointed out ‘AMTSO is made of people‘, and I think the generally negative tone employed is a shame. The first (Townsend) article is way more problematic; there’s just so much wrong with Mr Townsend’s thinking that I don’t really know where to start. Fortunately, Kurt Wismer has already done a great job of responding here, and David Harley an equally competent job here.

So why my response? Well, probably because I certainly am cantankerous.

I’m also, almost uniquely in this industry (David Harley is another), formerly one of those “users” that Mr Townsend is so adamant should be controlling the process of AMTSO’s output – indeed, the whole of AVIEN was set up in the year 2000 as an organisation of interested, non-vendor employed, users – albeit users who knew something about anti-malware issues. We were users responsible for protecting large enterprises, who wanted to be able to share breaking anti-virus information without the interference of Vendors or the noise of such cesspools as alt.comp.virus. We wanted good, reliable information.

I, like David Harley, later joined the industry as a Vendor, but I still understand what it is to be a user, and that was also a huge consideration in the setup of AMTSO – as so many have said before, and I want to reiterate here, bad testing of anti-virus products hurts everyone, the user most especially.

However, this debate is much more than just one on which we can ‘agree to differ’  – like whether Germany or Spain has the better football team might be – it’s much more fudamental than that.

Indeed, the only real analogy that comes close is that of the battle currently raging between the so called  faith based ‘science’ of creationists (let’s not prevaricate, Intelligent Design is just a euphemism for Creationism), and the research based science of evolutionary biologists and so on.

On the one hand, you have anti-malware researchers, professional testers and so on; people who study malware every day, who constantly deal with the realities of malware exploiting users, and who understand better than anyone the challenges that we face in tackling malware – if you like, the “Richard Dawkinses of anti-malware” (though I certainly would not claim to match his eloquence nor intelligence) –  and on the other hand, we have those outside the industry who say that we’re all wrong, that we’re just a “self-perpetuating cesspool populated by charlatans” (yet none the less, a cesspool at which the media feeds most voraciously), that nobody needs AV, and that everything the AV community does or says is bunk.

What I find so extraordinary (in both cases) is that those who are most in a position to provide trusted commentary on the subject are so ignored, in favour of those who have shrill, but ill-informed voices. Why is it that information from a tester; who may have just woken up one morning and decided to ‘test’ antivirus products; is taken on faith as being correct and true; and yet, when a group of professional people give up their time voluntarily, and work together to try to produce some documentation that sets out the ways in which anti-malware products can be tested effectively (and, no, that has nothing in particular to do with the WildList) and reliably, is it so violently decried as self-interested nonsense. It’s a terrible shame that science is so deliberately ignored in the face of popular opinion. Unfortunately, millions of people CAN be wrong, and often are.

AMTSO is not about dictating truth, but rather pointing out ways in which truth can be reliably found (and importantly, where it cannot).

I refuse to lie down and take it when someone tries to tell me that I’ve no right to point out the truth – and I’m not talking about truth based on some millenia old scripture, but real, hard, repeatable, scientifically verifiable, researched fact. If that makes me as unpopular as Richard Dawkins is to a creationist, then so be it.

If you’re interested in understanding why anti-virus testing is so important (and why so many professional testers participate in AMTSO) then, please, do have a read of the AMTSO scriptures er… documents, here.

Andrew Lee – AVIEN CEO, Cantankerous AV researcher.

* If you’ve not seen the excellent movie “Miller’s Crossing” you won’t know where that quote comes from.

(Thanks to Graham Cluley for pointing out that the first link didn’t go to the correct page.)

Brief hiatus

Our reader may note that it’s been quiet around here for a few weeks. Far from this being due to a lack of news, it’s rather that there have been a huge number of other things demanding time and attention. Not least of these is me trying to submit my master’s thesis on time, that and a few conferences, papers and other matters mean that we’re a little understaffed at AVIEN right now. Normal intermittent service should be resumed shortly.

Andrew Lee
AVIEN CEO / CTO K7 Computing

Testing AV: Why VB Tests are still relevant

The latest Virus Bulletin Anti-Malware product test, the largest ever of it’s type (a mammoth 60 product test) demonstrates several things; that testing Anti-Virus products never gets any easier; that discussing (or dissing) the tests never gets any less popular; and that the results of testing are never less than controversial.

Virus Bulletin has been in the testing game a very long time, and their comparative testing and VB Award have been around since early 1998. Before that time, VB was reviewing AV products since its inception in 1989. Their test methodology is well known, and is based on a combination of Wildlist testing, tests for ‘zoo’ viruses (that is, non-wildlist known malware) and False Positive (FP) testing. The full current methodology can be found here.

Despite there being a large number of people decrying this sort of WildList based testing, and indeed some vendors entirely withdrawing from any sort of ‘static’ tests (i.e. based on scanning of predetermined files, rather than live incoming threats), the fact that 60 products participated in a test like this shows that there is still life, and worth, in this type of testing.

The surprising thing is that while many criticize WildList based tests for being limited in scope (the WildList certainly is not a comprehensive list of malware) so many products fail to pass these tests. This perhaps more than anything highlights their usefulness as a baseline. If your product isn’t reasonably consistent in achieving the VB 100 Award, perhaps you should think about a different one. Often the problem is not detection so much as false detection, making the FP part of the test very important. Any product could detect 100% of all viruses very easily, it’s much more difficult to detect ONLY viruses, and nothing else.

The other aspect of the testing, that perhaps is not clear from the results, but is highlighted in the short review written of each product, is that of the experience of the tester in being able to test and use the product.

John Leyden, writing in the register points out that 20 out of the 60 products (1/3 for those of you who still remember how fractions work) failed to achieve the certification. He also quotes John Hawes (VB’s tireless tester) as saying “It was pretty shocking how many crashes, freezes, hangs and errors we encountered in this test” – indeed damning words considering that the test was on Windows XP, a mature platform that has been a standard for many years.

So, while attaining VB 100 Awards is not the be all and end all of testing Anti-Malware products, it’s still a good place to start looking. Congratulations to all those whose products did pass, from someone who knows only too well just how high that particular bar is set.

SRI iBotnet analysis

I’m not a huge fan of SRI, mainly because of its misconceived and inept use of VirusTotal as a measure of a measure of anti-malware effectiveness. (Unfortunately, SRI is not the only organization to misuse what is actually a useful and well-designed service by Hispasec as a sort of poor man’s comparative testing, even though  Hispasec/VirusTotal themselves have been at pains to disassociate themselves from this inappropriate use of the facility: see http://blog.hispasec.com/virustotal/22.)

So it pains me slightly to report that they have actually produced a reasonable analysis of the botnet associated with the iPhone malware sometimes known as Ikee.B or Duh (sigh…) But they have, and it’s at http://mtc.sri.com/iPhone/.

I wish I could say that some of their other web content is of the same standard. Disclaimer: the company for which I currently work does indeed consistently appear at a very low position in SRI rankings, so you’d expect me to dislike the way they get their results. I do… But I dislike even more the way that they’ve ignored all my attempts to engage them on the topic. OK, rant over. The ikee analysis is still well worth a look.

David Harley FBCS CITP CISSP
Chief Operations Officer, AVIEN
Director of Malware Intelligence, ESET

Also blogging at:
http://www.eset.com/threat-center/blog
http://dharley.wordpress.com/
http://blogs.securiteam.com
http://blog.isc2.org/

AMTSO – The herding of the cats continues

I’ve spent the last couple of days in Prague (never a real hardship) at the AMTSO (Anti-Malware Testing Standards Organization) conference. The subject of Testing is one that I, and many others in the industry, have been interested in for a long time. Indeed, my main contribution to the AVIEN Malware Defense Guide was a chapter discussing testing. The whole reason for AMTSO forming was to try to create some clarity around the increasingly complex issues of testing. It may seem to some – particularly those who may never have attended an event involving large numbers of people with (slightly or wildly) differing opinions – that the wheels of AMTSO grind very slowly. However, this is not the case, these are complex issues, and the important thing is to ensure that if a document is published, that it should meet the aims and principles of the organisation. To that end all documents must be fully discussed and formally voted upon by the membership. The meetings are a productive time where final adjustments to the documents that have been put together over the past months can be made, and these documents voted upon.

There are already signs that AMTSO is having a positive effect, many testers have joined in the effort – as clearly, bad testing also has a negative effect on their reputations, and many mentions of the group have been seen in the press and in security circles. I hope that the increased awareness will encourage people to get involved, and that the progress will continue. The conference was interesting for all, with some good discussion on controversial topics. Keep an eye out for a press release over the next couple of weeks, and the appearance of some news on the AMTSO site.

Anti-malware testing is something that really does affect anyone who has a computer, so it’s great to know that there is a group dedicated to promoting ethical practice and laying out guidelines for good testing that can showcase the abilities of modern products.

As a member of AMTSO (but not an official representative of it), I’m happy to say that I fully support the efforts, and while it may seem slow, and often progress does involve a level of complexity akin to herding cats, it’s a worthwhile effort, and it is to be hoped that it will continue to go from strength to strength

Andrew Lee CISSP
AVIEN CEO

Blog reviews

On the subject of testing (or at least of reviews), Tom Kelchner in the Sunbelt blog pointed out upcoming FTC rules that make (some) bloggers who review products more accountable by declaring . That’s products in general, of course, but there are obvious implications for this industry: the Untangled tests, for instance, were largely publicised through their blog (and secondary sources such as other bloggers and other media, of course).

Sunbelt: New FTC rules: bloggers must reveal pay and perks they get for reviews http://bit.ly/Qy26L

MSNBC story: http://www.msnbc.msn.com/id/33177160/ns/technology_and_science-tech_and_gadgets/

FTC: 

 

 

Testing, testing

OK, we’ve used that as a title before. However, it seems quite apposite as this is my first published blog here, and it’s related to anti-malware testing. (See what I did there? :-D)

This is actually a retread of my heavily re-edited blog at securiteam. But since it concerns (obliquely, for legal reasons) an issue that some of us discussed at VB 2009, I’m quite happy to repurpose some of it here.

Principle 3 of the AMTSO (Anti-Malware Testing Standards Organization) guidelines document (http://www.amtso.org/amtso—download—amtso-fundamental-principles-of-testing.html) states that “Testing should be reasonably open and transparent.”

The document goes on to explain what information on the test and the test methodology it’s reasonable to ask for.

So my first question is, is it open and transparent for an anti-malware tester who claims that his tests are compliant with AMTSO guidelines to decline to answer a vendor’s questions or give any information about the reported performance of their product unless they buy a copy of the report or pay a consultancy fee to the tester?

Secondly, there is, of course, nothing to stop an anti-malware tester soliciting payment from the vendors whose products have been tested both in advance of the test and in response to requests for further information. But is he then entitled to claim to be independent and working without vendor funding? In what respect is this substantially different to the way in which certification testing organizations work, for example?

AMTSO will be considering those questions at its next meeting (in Prague, next week).  But there are a lot of people inside and outside AVIEN who are seriously concerned with testing standards, as an aid to evaluating products for use in their own organizations, or because they have a vocational interest in making or supporting products that are impacted by fair/unfair or good/bad testing, and I’d be more than a little interested in hearing your views.

David Harley CISSP FBCS CITP
Chief Operations Officer, AVIEN