1 hr.

Is anti-virus software really woefully ineffective??
That's what a?report released in late November?by Redwood Shores, Calif., digital-security firm Imperva seems to suggest.?
It claims that out of 40 top anti-virus products, less than 5 percent detected newly discovered malware, and implied that anti-virus software customers were wasting their money.?
"We believe that the majority of?anti-virus products?on the market can't keep up with the rate of virus propagation on the Internet," the study stated. "[What] enterprises and consumers spend on anti-virus is not proportional to its effectiveness."?
Imperva's study was widely criticized at the time for poor methodology, small?sample size?and unrealistic testing scenarios. Because of those doubts, TechNewsDaily did not report the study when it was first released.?
Six weeks later, the story is enjoying a second wind, with write-ups Tuesday?in the influential tech blog The Register and in the business section of The New York Times.?
Neither the Register nor the Times questioned Imperva's methods. Yet the questions remain.?
"Not only is Imperva's sample size minutely small, but their test has been based upon an utterly flawed methodology," Graham Cluley, senior technology consultant at Sophos, an anti-virus software maker near Oxford, England, told TechNewsDaily Wednesday.?
"This 'study' and its conclusions are deeply flawed, wholly unreliable and massively biased,"?tweeted Rik Ferguson?of Japanese anti-virus firm TrendMicro yesterday, addressing the author of the New York Times piece.?
"Kaspersky Lab believes it is necessary to draw attention to a significant drawback in Imperva's testing methodology, which makes it impossible to take these test results seriously," a representative of the Russian anti-virus software maker told TechNewsDaily.?
Imperva did not directly respond to a request for comment, but pointed us to?a blog posting?from mid-December that addressed many of its study's critics.?
"While our report acknowledged the limitations of our methodology, we believe that, fundamentally, the model for antivirus ? and not our methodology ? is flawed," the posting reads.?
Unorthodox approach
Imperva's methods could be characterized as somewhat odd. It searched Google, underground hacker forums and its own "honey pots" (deliberately unprotected servers designed to attract malware), for new or nearly unknown pieces of malware. It ended up with 82 pieces.?
Then the Imperva researchers accessed VirusTotal, a publicly accessible database of malware signatures, long numerical strings, or "?hashes," unique to each piece of code. (Google recently bought VirusTotal.)?
Anyone can submit a hash to VirusTotal to see which, of any, of the 40 or so anti-virus companies that submit their own hashes to VirusTotal had already picked it up.?
Imperva did exactly that. It generated its own hashes for each of the 82 malware samples it had gathered and ran them against VirusTotal's database. It found very few matches in VirusTotal.?
"The initial detection rate of new viruses is nearly zero," the Imperva researchers concluded. "Though we don't recommend removing anti-virus altogether, a bigger portion of the security focus should leverage technologies that detect abnormal behavior such as unusually fast access speeds or large volume of downloads."?
Jump to a conclusion??
"The test pitted various anti-virus products against a tiny collection of 82 malware samples, and still got VirusTotal to do the hard work for them," Cluley said. "And yet,?VirusTotal's own About page?clearly says that using their service for scanner testing is a 'BAD IDEA' (their capitals, not mine) and has frequently debunked testers that try to compare products by using the site."
"VirusTotal representatives clearly state that the service was not designed as a tool to perform comparative anti-virus analyses," the Kaspersky Lab representative said. "Unfortunately, this was ignored by Imperva's so-called experts, whose incorrect testing methodology resulted in incorrect conclusions."
Furthermore, modern anti-virus software already does what Imperva suggested it doesn't. Besides matching signatures, it also analyzes unknown pieces of code for behavior patterns, methods of entry, resemblance to known malware and other "heuristic" ? experience-based ? clues.?
To make a real-world comparison, imagine that a team of security guards only compared strangers' faces to a file of photographs of known criminals. Such methods would work, but only up to a point, yet that's essentially what Imperva tested.?
Better-trained guards would also analyze what the strangers wore, how they acted, how they communicated and how they got to the building. That's akin to what most anti-virus products do nowadays.?
"Simply scanning a collection of files, no matter how large or how well-sourced, misses the point of security software entirely," Ferguson told the website of Britain's?ITPro magazine?today. "They were not exposing the products to threats in the way they would be in the wild."?
There are other companies that do rigorous, constant testing of anti-virus products as a major part of their business models.?
"Respected testing bodies like AV-Test.org, West Coast Labs, Virus Bulletin, etc., test anti-virus products against millions of samples," Cluley said, "and do it properly by verifying that the samples they test against are truly malicious ? and avoid the many many challenges and problems that anti-virus testers have to handle."?
In their response to critics, Imperva's Rob Rachwald and Tal Be'ery defended the company's methods and its conclusions. They cited a study by AV-Test that also, in their words, "reveal[ed] a worrisome security gap."?
Yet the?AV-Test screen grab?that Rachwald and Be'ery posted demonstrated the opposite of Imperva's conclusions. It showed that in real-world testing of protection against zero-day malware attacks, the anti-virus industry average was 87 percent.
abercrombie ohio state football cyber monday lupus iCarly banana republic gap
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.