Free Antivirus Software for Home Use v2.0
by Robert Spotswood
Summary for the impatient
If you're using Windows, you'd be foolish not to have anti-virus software. At work you have an IT department to take care of it and pay for it. But home users are on their own. However, you don't need to spend lots of money to get good anti-virus software. In fact, you don't have to spend any money at all.
There are at least 10 anti-virus programs free for home use you can choose from. Three of them are also free for business use too. If you don't like those, sometimes your Internet Service Provider will provide you with commercial antivirus software at no charge. Regardless of the price or source, which one do you choose?
I used to believe the answer was simple, just pick one and keep it updated. I was wrong. There are many factors to look at. Three of the most important ones are cost, how much of a PITA the software is, and last, but hardly least, how well does it work. As one antivirus tester put it (http://blog.untangle.com/?p=95 ), “Open source antivirus hasn’t gotten a fair shake, and all the meanwhile some commercial vendors escape with selling products so poor it should be considered a scam to sell them...”
I set out to prove him right. I wanted to show that two open source antivirus products, Moon Secure and Clam, are just as good as any commercial antivirus. And since I was at it, why not test some other free products that would be of interest to HAL-PC members too? In the end, things didn't go quite like I expected.
As a note, I will use the terms virus and malware pretty much interchangably. All the antivirus products I'm aware of don't just detect viruses any more. They detect viruses, trojans, worms, spyware, and rootkits, aka malware. Or at least they try to. A 2007 research study by Panda Labs found that about 23% of infected machines had active and up-to-date antivirus software(http://cli.gs/930J3L).
Which products to test? That was the question I faced. Considering this is all self funded, the free products jumped right to the front of the line. I started searching the net for a complete list of all the antivirus products out there. I didn't expect the list to be that big. In the end, I found 55, and I'm still not sure I found them all. There is no one, complete list that I could find.
Out of those 55, how many are free for at least home use? Most HAL-PC members have home computers to protect, and except for us Linux SIG members, the vast majority those computers are running Windows. So I limited my search to at least free for home use and Windows, which gave me 10 candidates.
The next step was to check out the license terms for each of the 10 finalists. It's not unknown for software manufacturers to stick rather onerous terms in licensing agreements, including no benchmarks being published without their say-so.
While it's true that such terms might very well be declared unconscionable by almost every judge in America, it's just not worth the money to fight it out in court. Better to learn about it early, and pass that product by. With that in mind, I started reading the End User Licensing Agreements (EULA). When I woke from my legalise induced coma, two contenters eliminated themselves. No testing was done on these products.
Be warned that nasty terms and tricks are not limited to just these two. For instance, in my research I turned up this quote:
Gathering the malware
Now that the field was down to 8, it was time to find some malware to throw against them and see how the various products did.
In my line of work, I'm sometimes called upon to clean infected computers. This gave me a perfect opportunity to gather, real, in-the-wild, samples. I have friends and associates that also do cleaning from time to time, and they gave me additional samples.
I should note that homes with teenagers were great sources of malware samples. Parents, don't let your teenagers on your computer if you have information on there that you wouldn't want a criminal to have. The teenagers aren't quite as web-savvy as they would like to believe.
As soon as I got the malware, I would then run it against the remaining antivirus products, being sure to update them first, and recorded which ones missed it. In a several cases, the samples were false positives - the antivirus said it was malware, but it was not – and I recorded those as misses too.
After I ran my tests, I submited the samples to those vendors that missed them, in hopes of preventing someone else from getting infected. In the process, I learned two dishearting facts about the antivirus world.
I suspect the slow update times were a combination of several factors. First, none of my samples got much press coverage. I'm certain mainstream media talking about a particular piece of malware would speed up the definitions. It is also reasonable to assume the number of submissions of a particular piece of malware has an impact on how quickly a signature is added.
Another part of the problem is one of sheer numbers. There are many, many more malware writers than antivirus signature writers. The defenders are simply overwhelmed. Each sample has to be checked to be sure it is malware, then a signature has to be created, then checked to make sure said signature doesn't create a big false positive problem. All this takes time.
None of the remaining contenders had a perfect score. None. Everyone missed something, which is more than a little disheartening. In fact, I had two samples that appeared to be zero day malware that over 40 different scanners missed. So much for the vaunted heuristics engines many vendors tout.
Still, some caught most of the samples, while others did poorly. Remember that 23% statistic from Panda Labs. It's quite believable based on my results. It's not enough to have active and up-to-date antivirus. You have to have active and up-to-date GOOD antivirus.
Unfortunately, GOOD is hard to determine. All test results, not just mine, are snapshots of history. Today's top performer could easily become tomorrow's dog, and vice-versa, and even top performers can have bad periods. It is important to reguarly evaluate your antivirus solution, regardless of what it is. Don't just rely on fancy marketing or a recommendation from someone who knows a lot about computers. This especially includes salesmen at the big electronics stores.
My testing has come down to eight products I would put in three different categories. The finalists are: Avast, AVG, Avira, ClamWin, Comodo, Moon Secure, PC Tools Antivirus, and Spyware Terminator. Of these, there are three categories: Standard, Host Intrusion, and Manual Scanning.
Most antivirus programs fall into the Standard category. They have a database of signatures for malware that they check against, both manually and with an on-access real-time scanner. Many also have a heuristics engine that will attempt (key word: attempt) to figure out if a program has the characteristics of malware even without a signature.
The heuristics engines are something of a mixed blessing. While they give you a chance to block malware there is no signature for, they also tend to flag perfectly harmless, and sometimes important, programs and files as malware. Heuristics engines have resulted in critical system files being removed, disabling the computer. It happened before and will likely happen again.
Overall, programs in the standard category require very little user interaction. They run on auto-pilot 95+% of the time. Of the final seven, Avast, AVG, Avira, and PC Tools Antivirus are in the standard category. Moon Secure is also in the standard category, but as explained below, was excluded very early in the testing.
The Host Intrusion programs are Standard category programs with an additional twist. They monitor access to certain critical files, folders, and registry keys. Any attempt by a program to access these is blocked and a pop-up message is issued. You, the user, can then chose to allow the action or continue to block it. Obviously, the user can exempt some programs from the auto-blocking.
Host Intrusion programs can warn you about attempted stealth installs even if the heuristics engine and the signature database fail. In my tests so far, none of the seven programs has a perfect score for the heuristics and signature detection, so the Host Intrusion programs can offer more security than those in the standard category.
However, this extra security comes at a price. While the Standard programs require almost no user interaction, the Host Intrusion programs can require a great deal of interaction. If you are the type of user who hates that sort of thing, go with one of the standard programs. If you are the user who has no idea how to determine what to block and what to allow, go with one of the standard programs.
Now don't get me wrong. The pop-ups do not occur every few minutes. All the pop-ups I've seen, with only a few exceptions, only occur when installing a new program or sometimes an update to a program. If you get a pop-up out of the blue and aren't installing something, odds are very good it's malware trying to infect you. The real difficulty usually occurs when some website tells you that you need a program they are offering you. Is it a legitimate program or malware? Unless it's Java, Flash, Shockwave, Silverlight, or Adobe Reader, the odds are good it's malware.
The pop-ups I found not caused by program installations were caused by the anti-piracy protection on some games. Teracopy also triggered a pop-up when I used it to change the default copy handler, but that was a one time occurence.
Two of the seven finalists are included in the Host Intrusion category: Comodo and Spyware Terminator.
There is only one entry in the Manual Scanning category: ClamWin. Manual Scanning does not offer any real time protection, and will only scan files when you explicitly ask it to. As such, any Manual Scanner should not be used for everyday desktop use.
ClamWin was included for three reasons. First, its database is used in other products, namely Spyware Terminator (optional) and Moon Secure. Second, the Linux version of Clam is often used on mail servers as a cheap, and supposedly effective, way to prevent the spread of malware via email. It is also used to check files in Windows shares stored on Linux file servers. The database used by the Windows version of Clam (ClamWin) is the exact same one used in the Linux version. Finally, I could find little in the way of tests done on Clam. The little evidence I found was all over the board, from perfect to poor.
I used the Windows version of Clam rather than the Linux version to keep the playing field as level as possible. All the other programs tested were run under Windows, so I needed to run Clam under Windows as well.
I actually got into testing because I wanted to do an article recommending Moon Secure, and by extension Clam. I hoped to give Clam a glowing recommendation. I wanted to give Clam a glowing recommendation. Sadly, I can not.
Clam has come in dead last in my signature tests. Further, it did poorly in the speed tests, with only AVG coming in worse. I found it interesting that Spyware Terminator, which can and did use the Clam database, as well as its own database during the speed tests, did better than ClamWin by itself. As for Clam's performance on Linux, my experience using it has led to an estimate of 8-10 minutes/per Gig of data. Not exactly a speed demon there either.
Therefore, I must put Clam in the Not Recommended category.
The tests were all done under a virtual machine, running XP SP3 with 512 MB RAM. A test folder of exe's was assembled. Some of the exe files are installer programs and as such contain other files. Windows Explorer lists the directory as being 711 MB and containing 910 files. All times are from the program in question reporting how long it took. Each program was run three times and the results averaged (mean).
The programs are sorted from fastest to slowest. The reported objects are larger than 910 in many cases as some programs automatically scan the memory when called, while others include every file in an archive to bump up their numbers. All programs were run with their default settings.
One suggestion I've received about why AVG was so slow was the free version was crippled - the “price of free”. I re-ran the tests on the same hardware, against the same files. I can definitely state that the free version is not crippled. The commercial version, slightly later than the free version, had an average time of 348 secs, with 5594 objects reported.
Just having active and up-to-date antivirus is not enough. According to one December 2007 article (http://cli.gs/DN9na8), no product tested by the VB100 test has a perfect score. The best, NOD32, a commercial product, achieved only 94%, and I've seen complaints about their database not updating due to an "undocumented serious error" and the software nagging about sending suspicious files to the makers of NOD32.
You have to do your research when picking an antivirus product. Hopefully, I've opened your eyes to just how bad it is out there. Just one miss can cost thousands of dollars (http://cli.gs/Xz5U2v). Be careful, it's a jungle out there.
Now if you'll excuse me, I have to don my fire retardant suit, as there's a lot of flames headed my way.
Charles W. Evans is a HAL-PC member and the Magazine’s Reviews Editor who can be contacted at firstname.lastname@example.org