============================================================ File 0XECSUM.TXT: "EXECUTIVE SUMMARY" AntiVirus/AntiMalware Product Test "2002-12" antiVirus Test Center (VTC), University of Hamburg ============================================================ [Formatted with non-proportional font (Courier), 72 columns] ********************************************************************** Content of this file: ********************************************************************** 0. Editors Foreword 1. Background of this test; Malware Threats Table ES0: Development of viral/malware threats 2. VTC Testbeds used in VTC test 2002-12 Table ES1: Content of VTC test databases in test 2002-12 3. Products participating in test 2002-12 Table ES2: List of AV products in test 2002-12 4. Serious problems: Many products crash, some dont find all files. Flaw in Microsoft FindFirst/FindNext routine? 5. Results of AV products under DOS: Table DOS-A1: Development of file virus/malware detection of DOS scanners from 1997-02 to 2002-12 Table DOS-A2: Development of macro/script virus/malware detection of DOS scanners from 1997-02 to 2002-12 Findings: DOS.1 - DOS.7 Grading DOS products according to their detection performance 6. Concerning Windows-NT: for last test under Windows NT 4.0 see VTC test 2001-10 7. Results of on-demand detection under Windows-98: Table W98-A1: Development of file virus/malware detection of W98 scanners from 1998-10 to 2002-12 Table W98-A2: Development of macro/script virus/malware detection of W98 scanners from 1998-10 to 2002-12 Findings W98.1 - W98.7 Grading W98 products according to their detection performance 8. Results of on-demand detection under Windows-2000: Table W2k-A: Development of W-2k scanners for file, macro and script viruses/malware from 2000-08 to 2002-12 Findings W2k.1 - W2k.7 Grading W2k products according to their detection performance 9. Comparison of detection behaviour for W32 platforms Grading AV products concerning W32-harmonical behaviour 10. Results of on-demand detection under Linux(SuSe) Table Lin-A: Development of Linux scanners from 2001-04 to 2002-12 Findings LIN.1 - LIN.7 Grading LINUX products according to their detection performance 11. Conclusion: In Search of the "Perfect AV/AM product" 12. Availability of full test results 13. Copyright, License, and Disclaimer *********************************************************************** 1. Editors Foreword: ==================== VTC test "2002-12" was started in January 2002, after submission of products (before Xmas 2001). In this test, testbeds have grown significantly. With the growth of our testbeds, we experienced much more problems as products behaved "abnormally" (see 8problms.txt). Moreover, some products (and our test crew) suffered from a known but uncorrected flaw in Microsoft FindFirst/FindNext routines which required postscans for almost ALL products on almost ALL testbeds (see 4). Evidently, the time of DOS-based AntiVirus/AntiMalware products - formerly the reference products for measuring detection rates on W32 platforms - passes by. Not only is the number of DOS products decreasing, but worse: detection rates tend to fall. Consequently, DOS products can no longer be regarded as reference products. As DOS becomes less relevant (if at all),AV producers also doNOT invest into improving related products. Moreover, the category of viruses which works essentially under, namely BOOT viruses, has become almost irrelevant. In this situation, aVTC team has decided that this test will be the last one adressing DOS-related scanners. After more than 10 years of testing, starting in 1991 with first tests, then performed by Dr. Vesselin Bontchev, we will concentrate on testing for W32 platforms and LINUX. With the deployment of new W32 platforms (including W-XP), customers going from one W32 platform to another will assume that related AV/AM products behave with IDENTICAL detection rates. We tested this assumption (which we call "W32-harmonical behaviour", see 9) and found that this assumptions is justified for many (though not all) products for macro/script viruses but NOT for W32 virus detection. ** As so many W32-products behave"W32-harmonically", it will ** ** be sufficient in future tests, to concentrate on fewer W32 ** ** platforms (including W-98, W-2000 and W-XP). ** One serious concern from our results is that AV producers concentrate more on detection of In-The-Wild (ITW) viruses than on zoo viruses. Indeed, one AV company - TrendMicro - informed us that they dont wish to participate in our test as they concentrate on ITW detection and are aware that their products will produce "unfavourable results" for our zoo testbeds (see 3). For many other AV products, detection rates of ITW viruses are perfect (100%) or excellent (>99%) but detection of zoo viruses is often significantly lower. Evidently, AV producers focusing on ITW detection forget that any ITW virus has been a zoo virus before becoming In-the-wild. It cannot surprise that customers of such products experience badly how neglection of zoo virus detection affects their IT services when a hithertoo unknown zoo virus is deployed broadly (the author of this test report had to advice several victims of such ill-advised "ITW-mindedness" aka "zoo-blindness"). And first victims - often large companies - can also not win from any fast exchange of newly "wildering" code: this is always too late for some! This test - as all previous ones - has been performed by students of Hamburg university Faculty for Informatics with special interest in IT Security (see our 4-semester curriculum started in 1988, on our homepage). Different from other tests where submitters of products have to pay a fee for being admitted to tests, VTC tests are "FREE OF FEE". This implies that students who have to complete their examinations and usually also work to earn their income are only "partially available" for tests. Moreover, our hardware which is essentially funded by Faculty support (sometimes also by donation of new machines, usually more powerful than those which we can buy from university money) canNOT compete with the technical equipment in other test-labs. We regret that these circumstances cause delays in perfor- ming and publishing our regular test reports, but instead of hurrying to meet dates and expectations, we insist that assessed QUALITY of our test results shall have - also in the future - highest priority. Most work in VTC tests rest on the shoulders of our test crew, and the editor wishes to thanx them all for their devotion and hard work. (See VTC test team at the end of this report). 1. Background of this test: Malware Threats: ============================================ Malicious software (malware) including viruses (=self-replicating malware), trojan horses (=pure payload without self-replication), virus droppers and network malware (e.g. worms and hostile applets), are regarded as serious threats to PC users esp. when connected to Intranetworks and Internet. The development of malicious software can well be studied in view of the growth of VTC (zoo and In-The-Wild) testbeds. The following table summarizes, for previous and current VTC tests (indicated by their year and month of publication), the size of virus and malware (full = "zoo") databases (indicating each the different viruses and number of instantiations of a virus or malware and having in mind that some revisions of testbeds were made): Table ES0: Development of threats as present in VTC test databases: =================================================================== -------------------------------+---------------+------------------------+------------------------ == FileViruses/Malware ==I =Boot Viruses=I =Macro Viruses/Malware=I=ScriptViruses/Malware = Test# Number Infected NumberI Number Infected Number Infected Number I Number Infected Number Viruses objects MalwareI viruses objects viruses objects malwareI viruses objects malware -------------------------------+---------------+------------------------+------------------------ 1997-07: 12,826 83,910 213 I 938 3,387 I 617 2,036 72 I --- --- --- 1998-03: 14,596 106,470 323 I 1,071 4,464 I 1,548 4,436 459 I --- --- --- 1998-10: 13,993 112,038 3,300 I 881 4,804 I 2,159 9,033 191 I --- --- --- 1999-03: 17,148 128,534 3,853 I 1,197 4,746 I 2,875 7,765 200 I --- --- --- VKIT/Poly:+5 146,640 I I I 1999-09: 17,561 132,576 6,217 I 1,237 5,286 I 3,546 9,731 329 I --- --- --- VKIT/Poly:+7 166,640 I I I 2000-04: 18,359 135,907 6,639 I 1,237 5,379 I 4,525 12,918 260 I --- --- --- VKIT/Poly:+7 166,640 I I I 2000-08: --- --- --- I --- --- I 5,418 15,720 500 I 306 527 --- 2001-04: 20,564 140,703 12,160 I 1,311 5,723 I 6,233 19,387 627 I 477 904 --- VKIT/Poly:+7 166,640 I I I 2001-07H1: --- --- --- I --- --- I + 544 + 2,035 +102 I +206 + 386 --- 2001-10: --- --- I --- --- I 6,762 21,677 683 I 481 1,079 30 2002-02H2: --- --- --- I --- --- I + 675 + 3,245 +720 I +854 +1,645 +270 -------------------------------+---------------+-----------------------+------------------------ TESTBED OF THIS TEST: 2002-12: 21,790 158,747 18,277 I ITW:11 149 I 7,306 25,231 747 I 823 1,574 202 -------------------------------+---------------+-----------------------+------------------------ Remark #1: Before test 1998-10, an ad-hoc cleaning operation was applied to remove samples where virality could not be proved easily. Since test 1999-03, separate tests are performed to evaluate detection rates of VKIT-generated and selected polymorphic file viruses. Remark #2: Heureka tests (marked "H#") use only those samples (viruses and malware) which were newly found (marked "+") in a given period (for details, see related test reports); Heureka tests include macro and script viruses/malware. With annual deployment of more than 5,000 viruses and about 1,000 Trojan horses, many of which are available from Internet, and in the absence of inherent protection against such dysfunctional software, users must rely on AntiMalware and esp. AntiVirus software to detect and eradicate - where possible - such malicious software. Hence, the detection quality of AntiMalware esp. including AntiVirus products becomes an essential prerequisite of protecting customer productivity and data. Virus Test Center (VTC) at Hamburg University´s Faculty for Informatics performs regular tests of AntiMalware and esp. AntiVirus Software. VTC recently tested current versions of on-demand scanners for their ability to identify PC viruses. Tests were performed on VTCs malware databases, which were frozen on their status as of *** October 2001 *** to give AV/AM producers a fair chance to support updates within the 6 weeks submission period (product submission date: December 17, 2001). The main test goal was to determine detection rates, reliability (=consistency) of malware identification and reliability of detection rates for submitted or publicly available scanners; this test determined the detection rates for boot (only DOS), file, macro and script viruses. It was also tested whether viruses packed with 5 popular compressing methods (PKZIP, ARJ, LHA, RAR and CAB) would be detected (and to what degree) by scanners. Moreover, avoidance of False Positive alarms on "clean" (=non-viral and non- malicious) objects was also determined. Remark: Result tables also include WINRAR packed objects. As in past tests, we used WINRAR 2.0 to pack macro and script viruses. When packing ITW file viruses for the resp. test, WINRAR crashed, so we decided to pack samples with WINRAR 2.9, which was at that time officially available for more than 3 months. Upon evaluation, we were very surprised to observe that just one AV product could unpack WINRAR 2.9 at all, although with high unreliability. Consequently, we didnot include WINRAR in our evaluation. As consequence, there will be a special test of the availability of AV products to properly recognize packed infected objects properly (for more packers). Finally, a set of selected non-viral file, macro and script malware (droppers, Trojan horses, intended viruses etc) was used to determine whether and to what degree AntiVirus products may be used for protecting customers against Trojan horses and other forms of malware. VTC maintains, in close and secure connection with AV experts worldwide, collections of boot, file, macro and script viruses as well as related malware ("zoo") which have been reported to VTC or AV labs. Moreover, following the list of "In-The-Wild Viruses" (published on regular basis by Wildlist.org), a collection of viruses reported to be broadly visible is maintained to allow for comparison with other tests; presently, this list does not report ITW Malware. 2. VTC Testbeds used in VTC test "2002-12": =========================================== The current sizes of different VTC testbeds (developed from previous testbeds through inclusion of new viruses and malware and some revision) is given in the following table (for detailed indices of VTC testbeds, see file "a3testbed.zip") Table ES1: Content of VTC test databases in test "2002-12": ========================================================================== "Full Zoo": 21,790 File Viruses in 158,747 infected files 8,001 different File Malware in 18,277 files 664 Clean file objects for False Positive test 7,306 Macro Viruses in 25,231 infected documents 450 different Macro Malware in 747 macro objects 329 Clean macro objects for False Positive test 823 different script viruses in 1,574 infected objects 117 different script malware in 202 macro objects ----------------------------------------------------------------- "ITW Zoo": 11 Boot Viruses in 149 infected images/sectors 50 File Viruses in 443 infected files 124 Macro Viruses in 1,337 infected documents 20 Script Viruses in 122 infected objects ============================================================================ For a survey of platforms, see A4tstdir.txt, and for the content of the resp. testbeds see A3TSTBED.zip (available for download). Concerning quality of viral testbeds, it is sometimes difficult to assess the "virality" (=ability of a given sample to replicate at least twice under given constraints) of large "viral zoo" databases, esp. as some viruses work under very specific conditions. We are glad to report, that Dr. Vesselin Bontchev, Eugene Kaspersky and Dr. Igor Muttik and other experts esp. from the "non" Organisation of cooperating AV experts (Computer Antivirus Research Organisation, CARO) helped us significantly with critical and constructive comments to establish viral testbeds, the residual non-viral part of which should be very small. Post-analysis and Quality Measure of VTC testbeds: -------------------------------------------------- Those samples which some product didnot properly detect are usually sent to a trustworthy expert from that company for post-analysis. In almost all cases, newer versions of that product were able to detect previously missed samples. We were esp. happy to receive comments from Vesselin Bontchev, Eugene Kaspersky and Igor Muttik about some specimen which should not have been included in one testbed (though possibly belonging to another testbed. Indeed, some samples contained remainders of some improperly cleaned virus. After an analysis of those comments, we can report that a VERY SMALL number of entries in few testbeds does NOT belong there: Number of Number of objects Inaccuracy Improper samples in testbed ratio ----------------+------------------+---------- Zoo File testbed: 25 I 158,747 I 0,0006% Zoo Macro testbed: 5 I 25,231 I 0,02% Zoo Script testbed: 3 I 1,337 I 0,2% ----------------+------------------+---------- We also wish to thank "WildList Organisation" for supporting us with their set of In-The-Wild viruses; the related results may support users in comparing VTC tests with other ITW-only tests. 3. Products participating ins Test "2002-12": ============================================= For test "2002-12", the following *** 20 *** AntiVirus products (adressed in subsequent tables by a 3-letter abbreviation) under DOS, Windows-98, Windows-2000 and LINUX in 58 different versions were tested: Table ES2: List of AV products in test "2002-12" ================================================ Abbreviation/Product/Version Tested under Platform ----------------------------------------------------------- AVA = AVAST 7.70/Lguard 32 DOS W-98 W2k AVG = Grisoft AntiVirus 6.0 DOS W-98 W2k AVK = GData AntivirenKit 3.0 (133) W-98 W2k Linux AVP = Kaspersky AntiVirus 3.0 (135) DOS W-98 W2k BDF (AVX) Bit Defender W-98 W2k CMD = Command Software 4.62.4 DOS W-98 W2k Linux DRW = DrWeb 4.26 DOS W-98 W2k Linux FPR = FProt 3.11b DOS W-98 W2k Linux FPW = FProt FP-WIN 3.11b W-98 W2k FSE = FSecure AntiVirus 1.00.1251 W-98 W2k Linux IKA = Ikarus AntiVirus 5.01 W2k INO = Inoculan 6.0 DOS W-98 W2k MR2 = MR2S 1.20 DOS W-98 W2k NAV = Norton AntiVirus (Corp.Ed.) DOS W-98 W2k NVC = Norman Virus Control 5.30.02 DOS W-98 W2k OAV = Open AntiVirus Linux PRO = Protector 7.1 W-98 W2k RAV = Rumanian AntiVirus 8.02.001 DOS W-98 W2k Linux SCN = NAI VirusScan 4.16.0 DOS W-98 W2k Linux VSP = VirScan Plus 12.34.1 DOS W-98 W2k ------------------------------------------------------------------ Products tested: 13 + 18 + 19 + 8 = 58 ------------------------------------------------------------------ For details of AV products including options used to determine optimum detection rates: see A3SCNLS.TXT. For scanners where results are missing: see 8problms.txt. In general, AV products were either submitted or, when test versions were available on Internet, downloaded from respective ftp/http sites. Few scanners were not available either in general (e.g. TNT) or for this test, some of which were announced to participate in some future test. Finally, very few AV producers answered VTCs requests for submitting scanners with electronic silence. Concerning frequently asked questions, some AV producers deliberately do NOT submit their products and even FORBID VTC to test their product:: TrendMicro Germany has again recently informed the author of this report that they are NOT interested in VTC test participation as their scanner is deliberately trimmed to on-access scanning and detection of In-The-Wild viruses. As VTC emphasize also the detection of zoo viruses where their products would produce "unfavourable results", there is still no intent to submit their products. When 2 experts from TrendMicro asked to test their product but NOT TO PUBLISH results but report them only to TM lab, VTC - as university lab devoted to public information - had to refuse that "offer". Consequently, VTC refrains from inviting TrendMicro for future test participation. Panda has permitted tests to any institution and university *except VTC*. Background: after first participation of a product (with less positive results), Panda CEO requested that VTC transfers the whole testbed to Panda labs, as condition for future test participation. VTCs policy is to send missed samples to participating companies, but sending the whole testbed is inconsistent with VTCs security policy (see VTC Code of Conduct). Sophos: Regrettably, we had to ask Sophos, producer of Sophos AntiVirus (aka Sweep) to refrain from submitting their products as VTC doesnot wish to support an enterprise which deliberately advocates and practices virus eXchange (vX) (whether rapid or not) which according to VTCs "Code of Conduct" is an attitude practiced by malevolent collectors and authors of malevolent software. Sophos followed VTCs suggestion but prefers calling this request "exclusion". So far, there is no indication that Sophos has changed their attitude concerning "vX". Some time ago, a lawyer of Sophos (Germany) has informed the author of this test report that they intend to undertake legal action against him to enforce him to delete the above statement from The Web. The following paragraphs survey essential findings in comparison with last VTC tests (performance over time), as well as some relative "grading" of scanners for detection of boot, file, macro and script viruses, both in full "zoo" and "In-The-Wild" testbeds, of macro and script malware, as well as detection of ITW file and macro viruses in objects packed with ARJ, LHA, ZIP, RAR, WinRAR and CAB. Finally, the ability of AV products to avoid False Positive alarms is also analysed. Detailed results including precision and reliability of virus and malware identification (including the grids used to assign a performance level to each product) are presented in files (platform-specific): for DOS: 6bdos.txt for W32: 6fw98.txt, 6hw2k.txt comparison of W32 results: 6mcmp32.txt for Linux: 6xlin.txt In a rather detailed analysis, detection rates are presented for each platform (operating systems), and product behaviour is graded in compa- rison with all products tested on ther resp. platform: evaluation/grading for DOS products: 7evaldos.txt for W-98 products: 7evalw98.txt for W-2k products: 7evalw2k.txt for W32 products: 7evalcmp.txt for LINUX products: 7evallin.txt Under the scope of VTCs grading system, a "Perfect AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND at least 99.9% of zoo samples, in ALL categories (file, boot, macro and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (NOW:5) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Remark: detection of "exotic viruses" is presently NOT rated. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). Remark: detection of "exotic malware" is presently NOT rated. 4. A serious problem: Flaw in Microsoft FindFirst/FindNext routine? =================================================================== Since VTC tests started (including the work of Vesselin Bontchev in the early 1990s), we have experienced many problems. Products were often difficult to install and manage (see 8problems). With the growing size and diversity of testbeds, it was expected that problems would again grow. But there is one problem which not only affects testers with large viral databases (which is hardly the situation which customers experience). More than ever before, we found after completion of some test run that AV products had NOT TOUCHED all parts of the directory, for no obvious reason and always without any diagnosis (no exception etc). In such cases, we determined those parts of the testbed which had not been processed and restarted the product ("postscan"). When after completion some remainder part was untouched, we started a 2nd postscan for the remainder. The most probably reason for such behaviour of SEVERAL products which otherwise behave "smoothly" is: the methods offered by Microsoft to traverse a directory, esp. routines FindFirst and FindNext, DONT WORK RELIABLY on large directories. This effect has been reported first by Eugene Kaspersky, but we have seen NO IMPROVEMENT OR CORRECTION. Evidently, this problems seems to be related to the invocation of those routines (FF/FM), and this may be affected by some compiler or assembler. Only through excessive inspection of resulting testlogs ("test quality assurance"), we could reduce the impact of this MS flaw on our test results: but it is "NOT NATURAL" that anyone must start an AV product more than once to be sure that all potentially malicious objects had ALL been checked! This problem may not only show its dirty face for large virus/malware testbeds. With growing sizes of customer directories, the likelihood that NOT ALL OBJECTS are touched by any method using FF/FM grows. As this is a problem also for many companies with large directories, WE STRONGLY REQUEST THAT MICROSOFT CURES THIS FLAW. 5. Results of AV products under DOS: ==================================== This is a summary of the essential findings for AV/AM products under DOS. For details see 7evaldos.txt. Meant as a perspective of product results, the following tables (DOS-A1/A2) list all results of DOS scanners for zoo detection of file, macro and script viruses, in last 10 VTC tests. Moreover, differences ("delta") in detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table DOS-A1: File Virus Detection Rate in 10 VTC tests under DOS: ================================================================== ----------------- File (ZOO) Virus Detection ------------------------ SCAN 9702 9707 9802 9810 9903 9909 0004 0104 0212 Delta NER % % % % % % % % % % ------------------------------------------------------------------------- ALE 98.8 94.1 89.4 - - - - - - - ANT 73.4 80.6 84.6 75.7 - - 92.8 - - - AVA 98.9 97.4 97.4 97.9 97.6 97.4 97.5 95.2 96.9 +1.7 AVG 79.2 85.3 84.9 87.6 87.1 86.6 - 81.9 (*) - AVK - - - 90.0 75.0 - - 99.7 - - AVP 98.5 98.4 99.3 99.7 99.7 99.8 99.6 99.8 100~ +0.2 CMD - - - - - - 99.5 - 98.5 - DRW 93.2 93.8 92.8 93.1 98.2 98.3 - - (*) - DSE 99.7 99.6 99.9 99.9 99.8 - - - - - FMA - - - - - - - - - - FPR 90.7 89.0 96.0 95.5 98.7 99.2 99.6 97.8 98.8 +1.0 FSE - - 99.4 99.7 97.6 99.3 99.9 - - - FWN - - - - - - - - - - HMV - - - - - - - - - - IBM 93.6 95.2 96.5 - - - - - - - INO - - 92.0 93.5 98.1 94.7 94.6 91.0 93.8 +2.8 IRS - 81.4 74.2 - 51.6 - - - - - ITM - 81.0 81.2 65.8 64.2 - - - - - IVB 8.3 - - - 96.9 - - - - - MR2 - - - - - 65.4 - - (*) - NAV 66.9 67.1 97.1 98.1 77.2 96.0 93.3 90.8 98.4 +7.6 NOD - - - 96.9 - 96.9 98.3 - - - NVC 87.4 89.7 94.1 93.8 97.6 - 99.1 - (*) - PAN - - 67.8 - - - - - - - PAV - 96.6 98.8 - 73.7 98.8 98.7 99.9 - - PCC - - - - - - - - - - PCV 67.9 - - - - - - - - - PRO - - - - 35.5 - - - - - RAV - - - 71.0 - - - - 96.7 - SCN 83.9 93.5 90.7 87.8 99.8 97.1 99.9 99.8 99.8 0.0 SWP 95.9 94.5 96.8 98.4 - 99.0 98.4 - - - TBA 95.5 93.7 92.1 93.2 - - - - - - TSC - - 50.4 56.1 39.5 51.6 - - - - TNT 58.0 - - - - - - - - - VDS - 44.0 37.1 - - - - - - - UKV - - - - - - - - - - VET - 64.9 - - 65.3 - - - - - VIT - - - - - - 7.6 - - - VRX - - - - - - - - - - VBS 43.1 56.6 - 35.5 - - - - - - VHU 19.3 - - - - - - - - - VSA - - 56.9 - - - - - - - VSP - - - 76.1 71.7 79.6 - - 61.6 - VSW - - 56.9 - - - - - - - VTR 45.5 - - - - - - - - - XSC 59.5 - - - - - - - - - ------------------------------------------------------------------------- Mean 74.2 84.8 84.4 85.4 81.2 90.6 98.3 95.1 93.5% +2.2% ------------------------------------------------------------------------- Remarks:concerning rounding, "100~" denotates that the result is 100% AFTER rounding whereas "100.0%" denotates that the result is 100.0% EXACTLY (rated "perfect"). For abbreviations of products (code names), see appendix A5CodNam.txt. For test problems, see 8problm.txt. Table DOS-A2: Macro/Script Virus Detection Rate in last 9 VTC tests under DOS: ============================================================================== ------------------ Macro Virus Detection ------------------ + - ScriptVirus Detection - SCAN 9702 9707 9802 9810 9903 9909 0004 0008 0104 0110 0212 DeltaI 0008 0104 0110 0212 Delta NER % % % % % % % % % % % % I % % % % % -----------------------------------------------------------------+-------------------------- ALE 96.5 66.0 49.8 - - - - - - - - - I - - - - - ANT 58.0 68.6 80.4 56.6 - - 85.9 93.3 - 97.0 - - I 55.2 - 81.8 - - AVA 99.3 98.2 80.4 97.2 95.9 94.6 93.7 - 92.0 93.0 92.8 -0.2 I - 30.0 33.7 31.5 -2.2 AVG 25.2 71.0 27.1 81.6 82.5 96.6 - - 98.3 98.4 98.1 -0.3 I - 57.9 62.9 63.9 +1.0 AVK - - - 99.7 99.6 - - 100~ 100~ 100% - - I 91.5 99.4 100% - - AVP 99.3 99.0 99.9 100% 99.8 100% 99.9 - 100~ 100% 100~ 0.0 I - 99.8 100% 98.9 CMD - - - - - 99.5 100% 100~ - 100~ 99.9 -0.1 I 93.5 - 93.9 98.1 DRW 90.2 98.1 94.3 99.3 98.3 - 98.4 97.6 98.0 99.5 99.4 -0.1 I 60.8 95.6 95.4 94.7 DSE 97.9 98.9 100% 100% 100% - - - - - - - I - - - - - FMA 98.6 98.2 99.9 - - - - - - - - - I - - - - - FPR 43.4 36.1 99.9 99.8 99.8 99.7 100% 100~ 100% 100~ 100~ 0.0 I 90.5 96.9 94.6 88.7 -5.9 FSE - - 99.9 90.1 99.6 97.6 99.9 - - - - - I - - - - - FWN 97.2 96.4 91.0 85.7 - - - - - - - - I - - - - - HMV - - 98.2 99.0 99.5 - - - - - - - I - - - - - IBM 65.0 88.8 99.6 - - - - - - - - - I - - - - - INO - - 90.3 95.2 99.8 99.5 99.7 99.7 99.3 - 99.9 - I 77.8 66.0 - 94.7 - IRS - 69.5 48.2 - 89.1 - - - - - - - I - - - - - ITM - 81.8 58.2 68.6 76.3 - - - - - - - I - - - - - IVB - - - - - - - - - - - - I - - - - - MR2 - - - - - 69.6 - - 44.2 40.8 37.9 -2.9 I - 85.1 83.3 81.0 -2.3 NAV 80.7 86.4 98.7 99.8 99.7 98.6 97.4 97.0 93.8 99.5 99.8 +0.3 I 24.8 31.2 94.2 97.0 +2.8 NOD - - - - 99.8 100% 99.4 - - - - - I - - - - - NVC 13.3 96.6 99.2 90.8 - 99.6 99.9 99.9 99.8 - 99.8 - I 83.7 88.5 - 87.6 - PAN - - 73.0 - - - - - - - - - I - - - - - PAV - - 93.7 100% 99.5 98.8 99.9 - 100~ 100% - - I - 99.8 100% - - PCC - 67.6 - - - - - - - - - - I - - - - - PCV - - - - - - - - - - - - I - - - - - PRO - - - - 81.5 - - - - - - - I - - - - - RAV - - - 99.5 99.2 - - - - 99.5 99.9 +0.4 I - - 82.5 96.1+13.6 SCN 95.1 97.6 99.0 98.6 100% 100% 100% 100~ 100% 100% 100% 0.0 I 85.6 100% 99.8 99.6 -0.2 SWP 87.4 89.1 98.4 98.6 - 98.4 98.4 - - - - - I - - - - - TBA 72.0 96.1 99.5 98.7 - - - - - - - - I - - - - - TSC - - 81.9 76.5 59.5 69.6 - - - - - - I - - - - - TNT - - - - - - - - - - - - I - - - - - VDS 16.1 9.9 8.7 - - - - - - - - - I - - - - - UKV - - - - - - - 0.0 - - - - I 0.0 - - - - VET - 94.0 97.3 97.5 97.6 - - - - - - - I - - - - - VIT - - - - - - - - - - - - I - - - - - VRX - - - - - - - - - - - - I - - - - - VBS - - - - - - - - - - - - I - - - - - VHU - - - - - - - - - - - - I - - - - - VSA - - 80.6 - - - - - - - - - I - - - - - VSP - - - - - - - - 0.0 0.0 0.0 0.0 I - 85.3 84.0 81.2 -2.8 VSW - - 83.0 - - - - - - - - - I - - - - - XSC - - - - - - - - - - - - I - - - - - -----------------------------------------------------------------+--------------------------- Mean 69.6 80.9 83.8 89.6 93.6 88.2 98.0 98.6 93.8 87.7 86.7 -0.3%I 66.4 79.7 86.2 84.9% 0.4% Without extreme values: 94.4 94.0% - I 84.9% - -----------------------------------------------------------------+--------------------------- General observation: the continuing decrease of detection rates indicates, that AV companies dont further invest into maintaining scanners on this platform on the required quality level. This is probably influenced by the fact that DOS is no longer in broad usage, with number of users steadily decreasing. ****************************************************** In order to support users with valid information about quality of products on RELEVANT PLATFORMS, VTC team has decided to stop DOS scanners with test 2002-12. ****************************************************** After having started our first DOS test in 1992 (then performed by Vesselin Bontchev/VTC), DOS was a stable platform for about 10 years (we dont expect that Windows 32-platforms will survive equally long!). Findings DOS.1: General development of zoo virus detection rates: ----------------------------------------------------------------- Evidently, AV producers dont invest into DOS products. For ALL platforms, detection rates are declining, with mean detection rates for file zoo viruses down to 93.5% for macro zoo viruses down to 86.7% and for script zoo viruses down to 84.9% Such detection rates are inacceptably low. ---------------------------------------------------- Concerning detection of file zoo viruses, NO product detects ALL viruses and is "perfect". But 2 products detect almost all (>99%) viruses and are "excellent": AVP,SCN 5 more products are "very good"; AVA,CMD,FPR,NAV,RAV ---------------------------------------------------- Concerning detection of macro zoo viruses, 1 product detects ALL viruses and is "perfect": SCN 9 more products detect >99% of macro zoo viruses and are rated "excellent": AVP,FPR;CMD,INO,RAV;NAV,NVC;DRW ---------------------------------------------------- Concerning detection of script zoo viruses, NO product detects ALL viruses and is "perfect". 1 product detect >99% of visuses: "excellent" SCN And 3 more products detect >95% of script viruses and are "very good": AVP,NAV,RAV ---------------------------------------------------- Overall, NO product detects ALL file, macro and script viruses. But 1 product detects >99% of all file, macro and script viruses and is "Overall Perfect": SCN Findings DOS.2: Development of ITW virus detection rates: --------------------------------------------------------- 3 AV products (out of 13) detect ALL In-The-Wild boot, file, macro and script viruses in ALL instantiations (files) and are rated "perfect": AVP,NAV,SCN And 2 AV products detect ALL ITW viruses on all platforms but misses few (<1%) files and is rated "excellent": RAV,INO -------------------------------------------- 8 AV products detect ALL ITW boot viruses in ALL samples and are rated "perfect": AVP,CMD,DRW,FPR,NAV,NVC,SCN,VSP -------------------------------------------- 3 AV products detect ALL ITW file viruses and infected objects and are rated "perfect": AVP,NAV,SCN 2 AV products detect ALL ITW file viruses on all platforms but misses few (<1%) files is rated "excellent": INO,RAV -------------------------------------------- 5 products can be rated "perfect" concerning detection of ITW macro viruses: AVP,DRW,INO,NAV,SCN 5 AV products detect ALL ITW macro viruses but misses few (<1%) files and are rated "excellent": AVG,CMD,FPR,NVC,RAV -------------------------------------------- Concerning detection of ITW script viruses, 9 products are rated "perfect" as they detect ALL viruses in ALL samples: AVG,AVP,CMD,DRW,FPR,NAV,NVC,RAV,SCN And 1 products detects all ITW viruses but misses one sample and is "excellent": INO Findings DOS.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- NO "perfect" overall scanner: --- 1 "excellent" overall scanner: SCN 3 "very good" overall scanner : AVP,NAV,RAV Finding DOS.4: Performance of DOS scanners by virus classes: ------------------------------------------------------------ Perfect scanners for file zoo+ITW: --- Excellent scanners for file zoo+ITW: AVP,SCN Very Good scanners for file zoo+ITW: FPR,CMD,NAV,AVA,RAV Perfect scanners for macro zoo+ITW: SCN Excellent scanners for macro zoo+ITW: AVP,FPR,CMD,INO,RAV,NAV,NVC,DRW Very Good scanners for macro zoo+ITW: AVG Perfect scanners for script zoo+ITW: --- Excellent scanners for script zoo+ITW: SCN Very Good scanners for script zoo+ITW: AVP,NAV,RAV Findings DOS.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Detection of packed viral objects needs significant improvement: Perfect packed ITW file/macro virus detector: AVP,SCN Excellent packed ITW file/macro virus detector: DRW Very Good packed ITW file/macro virus detector: RAV -------------------------------------------------------------- "Perfect" packed file virus detector: AVP,SCN "Excellent" packed file virus detector: DRW "Very Good" packed file virus detector: RAV -------------------------------------------------------------- "Perfect" packed macro virus detector: AVP,CMD,FPR,SCN "Excellent" packed macro virus detector: DRW "Very Good" packed macro virus detector: AVG,RAV Findings DOS.6: Avoidance of False Alarms: ------------------------------------------ Avoidance of False-Positive Alarms is improving though still regarded insufficient. Generally for both file and macro, FP-avoiding "perfect" DOS scanners: AVA,INO,NAV,SCN,VSP -------------------------------------------------- "Perfect" file -FP avoiding DOS scanners: AVA,AVP,CMD,FPR,INO,NAV,RAV,SCN,VSP "Perfect" macro -FP avoiding DOS scanners: AVA,AVG,INO,NAV,SCN,VSP Findings DOS.7: Detection rates for file/macro malware: ------------------------------------------------------- File, Macro and Script Malware detection under DOS is only slowly improving. Mean detection rates are significantly lower than for virus detection: Mean file malware detection: 73.7% Mean macro malware detection: 81.9% Mean script malware detection: 49.7% --------------------------------------------------- NO product detect ALL file, macro and script malware samples and is "perfect": --- 2 products are rated "excellent": AVP,SCN 1 product is rated "very good": RAV --------------------------------------------------- Concerning file malware detection: NO product is rated "perfect" --- 4 products are rated "excellent":AVP,SCN,FPR,CMD 1 product is rated "very good": RAV --------------------------------------------------- Concerning macro malware detection: 2 products detect ALL macro malware samples and are rated "perfect": AVP,SCN 7 products are rated "excellent": CMD,FPR,RAV,NVC,INO,NAV,DRW 2 products are rated "very good": AVA,AVG --------------------------------------------------- Concerning script malware detection: NO product detect ALL script malware samples and can be rated "perfect": --- 3 products are rated "excellent": AVP,NAV,SCN 1 product is rated "very good": RAV Grading DOS products according to their detection performance: ============================================================== Under the scope of VTCs grading system (see 4), we summarize our results for DOS-related scanners: ******************************************************************** In VTC test "2002-12", we found **** NO perfect DOS AV product **** ********************************************* AND we found **** NO perfect DOS AM product **** ******************************************************************** But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ----------------------------------------------------------------- DOS boot ITW test: AVP,CMD,DRW,FPR, NAV,NVC,SCN,VSP DOS file ITW test: AVP,NAV,SCN INO,RAV DOS macro ITW test: AVP,DRW,INO,NAV,SCN AVG,CMD,FPR,NVC,RAV DOS script ITW test: AVG,AVP,CMD,DRW,FPR, INO NAV,NVC,RAV,SCN ----------------------------------------------------------------- DOS file zoo test: --- AVP,SCN DOS macro zoo test: SCN AVP,FPR,CMD,INO, NAV,NVC,DRW,RAV DOS script zoo test: --- SCN ----------------------------------------------------------------- DOS file pack test: AVP,SCN DRW DOS macro pack test: AVP,CMD,FPR,SCN DRW DOS file FP avoidance: AVA,AVP,CMD,FPR,INO, --- NAV,RAV,SCN,VSP DOS macro FP avoidance:AVA,AVG,INO,NAV,SCN,VSP --- ------------------------------------------------------------------ DOS file malware test: --- AVP,SCN,FPR,CMD DOS macro malware test: AVP,SCN CMD,FPR,RAV, NVC,INO,NAV,DRW DOS script malware test: --- AVP,NAV,SCN ------------------------------------------------------------------ In order to support the race for more customer protection, we evaluate the order of performance in this DOS test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once): ************************************************************ "Perfect" DOS AntiVirus product: === (22 points) "Excellent" DOS AV products: 1st place: SCN (20 points) 2nd place: AVP (16 points) 3rd place: NAV (13 points) 4th place: CMD,FPR (10 points) 6th place: DRW,INO ( 9 points) 8th place: RAV ( 7 points) 8th place: NVC,VSP ( 6 points) 11th place: AVG ( 5 points) 12th place: AVA ( 4 points) ************************************************************ "Perfect" DOS AntiMalware product: =NONE= (28 points) "Excellent" DOS AntiMalware product: 1st place: SCN (24 points) 2nd place: AVP (20 points) 3rd place: NAV (15 points) 4th place: CMD,FPR (12 points) 6th place: DRW,INO (10 points) 8th place: RAV ( 8 points) 9th place: NVC ( 7 points) ************************************************************ 7. Results of on-demand detection under Windows-98 (W98): ========================================================= This is a summary of the essential findings for AV/AM products under W98. For details see 7evalw98.txt. Meant as a perspective of product results, the following tables (W98-A1/A2) list all results of W98 scanners for zoo detection of file, macro and script viruses, in last 7 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table W98-A1: File Virus Detection Rate in last 7 VTC tests under W-98: ======================================================================= Scan --------- File Virus Detection ---------- ner 9810 9903 9909 0004 0104 0212 DELTA -------------------------------------------------- ACU - - - - - - - ADO - - - - 99.9 - - AN5 - - 87.2 - - - - ANT 91.3 - 86.5 92.8 - - - ANY - - - - - - - ATR - - - - - - - AVA/3 96.6 97.6 97.2 97.5 95.2 96.2 +1.0 AVG - 87.3 87.0 85.4 81.9 80.6 -1.3 AVK 99.6 90.8 99.8 99.7 99.8 99.9 +0.1 AVP 99.9 99.9 99.8 99.9 99.9 100~ +0.~ BDF=AVX - 74.2 75.7 77.4 - 82.9 - CLE - - - - 0.1 - - CMD - - 98.4 99.6 97.8 98.5 +0.7 DSS/DSE 99.9 99.9 * 99.8 99.9 - - DRW/DWW - 89.5 98.3 96.7 98.5 98.3 -0.2 ESA - - - 58.0 - - - FPR/FMA - 93.9 99.4 99.7 97.8 98.8 +1.0 FPW - - 99.2 99.6 97.8 98.8 +1.0 FSE 99.8 100.0 99.9 100.0 99.7 100~ +0.3 FWN - - - - - - HMV - - - - - - IBM 92.8 * * * * * * INO 93.5 98.1 97.1 98.7 97.9 98.7 +0.8 IRS 96.7 97.6 - - - - - ITM - 64.2 - - - - - IVB - - - - - - - MKS - - - - - - - MR2 - - 65.9 - 50.1 1.3 -48.8% (!) NAV - 96.8 97.6 96.8 93.9 11.6 -82.3% (!) NOD - 97.6 98.3 98.3 - - - NV5 - - 99.0 - - - - NVC 93.6 97.0 99.0 99.1 98.1 97.8 -0.3 PAV 98.4 99.9 99.6 100.0 99.7 - - PCC - 81.2 - - - - - PER - - - - - - - PRO - 37.3 39.8 44.6 69.9 69.5 -0.4 QHL - - - - - - - RAV 84.9 - 86.9 86.5 93.6 96.7 +3.1 SCN 86.6 99.8 99.7 100.0 99.9 99.8 -0.1 SWP 98.4 - 99.0 99.6 - - - TBA 92.6 * * * * * * TSC - 55.3 53.8 - - - - VBS - - - - - - - VBW - 26.5 - - - - - VET - 66.3 * * * * * VSP - 86.4 79.7 78.1 64.9 4.9 -60.0% (!) -------------------------------------------------------------- Mean 95.0 84.2 89.7 91.6 87.4 79.7% -0.0% (without !) Mean (rate>10%]: 89.3% - -------------------------------------------------------------- (!) Results of 3 products - MR2,NAV and VSP - are influenced by the fact that these products crashed 3 times after having scanned only a minor part of the file testbed. (*) Products no longer available. Table W98-A2: Comparison: Macro/Script Virus Detection Rate in last 7 aVTC tests under W-98: =========================================================================================== Scan ------------- Macro Virus Detection ---------------- + --- ScriptVirusDetection --- ner 9810 9903 9909 0004 0008 0104 0110 0212 DELTAI 0008 0104 0110 0212 DELTA ----------------------------------------------------------+----------------------------- ACU - 97.6 - - - - - - - I - - - - - ADO - - - - - 99.9 - - - I - 99.8 - - - AN5 - - 89.3 - - - - - - I - - - - - ANT 84.3 - 89.5 90.2 96.4 - 97.4 - - I 55.2 - 81.8 - - ANY 70.7 - - - - - - - - I - - - - - ATR - - - - - - - - - I - 2.7 - - - AVA/3 96.7 95.9 93.9 94.3 94.1 95.7 97.7 97.8 +0.1 I 15.0 30.0 33.7 31.5 -2.2 AVG - 82.5 96.6 97.5 97.9 98.3 98.4 98.1 -0.3 I - 57.9 62.9 63.9 +1.0 AVK 99.6 99.6 100.0 99.9 100~ 100~ 100% 100~ -0.~ I 91.2 99.8 100% 99.0 -1.0 AVP 100.0 99.2 100.0 99.9 100~ 100~ 100~ 100~ 0.0 I 88.2 99.8 100% 98.9 -1.1 BDF=AVX - - 98.7 94.5 99.0 - 99.1 99.0 -0.1 I 61.4 - 70.1 72.4 +2.3 CLE - - - - - 0.0 - - - I 4.2 6.3 - - - CMD - - 99.6 100.0 100% 100% 100~ 99.9 -0.~ I 93.5 96.9 93.9 89.1 -4.8 DRW - 98.3 98.8 98.4 - 98.0 99.5 99.4 -0.1 I - 95.6 95.4 94.7 -0.7 DSE 100.0 100.0 * 100.0 100% 99.9 97.8 - - I 95.8 100% 73.0 - - ESA - - - 88.9 - - - - - I - - - - - FPR 92.4 99.8 99.7 100.0 - 100% 100~ 100~ 0.0 I - 96.9 94.6 88.7 -5.9 FPW - - 99.9 100.0 100% 100% 100~ 100~ 0.0 I 90.8 96.9 94.6 88.7 -5.9 FSE 100.0 100.0 100.0 100.0 100% 100% 100% 100~ 0.0 I 96.7 100% 100% 99.5 -0.5 FWN 99.6 99.7 99.9 99.8 - - - - - I - - - - - HMV - 99.5 - - - - - - - I - - - - - IBM 94.5 * * * * * * * * I * * * * * INO 88.1 99.8 98.1 99.7 99.8 99.7 99.9 99.9 0.0 I 78.1 92.7 95.1 94.7 -0.4 IRS 99.0 99.5 - - - - - - - I - - - - - ITM - - - - - - - - - I - - - - - IVB 92.8 95.0 - - - - - - - I - - - - - MKS - - - 97.1 - 44.2 - - - I - - - - - MR2 - - 64.9 - - - 40.8 37.9 -2.9 I - 85.1 83.3 81.0 -2.3 NAV 95.3 99.7 98.7 98.0 97.7 97.0 99.5 99.8 +0.3 I 36.6 65.5 94.2 97.0 +2.8 NOD - 99.8 100.0 99.4 - - - - - I - - - - - NV5 - - 99.6 - - - - - - I - - - - - NVC - 99.1 99.6 99.9 99.9 99.8 99.8 99.8 0.0 I 83.7 88.5 91.3 87.6 -3.7 PAV 99.5 99.5 86.7 99.9 100~ 99.5 100% - - I 90.2 99.8 100% - - PCC - 98.0 - - - - - - - I - - - - - PER - - - 53.7 67.2 68.5 - - - I 18.0 22.0 - - - PRO - 58.0 61.9 67.4 69.1 67.1 - 72.7 - I 12.1 40.7 - 59.8 - QHL - - - 0.0 - 0.0 0.0 - - I 6.9 - - - - RAV 92.2 - 98.1 97.9 96.9 99.6 99.5 99.9 +0.4 I 47.1 84.9 82.5 96.1+13.6 SCN 97.7 100.0 99.8 100.0 100% 100% 100% 100% 0.0 I 95.8 100% 99.8 99.6 -0.2 SWP 98.6 - 98.5 98.6 - - - - - I - - - - - TBA 98.7 * * * * * * * * I - - - - - TSC - 76.5 64.9 - - - - - - I - - - - - VBS 41.5 - - - - - - - - I - - - - - VBW 93.4 - - - - - - - - I - - - - - VET - 97.6 * * * * * * * I - - - - - VSP - 0.4 0.3 - - 0.0 0.0 0.~ 0.~ I - 85.3 84.0 81.2 -2.8 ----------------------------------------------------------+----------------------------- Mean 92.1 90.3 93.5 95.0 95.6 84.7 87.1 89.1% -0.~%I 61.0 76.0 86.5 84.6% -0.9% Without extreme low detectors: 96.3 94.4% - I 84.6% - ----------------------------------------------------------+----------------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. Findings W98.1: General development of macro/script zoo detection rates: ------------------------------------------------------------------------ Mean Detection rates for file and script viruses have decreased while detection rates for macro show a slight improvement. All detection rates are on inacceptably low level. Mean detection rates remain inacceptably low: mean file zoo virus detection rate: 79.7% mean macro virus detection rate: 89.1% mean script virus detection rate: 84.6% ------------------------------------------------ Concerning file virus detection: NO product detects ALL file viruses 4 products detect ALMOST all viruses (>99%) and are rated "excellent": AVP,FSE,AVK,SCN ------------------------------------------------ Concerning macro virus detection only: 1 product detects ALL macro zoo viruses and is rated "perfect": SCN 12 products detects >99% of macro zoo viruses and are rated "excellent": AVK,AVP,FPR,FPW,FSE,CMD,INO, RAV,NVC,NAV,DRW,BDF ------------------------------------------------- Concerning script virus detection, NO product detects ALL script viruses 3 products detect ALMOST all viruses (>99%) and are rated "excellent": SCN,AVK,FSE Findings W98.2: Development of ITW file/macro/script virus detection rates: --------------------------------------------------------------------------- 6 AV products (out of 19) detect ALL ITW file, macro and script viruses in ALL samples and are rated "perfect": AVK,AVP,DRW,FSE,NAV,SCN 2 more products detect ALL ITW viruses in ALMOST all samples (>99%) and are rated "excellent": INO,RAV --------------------------------------------- Concerning ITW file virus detection only, 6 products are "perfect" as they detect ALL viruses in ALL samples: AVK,AVP,DRW,FSE,NAV,SCN 8 products detect ALL ITW viruses in ALMOST all samples (>99%) and are rated "excellent": INO,RAV --------------------------------------------- Concerning ITW macro virus detection only, 7 products detect ALL viruses in ALL files: AVK,AVP,DRW,FSE,INO,NAV,SCN 9 products detect ALL ITW viruses in ALMOST all samples (>99%) and are rated "excellent": AVG,BDF,CMD,FPR,FPW,NVC,RAV,AVA,PRO --------------------------------------------- Concerning ITW script virus detection only, 12 products detect ALL viruses in ALL files: AVG,AVK,AVP,CMD,DRW, FPR,FPW,FSE,NAV,NVC,RAV,SCN 1 product detects ALL ITW viruses in ALMOST all samples (>99%) and is rated "excellent": INO Findings W98.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- NO W98 product is overall rated "perfect": --- 3 products are rated "excellent": SCN,FSE,AVK 2 products are rated "very good": AVP,RAV Findings W98.4: Performance of W98 scanners by virus classes: ------------------------------------------------------------- Perfect scanners for file zoo: --- Excellent scanners for file zoo: AVP,FSE,AVK,SCN Very Good scanners for file zoo: FPR,FPW,INO,CMD,DRW,NVC,RAV,AVA Perfect scanners for macro zoo: SCN Excellent scanners for macro zoo: AVK,AVP,FPR,FPW,FSE,CMD,INO,RAV,NAV,NVC,DRW,BDF Very Good scanners for macro zoo: AVG,AVA Perfect scanners for script zoo: --- Excellent scanners for script zoo: SCN,FSE,AVK Very Good scanners for script zoo: AVP,NAV,RAV Findings W98.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Concerning OVERALL detection of packed file AND macro viruses, 4 products are "perfect": AVK,AVP,BDF,SCN And 1 product is "excellent": DRW No product is "very good": --- ------------------------------------------------------- Concerning detection of packed FILE viruses: 4 products are "perfect": AVK,AVP,BDF,SCN 3 products are "excellent": DRW,RAV ------------------------------------------------------- Concerning detection of packed MACRO viruses: 7 products are "perfect": AVK,AVP,BDF,CMD,FPR,FPW,SCN 1 product is "excellent": DRW 1 product is "very good": FSE Findings W98.6: Avoidance of False Alarms: ------------------------------------------ Avoidance of False-Positive Alarms is rather well well developped, at least for file-FP avoidance. 8 overall FP-avoiding perfect W98 scanners: AVA,AVG,BDF,INO,NAV,PRO,SCN,VSP --------------------------------------------------- Concerning file-FP avoidance, 17 (of 18) products are "perfect": AVA,AVG,AVK,AVP,BDF,CMD, FPR,FPW,FSE,INO,MR2,NAV,NVC,PRO,RAV,SCN,VSP And 1 more product is "excellent": DRW --------------------------------------------------- Concerning macro-FP avoidance, these products are "perfect": AVA,AVG,BDF,INO,NAV,PRO,SCN,VSP And 2 more products are "excellent": AVK,RAV Findings W98.7: Detection rates for file/macro malware: ------------------------------------------------------- Generally, detection of malware is insufficient as is indicated by mean detection rates for file malware: 75.0% for macro malware: 84.5% for script malware: 51.4%. --------------------------------------------------- Concerning overall malware detection (including file, macro AND script malware) under W98: 0 products are "perfect": --- 4 products are "excellent": FSE,AVK,AVP,SCN 1 product is "very good": RAV --------------------------------------------------- Concerning only file malware detection, 4 products are "perfect": --- 7 products are "excellent": FSE,AVK,AVP,SCN,FPR,FPW,CMD 2 products are "very good": INO,RAV --------------------------------------------------- Concerning only macro malware detection, 4 products are "perfect": AVK,AVP,FSE,SCN 10 products are "excellent": CMD,FPR,FPW,RAV,NVC,INO,NAV,BDF,DRW,AVA 1 product is rated "very good": AVG --------------------------------------------------- Concerning only script malware detection, 0 products are "perfect": --- 5 products are "excellent": FSE,SCN,AVK,AVP,NAV 1 product is rated "very good": RAV Grading W98 products according to their detection performance: ============================================================== Under the scope of VTCs grading system (see 4), we summarize our results for W98-related scanners: ******************************************************************* In aVTC test "2002-12", we found *** NO perfect W-98 AV product *** and we found *** No perfect W-98 AM product *** ******************************************************************* But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ----------------------------------------------------------------- W98 file ITW test: AVK,AVP,DRW,FSE, INO,RAV NAV,SCN W98 macro ITW test: AVK,AVP,DRW,FSE, AVG,BDF,CMD,FPR,FPW, INO,NAV,SCN NVC,RAV,AVA,PRO W98 script ITW test: AVG,AVK,AVP,CMD,DRW,FPR, INO FPW,FSE,NAV,NVC,RAV,SCN ----------------------------------------------------------------- W98 file zoo test: --- AVP,FSE,AVK,SCN W98 macro zoo test: SCN AVK,AVP,FPR,FPW,FSE,CMD, INO,RAV,NAV,NVC,DRW,BDF W98 script zoo test: --- SCN,FSE,AVK ----------------------------------------------------------------- W98 file pack test: AVK,AVP,BDF,SCN DRW,RAV W98 macro pack test: AVK,AVP,BDF,CMD, DRW FPR,FPW,SCN W98 file FP avoidance: AVA,AVG,AVK,AVP,BDF, DRW CMD,FPR,FPW,FSE,INO,MR2, NAV,NVC,PRO,RAV,SCN,VSP W98 macro FP avoidance: AVA,AVG,BDF,INO,NAV, AVK,RAV PRO,SCN,VSP ----------------------------------------------------------------- W98 file malware test: --- FSE,AVK,AVP,SCN, FPR,FPW,CMD W98 macro malware test: AVK,AVP,FSE,SCN CMD,FPR,FPW,RAV,NVC, INO,NAV,BDF,DRW,AVA W98 script malware test: --- FSE,SCN,AVK,AVP,NAV ----------------------------------------------------------------- In order to support the race for more customer protection, we evaluate the order of performance in this W-98 test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once): ************************************************************ "Perfect" W-98 AntiVirus product: =NONE= (20 points) "Excellent" W-98 AntiVirus products: 1st place: SCN (18 points) 2nd place: AVK (16 points) 3rd place: AVP (14 points) 4th place: FSE,NAV (11 points) 6th place: BDF,DRW (10 points) 8th place: INO,RAV ( 9 points) 10th place: CMD,FPR,FPW ( 8 points) 13th place: AVG ( 7 points) 14th place: NVC ( 6 points) 15th place: AVA,PRO ( 5 points) 17th place: VSP ( 4 points) 18th place: MR2 ( 2 points) ************************************************************ "Perfect" W-98 AntiMalware product: =NONE= (26 points) "Excellent" W-98 AntiMalware products: 1st place: SCN (22 points) 2nd place: AVK (20 points) 3rd place: AVP (18 points) 4th place: FSE (15 points) 5th place: NAV (13 points) 6th place: BDF,DRW (11 points) 8th place: CMD,FPR,FPW,INO,RAV (10 points) 13th place: NVC ( 7 points) 14th place: AVA ( 6 points) ************************************************************ 8. Results of on-demand detection under Windows-2000 (W2k): =========================================================== This is a summary of the essential findings for AV/AM products under W-2k. For details see 7evalw2k.txt. Meant as a perspective of product results, the following table (W2k-A) lists all results of W2k scanners for zoo detection of file, macro and script viruses, in last 4 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table W2k-A: Comparison: File/Macro/Script Virus Detection Rate: ================================================================ Scan I == File Virus == + ======= Macro Virus ======== + ======= Script Virus ======== ner I Detection I Detection I Detection -----+------------------+------------------------------+------------------------------ Test I 0104 0212 Delta I 0008 0104 0110 0212 Delta I 0008 0104 0110 0212 Delta -----+------------------+------------------------------+------------------------------ ANT I - - - I 93.3 - - - - I 53.9 - - - - AVA I 95.0 96.2 +1.2 I 94.1 95.7 97.7 97.8 +0.1 I 15.0 29.1 29.6 31.5 +1.9 AVG I 81.9 80.6 -1.3 I 97.9 98.3 98.4 98.1 -0.3 I 45.8 57.9 62.9 63.9 +1.0 AVK I 99.8 99.9 +0.1 I 100~ 100~ 100% 100~ -0.0 I 91.5 99.8 100% 99.0 -1.0 AVP I 99.9 100~ +0.1 I 100~ 100~ 100~ 100~ 0.0 I 88.2 99.8 100% 98.9 -1.1 AVX=BDF - 82.9 - I 99.0 - - 99.0 - I 61.4 - - 72.4 - CLE I - - - I - - - - - I 4.2 - - - - CMD I 97.8 98.5 +0.7 I 100% 100% 100~ 99.9 -0.1 I 93.5 96.9 93.2 89.1 -4.1 DRW I - 98.3 - I 97.5 - 99.5 99.4 -0.1 I 59.8 - 95.4 94.7 -0.7 FPR I 97.8 98.8 +1.0 I - 100% 100~ 100~ 0.0 I - 96.9 94.6 88.7 -5.9 FPW I 97.8 98.8 +1.0 I 100% 100% 100~ 100~ 0.0 I 90.8 96.9 94.6 88.7 -5.9 FSE I - 100~ - I 100% 100% 100% 100~ -0.0 I 96.7 100% 100% 99.5 -0.5 IKA I - 89.2 - I - - - 96.2 - I - - - 81.2 - INO I 97.9 98.7 +0.8 I 99.8 99.7 99.9 99.9 0.0 I 78.1 93.1 93.9 94.7 +0.8 MCV I - - - I - - 88.5 - - I - - 27.7 - - MR2 I - 9.5 - I - - 0.7 10.4 +9.7 I - - 83.3 81.0 -2.3 NAV I 93.9 98.3 +4.4 I 97.7 97.0 99.5 99.6 +0.1 I 36.6 54.5 94.2 96.8 +2.6 NVC I 98.1 97.8 -0.3 I 99.9 99.8 99.8 99.8 0.0 I 83.7 88.5 91.3 87.6 -3.7 PAV I 97.5 - - I 100~ 99.4 100% - - I 90.2 98.5 100% - - PER I - - - I 85.0 68.2 - - - I 0.0 22.0 - - - PRO I 70.6 70.4 -0.2 I 69.1 67.1 - 72.7 - I 12.1 40.7 - 59.8 - QHL I - - - I 0.0 - - - - I 6.9 - - - - RAV I 93.5 94.7 +1.2 I 96.9 99.6 99.5 99.9 +0.4 I 47.1 84.9 82.5 96.1 +13.6 SCN I 89.0 99.8 +10.9 I 100% 100% 100% 100% 0.0 I 95.8 100% 99.8 99.6 -0.2 VSP I - 14.0 - I - 0.0 0.~ 0.~ 0.0 I - 85.3 84.0 81.2 -2.8 -----+------------------+------------------------------+------------------------------ Mean : 97.6 85.6% +0.9 I 99.9 89.7 88.0 88.0% +2.4%I 57.6 79.4 84.8 84.4% -0.5% Mean >10%: 89.8% - I 98.9 92.9% I 91.9 84.4% - -----+------------------+------------------------------+------------------------------ Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. Concerning macro viruses, "mean" detection rate is slightly reduced "in the mean" to a still inacceptably low level (<90%) though there is some slight improvement (+0.3%) for those products which participated also in last test; when one does not count 2 products with extreme low detection arte (<30%), mean results are acceptable if not very good (98.9%). Now, 4 scanners detect ALL MACRO zoo viruses, and 4 more detect almost all. Concerning script viruses which is presently the fastest growing sector, detection rate is improving though still low (84.8% mean) but those (15) products which also participated in last VTC test have improved their detection rates; but the impressing figure (+2.5%) is essentially influenced by one product which upgraded its detection rate by 39.9% (NAV) to now reach 94.2%. Now, 4 products detect ALL script zoo viruses. Findings W2k.1: General development of zoo virus detection rates: ----------------------------------------------------------------- For W-2000, file, macro and script zoo virus detection rates, no improvement can be reported. Here, significant work is needed. Mean detection rates remain inacceptably low: mean file zoo virus detection rate: 85.6% mean macro virus detection rate: 88.0% mean script virus detection rate: 84.4% ------------------------------------------------ Concerning file zoo viruses: NO product detects ALL viruses ("perfect") 4 products detect more than 90% and are rated "excellent": AVP,FSE;AVK,SCN ------------------------------------------------ Concerning macro zoo viruses: 1 products detects ALL macro zoo viruses in all files and is rated "perfect": SCN 12 products detect almost all macro viruses in almost all files and are rated "excellent": AVK,AVP,FPR,FPW,FSE;CMD,INO,RAV;NAV,NVC;DRW;BDF ------------------------------------------------ Concerning script zoo viruses: NO product detects ALL viruses ("perfect") 3 product detect almost all script viruses in almost all files and are rated "excellent": SCN,FSE,AVK Findings W2k.2: Development of ITW file/macro/script virus detection rates: --------------------------------------------------------------------------- 6 AV products (out of 19) detect ALL In-The-Wild file, macro and zoo viruses in ALL instantiations (files) and are rated "perfect": AVK,AVP,DRW,FSE,NAV,SCN 0 scanner is "excellent": --- ************************************************* Concerning detection of ITW file viruses: 1 "perfect" scanner: NAV 7 "excellent" scanner:AVK,AVP,DRW,FSE,RAV,SCN,INO Concerning detection of ITW macro viruses: 7 "perfect" scanners: AVK,AVP,DRW,FSE,INO,NAV,SCN 10 "excellent" scanners: AVG,BDF,CMD,FPR,FPW,NVC,RAV,AVA,IKA,PRO Concerning detection of ITW script viruses: 12 "perfect" scanners: AVG,AVK,AVP,CMD,DRW,FPR,FPW,FSE,NAV,NVC,RAV,SCN 2 "excellent" scanners: IKA,INO Findings W2k.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- Now, NO W2k product is overall rated "perfect" (in last test: 3 products!): --- 3 "excellent" overall scanners: SCN,FSE,AVK 2 "very good" overall scanners: AVP,NAV Findings W2k.4: Performance of W2k scanners by virus classes: ------------------------------------------------------------ Perfect scanners for file zoo: --- Excellent scanners for file zoo: AVP,FSE,AVK,SCN Very Good scanners for file zoo: INO,CMD,DRW,NAV,NVC,AVA,FPR,FPW ------------------------------------------------------------ Perfect scanners for macro zoo: SCN Excellent scanners for macro zoo: AVK,AVP,FPR,FPW,FSE,CMD,INO,RAV,NVC,NAV,DRW,BDF Very Good scanners for macro zoo: AVG,AVA,IKA ------------------------------------------------------------ Perfect scanners for script zoo: --- Excellent scanners for script zoo: SCN,FSE,AVK Very Good scanners for script zoo: AVP,NAV,RAV Findings W2k.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Concerning OVERALL detection of packed file AND macro viruses, 4 products are "perfect": AVK,AVP,BDF,SCN And 2 products are "excellent": DRW,RAV One more product is "very good": FSE ******************************************************* Concerning detection of packed FILE viruses: 5 products are "perfect": AVK,AVP,BDF,FSE,SCN 2 products are "excellent": DRW,RAV ***************************************************** Concerning detection of packed MACRO viruses: 7 products are "perfect": AVK,AVP,BDF,CMD,FPR,FPW,SCN 2 products are "excellent": DRW,RAV 3 products are "very good": AVG,FSE,NAV Findings W2k.6: Avoidance of False Alarms: ------------------------------------------ Avoidance of False-Positive Alarms is rather well developped, at least for file-FP avoidance. 8 Overall FP-avoiding perfect W2k scanners: AVA,AVG,BDF,INO,PRO,NAV,SCN,VSP *************************************************** Concerning file-FP avoidance, 17 products are "perfect": AVA,AVG,AVK,AVP,BDF,CMD, FPR,FPW,FSE,IKA,INO,MR2,NAV,PRO,RAV,SCN,VSP And 1 more product is "excellent": DRW *************************************************** Concerning macro-FP avoidance, these products are "perfect": AVA,AVG,BDF,INO,NAV,PRO,SCN,VSP And 2 more products are "excellent": AVK,RAV Findings W2k.7: Detection rates for file/macro/script malware: -------------------------------------------------------------- File/ Macro/Script Malware detection under W2k is less developped compared to last test: 0 products are "perfect": --- 5 products are "excellent": FSE,AVK,AVP,SCN 1 products is "very good": RAV *************************************************** Concerning only file malware detection, 4 products are "perfect": --- 5 products are "excellent": FSE,AVK,AVP,SCN,FPR,FPW,CMD 2 product are rated "very good": INO,RAV *************************************************** Concerning only macro malware detection, 4 products are "perfect": AVK,AVP,FSE,SCN 11 products are "excellent": CMD,FPR,FPW,RAV,NVC,INO,NAV,BDF,IKA,DRW,AVA 1 product is rated "very good": AVG *************************************************** Concerning only script malware detection, 0 products are "perfect": --- 5 products are "excellent": SCN,FSE,AVK,AVP,NAV 1 product is rated "very good": RAV Grading W2k products according to their detection performance: ============================================================== Under the scope of VTCs grading system (see 4), we summarize our results for W2k-related scanners: ***************************************************************** In VTC test "2002-12", we found *** NO perfect W2k AV product *** and we found *** No perfect W2k AM product *** ***************************************************************** But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ----------------------------------------------------------------- W2k file ITW test: NAV AVK,AVP,DRW,FSE,SCN, INO,RAV W2k macro ITW test: AVK,AVP,DRW,FSE, AVG,BDF,CMD,FPR,FPW, INO,NAV,SCN NVC,RAV,AVA,IKA,PRO W2k script ITW test: AVG,AVK,AVP,CMD,DRW, IKA,INO FPR,FPW,FSE,NAV,NVC, RAV,SCN, ----------------------------------------------------------------- W2k file zoo test --- AVP,FSE,AVK,SCN W2k macro zoo test: SCN AVK,AVP,FPR,FPW,FSE,CMD, INO,RAV,NVC,NAV,DRW W2k script zoo test: --- SCN,FSE,AVK ----------------------------------------------------------------- W2k file pack test: AVK,AVP,BDF,FSE,SCN DRW,RAV W2k macro pack test: AVK,AVP,BDF,CMD,FPR, DRW,RAV FPW,SCN W2k file FP avoidance: AVA,AVG,AVK,AVP,BDF, DRW CMD,FPR,FPW,FSE,IKA, INO,MR2,NAV,NVC,PRO, RAV,SCN,VSP W2k macro FP avoidance: AVA,AVG,BDF,INO, AVK,RAV PRO,NAV,SCN,VSP ----------------------------------------------------------------- W2k file malware test: --- FSE,AVK,AVP,SCN, FPR,FPW,CMD W2k macro malware test: AVK,AVP,FSE,SCN CMD,FPR,FPW,RAV,NVC, INO,NAV,BDF,IKA,DRW,AVA W2k script malware test: --- SCN,FSE,AVK,AVP,NAV ----------------------------------------------------------------- In order to support the race for more customer protection, we evaluate the order of performance in this W2k test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" W-2000 AntiVirus product: =NONE= (20 points) "Excellent" W-2000 AV products: 1st place: SCN (18 points) 2nd place: AVK (16 points) 3rd place: AVP (14 points) 4th place: FSE (13 points) 5th place: NAV (11 points) 6th place: DRW (10 points) 7th place: BDF,INO,RAV ( 9 points) 10th place: FPR ( 8 points) 11th place: AVG ( 7 points) 12th place: CMD,FPW ( 6 points) 14th place: NVC,AVA ( 5 points) 16th place: IKA,MR2,PRO,VSP ( 4 points) ************************************************************ "Perfect" W-2000 AntiMalware product: =NONE= (26 points) "Excellent" W-2000 AntiMalware product: 1st place: SCN (22 points) 2nd place: AVK (20 points) 3rd place: AVP (18 points) 4th place: FSE (17 points) 5th place: NAV (13 points) 6th place: DRW (11 points) 7th place: BDF,FPR,INO,RAV (10 points) 11th place: CMD,FPW ( 8 points) 13th place: NVC ( 6 points) 14th place: IKA ( 5 points) ************************************************************ 9. Comparison of detection results under Windows-32 platforms: ============================================================== This is a summary of the comparison of AV/AM products under different W32 platforms (W-98 and W-2k). For details see 7evalw32.txt. With the fast deployment of new versions of Microsoft Windows-32 (in past 5 years from W-NT to W-95, W-98, W-2000 and soon W-XP), both customers needing protection nd producers of security-enhancing software (esp. AntiVirus and AntiMalware) can only cope with the pace when they essentially re-use engines prepared for previous W32 platforms and simply "adapt" them to the intrinsics of the new platforms. Otherwise, "rewriting" the resp. software would consume too much time and efforts, and customers would receive "adapted" products only with some delay. AV/AM testers cannot determine the characteristics of the algorithms in scanning engines, either in following legal objectives (which, in most Copyright laws, prohibit reverse-engineering of proprietory code, except for specific reasons such as collecting evidence for a court case or teaching related techniques, as in Hamburg university IT Security curriculum), or for shere complexity of related code (and in many cases, for unsufficient professional knowledge of testers). It is therefore worthwhile to analyse whether those AV/AM products versions of which are available for all W32 platforms behave EQUALLY concerning detection and identification of viral and malicious code. Test Hypothesis: "W32-harmonical" behaviour of W32 products: ============================================================ We assume that those products which participate for all W32 platforms in this test (W98 and W2k) for ALL categories shall yield INDENTICAL results (argument for this assumption: likelihood of reuse of engines running on the same platform). We call product behaviour following this hypothesis "W32-harmonical". Finding W32.1: Equality of results for all W32 platforms: --------------------------------------------------------- In comparison with last VTC test, not much progress can be reported. Equal detection this test last test -----------------------------+-----------+----------- of zoo file viruses: 9 (of 18) ----- of zoo infected files: 7 (of 18) ----- of ITW file viruses: 17 (of 18) ----- of ITW infected macro files: 17 (of 18) ----- of zoo file malware: 13 (of 18) ----- Equal detection this test last test -----------------------------+-----------+----------- of zoo macro viruses: 16 (of 18) 16 (of 18) of zoo infected macro objects: 16 (of 18) 16 (of 18) of ITW macro viruses: ALL(of 18) ALL(of 18) of ITW infected macro files: ALL(of 18) ALL(of 18) of zoo macro malware: 15 (of 18) 15 (of 18) Equal detection this test last test -----------------------------+-----------+----------- of zoo script viruses: 16 (of 18) 12 (of 18) of zoo script viral objects: 15 (of 18) 10 (of 18) of ITW script viruses: ALL(of 18) 17 (of 18) of ITW script viral objects: ALL(of 18) 15 (of 18) of ITW script malware: 16 (of 18) 17 (of 18) -------------------------------------------------------------- Concerning detection of FILE viruses, a MINORITY of 7 (out of 18) products behave "W32-harmonically" in all categories: AVK,AVP,BDF,CMD,INO,FPW,SCN And concerning file malware detection, only 7 (of 18) products behave in W32-harmonical form: AVK,AVP,BDF,CMD,FPW,SCN -------------------------------------------------------------- Concerning detection of MACRO viruses, a MAJORITY of 14 (out of 18) products behave "W32-harmonically" in all categories: AVA,AVG,AVK,AVP,BDF,CMD,DRW,FPR,FPW,FSE,INO,PRO,SCN,VSP And concerning macro malware detection, 14 (of 18) products behave in W32-harmonical form: AVA,AVG,AVK,AVP,BDF,CMD,DRW,FPR,FPW,FSE,INO,PRO,SCN,VSP -------------------------------------------------------------- Concerning detection of SCRIPT viruses, a MAJORITY of 16 (out of 18) products behave "W32-harmonically" in all categories: AVA,AVG,AVK,AVP,CMD,DRW,FPR,FPW,INO,PRO,MR2,NVC,PRO,RAV,SCN,VSP And concerning script malware detection, 14 (of 18) products behave in W32-harmonical form: AVA,AVG,AVK,AVP,CMD,DRW,FPR,FPW,INO,MR2,NVC,PRO,RAV,SCN,VSP The following grid is used to grade W32 products concerning their ability for IDENTICAL detection for ALL categories on ALL W32 platforms: A "perfect" W32-harmonical AV product will yield IDENTICAL results for all categories (macro and script viruses). (Assigned value: 5). A "perfect" W32-harmonical AM product will be a perfect AV product and yield IDENTICAL results for all categories (macro and script malware). (Assigned value: 2). Grading W32-harmonical AntiVirus products: =========================================================== Grade: "Perfect" W32-harmonical detection: AVK,AVP,BDF,CMD,INO,FPW,SCN =========================================================== Grading W32-harmonical AntiMalware products: =========================================================== Grade: "Perfect" W32-harmonical detection: AVK,AVP,BDF,CMD,FPW,SCN =========================================================== ************************************************************ "Perfect" W32-harmonical AntiVirus products: 1st place: AVK,AVP,BDF,CMD,INO,FPW,SCN (5 points) ************************************************************ "Perfect" W32-harmonical AntiMalware products: 1st place: AVK,AVP,BDF,CMD,FPW,SCN (7 points) ************************************************************ 10. Results of on-demand detection under Linux(SuSe): ===================================================== This is a summary of the essential findings for AV/AM products under Linux. For details see 7evallin.txt. Meant as a perspective of product results, the following table (LIN-A) lists all results of Linux scanners for zoo detection of (file), macro and script viruses, in last 3 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table Lin-A: Performance of LINUX scanners in Test 2001-04 until 2002-12: ========================================================================= Scan I == File Virus == + ===== Macro Virus ===== + === Script Virus ==== ner I Detection I Detection I Detection -----+------------------+-------------------------+------------------------ Test I 0104 0212 Delta I 0104 0110 0212 Delta I 0104 0110 0212 Delta -----+------------------+-------- ----------------+------------------------ ANT I - - - I - 97.1 - - I - 81.8 - - AVK I - 99.9 - I - 100% 100~ 0.0 I - 100% 99.1 -0.9 AVP I 99.9 - - I 100~ 100% - - I 99.8 100% - CMD I 97.8 99.1 +1.3 I 100% 100% 100~ ~0.0 I 96.9 94.2 89.4 -4.8 DRW I - 98.3 - I - 99.5 99.4 -0.1 I - 95.4 94.7 -0.7 FPR I - 98.9 - I - - 100~ - I - - 88.7 - FSE I 97.1 98.0 +0.9 I 100~ 100% 100~ ~0.0 I 96.9 92.3 88.1 -4.2 MCV I - - - I - 9.1 - - I - 27.6 - - OAV I - 9.1 - I - - 0.1 - I - - 13.9 - RAV I 93.5 96.7 +3.2 I 99.6 99.5 99.9 +0.4 I 84.9 82.5 96.1 +13.6 SCN I 99.7 99.8 +0.1 I 100% 100% 100% 0.0 I 99.8 99.8 99.5 -0.3 -----+-------------------+------------------------+------------------------ Mean : 97.6% 87.5% +1.4% I 99.9% 89.5% 74.9% +0.1 I 95.7% 86.0% 83.7% +0.5% >10% : 97.6% 98.7% - I 99.9% 99.5% 99.8% - I 95.7% 86.0% 83.7% - -----+-------------------+------------------------+------------------------ Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. While the majority of macro virus detectors works on a rather high level (the reduced mean detection rate is essentially influenced by one product which is new in this test), detection rates for script detectors are sig- nificantly less developped and need further improvement. Findings LIN.1: General development of zoo virus detection rates: ------------------------------------------------------------------ Concerning zoo virus detection, LINUX products are less developped as compared to other platforms. Detection rates for file viruses (in zoo: 98.7%) are significantly better than for macro viruses (in zoo: 89.3%) and for script viruses (in zoo: 83.7%). ------------------------------------------------- NO product detects ALL zoo file viruses, but 3 products detect >99% (grade: excellent) AVK (99.9%), SCN (99.8%), CMD (99.1%) ------------------------------------------------- ONE product detects ALL zoo macro viruses: SCN (100.0%): grade: perfect, and products detect almost 100%: AVK, CMD, FPR, FSE (all: 100~), RAV (99.9%): grade: "excellent" ------------------------------------------------- NO product detects ALL zoo script viruses, but 2 products detect >90% (grade: excellent): SCN (99.9%), AVK (99.1%) Findings LIN.2: Development of ITW virus detection rates: --------------------------------------------------------- 3 AV products detect "perfectly" all ITW file, macro and script viruses in all files: AVK,DRW,SCN ---------------------------------------------------- Concerning detection of ITW file viruses: 3 "perfect" scanners: AVK,DRW,SCN 1 "excellent" scanner: RAV ---------------------------------------------------- Concerning detection of ITW macro viruses: 3 "perfect" scanners: AVK,DRW,SCN 4 "excellent" scanners: CMD,FPR,FSE,RAV ---------------------------------------------------- Concerning detection of ITW script viruses: 7 "perfect" scanners: AVK,CMD,DRW,FPR,FSE,RAV,SCN Findings LIN.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- 2 LINUX products overall rated "excellent": AVK,SCN ----------------------------------------------------- 1 "Useless" overall LINUX scanner: OAV Findings LIN.4: Performance of LINUX scanners by virus classes: --------------------------------------------------------------- Perfect scanners for file zoo: --- Excellent scanners for file zoo: AVK,SCN,CMD Very Good scanners for file zoo: FPR,DRW,FSE,RAV --------------------------------------------------------- Perfect scanners for macro zoo: SCN Excellent scanners for macro zoo: AVK,CMD,FPR,FSE,RAV,DRW Very Good scanners for macro zoo: --- --------------------------------------------------------- Perfect scanners for script zoo: --- Excellent scanners for script zoo: SCN,AVK Very Good scanners for script zoo: RAV Findings LIN.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Detection of packed viral objects needs improvement: Perfect packed ITW file/macro virus LINUX detector: AVK,SCN Excellent packed ITW file/macro virus LINUX detector:DRW,RAV Very Good packed ITW file/macro virus LINUX detector: --- Findings LIN.6: Avoidance of False Alarms: ------------------------------------------ Avoidance of False-Positive Alarms is insufficient and needs improvement. FP-avoiding perfect LINUX scanners: SCN --------------------------------------------------- Concerning file-FP avoidance, these products are "perfect": AVK,FPR,FSE,RAV,SCN The following products are "excellent": DRW,CMD --------------------------------------------------- And concerning macro-FP avoidance, this product is "perfect": SCN The following product is "excellent": RAV Findings LIN.7: Detection rates for file/macro malware: ------------------------------------------------------- NO LINUX products can be rated "perfect" in detecting ALL file, macro & script malware specimen: 2 products are rated "excellent": AVK,SCN 1 product is rated "Very Good": RAV ---------------------------------------------------- Concerning single classes of malware: A) "perfect" file malware detector: --- "excellent" file malware detector: AVK,CMD,SCN,FPR "very good" script malware detector: FSE,RAV --------------------------------------------------- B) "perfect" macro malware detector: AVK "excellent" macro malware detector: SCN,CMD,DRW,FPR,FSE,RAV "very good" macro malware detector: --- --------------------------------------------------- C) "perfect" script malware detector: --- "excellent" script malware detector: SCN,AVK "very good" script malware detector: RAV Grading LIN products according to their detection performance: ============================================================== Under the scope of VTCs grading system (see 4), we summarize our results for W2k-related scanners: ******************************************************************* In VTC test "2002-12", we found *** NO perfect LINUX AV product *** and we found *** No perfect LINUX AM product *** ******************************************************************* But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ------------------------------------------------------------------ LINUX file ITW test: AVK,DRW,SCN RAV LINUX macro ITW test: AVK,DRW,SCN CMD,FPR,FSE,RAV LINUX script ITW test: AVK,CMD,DRW, FPR,FSE,RAV,SCN ------------------------------------------------------------------ LINUX file zoo test: --- AVK,SCN,CMD LINUX macro zoo test: SCN AVK,CMD,FPR,FSE,RAV,DRW LINUX script zoo test: --- SCN,AVK ------------------------------------------------------------------ LINUX file pack test: AVK,SCN DRW,RAV LINUX macro pack test: AVK,SCN,CMD,FPR DRW,RAV LINUX file FP avoidance: AVK,FPR,FSE,RAV,SCN CMD,DRW LINUX macro FP avoidance: SCN RAV ---------------------------------------------------------------- LINUX file malware test: --- AVK,CMD,SCN,FPR LINUX macro malware test: AVK SCN,CMD,FPR,RAV,DRW,FSE LINUX script malware test: --- SCN,AVK ---------------------------------------------------------------- As before, to support the race for better customer protection, we evaluate the order of performance in this LINux test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once): ************************************************************ "Perfect" LINUX AntiVirus product: ===NONE=== (20 points) "Excellent" LINUX AV products: 1st place: SCN (18 points) 2nd place: AVK (15 points) 3rd place: DRW,RAV (10 points) 5th place: CMD,FPR ( 8 points) 7th place: FSE ( 6 points) ************************************************************ "Perfect" LINUX AntiMalware product:===NONE=== (24 points) "Excellent" LINUX AM products: 1st place: SCN (21 points) 2nd place: AVK (19 points) 3rd place: DRW,RAV (11 points) 5th place: CMD,FPR (10 points) 7th place: FSE ( 7 points) ************************************************************ Under LINUX, we also tested the then available version of a product which is advertised as "Open AntiVirus" (OAV). Evidently aimed at supporting friends of the "Open Software" idea, this product requires JAVA to be able to run. Apart from the fact that such a requirement introduces another kind of risk, the addition of another level of complexity to protect some yet overly complex systems is contradicting the security principle: "Keep it as simple as possible" (known as KIS principle). Moreover, it contradicts VTCs test conditions (see 4testcon.txt); conse- quently, VTC canNOT test this product on Windows platforms. But we were interested to learn whether a group of self-declared experts with no sufficient daily experience in virus analysis is able to support their community with a somewhat qualified product. The results are - at best - disappointing: ------------------------------------------------------ Detection rates of OAV under LINUX are INACCEPTABLE! OAV must be graded into the "useless" class :-) ====================================================== 11. Final remark: In search of the "Perfect AV/AM product": =========================================================== This test includes 3 platforms for which engines are hardly comparable, namely DOS (16-bit engines), W98/W2k (32-bit engines, comparable) and LINUX. Moreover, several manufacturers submitted only products for special platforms. ********************************************************** In general, there is NO AntiVirus and NO AntiMalware product which can be rated "PERFECT" in all categories for ALL categories (esp. file, macro AND script). ----------------------------------------------------------- But for SINGLE categories (file, macro OR script viruses), there are SEVERAL products which can be rated "perfect" or "excellent". ************************************************************ Instead of calculating an overall value (e.g. the sum of points divided by number of products in test for a given platform), the following TABLES list product suites by their places, sorting by assigned points: Table SUM-AV: Survey of Results for AntiVirus Products: ------------------------------------------------------- ================== AntiVirus Products ====================== DOS (22) W98 (20) W2k (20) W32 ( 5) LINUX(20) ------------------------------------------------------------ Place 1: SCN (20) SCN (18) SCN (18) AVK ( 5) SCN (18) --------------------------------------+ +--------- AVP (16) AVK (16) AVK (16) =AVP ( 5) AVK (15) NAV (13) AVP (14) AVP (14) =BDF ( 5) DRW (10) DRW (11) FSE (12) FSE (13) =CMD ( 5) =RAV (10) CMD (10) NAV (11) NAV (11) =INO ( 5) CMD ( 8) FPR (10) BDF (10) DRW (10) =FPW ( 5) =FPR ( 8) INO ( 9) DRW ( 9) BDF ( 9) =SCN ( 5) FSE ( 6) RAV ( 8) INO ( 9) =INO ( 9) +--------+ NVC ( 6) RAV ( 9) =RAV ( 9) VSP ( 6) CMD ( 8) FPR ( 8) AVG ( 5) FPR ( 8) AVG ( 7) AVA ( 4) FPW ( 8) CMD ( 6) AVG ( 7) =FPW ( 6) NVC ( 6) NVC ( 5) AVA ( 5) =AVA ( 5) PRO ( 5) IKA ( 4) VSP ( 4) =MR2 ( 4) MR2 ( 2) =PRO ( 4) =VSP ( 4) ------------------------------------------------------------ Remark: Numbers for platforms indicate numbers of products in test. Numbers for products indicates points assigned for that platform. = indicates that products place euqals to previous product. Table SUM-AM: Survey of Results for AntiMalware Products: --------------------------------------------------------- ================ AntiMalware Products ===================== DOS (28) W98(26) W2k(26) W32( 7) LINUX(26) ----------------------------------------------------------- Place 1: SCN (26) SCN (22) SCN (22) AVK ( 7) SCN (21) --------------------------------------+=AVP ( 7)+---------- AVP (20) AVK (20) AVK (20) =BDF ( 7) AVK (19) NAV (14) AVP (18) AVP (18) =CMD ( 7) DRW (11) DRW (13) FSE (16) FSE (17) =INO ( 7) =RAV (11) CMD (12) NAV (13) NAV (13) =FPW ( 7) CMD (10) =FPR (12) BDF (11) DRW (11) =SCN ( 7) =FPR (10) INO (10) CMD (10) BDF (10)+---------+ FSE ( 7) RAV ( 8) =DRW (10) =FPR (10) NVC ( 7) =FPR (10) =INO (10) =FPW (10) =RAV (10) =INO (10) CMD ( 8) =RAV (10) =FPW ( 8) NVC ( 7) NVC ( 6) AVA ( 6) IKA ( 5) ----------------------------------------------------------- Generally, we hope that these rather detailed results help AV producers to adapt their products to growing threats and thus to protect their customers. 14. Availability of full test results: ====================================== Much more information about this test, its methods and viral databases, as well as detailed test results are available for anonymous FTP downloading from VTCs HomePage (VTC is part of Working Group AGN): ftp://agn-www.informatik.uni-hamburg.de/vtc Any comment and critical remark which helps VTC learning to improve our test methods will be warmly welcomed. Further to this test, we follow suggestions of AV producers to test the heuristic detection ability of scanners for those viruses which were detected first only after product submission. In this "pro-active test", we will test the heuristic detection quality for products submitted for W-NT, for macro and script viruses detected between April 2001 and June 2001 as well as between July 2001 and October 2001. The next comparative test will evaluate file, boot, macro (VBA/VBA5) and script virus and malware detection. This test is planned for January - March 2002, with viral databases frozen on October 31, 2001. Any AV producer wishing to participate in forthcoming test is invited to submit related products. The VTC Test Team: ------------------ VTC supervisor: Jan Seedorf Coordination: Klaus Brunnstein, Michel Messerschmidt, Jan Seedorf Testbeds: Klaus Brunnstein, Martin Kittel Hardware (C/S): Martin Kittel System Software: Martin Kittel, Jan Seedorf Managing Products: Klaus Brunnstein DOS-Scanners: Heiko Fangmeier W-98/W-2k Scanners: Ulrike Siekierski, Ulf Deichmann, Stefan Heimann, Eugen Hofmann, Benjamin Hoherz, Silvio Krueger, Maxim Mariach, Jan Menne, Fabian Mueller, Jan Seedorf Linux Scanners: Bodo Eggert Evaluation, QA: Michel Messerschmidt, Thomas Buck Graphical Presentation: Thomas Buck, Jan Seedorf Test report: Klaus Brunnstein (supported by all) On behalf of the VTC test team: Dr. Klaus Brunnstein (January 29, 2003) 15. Copyright, License, and Disclaimer: ======================================= This publication is (C) Copyright 2001 by Klaus Brunnstein and the Virus Test Center (VTC) at University of Hamburg, Germany. Permission (Copy-Left) is granted to everybody to distribute copies of this information in electronic form, provided that this is done for free, that contents of the information are not changed in any way, and that origin of this information is explicitly mentioned. It is esp. permitted to store and distribute this set of text files at university or other public mirror sites where security/safety related information is stored for unrestricted public access for free. Any other use, esp. including distribution of these text files on CD-ROMs or any publication as a whole or in parts, are ONLY permitted after contact with the supervisor, Prof. Dr. Klaus Brunnstein or authorized members of Virus Test Center at Hamburg University, and this agreement must be in explicit writing, prior to any publication. No responsibility is assumed by the author(s) for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or ideas contained in the material herein. Dr. Klaus Brunnstein Professor for Applications of Informatics University of Hamburg, Germany (January 29, 2003)