============================================================ File 0XECSUM.TXT: "EXECUTIVE SUMMARY" AntiVirus/AntiMalware Product Test "2001-10" antiVirus Test Center (VTC), University of Hamburg ============================================================ [Formatted with non-proportional font (Courier), 72 columns] ********************************************************************** Content of this file: ********************************************************************** 0. Editors Foreword 1. Background of this test; Malware Threats Table ES0: Development of viral/malware threats 2. VTC Testbeds used in VTC test 2001-10 Table ES1: Content of VTC test databases in test 2001-10 3. Products participating in test 2001-10 Table ES2: List of AV products in test 2001-10 4. A serious problem: Flaw in Microsoft FindFirst/FindNext routine? 5. Results of AV products under DOS Table DOS-A2: Development of DOS scanners from 1997-02 to 2001-10 Findings DOS.1 - DOS.7 Grading DOS products according to their detection performance 6. Results of on-demand detection under Windows-NT Table WNT-A2: Development of W-NT scanners from 1997-07 to 2001-10 Findings WNT.1 - WNT.7 Grading WNT products according to their detection performance 7. Results of on-demand detection under Windows-98 Table W98-A2: Development of W-98 scanners from 1998-10 to 2001-10 Findings W98.1 - W98.7 Grading W98 products according to their detection performance 8. Results of on-demand detection under Windows-2000 Table W2k-A: Development of W-2k scanners from 2000-08 to 2001-10 Findings W2k.1 - W2k.7 Grading W2k products according to their detection performance 9. Comparison of detection behaviour for W32 platforms Grading AV products concerning W32-harmonical behaviour 10. Results of on-demand detection under Linux(SuSe) Table Lin-A: Development of Linux scanners from 2001-04 to 2001-10 Findings LIN.1 - LIN.7 Grading LINUX products according to their detection performance 11. Conclusion: In Search of the "Perfect AV/AM product" 12. Availability of full test results 13. Copyright, License, and Disclaimer *********************************************************************** 1. Editors Foreword: ==================== VTC test "2001-10" was started end-June 2001. In this test, we had more scanners than ever before, and testbeds have grown significantly. With the growth of our testbeds, we also experienced much more problems as products behaved "abnormally" (see 8problms.txt). Moreover, some products (and our test crew) suffered from a known but uncorrected flaw in Microsoft FindFirst/FindNext routines which required several postscans (see 4). Evidently, the time of DOS-based AntiVirus/AntiMalware products - formerly the reference products for measuring detection rates on W32 platforms - passes by. Not only is the number of DOS products decreasing, but worse: detection rates tend to fall. Consequently, DOS products can no longer be regarded as reference products. This forced the editor to develop a general framework for evaluation of each single product (see 7EVAL texts). With the deployment of new W32 platforms (including W-XP), customers going from one W32 platform to another will assume that related AV/AM products behave with IDENTICAL detection rates. We tested this assumption (which we call "W32-harmonical behaviour", see 9) and found that this assumptions is justified for many (though not all) products. ** As so many W32-products behave"W32-harmonically", it will ** ** be sufficient in future tests, to concentrate on fewer W32 ** ** platforms (including W-98, W-2000 and W-XP). ** One serious concern from our results is that AV producers concentrate more on detection of In-The-Wild (ITW) viruses than on zoo viruses. Indeed, one AV company informed us that they dont wish to participate in our test as they concentrate on ITW detection and are aware that their products will produce "unfavourable results" for our zoo testbeds (see 3). For many other AV products, detection rates of ITW viruses are perfect (100%) or excellent (>99%) but detection of zoo viruses is often significantly lower. Evidently, AV producers focusing on ITW detection forget that any ITW virus has been a zoo virus before becoming In-the-wild. It cannot surprise that customers of such products experience badly how neglection of zoo virus detection affects their IT services when a hithertoo unknow zoo virus is deployed broadly (the author of this test report had to advice several victims of such ill-advised "ITW-mindedness" aka "zoo-blindness"). And first victims - often large companies - can also not win from any fast exchange of newly "wildering" code: this is always too late for some! This test - as all previous ones - has been performed by students of Hamburg university Faculty for Informatics with special interest in IT Security (see our 4-semester curriculum started in 1988, on our homepage). Different from other tests where submitters of products have to pay a fee for being admitted to tests, VTC tests are "free of fee". This implies that students who have to complete their examinations and usually also work to earn their life are only "partially available" for tests. Moreover, our hardware which is essentially funded by Faculty support (sometimes also by donation of new machines, usually more powerful than those which we can by from university money) canNOT compete with the technical equipment in other test-labs. We regret that these circumstances cause delays in perfor- ming and publishing our regular test reports, but instead of hurrying to meet dates and expectations, we insist that assessed quality of our test results shall have - also in the future - highest priority. Most work in VTC tests rest on the shoulders of our test crew, and the editor wishes to thanx them all for their devotion and hard work. 1. Background of this test: Malware Threats: ============================================ Malicious software (malware) including viruses (=self-replicating malware), trojan horses (=pure payload without self-replication), virus droppers and network malware (e.g. worms and hostile applets), are regarded as serious threats to PC users esp. when connected to Intranetworks and Internet. The development of malicious software can well be studied in view of the growth of VTC (zoo and In-The-Wild) testbeds. The following table summarizes, for previous and current VTC tests (indicated by their year and month of publication), the size of virus and malware (full = "zoo") databases (indicating each the different viruses and number of instantiations of a virus or malware and having in mind that some revisions of testbeds were made): Table ES0: Development of threats as present in VTC test databases: =================================================================== = File viruses= = Boot Viruses= === Macro Viruses/Malware ==== ScriptViruses/Malware= Test# Number Infected Number Infected Number Infected Number Malware Number Number Malware Viruses objects viruses objects viruses objects file macro viruses objects script --------------------------------------------------------------------------------------------- 1997-07: 12,826 83,910 938 3,387 617 2,036 213 72 --- --- --- 1998-03: 14,596 106,470 1,071 4,464 1,548 4,436 323 459 --- --- --- 1998-10: 13,993 112,038 881 4,804 2,159 9,033 3,300 191 --- --- --- 1999-03: 17,148 128,534 1,197 4,746 2,875 7,765 3,853 200 --- --- --- + 5 146,640 (VKIT+4*Poly) 1999-09: 17,561 132,576 1,237 5,286 3,546 9,731 6,217 329 --- --- --- + 7 166,640 (VKit+6*Poly) 2000-04: 18,359 135,907 1,237 5,379 4,525 12,918 6,639 260 --- --- --- + 7 166,640 (VKit+6*Poly) 2000-08: --- --- --- --- 5,418 15,720 --- 500 306 527 --- 2001-04: 20,564 140,703 1,311 5,723 6,233 19,387 12,160 627 477 904 --- + 7 166,640 (Vkit+6*Poly) 2001-07H: --- --- --- --- (+ 544)(+2,035) --- (+ 102)(+206)(+ 386) --- 2001-10: --- --- --- --- 6,762 21,677 --- 683 481 1,079 30 --------------------------------------------------------------------------------------------- Remark: Before test 1998-10, an ad-hoc cleaning operation was applied to remove samples where virality could not be proved easily. Since test 1999-03, separate tests are performed to evaluate detection rates of VKIT-generated and selected polymorphic file viruses. With annual deployment of more than 5,000 viruses and about 1,000 Trojan horses, many of which are available from Internet, and in the absence of inherent protection against such dysfunctional software, users must rely on AntiMalware and esp. AntiVirus software to detect and eradicate - where possible - such malicious software. Hence, the detection quality of AntiMalware esp. including AntiVirus products becomes an essential prerequisite of protecting customer productivity and data. Virus Test Center (VTC) at Hamburg University´s Faculty for Informatics performs regular tests of AntiMalware and esp. AntiVirus Software. VTC recently tested current versions of on-demand scanners for their ability to identify PC viruses. Tests were performed on VTCs malware databases, which were frozen on their status as of *** April, 2001 *** to give AV/AM producers a fair chance to support updates within the 8 weeks submission period (product submission date: June 25, 2001). The main test goal was to determine detection rates, reliability (=consistency) of malware identification and reliability of detection rates for submitted or publicly available scanners; this test determined the detection rates for macro and script viruses. It was also tested whether viruses packed with 6 popular compressing methods (PKZIP, ARJ, LHA, RAR, WinRAR and CAB) would be detected (and to what degree) by scanners. Moreover, avoidance of False Positive alarms on "clean" (=non-viral and non- malicious) objects was also determined. Finally, a set of selected non-viral macro and script malware (droppers, Trojan horses, intended viruses etc) was used to determine whether and to what degree AntiVirus products may be used for protecting customers against Trojan horses and other forms of malware. VTC maintains, in close and secure connection with AV experts worldwide, collections of boot, file, macro and script viruses as well as related malware ("zoo") which have been reported to VTC or AV labs. Moreover, following the list of "In-The-Wild Viruses" (published on regular basis by Wildlist.org), a collection of viruses reported to be broadly visible is maintained to allow for comparison with other tests; presently, this list does not report ITW Malware. 2. VTC Testbeds used in VTC test "2001-10": =========================================== The current sizes of different VTC testbeds (developed from previous testbeds through inclusion of new viruses and malware and some revision) is given in the following table (for detailed indices of VTC testbeds, see file "a3testbed.zip") Table ES1: Content of VTC test databases: ========================================================================== "Full Zoo": 6,762 Macro (VBA) Viruses in 21,667 infected documents 426 different Macro Malware in 683 macro objects 329 Clean macro objects (in 26 directories) used for False Positive test (macro ITW viruses) 588 different script (VBS etc) viruses in 1,079 infected objects 22 different script malware in 30 macro objects ----------------------------------------------------------------- "ITW Zoo": 143 Macro Viruses in 1,308 infected documents 19 Script Viruses in 110 infected objects ============================================================================ For a survey of platforms, see A4tstdir.txt, and for the content of the resp. testbeds see A3TSTBED.zip (available for download). Concerning quality of viral testbeds, it is sometimes difficult to assess the "virality" (=ability of a given sample to replicate at least twice under given constraints) of large "viral zoo" databases, esp. as some viruses work under very specific conditions. We are glad to report, that colleagues such as Dr. Vesselin Bontchev, Eugene Kaspersky, Fridrik Skulason, Igor Muttik and Righard Zwienenberg (to name only some) helped us with critical and constructive comments to establish viral testbeds, the residual non-viral part of which should be very small. We also wish to thank "WildList Organisation" for supporting us with their set of In-The-Wild viruses; the related results may support users in comparing VTC tests with other ITW-only tests. 3. Products participating ins Test "2001-10": ============================================= For test "2001-10", the following *** 24 *** AntiVirus products (adressed in subsequent tables by a 3-letter abbreviation) under DOS, Windows-98, Windows-NT, Windows-2000 and LINUX in 84 different versions were tested: Table ES2: List of AV products in test "2001-10" ================================================ Abbreviation/Product/Version Tested under Platform ---------------------------------------------------------------- ANT = H&B-EDV AntiVir 6.8.0.56 DOS W-NT W-98 Linux AVA = AVAST 7.70-53 DOS W-NT W-98 W2k AVG = Grisoft AntiVirus 6.0.263 DOS W-NT W-98 W2k AVK = GData AntivirenKit 3.0 (133) DOS W-NT W-98 W2k Linux AVP = Kaspersky AntiVirus 3.0 (135) DOS W-NT W-98 W2k Linux AVX = AntiVirus eXpert 6.1 W-NT W-98 CMD = Command Software 4.61.5 DOS W-NT W-98 W2k Linux DRW = DrWeb 4.25 DOS W-NT W-98 W2k Linux DSE = Dr Solomon Emergency AV W-98 FPR = FProt 3.09d DOS W-NT W-98 W2k FPW = FProt FP-WIN 3.09d W-NT W-98 W2k FSE = FSecure AntiVirus 1.00.1251 W-NT W-98 W2k Linux IKA = Ikarus AntiVirus 5.01 W-NT INO = Inoculan 6.0.85 W-NT W-98 W2k MCV = Main Channel Tech ViruScan W2k Linux MR2 = MR2S 1.17 DOS W-NT W-98 W2k NAV = Norton AntiVirus 4.1.0.6 DOS W-NT W-98 W2k NVC = Norman Virus Control 5.00.25 W-NT W-98 W2k PAV = Power AntiVirus 3.0 (131) DOS W-NT W-98 W2k QHL = QuickHeal 6.02 W-NT W-98 RAV = Rumanian AntiVirus 8.02.001 DOS W-NT W-98 W2k Linux SCN = NAI VirusScan 4.14.0 (4144) DOS W-NT W-98 W2k Linux VSP = VirScan Plus 12.22.1 DOS W-NT W-98 W2k ----------------------------------------------------------------------- Products tested: 14 + 22 + 21 + 18 + 9 = 84 ----------------------------------------------------------------------- For details of AV products including options used to determine optimum detection rates: see A3SCNLS.TXT. For scanners where results are missing: see 8problms.txt. In general, AV products were either submitted or, when test versions were available on Internet, downloaded from respective ftp/http sites. Few scanners were not available either in general (e.g. TNT) or for this test, some of which were announced to participate in some future test. Finally, very few AV producers answered VTCs requests for submitting scanners with electronic silence. Concerning often asked questions, some AV producers deliberately don´t submit their products: TrendMicro Germany has again recently informed the author of this report that they are NOT interested in VTC test participation as their scanner is deliberately trimmed to on-access scanning and detection of In-The-Wild viruses. As VTC emphasize also the detection of zoo viruses where their products would produce "unfavourable results", there is still no intent to submit their products. Following this clarification, VTC refrains from inviting TrendMicro for future test participation. Panda has permitted tests to any institution and university *except VTC*. Sophos: Regrettably, we had to ask Sophos, producer of Sophos AntiVirus (aka Sweep) to refrain from submitting their products, as VTC doesnot wish to support an enterprise which deliberately advocates and practices virus eXchange (vX) (whether rapid or not) which according to VTCs Code of Conduct is an attitude practiced by malevolent collectors and authors of malevolent software. Sophos followed VTCs suggestion but prefers calling this request "exclusion". So far, there is no indication that Sophos has changed their attitude concerning "vX". Recently, Sophos (Germany) has informed the author of this test report that they intend to take legal action against him to enforce him to delete the above statement from The Web. The following paragraphs survey essential findings in comparison with last VTC tests (performance over time), as well as some relative "grading" of scanners for detection of macro and script viruses, both in full "zoo" and "In-The-Wild" testbeds, of macro and script malware, as well as detection of ITW file and macro viruses in objects packed with ARJ, LHA, ZIP, RAR, WinRAR and CAB. Finally, the ability of AV products to avoid False Positive alarms is also analysed. Detailed results including precision and reliability of virus and malware identification (including the grids used to assign a performance level to each product) are presented in files (platform-specific): for DOS: 6bdos.txt for W32: 6gwnt.txt, 6fw98.txt, 6hw2k.txt comparison of W32 results: 6mcmp32.txt for Linux: 6xlin.txt In a rather detailed analysis, detection rates are presented for each platform (operating systems), and product behaviour is graded in compa- rison with all products tested on ther resp. platform: evaluation/grading for DOS products: 7evaldos.txt for W-NT products: 7evalwnt.txt for W-98 products: 7evalw98.txt for W-2k products: 7evalw2k.txt for W32 products: 7evalcmp.txt for LINUX products: 7evallin.txt Under the scope of VTCs grading system, a "Perfect AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND at least 99.9% of zoo samples, in ALL categories (file, boot and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (6) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Remark: detection of "exotic viruses" is presently NOT rated. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). Remark: detection of "exotic malware" is presently NOT rated. 4. A serious problem: Flaw in Microsoft FindFirst/FindNext routine? =================================================================== Since VTC tests started (including the work of Vesselin Bontchev in the early 1990s), we have experienced many problems. Products were often difficult to install and manage (see 8problems). With the growing size and diversity of testbeds, it was expected that problems would again grow. But there is one problem which not only affects testers with large viral databases (which is hardly the situation which customers experience). More than ever before, we found after completion of some test run that AV products had NOT TOUCHED all parts of the directory, for no obvious reason and always without any diagnosis (no exception etc). In such cases, we determined those parts of the testbed which had not been processed and restarted the product ("postscan"). When after completion some remainder part was untouched, we started a 2nd postscan for the remainder. The most probably reason for such behaviour of SEVERAL products which otherwise behave "smoothly" is: the methods offered by Microsoft to traverse a directory, esp. routines FindFirst and FindNext, DONT WORK RELIABLY on large directories. This effect has been reported first by Eugene Kaspersky, but we have seen NO IMPROVEMENT OR CORRECTION. Evidently, this problems seems to be related to the invocation of those routines (FF/FM), and this may be affected by some compiler or assembler. Only through excessive inspection of resulting testlogs ("test quality assurance"), we could reduce the impact of this MS flaw on our test results: but it is "NOT NATURAL" that anyone must start an AV product more than once to be sure that all potentially malicious objects had ALL been checked! This problem may not only show its dirty face for large virus/malware testbeds. With growing sizes of customer directories, the likelihood that NOT ALL OBJECTS are touched by any method using FF/FM grows. As this is a problem also for many companies with large directories, WE STRONGLY REQUEST THAT MICROSOFT CURES THIS FLAW. 5. Results of AV products under DOS: ==================================== This is a summary of the essential findings for AV/AM products under DOS. For details see 7evaldos.txt. Meant as a perspective of product results, the following table (DOS-A2) lists all results of DOS scanners for zoo detection of macro and script viruses, in last 10 VTC tests. Moreover, differences ("delta") in detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table DOS-A2: Macro/Script Virus Detection Rate in last 10 VTC tests under DOS: =============================================================================== ------------------ Macro Virus Detection --------------- + ScriptVirus Detection SCAN 9702 9707 9802 9810 9903 9909 0004 0008 0104 0110 Delta I 0008 0104 0110 Delta NER % % % % % % % % % % % I % % % % ---------------------------------------------------------------+---------------------- ALE 96.5 66.0 49.8 - - - - - - - - I - - - - ANT 58.0 68.6 80.4 56.6 - - 85.9 93.3 - 97.0 - I 55.2 - 81.8 - AVA 99.3 98.2 80.4 97.2 95.9 94.6 93.7 - 92.0 93.0 +1.0 I - 30.0 33.7 +3.7 AVG 25.2 71.0 27.1 81.6 82.5 96.6 - - 98.3 98.4 +0.1 I - 57.9 62.9 +5.0 AVK - - - 99.7 99.6 - - 100~ 100~ 100% 0.0 I 91.5 99.4 100% +7.9 AVP 99.3 99.0 99.9 100% 99.8 100% 99.9 - 100~ 100% 0.0 I - 99.8 100% +0.2 CMD - - - - - 99.5 100% 100~ - 100~ - I 93.5 - 93.9 - DRW 90.2 98.1 94.3 99.3 98.3 - 98.4 97.6 98.0 99.5 +1.5 I 60.8 95.6 95.4 -0.2 DSE 97.9 98.9 100% 100% 100% - - - - - - I - - - - FMA 98.6 98.2 99.9 - - - - - - - - I - - - - FPR 43.4 36.1 99.9 99.8 99.8 99.7 100% 100~ 100% 100~ 0.0 I 90.5 96.9 94.6 -2.3 FSE - - 99.9 90.1 99.6 97.6 99.9 - - - - I - - - - FWN 97.2 96.4 91.0 85.7 - - - - - - - I - - - - HMV - - 98.2 99.0 99.5 - - - - - - I - - - - IBM 65.0 88.8 99.6 - - - - - - - - I - - - - INO - - 90.3 95.2 99.8 99.5 99.7 99.7 99.3 - - I 77.8 66.0 - - IRS - 69.5 48.2 - 89.1 - - - - - - I - - - - ITM - 81.8 58.2 68.6 76.3 - - - - - - I - - - - IVB - - - - - - - - - - - I - - - - MR2 - - - - - 69.6 - - 44.2 40.8 -3.4 I - 85.1 83.3 -1.8 NAV 80.7 86.4 98.7 99.8 99.7 98.6 97.4 97.0 93.8 99.5 +5.7 I 24.8 31.2 94.2 +63.0 NOD - - - - 99.8 100% 99.4 - - - - I - - - - NVC 13.3 96.6 99.2 90.8 - 99.6 99.9 99.9 99.8 - - I 83.7 88.5 - - PAN - - 73.0 - - - - - - - - I - - - - PAV - - 93.7 100% 99.5 98.8 99.9 - 100~ 100% 0.0 I - 99.8 100% +0.2 PCC - 67.6 - - - - - - - - - I - - - - PCV - - - - - - - - - - - I - - - - PRO - - - - 81.5 - - - - - - I - - - - RAV - - - 99.5 99.2 - - - - 99.5 - I - - 82.5 - SCN 95.1 97.6 99.0 98.6 100% 100% 100% 100~ 100% 100% 0.0 I 85.6 100% 99.8 -0.2 SWP 87.4 89.1 98.4 98.6 - 98.4 98.4 - - - - I - - - - TBA 72.0 96.1 99.5 98.7 - - - - - - - I - - - - TSC - - 81.9 76.5 59.5 69.6 - - - - - I - - - - TNT - - - - - - - - - - - I - - - - VDS 16.1 9.9 8.7 - - - - - - - - I - - - - UKV - - - - - - - 0.0 - - - I 0.0 - - - VET - 94.0 97.3 97.5 97.6 - - - - - - I - - - - VIT - - - - - - - - - - - I - - - - VRX - - - - - - - - - - - I - - - - VBS - - - - - - - - - - - I - - - - VHU - - - - - - - - - - - I - - - - VSA - - 80.6 - - - - - - - - I - - - - VSP - - - - - - - - 0.0 0.0 0.0 I - 85.3 84.0 -1.3 VSW - - 83.0 - - - - - - - - I - - - - VTR 6.3 - - - - - - - - - - I - - - - XSC - - - - - - - - - - - I - - - - ---------------------------------------------------------------+---------------------- Mean 69.6 80.9 83.8 89.6 93.6 88.2 98.0 98.6 93.8 87.7% +0.4% I 66.4 79.7 86.2% +6,7% Without extreme values: (94.4%) I (+1.1%) ---------------------------------------------------------------+---------------------- Remark: concerning rounding, "100~" denotates that the result is 10% AFTER rounding whereas "100.0%" denotates that the result is 100.0% EXACTLY (rated "perfect"). Concerning macro viruses, "mean" detection rate is further declining (down to 87.7% after 93.8%). Even if one doesnot count scanners with extremely low detection rates (<30%), the mean detection rate of 94.4% is inacceptably low. On the other side, products having participated in last VTC tests succeeded to raise their detection rates (+0.4%). Concerning script viruses which is presently the fastest growing sector, the detection rate is improving (now 86.2%) but is still regarded as insufficient. But one scanner improved his detection rate by impressive 63%; the other (10) scanners which participated in last test succeeded to improve their detection rate by 0.4%"in the mean". Findings DOS.1: General development of macro/script zoo detection rates: ------------------------------------------------------------------------ For DOS, macro zoo virus detection rates in the mean needs improvement but several products detect ALL or almost all macro zoo viruses. The situation is worse for script virus detection. Overall, 3 products detect ALL macro AND script zoo viruses and are "Overall Perfect": AVK,AVP,PAV And 1 product is rated "excellent" as it detects >99% of all zoo viruses: SCN Findings DOS.2: Development of ITW macro/script virus detection rates: ---------------------------------------------------------------------- 6 AV products (out of 14) detect ALL In-The-Wild macro and script viruses in ALL instantiations (files): AVK,AVP,DRW,NAV,PAV,SCN 8 products can be rated "perfect" concerning detection of ITW macro viruses: AVG,AVK,AVP,CMD,DRW,FPR,NAV,PAV,SCN And 6 products are rated "perfect" as they detect ALL ITW script viruses: AVK,AVP,DRW,NAV,PAV,SCN Findings DOS.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- 3 "perfect" overall scanners: AVK,AVP,PAV 1 "excellent" overall scanner: SCN Finding DOS.4: Performance of DOS scanners by virus classes: ------------------------------------------------------------ Perfect scanners for macro zoo+ITW: AVK,AVP,PAV,SCN Excellent scanners for macro zoo+ITW: CMD,FPR,DRW,NAV,RAV Perfect scanners for script zoo+ITW: AVK,AVP,PAV Excellent scanners for script zoo+ITW: SCN Findings DOS.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Detection of packed viral objects needs improvement Perfect packed macro virus DOS detectors: AVK,AVP,CMD,FPR,PAV,SCN Excellent packed macro virus DOS detector: --- Findings DOS.6: Avoidance of False Alarms: ------------------------------------------ Avoidance of False-Positive Alarms is improving though still regarded insufficient. FP-avoiding perfect DOS scanners: ANT,AVA,AVG,AVK,PAV,SCN Findings DOS.7: Detection rates for file/macro malware: ------------------------------------------------------- Macro and Script Malware detection under DOS is impressively improving: 2 products detect ALL macro and script malware samples and are "perfect": AVP,SCN 2 products are rated "excellent": AVK,PAV Concerning macro malware detection: 2 products detect ALL macro malware samples and are rated "perfect": AVP,SCN 6 products are rated "excellent": AVK,PAV,FPR,CMD,RAV,DRW Concerning script malware detection: 4 products detect ALL script malware samples and are rated "perfect": AVP,AVK,PAV,SCN 0 product is rated "excellent": --- Grading DOS products according to their detection performance: ============================================================== Test category: "Perfect" "Excellent" ----------------------------------------------------------------- DOS zoo macro test: AVK,AVP,PAV,SCN CMD,FPR,DRW,NAV,RAV DOS zoo script test: AVK,AVP,PAV SCN DOS ITW tests: AVK,AVP,DRW,NAV,PAV,SCN --- DOS pack-tests: AVK,AVP,CMD,FPR,PAV,SCN --- DOS FP avoidance: ANT,AVA,AVG,AVK,PAV,SCN --- ----------------------------------------------------------------- DOS Macro Malware Test: AVP,SCN AVK,PAV,FPR,CMD,RAV,DRW DOS Script Malware Test: AVP,AVK,PAV,SCN --- ----------------------------------------------------------------- ********************************************************************* In VTC test "2001-10", we found ** 2 perfect DOS AV products: AVK,PAV ********************************************** but we found **** No perfect DOS AM product **** ********************************************************************* In order to support the race for more customer protection, we evaluate the order of performance in this DOS test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" DOS AntiVirus product: AVK,PAV (10 points) ************************************************************ "Excellent" DOS AV products: 3rd place: SCN ( 9 points) 4th place: AVP ( 8 points) 5th place: CMD,DRW,FPR,NAV ( 3 points) 9th place: ANT,AVA,AVG ( 2 points) 12th place: RAV ( 1 point ) ************************************************************ "Perfect" DOS AntiMalware product: =NONE= (14 points) "Excellent" DOS AntiMalware product: 1st place: AVK,PAV,SCN (13 points) 4th place: AVP (12 points) 5th place: DRW ( 5 points) 6th place: FPR,CMD ( 4 points) 8th place: RAV ( 2 points) ************************************************************ 6. Results of on-demand detection under Windows-NT: =================================================== This is a summary of the essential findings for AV/AM products under W-NT. For details see 7evalwnt.txt. Meant as a perspective of product results, the following table (WNT-A2) lists all results of WNT scanners for zoo detection of file, macro and script viruses, in last 9 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table WNT-A2: Macro&Script Virus Detection Rate in last 9 VTC tests under W-NT: =============================================================================== Scan =========== Macro Virus Detection ================ + =Script Virus Detection= ner 9707 9802 9810 9903 9909 0004 0008 0104 0110 Delta I 0008 0104 0110 Delta ---------------------------------------------------------+------------------------- ANT 92.2 - 85.7 - 89.3 90.2 96.4 - 97.1 - I 55.2 - 81.8 - ANY - - 70.5 - - - - - - - I - - - - ATD - - - - - 99.9 - - - - I - - - - AVA - 91.9 97.2 95.2 93.3 94.3 94.1 95.7 97.7 +2.0 I 15.0 29.1 29.6 +0.5 AVG - - - 82.5 96.6 97.5 97.9 98.3 98.4 +0.1 I 45.8 57.9 62.9 +5.0 AVK - - 99.6 99.6 100% 99.9 100~ 100~ 100% 0.0 I 91.8 99.8 100% +0.2 AVP - - 100% 99.2 100% 99.9 100~ 100~ 100~ 0.0 I 88.2 99.8 100% +0.2 AVX - - - - - - 99.0 - 99.1 - I 61.4 - 70.1 - AW - 61.0 - - - - - - - - I - - - - CLE - - - - - - - - - - I 4.2 - - - CMD - - - - - 100% 100% 100% 100~ 0.0 I 93.5 96.9 93.9 -3.0 DRW/DWW - - - 98.3 98.8 98.4 97.5 - 99.5 - I 59.8 - 95.4 - DSS/E 99.0 100% 100% 100% - - - - - - I - - - - ESA - - - - - 88.9 - - - - I - - - - FPR/FMA - 99.9 99.8 99.8 99.7 - - 100% 100~ 0.0 I - 97.1 94.9 -2.2 FPW - - - - 99.7 100% 100% 100% 100~ 0.0 I 90.8 96.9 94.6 -2.3 FSE - - 99.9 100% 100% 100% 100% 100% 100% 0.0 I 96.7 100% 100% 0.0 FWN - - 99.6 99.7 - 99.9 - - - - I - - - - HMV - - 99.0 99.5 - - - - - - I - - - - IBM 92.9 92.6 98.6 * * * * * * * I * * * * IKA - - - - - - - - 95.4 - I - - 77.7 - INO - 89.7 - 99.8 99.7 99.7 99.8 99.7 99.9 +0.2 I 78.1 92.7 95.1 +2.4 IRS - 99.1 - 99.5 - - - - - - I - - - - IVB - - 92.8 95.0 - - - - - - I - - - - MKS - - - - - 97.1 - - - - I - - - - MR2 - - - - 69.6 - - - 0.7 - I - - 83.3 - NAV 95.6 98.7 99.9 99.7 98.7 98.0 97.7 97.0 99.5 +2.5 I 36.6 54.5 94.2 +39.7 NOD - - - 99.8 100% 99.4 - - - - I - - - - NVC 96.6 99.2 - 98.9 98.9 99.9 99.9 99.8 99.8 0.0 I 83.7 88.5 91.3 +2.8 NVN - - - - 99.5 - - - - - I - - - - PAV 93.5 98.8 99.5 99.4 99.7 99.9 100~ 100~ 100% 0.0 I 90.2 99.8 100% +0.2 PCC - 94.8 - - - - - - - - I - - - - PER - 91.0 - - - - 85.0 68.2 - - I 0.0 22.0 - - PRO - - - 58.0 61.9 67.4 69.1 67.1 - - I 13.1 35.8 - - QHL - - - - - 0.0 - 0.0 - I 6.9 - 0.2 - RAD - - - - - - - - 99.5 - I - - 82.5 - RAV - 98.9 99.5 99.2 - 97.9 96.9 99.6 99.5 -0.1 I 47.1 84.9 0.2 - RA7 - - - 99.2 - - - - - - I - - - - SCN 97.6 99.1 97.7 100% 100% 100% 100% 100% 100% 0.0 I 95.8 100% 99.8 -0.2 SWP 89.1 98.4 97.5 - 98.4 98.6 - - - - I - - - - TBA 96.1 - 98.7 * * * * * * * I * * * * TNT - - 44.4 * * * * * * * I * * * * VET - 94.0 - 94.9 * * * * * * I * * * * VSA - 84.4 - - - - - - - - I - - - - VSP - - - 86.7 0.3 0.0 - 0.0 0.0 0.0 I - 85.3 84.0 -1.3 ---------------------------------------------------------+-------------------------- Mean: 94.7 95.9 91.6 95.3 95,1 96.5 96.3 95.3 85.8 +0.3%I 57.7 78.9 78.7 +3.3% Without extreme results: (99.2) I (86.6) (+0.3) ---------------------------------------------------------+-------------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. Concerning macro viruses, "mean" detection rate is significantly reduced (almost 10%), essentially due to some (new) products with extremely low detection rates (<1%); when leaving out these (3) products, the mean detection rate is "excellent" (99.2%). Those (15) products which also participated in the last test succeeded to improve their detection rates slightly (+0.3%). Concerning script viruses which is presently the fastest growing sector, the detection rate is stable on very low level (78.7% mean). Again, this result is influenced by some "newcomers" with extremely low detection rates whose insuffient detection rate cannot be balanced by a surprisingly strong improvement of NAV (+35% detection to now 94.2%!). Without counting those extremely bad detectors, the mean detection rate would be 86.6%. Those (14) scanners which participated in last test succeeded to improve their detection rate again (+3.3%) but if one doesnot count the unusual large improvement of NAV, mean improvement is "normal" (+0.3%). Findings WNT.1: General development of macro/script zoo detection rates: ------------------------------------------------------------------------ Findings WNT.1: For W-NT, macro and script zoo virus detection is further improving but significant work must be invested into script virus detection. Now, 3 WNT products now detect ALL zoo macro and script virus "perfectly": AVK,FSE,PAV 1 WNT product detects >99% of zoo macro and script viruses "excellently": SCN Findings WNT.2: Development of ITW macro/script virus detection rates: ---------------------------------------------------------------------- 9 AV products (out of 22) detect ALL In-The-Wild macro and script viruses in >99.9% of files and are rated "perfect": AVK,AVP,AVX,DRW,FSE,INO,NAV,PAV,SCN 13 products can be rated "perfect" concerning detection of ITW macro viruses: AVG,AVK,AVP,AVX,CMD, DRW,FPR,FPW,FSE,INO,NAV,PAV,SCN 10 products can be rated "perfect" concerning detection of ITW script viruses: AVK,AVP,AVX,DRW,FSE,INO,NAV,NVC,PAV,SCN Findings WNT.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- 3 WNT products are overall rated "perfect": AVK,FSE,PAV 2 "excellent" overall scanners: AVP,SCN 2 "very good" overall scanners: DRW,INO Findings WNT.4: Performance of WNT scanners by virus classes: ------------------------------------------------------------ 4 "Perfect" scanners for macro zoo: AVK,FSE,PAV,SCN 7 "Excellent" scanners for macro zoo: AVP,CMD,FPR,FPW,INO,NAV,AVX 3 "Perfect" scanners for script zoo: AVK,AVP,PAV 1 "Excellent" scanner for script zoo: FSE,SCN Findings WNT.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Detection of packed viral objects is improving 8 "Perfect" packed macro virus WNT detector: AVK,AVP,AVX,CMD,FPR,FPW,PAV,SCN 0 "Excellent" packed file macro detector: --- 2 Perfect packed macro virus detector: AVG,DRW Findings WNT.6: Avoidance of False Alarms: ------------------------------------------ Avoidance of False-Positive Alarms is improving though still regarded insufficient. 7 FP-avoiding "perfect" W-NT scanners: ANT,AVA,AVG,AVK,INO,RAD,SCN Findings WNT.7: Detection rates for macro & script malware: ----------------------------------------------------------- Macro & Script Malware detection under WNT is slowly improving. Macro malware detection is significantly better developped than script malware detection. Now, 2 macro&script detectors are "perfect": PAV,SCN And 9 products are rated "excellent" (>90%): AVK,AVP,CMD,FSE,FPR,FPW,NVC,RAD,AVX Concerning only macro malware detection, 2 products are rated "perfect": PAV,SCN And concerning macro malware detection only, 9 more products are rated "excellent": AVK,AVP,CMD,FSE,FPR,FPW,NVC,RAD,INO,AVX Concerning only script malware detection,5 products are rated "perfect": AVK,AVP,FSE,PAV,SCN And NO product is rated "excellent" (>90% detection). Grading WNT products according to their detection performance: ============================================================== Test category: "Perfect" "Excellent" ----------------------------------------------------------------- WNT zoo macro test: AVK,FSE,PAV,SCN AVP,AVX,CMD, FPR,FPW,INO,NAV WNT zoo script test: AVK,AVP,PAV FSE,SCN WNT ITW tests: AVK,AVP,AVX,DRW, FSE,INO,NAV,PAV,SCN WNT pack-tests: AVK,AVP,AVX,CMD,FPR, --- FPW,PAV,SCN WNT FP avoidance: ANT,AVA,AVG,AVK,INO, --- RAD,SCN ----------------------------------------------------------------- WNT Macro Malware Test: PAV,SCN AVK,AVP,CMD,FSE,FPR, FPW,NVC,RAD,INO,AVX WNT Script Malware Test: AVK,AVP,FSE,PAV,SCN --- ----------------------------------------------------------------- ********************************************************************* In VTC test "2001-10", we found ** 1 perfect WNT AV product: AVK ********************************************** but we found **** No perfect WNT AM product **** ********************************************************************* ************************************************************ "Perfect" WNT AV product: AVK (10 points) ************************************************************ 2nd place: SCN ( 9 points) 3rd place: PAV ( 8 points) 4th place: AVP ( 7 points) 5th place: AVX,FSE,INO ( 5 points) 8th place: CMD,FPR,FPW,DRW,NAV ( 3 points) 13th place: ANT,AVA,AVG ( 2 points) 16th place: RAD/RAV ( 1 point ) ************************************************************ "Perfect" WNT AntiMalware product: =NONE= (14 points) "Excellent" WNT AntiMalware product: 1st place: AVK,SCN (13 points) 3rd place: PAV (12 points) 4th place: AVP (11 points) 5th place: FSE ( 8 points) 6th place: AVX,INO ( 6 points) 8th place: CMD,FPR,FPW ( 4 points) 11th place: RAD/RAD ( 2 points) ************************************************************ 7. Results of on-demand detection under Windows-98: =================================================== This is a summary of the essential findings for AV/AM products under W-98. For details see 7evalw98.txt. Meant as a perspective of product results, the following table (W98-A2) lists all results of W98 scanners for zoo detection of macro and script viruses, in last 6 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which p articipated in last 2 tests are also given, and mean values are calculated. Table W98-A2: Comparison: Macro/Script Virus Detection Rate in last 7 VTC tests under W-98: =========================================================================================== Scan ------------- Macro Virus Detection ------------ + --ScriptVirusDetection-- ner 98/10 99/03 99/09 00/04 00/08 01/04 01/10 DELTA I 00/08 01/04 01/10 DELTA -------------------------------------------------------+------------------------- ACU - 97.6 - - - - - - I - - - - ADO - - - - - 99.9 - - I - 99.8 - - AN5 - - 89.3 - - - - - I - - - - ANT 84.3 - 89.5 90.2 96.4 - 97.4 - I 55.2 - 99.1 - ANY 70.7 - - - - - - - I - - - - ATR - - - - - - - - I - 2.7 - - AVA/3 96.7 95.9 93.9 94.3 94.1 95.7 97.7 +2.0 I 15.0 30.0 33.7 +3.7 AVG - 82.5 96.6 97.5 97.9 98.3 98.4 +0.1 I - 57.9 62.9 +5.0 AVK 99.6 99.6 100.0 99.9 100~ 100~ 100% 0.0 I 91.2 99.8 100% +0.2 AVP 100.0 99.2 100.0 99.9 100~ 100~ 100~ 0.0 I 88.2 99.8 100% +0.2 AVX - - 98.7 94.5 99.0 - 99.1 - I 61.4 - 70.1 - CLE - - - - - 0.0 - - I 4.2 6.3 - - CMD - - 99.6 100.0 100% 100% 100~ 0.0 I 93.5 96.9 93.9 -3.0 DRW - 98.3 98.8 98.4 - 98.0 99.5 +1.5 I - 95.6 95.4 - DSE 100.0 100.0 * 100.0 100% 99.9 97.8 -2.1 I 95.8 100% 73.0 -27.0 ESA - - - 88.9 - - - - I - - - - FPR 92.4 99.8 99.7 100.0 - 100% 100~ 0.0 I - 96.9 94.6 -2.3 FPW - - 99.9 100.0 100% 100% 100~ 0.0 I 90.8 96.9 94.6 -2.3 FSE 100.0 100.0 100.0 100.0 100% 100% 100% 0.0 I 96.7 100% 100% 0.0 FWN 99.6 99.7 99.9 99.8 - - - - I - - - - HMV - 99.5 - - - - - - I - - - - IBM 94.5 * * * * * * * I * * * * INO 88.1 99.8 98.1 99.7 99.8 99.7 99.9 +0.2 I 78.1 92.7 95.1 +2.4 IRS 99.0 99.5 - - - - - - I - - - - ITM - - - - - - - - I - - - - IVB 92.8 95.0 - - - - - - I - - - - MKS - - - 97.1 - 44.2 - - I - - - - MR2 - - 64.9 - - - 40.8 - I - 85.1 83.3 -1.8 NAV 95.3 99.7 98.7 98.0 97.7 97.0 99.5 +2.5 I 36.6 65.5 94.2 +29.7 NOD - 99.8 100.0 99.4 - - - - I - - - - NV5 - - 99.6 - - - - - I - - - - NVC - 99.1 99.6 99.9 99.9 99.8 99.8 0.0 I 83.7 88.5 91.3 +2.8 PAV 99.5 99.5 86.7 99.9 100~ 99.5 100% +0.5 I 90.2 99.8 100% +0.2 PCC - 98.0 - - - - - - I - - - - PER - - - 53.7 67.2 68.5 - - I 18.0 22.0 - - PRO - 58.0 61.9 67.4 69.1 67.1 - - I 12.1 40.7 - - QHL - - - 0.0 - 0.0 0.0 0.0 I 6.9 - - - RAV 92.2 - 98.1 97.9 96.9 99.6 99.5 -0.1 I 47.1 84.9 82.5 -2.4 SCN 97.7 100.0 99.8 100.0 100% 100% 100% 0.0 I 95.8 100% 99.8 -0.2 SWP 98.6 - 98.5 98.6 - - - - I - - - - TBA 98.7 * * * * * * * I - - - - TSC - 76.5 64.9 - - - - - I - - - - VBS 41.5 - - - - - - - I - - - - VBW 93.4 - - - - - - - I - - - - VET - 97.6 * * * * - - I - - - - VSP - 0.4 0.3 - - 0.0 0.0 0.0 I - 85.3 84.0 -1.3 -------------------------------------------------------+------------------------- Mean 92.1 90.3 93.5 95.0 95.6 84.7 87.1% +0.3% I 61.0 76.0 86.5% +0.2% Without extreme low detectors: (96.3%) I -------------------------------------------------------+------------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. The number of scanners in test has grown to 22. But as some new products reached very low detection rates, mean detection rates for macro and script virus detection are significantly reduced. Concerning macro viruses, "mean" detection rate is significantly reduced (esp. due to several products with very insufficient detection rates), but those products which participated in last test on a still acceptable level (96.3%) further improved their detection rates slightly (+0.3%). Concerning script viruses which is presently the fastest growing sector, the detection rate is significantly improved though still low (86.5% mean). Those scanners which participated in last test improved their detection rate further by 0.2% "in the mean". Findings W98.1: General development of macro/script zoo detection rates: ------------------------------------------------------------------------ For W-98, macro and script zoo virus detection rates are in the mean increasing but still insufficient; all products rated MUST detect ITW viruses "perfectly" (100%). Now, 3 W-98 products now detect ALL zoo macro and script virus "perfectly" (100% detection rate): AVK,FSE,PAV 2 products detect >99% of macro&script zoo viruses and are rated "excellent": AVP,SCN Findings W98.2: Development of ITW file/macro/script virus detection rates: --------------------------------------------------------------------------- 9 AV products (out of 21) detect ALL In-The-Wild macro & script viruses in ALL files and are rated "perfect ITW scanners": AVK,AVP,AVX,DRW,FSE,INO,NAV,PAV,SCN Concerning ITW macro virus detection, 13 products detect ALL viruses in ALL files: AVG,AVK,AVP,AVX,CMD,DRW,FPR,FPW,FSE,INO,NAV,PAV,SCN Concerning ITW script virus detection, 10 products detect ALL viruses in ALL files: AVK,AVP,AVX,DRW,FSE,INO,NAV,NVC,PAV,SCN Findings W98.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- Now, 3 W98 products are overall rated "perfect": AVK,FSE,PAV 2 products are rated "excellent": AVP,SCN 2 products are rated "very good": DRW,INO Findings W98.4: Performance of W98 scanners by virus classes: ------------------------------------------------------------- Concerning overall macro virus detection: 4 products are rated "perfect": AVK,FSE,PAV,SCN 8 products are arted "excellent": AVP,CMD,FPR,FPW,INO,DRW,NAV,AVX 1 product is rated "very good": AVA Concerning overall script virus detection: 2 products are rated "perfect": AVK,AVP 3 products are rated "excellent": FSE,PAV,SCN 2 products are rated "very good": INO,DRW Findings W98.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Progress in detection of packed viral objects. 7 "Perfect" packed macro virus W98 detector: AVK,AVP,CMD,FPR,FPW,PAV,SCN 0 "Excellent" packed macro virus W98 detectors: --- 3 "Very Good" packed macro virus W98 detectors: AVG,DRW,INO Findings W98.6: Avoidance of False Alarms: ------------------------------------------ Avoidance of False-Positive Alarms needs further work as many scanners have significant alarm rates. 7 FP-avoiding perfect W-98 scanners: AVA,AVG,AVK,DSE,INO,RAV,SCN Findings W98.7: Detection rates for file/macro malware: ------------------------------------------------------- Macro/Script Malware detection under W-98 is slowly improving. 2 macro/script malware detectors "perfect": PAV,SCN 3 macro/script malware detectors "excellent": FSE,AVK,AVP 2 macro/script malware detectors "very good": RAV,DSE 2 macro malware detectors "perfect": PAV,SCN 11 macro malware detectors "excellent": AVK,AVP,CMD,FPR,FPW,FSE,NVC,RAV,INO,AVX,DRW 5 macro malware detectors "very good": ANT,AVA,NAV,DSE,AVG Script malware detection is underdelopped: 5 script malware detectors "perfect": AVK,AVP,FSE,PAV,SCN 0 script malware detectors "excellent": --- 2 script malware detectors "very good": DSE,RAV Grading W98 products according to their detection performance: ============================================================== Test category: "Perfect" "Excellent" -------------------------------------------------------------------- W98 zoo macro test: AVK,FSE,PAV,SCN AVP,AVX,CMD,DRW, FPR,FPW,INO,NAV W98 zoo script test: AVK,AVP FSE,PAV,SCN W98 ITW tests: AVK,AVP,AVX,DRW, ----- FSE,INO,NAV,PAV,SCN W98 pack-tests: AVK,AVP,CMD,FPR,FPW,PAV,SCN ----- W98 FP avoidance: AVA,AVG,AVK,DSE,INO,RAV,SCN ----- ----------------------------------------------------------------- W98 Macro Malware Test: PAV,SCN FSE,AVK,AVP,CMD,FPR, FPW,NVC,RAV,INO,AVX,DRW W98 Script Malware Test: AVK,AVP,FSE,PAV,SCN ----- ----------------------------------------------------------------- ********************************************************************* In VTC test "2001-10", we found ** 1 perfect W98 AV product: AVK ********************************************** but we found **** No perfect W98 AM product **** ********************************************************************* ************************************************************ "Perfect" W-98 AntiVirus product: 1st place: AVK (10 points) ************************************************************ "Excellent" W-98 AntiVirus products: 2nd place: SCN ( 9 points) 3rd place: AVP,PAV ( 7 points) 5th place: FSE,INO ( 5 points) 7th place: AVX,CMD,DRW,FPR,FPW,NAV ( 3 points) 13th place: AVA,AVG,DSE,RAV ( 2 points) ************************************************************ "Perfect" W-98 AntiMalware product: =NONE= (14 points) "Excellent" W-98 AntiMalware products: 1st place: AVK,SCN (13 points) 3rd place: PAV (11 points) 4th place: AVP (10 points) 5th place: FSE ( 8 points) 6th place: INO ( 6 points) 7th place: AVX,CMD,DRW,FPR,FPW ( 4 points) 12th place: RAV ( 3 points) ************************************************************ 8. Results of on-demand detection under Windows-2000 (W2k): =========================================================== This is a summary of the essential findings for AV/AM products under W-2k. For details see 7evalw2k.txt. Meant as a perspective of product results, the following table (W2k-A) lists all results of W2k scanners for zoo detection of (file), macro and script viruses, in last 3 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table W2k-A: Comparison: File/Macro/Script Virus Detection Rate: ================================================================ Scan I = File Virus = + ======= Macro Virus ======= + ===== Script Virus ======= ner I Detection I Detection I Detection -----+-----------------+-----------------------------+--------------------------- Test I 0104 Delta I 0008 0104 0110 Delta I 0008 0104 0110 Delta -----+-----------------+-----------------------------+--------------------------- ANT I - - I 93.3 - - - I 53.9 - - - AVA I 95.0 - I 94.1 95.7 97.7 +2.0 I 15.0 29.1 29.6 +0.5 AVG I 81.9 - I 97.9 98.3 98.4 +0.1 I 45.8 57.9 62.9 +5.0 AVK I 99.8 - I 100.0~ 100.0~ 100.0% 0.0 I 91.5 99.8 100.0% +0.2 AVP I 99.9 - I 100.0~ 100.0~ 100.0~ 0.0 I 88.2 99.8 100.0% +0.2 AVX I - - I 99.0 - - - I 61.4 - - - CLE I - - I - - - - I 4.2 - - - CMD I 97.8 - I 100.0% 100.0% 100.0~ 0.0 I 93.5 96.9 93.2 -3.7 DRW I - - I 97.5 - 99.5 - I 59.8 - 95.4 - FPR I 97.8 - I - 100.0% 100.0~ - I - 96.9 94.6 -2.3 FPW I 97.8 - I 100.0% 100.0% 100.0~ 0.0 I 90.8 96.9 94.6 -2.3 FSE I - - I 100.0% 100.0% 100.0% 0.0 I 96.7 100% 100.0% 0.0 INO I 97.9 - I 99.8 99.7 99.9 +0.2 I 78.1 93.1 93.9 +0.8 MCV I - - I - - 88.5 - I - - 27.7 - MR2 I - - I - - 0.7 - I - - 83.3 - NAV I 93.9 - I 97.7 97.0 99.5 +2.5 I 36.6 54.5 94.2 +39.9 NVC I 98.1 - I 99.9 99.8 99.8 0.0 I 83.7 88.5 91.3 +2.8 PAV I 97.5 - I 100.0~ 99.4 100.0% -0.6 I 90.2 98.5 100.0% +1.5 PER I - - I 85.0 68.2 - - I 0.0 22.0 - - PRO I 70.6 - I 69.1 67.1 - - I 12.1 40.7 - - QHL I - - I 0.0 - - - I 6.9 - - - RAV I 93.5 - I 96.9 99.6 99.5 -0.1 I 47.1 84.9 82.5 -2.4 SCN I 89.0 - I 100.0% 100.0% 100.0% 0.0 I 95.8 100% 99.8 -0.2 VSP I - - I - 0.0 0.~ 0.0 I - 85.3 84.0 -1.3 -----+-----------------+-----------------------------+---------------------------- Mean : 97.6% - I 99.9% 89.7 88.0% +0.3% I 57.6 79.4 84.8% +2.5% Without extreme results: (98.9%) I (91.9%)(-0.2%) -----+-----------------+-----------------------------+---------------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. Concerning macro viruses, "mean" detection rate is slightly reduced "in the mean" to a still inacceptably low level (<90%) though there is some slight improvement (+0.3%) for those products which participated also in last test; when one does not count 2 products with extreme low detection arte (<30%), mean results are acceptable if not very good (98.9%). Now, 4 scanners detect ALL MACRO zoo viruses, and 4 more detect almost all. Concerning script viruses which is presently the fastest growing sector, detection rate is improving though still low (84.8% mean) but those (15) products which also participated in last VTC test have improved their detection rates; but the impressing figure (+2.5%) is essentially influenced by one product which upgraded its detection rate by 39.9% (NAV) to now reach 94.2%. Now, 4 products detect ALL script zoo viruses. Findings W2k.1: General development of macro/script zoo detection rates: ------------------------------------------------------------------------ For W-2000, macro and script zoo virus detection rates need further work. Now, 3 W2k products now detect ALL zoo macro and script virus "perfectly" (100% detection rate): AVK,FSE,PAV 1 WNT product detects >99% of zoo macro and script viruses "excellently": SCN Findings W2k.2: Development of ITW file/macro/script virus detection rates: --------------------------------------------------------------------------- Now 8 AV products (out of 18) detect ALL In-The-Wild macro and script viruses in ALL instantiations (files): AVK,AVP,DRW,FSE,INO,NAV,NVC,PAV,SCN And 3 products can be rated "perfect" concerning detection of macro and script viruses but they still fail to detect all script viral files (objects): FPR,FPW,AVG Findings W2k.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- Now, 3 W2k product are overall rated "perfect" in last test: no product): AVK,FSE,PAV 2 "excellent" overall scanners: AVP,SCN 1 "very good" overall scanner: DRW Findings W2k.4: Performance of W2k scanners by virus classes: ------------------------------------------------------------ Perfect scanners for macro zoo: AVK,FSE,PAV,SCN Excellent scanners for macro zoo: AVP,CMD,FPR,FPW,INO,NVC,DRW,NAV,RAV Perfect scanners for script zoo: AVK,AVP,FSE,PAV Excellent scanners for script zoo: SCN Findings W2k.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Detection of packed viral objects shows significant improvements as now 7 products detect "perfectly" in ALL packed viruses (in last test: 4). Perfect packed macro virus detectors: AVK,AVP,CMD,FPR,FPW,PAV,SCN Excellent packed macro virus detectors: INO,RAV Findings W2k.6: Avoidance of False Alarms: ------------------------------------------ Avoidance of False-Positive Alarms is improving though still regarded insufficient. FP-avoiding perfect W-2k scanners: AVA,AVG,AVK,INO,PAV,SCN Findings W2k.7: Detection rates for file/macro malware: ------------------------------------------------------- Macro/Script Malware detection under W2k is slowly improving but still more efforts are needed: 2 products are "perfect": PAV,SCN 3 products are "excellent": AVK,AVP,FSE 1 product is rated "very good": RAV Concerning only macro malware detection, 2 products are rated "perfect": PAV,SCN And concerning macro malware detection only, 10 more products are rated "excellent": AVK,AVP,FSE,CMD,FPR,FPW,NVC,RAV,INO,DRW Concerning only script malware detection, 5 products are rated "perfect": AVK,AVP,FSE,PAV,SCN Grading W2k products according to their detection performance: ============================================================== Test category: "Perfect" "Excellent" ------------------------------------------------------------------ W2k zoo macro test: AVK,FSE,PAV,SCN AVP,CMD,FPR,FPW, INO,NVC,DRW,NAV,RAV W2k zoo script test: AVK,AVP,FSE,PAV SCN W2k ITW tests: AVK,AVP,DRW,FSE, FPR,FPW,AVG INO,NAV,NVC,PAV,SCN W2k pack-tests: AVK,AVP,CMD,FPR,FPW,PAV,SCN ----- W2k FP avoidance: AVA,AVG,AVK,INO,PAV,SCN ----- ----------------------------------------------------------------- W2k Macro Malware Test: PAV,SCN AVK,AVP,FSE,CMD,FPR, FPW,NVC,RAV,INO,DRW W2k Script Malware Test: AVK,AVP,FSE,PAV,SCN ----- ----------------------------------------------------------------- ********************************************************************* In VTC test "2001-10", we found ** 2 perfect W2k AV products: AVK,PAV ********************************************** but we found **** No perfect WNT AM product **** ********************************************************************* ************************************************************ "Perfect" W-2000 AntiVirus product: AVK,PAV (10 points) ************************************************************ "Excellent" W-2000 AV products: 3rd place: SCN ( 9 points) 4th place: AVP ( 7 points) 5th place: FSE ( 6 points) 6th place: INO ( 5 points) 7th place: FPR,FPW ( 4 points) 9th place: CMD,DRW,NAV,NVC ( 3 points) 13th place: AVA,AVG ( 2 points) 15th place: RAV ( 1 point ) ************************************************************ "Perfect" W-2000 AntiMalware product: PAV (14 points) ************************************************************ "Excellent" W-2000 AntiMalware product: 2nd place: AVK,SCN (13 points) 4th place: AVP (10 points) 5th place: FSE ( 7 points) 6th place: INO,FPR,FPW ( 5 points) 9th place: CMD,DRW,NVC ( 4 points) 12th place: RAV ( 2 points) ************************************************************ 9. Comparison of detection results under Windows-32 platforms: ============================================================== This is a summary of the comparison of AV/AM products under different W32 platforms (W-NT, W-98, W-2k). For details see 7evalw32.txt. With the fast deployment of new versions of Microsoft Windows-32 (in past 5 years from W-NT to W-95, W-98, W-2000 and soon W-XP), both customers needing protection nd producers of security-enhancing software (esp. AntiVirus and AntiMalware) can only cope with the pace when they essentially re-use engines prepared for previous W32 platforms and simply "adapt" them to the intrinsics of the new platforms. Otherwise, "rewriting" the resp. software would consume too much time and efforts, and customers would receive "adapted" products only with some delay. AV/AM testers cannot determine the characteristics of the algorithms in scanning engines, either in following legal objectives (which, in most Copyright laws, prohibit reverse-engineering of proprietory code, except for specific reasons such as collecting evidence for a court case or teaching related techniques, as in Hamburg university IT Security curriculum), or for shere complexity of related code (and in many cases, for unsufficient professional knowledge of testers). It is therefore worthwhile to analyse whether those AV/AM products versions of which are available for all W32 platforms behave EQUALLY concerning detection and identification of viral and malicious code. Test Hypothesis: "W32-harmonical behaviour of W32 products: =========================================================== We assume that those products which participate for all W32 platforms (WNT, W98 and W2k) for ALL categories shall yield INDENTICAL results. We call product behaviour following this hypothesis "W32-harmonical". Finding W32.1: Equality of results for all W32 platforms: --------------------------------------------------------- Almost ALL W-32 scanners perform equally on W-NT/W-NT/W-2k in ALL categories and can be called "W32-harmonical". ALL products are W32-harmonically in the detection of In-The-Wild macro viruses, but detection of ITW script viruses is slightly less developped. When looking at specific categories only, W32-harmonical behaviour is better developped for macro than for script virus detection. -------------------------------------------------------------- For ALL categories, the following *7* W32 scanners (of 17) yield identical results on ALL platforms: AVG,AVK,AVP,AVX,FPW,NVC,SCN The following *14* W32 scanners yield identical results for all macro (zoo,ITW) viruses: AVG,AVK,AVP,AVX,CMD,DRW,FPR, FPW,FSE,INO,NVC,PAV,RAD/RAV,SCN The following *16* W32 scanners yield identical results for all macro malware: AVA,AVG,AVK,AVP,AVX,CMD,DRW,FPR, FPW,FSE,INO,NAV,NVC,PAV,RAD/RAV,SCN The following *9* products yield identical results for all script (zoo,ITW) viruses: AVG,AVK,AVX,DRW,FPW,NAV,NVC,SCN,VSP The following *16* W32 scanners yield identical results for all script malware: ANT,AVG,AVK,AVX,CMD,DRW,FPR,FPW, FSE,INO,NAV,NVC,PAV,RAD/RAV,SCN,VSP The following grid is used to grade W32 products concerning their ability for IDENTICAL detection for ALL categories on ALL W32 platforms: A "perfect" W32-harmonical AV product will yield IDENTICAL results for all categories (macro and script viruses). (Assigned value: 5). A "perfect" W32-harmonical AM product will be a perfect AV product and yield IDENTICAL results for all categories (macro and script malware). (Assigned value: 2). Grading W32-harmonical AntiVirus products: =========================================================== Grade: "Perfect" W32-harmonical detection: AVG,AVK,AVX,DRW,FPW,NVC,SCN =========================================================== Grading W32-harmonical AntiMalware products: =========================================================== Grade: "Perfect" W32-harmonical detection: AVG,AVK,AVX,CMD,DRW,FPR,FPW, FSE,INO,NAV,NVC,PAV,RAD/RAV,SCN =========================================================== ************************************************************ "Perfect" W32-harmonical AntiVirus products: 1st place: AVG,AVK,AVX,DRW,FPW,NVC,SCN (5 points) ************************************************************ "Perfect" W32-harmonical AntiMalware products: 1st place: AVG,AVK,AVX,DRW,FPW,NVC,SCN (7 points) ************************************************************ 10. Results of on-demand detection under Linux(SuSe): ===================================================== This is a summary of the essential findings for AV/AM products under Linux. For details see 7evallin.txt. Meant as a perspective of product results, the following table (LIN-A) lists all results of Linux scanners for zoo detection of (file), macro and script viruses, in last 2 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table Lin-A: Performance of LINUX scanners in Test 2001-04 and 2001-10: ======================================================================= Scan I = File Virus = + === Macro Virus ==== + === Script Virus ==== ner I Detection I Detection I Detection -----+-----------------+----------------------+---------------------- Test I 0104 Delta I 0104 0110 Delta I 0104 0110 Delta -----+-----------------+----------------------+---------------------- ANT I - - I - 97.1 - I - 81.8 - AVK I - - I - 100% - I - 100% - AVP I 99.9 - I 100~ 100% +0.0 I 99.8 100% +0.2 CMD I 97.8 - I 100% 100~ 0.0 I 96.9 94.2 -2.7 DRW I - - I - 99.5 - I - 95.4 - FSE I 97.1 - I 100~ 100~ +0.0 I 96.9 92.3 -4.6 MCV I - - I - 9.1 - I - 27.6 - RAV I 93.5 - I 99.6 99.5 -0.1 I 84.9 82.5 -2.4 SCN I 99.7 - I 100% 100% 0.0 I 99.8 99.8 0.0 -----+-----------------+----------------------+---------------------- Mean : 97.6% - I 99.9% 89.5% -0.~% I 95.7% 86.0% -1.9% Without MCV: I (99.5%) I (93.3%) -----+-----------------+----------------------+---------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. While the majority of macro virus detectors works on a rather high level (the reduced mean detection rate is essentially influenced by one product which is new in this test), detection rates for script detectors are sig- nificantly less developped and need further improvement. Findings LIN.1: General development of macro/script zoo detection rates: ------------------------------------------------------------------------ Different from the first test (where NO product detected all macro and script viruses), now *2* products: AVK,AVP detect ALL macro and script virus in Zoo test and are rated "perfect". Findings LIN.2: Development of ITW macro/script virus detection rates: ---------------------------------------------------------------------- Now, 4 AV product detect "perfectly" all ITW macro/script viruses in all files: AVK,AVP,DRW,SCN 1 scanner is rated "excellent" concerning ITW virus detection: CMD Findings LIN.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- 2 "Perfect" overall scanners: AVK,AVP 1 "Excellent" overall scanner: SCN Findings LIN.4: Performance of LINUX scanners by virus classes: --------------------------------------------------------------- 3 Perfect scanners for macro zoo: AVK,AVP,SCN 3 Excellent scanners for macro zoo: CMD,FSE,DRW 2 Perfect scanners for script zoo: AVK,AVP 1 Excellent scanner for script zoo: SCN Findings LIN.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Detection of packed viral objects needs improvement 4 Perfect packed ITW macro virus LINUX detectors: AVK,AVP,CMD,SCN 1 Excellent packed ITW macro virus LINUX detector: RAV Findings LIN.6: Avoidance of False Alarms: ------------------------------------------ Avoidance of False-Positive Alarms is insufficient and needs improvement. 4 FP-avoiding perfect LINUX scanners: ANT,RAV,SCN,MCV Findings LIN.7: Detection rates for file/macro malware: ------------------------------------------------------- For the first time, 2 LINUX products are "perfect" in detecting ALL macro&script malware specimen: AVP,SCN Moreover, one product is rated "excellent": AVK And one product is rated "Very Good": RAV Detection of macro malware is improving, with 3 products rated "perfect": AVP,CMD,SCN 4 products rated "excellent": AVK,FSE,RAV,DRW But detection of script malware is insufficient and needs improvement: 3 products rated "perfect": AVP,AVK,SCN 1 product rated "excellent": RAV Grading LIN products according to their detection performance: ============================================================== Test category: "Perfect" "Excellent" -------------------------------------------------------------- LINUX zoo macro test: AVK,AVP,SCN CMD,FSE,DRW LINUX zoo script test: AVK,AVP SCN LINUX ITW tests: AVK,AVP,DRW,SCN CMD LINUX pack-test: AVK,CMD,SCN --- LINUX FP avoidance: ANT,RAV,SCN,MCV --- --------------------------------------------------------------- LINUX Macro Malware Test: AVP,CMD,SCN AVK,FSE,RAV,DRW LINUX Script Malware Test: AVP,AVK,SCN RAV --------------------------------------------------------------- ******************************************************************* In VTC test "2001-10", we found *** No perfect LINUX AV product *** and we found *** No perfect LINUX AM product *** ******************************************************************* ************************************************************ "Perfect" LINUX AntiVirus product: =NONE= (10 points) "Excellent" LINUX AV products: 1st place: SCN ( 9 points) 2nd place: AVK ( 8 points) 3rd place: AVP ( 7 points) 4th place: CMD ( 4 points) 5th place: DRW ( 3 points) 6th place: MCV,ANT,RAV ( 2 points) 9th place: FSE ( 1 point ) ************************************************************ "Perfect" LINUX AntiMalware product: =NONE= (14 points) "Excellent" LINUX AM products: 1st place: SCN (13 points) 2nd place: AVP (11 points) 3rd place: AVK (10 points) 4th place: CMD ( 6 points) 5th place: DRW ( 4 points) 6th place: RAV ( 3 points) 7th place: FSE ( 2 points) ************************************************************ 11. Final remark: In search of the "Perfect AV/AM product": =========================================================== This test includes 3 platforms for which engines are hardly comparable, namely DOS (16-bit engines), WNT/W98/W2k (32-bit engines, comparable) and LINUX. Moreover, several manufacturers submitted only products for special platforms. ********************************************************* In general, there is NO AntiVirus nor AntiMalware product which can be rated "PERFECT" in all categories for all different testbeds. ********************************************************* *********************************************************** But when differentiating for categories, there are several products which can be rated either "perfect" or "excellent". ************************************************************ Instead of calculating an overall value (e.g. the sum of points yielded divided by the number of products in test for a given platform), the following TABLES lists product suites by their places, sorting by assigned points: Table SUM-AV: Survey of Results for AntiVirus Products: ------------------------------------------------------- ================== AntiVirus Products ====================== DOS (14) WNT (22) W98 (21) W2k (18) W32 (19) LINUX(9) ------------------------------------------------------------ AVK (10) AVK (10) AVK (10) AVK (10) AVG ( 5) SCN ( 9) PAV (10) SCN ( 9) SCN ( 9) PAV (10) AVK ( 5) AVK ( 8) SCN ( 9) PAV ( 8) AVP ( 7) SCN ( 9) AVP ( 5) AVP ( 7) AVP ( 8) AVP ( 7) PAV ( 7) AVP ( 7) AVX ( 5) CMD ( 4) CMD ( 3) AVX ( 5) FSE ( 5) FSE ( 6) DRW ( 5) DRW ( 3) DRW ( 3) FSE ( 5) INO ( 5) INO ( 5) FPW ( 5) MCV ( 2) FPR ( 3) INO ( 5) AVX ( 3) FPR ( 4) NVC ( 5) ANT ( 2) NAV ( 3) CMD ( 3) CMD ( 3) FPW ( 4) SCN ( 5) RAV ( 2) ANT ( 2) DRW ( 3) DRW ( 3) CMD ( 3) FSE ( 1) AVA ( 2) FPR ( 3) FPR ( 3) DRW ( 3) AVG ( 2) FPW ( 3) FPW ( 3) NAV ( 3) RAV ( 1) NAV ( 3) NAV ( 3) NVC ( 3) ANT ( 2) AVA ( 2) AVA ( 2) AVA ( 2) AVG ( 2) AVG ( 2) AVG ( 2) DSE ( 2) RAV ( 1) RAD/RAV(1) RAV ( 2) ------------------------------------------------------------ Remark: Numbers for platforms indicate numbers of products in test. Numbers for products indicates points assigned for that platform. Table SUM-AM: Survey of Results for AntiMalware Products: --------------------------------------------------------- ================ AntiMalware Products ===================== DOS (14) WNT(22) W98(21) W2k(18) W32(19) LINUX(9) ----------------------------------------------------------- AVK (13) AVK (13) AVK (13) PAV (14) AVG ( 7) SCN (13) PAV (13) SCN (13) SCN (13) AVK (13) AVK ( 7) AVP (11) SCN (13) PAV (12) PAV (11) SCN (13) AVP ( 7) AVK (10) AVP (12) AVP (11) AVP (10) AVP (10) AVX ( 7) CMD ( 6) DRW ( 5) FSE ( 8) FSE ( 8) FSE ( 7) DRW ( 7) DRW ( 4) FPR ( 4) AVX ( 6) INO ( 6) INO ( 5) FPW ( 7) RAV ( 3) CMD ( 4) INO ( 6) AVX ( 4) FPR ( 5) NVC ( 7) FSE ( 2) RAV ( 2) CMD ( 4) CMD ( 4) FPW ( 5) SCN ( 7) FPR ( 4) FPR ( 4) CMD ( 4) FPW ( 4) FPW ( 4) DRW ( 4) RAV/RAD(2) RAV ( 2) NVC ( 4) RAV ( 2) ----------------------------------------------------------- Remark: Numbers for platforms indicate numbers of products in test. Numbers for products indicates points assigned for that platform. Generally, we hope that these rather detailed results help AV producers to adapt their products to growing threats and thus to protect their customers. 14. Availability of full test results: ====================================== Much more information about this test, its methods and viral databases, as well as detailed test results are available for anonymous FTP downloading from VTCs HomePage (VTC is part of Working Group AGN): ftp://agn-www.informatik.uni-hamburg.de/vtc Any comment and critical remark which helps VTC learning to improve our test methods will be warmly welcomed. Further to this test, we follow suggestions of AV producers to test the heuristic detection ability of scanners for those viruses which were detected first only after product submission. In this "pro-active test", we will test the heuristic detection quality for products submitted for W-NT, for macro and script viruses detected between April 2001 and June 2001 as well as between July 2001 and October 2001. The next comparative test will evaluate file, boot, macro (VBA/VBA5) and script virus and malware detection. This test is planned for January - March 2002, with viral databases frozen on October 31, 2001. Any AV producer wishing to participate in forthcoming test is invited to submit related products. On behalf of the VTC Test Crew: Dr. Klaus Brunnstein (November 30, 2001) 15. Copyright, License, and Disclaimer: ======================================= This publication is (C) Copyright 2001 by Klaus Brunnstein and the Virus Test Center (VTC) at University of Hamburg, Germany. Permission (Copy-Left) is granted to everybody to distribute copies of this information in electronic form, provided that this is done for free, that contents of the information are not changed in any way, and that origin of this information is explicitly mentioned. It is esp. permitted to store and distribute this set of text files at university or other public mirror sites where security/safety related information is stored for unrestricted public access for free. Any other use, esp. including distribution of these text files on CD-ROMs or any publication as a whole or in parts, are ONLY permitted after contact with the supervisor, Prof. Dr. Klaus Brunnstein or authorized members of Virus Test Center at Hamburg University, and this agreement must be in explicit writing, prior to any publication. No responsibility is assumed by the author(s) for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or ideas contained in the material herein. Prof. Dr. Klaus Brunnstein University of Hamburg, Germany (November 30, 2001)