============================================================ File 0XECSUM.TXT: "EXECUTIVE SUMMARY" VTC University of Hamburg AntiMalware Product Test "1999-03" ============================================================ [Formatted with non-proportional font (Courier), 72 columns] Content of this text: ===================== 1. Background of this test; Malware Threats 2. VTC Testbeds used in VTC test "1999-03" 3. Summary #1: Development of DOS AV product detection rates, Result #1: DOS zoo detection rates need improvement! 4. Summary #2: Performance of DOS AV products on ZOO testbeds, including Polymorphic and VKIT virus detection, Result #2 Quality of best 3 AV DOS scanners not yet perfect "Excellent" DOS scanners: DSS, SCN and AVP 5. Summary #3: Performance of DOS scanners on ITW testbeds Result #3 High ITW detection rates and implied risks: 8 "Perfect" DOS ITW scanners (all 100%): AVG, AVP, DSS, FPR, FSE, INO, NOD and NVC. 6. Summary #4: Performance of DOS scanners by virus classes Result #4 Performance of DOS scanners by virus classes: "Perfect" scanners for macro zoo: DSS, SCN "Perfect" scanners for Polymorphic virus set: AVG, AVK, AVP, DRW, FPR, FSE, IRS, NAV, NOD and PAV. No perfect scanner for boot, file and VKit zoo. 7. Summary #5: Detection of viruses in packed objects (DOS) Result #5 Virus detection in packed objects insufficient Only one product is "perfect": AVP 8. Summary #6: False Positive avoidance (DOS/Win-NT) Result #6 Avoidance of False-Positive Alarms insufficient But 4 "perfect" DOS scanners: AVK, FSE, PAV,SCN And 2 "perfect" W-NT scanners: AVK, SCN 9. Summary #7: Detection of File/Macro Malware (DOS/Win-NT) Result #7 AntiMalware detection under DOS/W-NT improving No "perfect" but 6 "excellent" AM products: for DOS and W-NT: DSS, SCN, AVK, PAV for W-NT additionally: AVP FSE 10. Summary #8: File/Macro virus detection under Win-NT Result #8 Virus detection rates under W-NT on high level One "perfect" Windows-NT zoo scanner: FSE More: 3 "excellent" and 9 "Very Good" scanners. 11. Summary #9: File/Macro Virus detection under 32-bit engines Result #9 Best uniform W-32 performance: FSE, AVP, DWW, NOD, IRS 12. Summary#10: Malware detection under Windows 98/NT Result #10: AntiMalware quality of AV products is developping No "perfect" AM product for W-98 and W-NT But 6 "excellent" AM products: FSE, AVK, DSS, SCN, PAV, AVP 13. Conclusion: Searching for the "Perfect AV/AM product" Result #11: Best AV products in test: FSE and SCN Best AM products in test: SCN 14. Availability of full test results 15. Copyright, License, and Disclaimer Tables: ======= Table ES0: Development of viral/malware threats Table ES1: Content of VTC test databases in test "1999-03" Table ES2: List of AV products in test "1999-03" Table ES3: Development of DOS scanners from 1997-02 to 1999-03 Table ES4: Development of W-NT scanners from 1997-07 to 1999-03 Table ES5: Malware detection under Windows-98/Windows-NT 1. Background of this test: Malware Threats: ============================================ Malicious software (malware) including viruses (=self-replicating malware), trojan horses (=pure payload without self-replication), virus droppers and network malware (e.g. worms and hostile applets), are regarded as serious threats to PC users esp. when connected to Intranetworks and Internet. The development of malicious software can well be studied in view of the growth of VTC testbeds. The following table summarizes, for previous and actual VTC tests (indicated by their year and month of publication), the size of virus and malware (full = "zoo") databases (indicating each the different viruses and number of instantiations of a virus or malware and having in mind that some revisions of testbeds were made): Table ES0: Development of threats as present in VTC test databases: =================================================================== = File viruses= = Boot Viruses= =Macro Viruses= == Malware == Test# Number Infected Number Infected Number Infected Number Malware Viruses objects viruses objects viruses objects file macro ----------------------------------------------------------------------- 1997-07: 12,826 83,910 938 3,387 617 2,036 213 72 1998-03: 14,596 106,470 1,071 4,464 1,548 4,436 323 459 1998-10: 13,993 112,038 881 4,804 2,159 9,033 3,300 191 1999-03: 17,148 128,534 1,197 4,746 2,875 7,765 3,853 200 ----------------------------------------------------------------------- Remark: Before test 1998-10, an ad-hoc cleaning operation was applied to remove samples where virality could not be proved easily. With annual deployment of more than 5,000 viruses and several 100 Trojan horses, many of which are available from Internet, and in the absence of inherent protection against such dysfunctional software, users must rely on AntiMalware and esp. AntiVirus software to detect and eradicate - where possible - such malicious software. Hence, the detection quality of AntiMalware esp. including AntiVirus products becomes an essential prerequisite of protecting customer productivity and data. Virus Test Center (VTC) at Hamburg University´s Faculty for Informatics performs regular tests of AntiMalware and esp. AntiVirus Software. VTC recently tested actual versions of on-demand scanners for their ability to identify PC viruses. Tests were performed on VTCs malware databases, which were frozen on their status as of *** November 30, 1998 *** to give AV/AM producers a fair chance to support updates within the 8 weeks submission period. The main test goal was to determine detections rates, reliability (=consistency) of malware identification and reliability of detection rates for submitted or publicly available scanners. Special tests were devoted to detection of multiple generations of 4 polymorphic file viruses (Maltese.Amoeba, Mte.Encroacher.B, Natas and Tremor) and of viruses generated with the "VKIT" file virus generator. It was also tested whether viruses packed with 4 popular compressing methods (PKZIP, ARJ, LHA and RAR) would be detected (and to what degree) by scanners. Moreover, avoidance of False Positive alarms on "clean" (=non-viral and non-malicious) objects was also determined. Finally, a set of selected non-viral file and macro malware (droppers, Trojan horses, intended viruses etc) was used to determine whether and to what degree AntiVirus products may be used for protecting customers against Trojan horses and other forms of malware. VTC maintains, in close and secure connection with AV experts worldwide, collections of boot, file and macro viruses as well as related malware ("zoo") which have been reported to VTC or AV labs. Moreover, following the list of "In-The-Wild Viruses" (published on regular basis by Wildlist.org), a collection of viruses reported to be broadly visible is maintained to allow for comparison with other tests; presently, this list doesnot report ITW Malware. 2. VTC Testbeds used in VTC test "1999-03": =========================================== The actual sizes of VTC testbeds (developped from previous testbed through inclusion of new viruses and malware and some revision) is given in the following table: Table ES1: Content of VTC test databases: ================================================================= "Full Zoo":17,148 File Viruses in 128,534 infected files, 40,000 Instantiations of 4 polymorphic file viruses 10,706 Variations of file viruses generated with VKIT 2,485 different File Malware in 3,853 file objects 3,300 Clean Files used for False Positive Detection 1,197 System Viruses in 4,746 infected images, 2,875 Macro Viruses in 7,765 infected documents, 143 different Macro Malware in 200 macro objects 362 Clean macro objects used for False Positive test ----------------------------------------------------- "ITW Zoo": 87 File Viruses in 2,867 infected files, 76 System Viruses in 655 infected images, and 83 Macro Viruses in 675 infected documents ================================================================== (For detailed indices of VTC testbeds, see file "a3testbed.zip") For test "1999-03", the following *** 34 *** AntiVirus products (adressed in subsequent tables by a 3-letter abbreviavion) under DOS, Windows-98 or Windows-NT were tested: Table ES2: List of AV products in test "1999-03" ================================================ Abbreviation/Product/Version Tested under Platform --------------------------------------------------------- ACU = Accura W-98, W-NT AVA = AVAST! 2.0 DOS, W-98, W-NT AVG = AVG Grisoft 5.0 DOS, W-98, W-NT AVK = AntiViren Kit 8.06 G-Data DOS, W-98, W-NT AVP = AVP 3.0 Build 128 DOS, W-98, W-NT AVX = AntiVirus eXpert 3.45 DOS, W-98, W-NT DRW = DrWeb 4.03a W-98, W-NT DWW = DrWeb for Win32 4.03a beta DOS, W-98, W-NT DSS = Dr Solomon AV Toolkit v.7.91 DOS, W-98, W-NT FPR = F-Prot 3.04a DOS, W-98, W-NT FMA = F-MacroW 3.04a DOS, W-98, W-NT FSE = F-Secure 4.03a Beta DOS, W-98, W-NT FWN = FWin 32 v.1.61 W-NT HMV = HMVS 3.11 DOS, W-98, W-NT INO = InoculanIT 4.5 Virus sig 4.17 DOS, W-98, W-NT IRS = IRIS 22.16 DOS, W-98, W-NT IVB = InVircible 7.02 build 238 DOS, W-98, W-NT ITM = ITM 4.21a Integrity Master DOS, W-98, W-NT NAV = Norton AntiVirus 5.01.01 DOS, W-98, W-NT NOD = NOD32 (eset Software) 1.13 DOS, W-98, W-NT NVC = Norman Virus Control 4.63 DOS, W-98, W-NT PAN = Panda Antivirus NT WS 4.0 DOS, W-98, W-NT *** PA6 = Panda Platinum Antivirus 6.0 DOS, W-98, W-NT *** PAV = Power AntiVirus 7 3.0 build 124 DOS, W-98, W-NT PCC = PcCillin (TrendMicro) W-98 PRO = Protector Plus (Proland Sw India) DOS, W-98, W-NT RAV = Romanian AntiVirus (RAV6) W-NT RA7 = RAV version 7.0 Beta W-NT RAD = RAV stand-alone DOS under W-NT DOS TSC = TScan 1.80 (Andreas Marx) DOS, W-98 SCN = NAI VirusScan (CD-ROM) 4.0.2 DOS, W-98, W-NT VET = VET 9.91 (CYBEC/CAI) DOS, W-98, W-NT VSP = VirScanPlus 11.71.02 (ROSE SWE) DOS, W-98, W-NT VBW = Virus Buster 5.50 (Leprechaun) W-98, W-NT ----------------------------------------------------------- For details of AV products including options used to determine optimum detection rates: see A3SCNLS.TXT. For scanners where results are missing: see 8problms.txt. ** Remark (added May 5, 1999): PANda versions were deleted from test report upon request of PANDA CEO Mr. Mikel Urizarbarrenna, as PANDA shareware products may not be tested by universities except with special permission of PANDA; VTC as independent test institute canNOT accept any conditions for test permission. VTC included actual versions of PANDA AV products after having been approached by customers of PANDA who complained that related products (which had been submitted for VTC test 1998-02 but not for VTC test 1998-10) were not referenced in last VTC test. Indeed, it has been overlooked in installation of PANDA programs that a statement on the 2nd screen forbids university tests. VTC performed this test in good faith, and we offered to transmit missed In-The-Wild samples to help improving the products. PANDA requested the transfer of the WHOLE testbed (which they assume to be some "CARO database" which to the best knowledge of the author does NOT exist or is, if existing, not accessible to VTC) which is not compliant with VTCs legal and ethical conditions. The author apologizes to PANDA for having overseen the test condition. End of remark *** With recent developments in the AntiVirus business, some AV products whose development was measured in previous VTC tests were taken over by other AV companies (hopefully to be included in future products of the "new" owners). We wish to especially appreciate both the good cooperation in testing with related AV labs and experts as well as the pastly approved product quality for: DrSolomon AntiVirus (DSAV), IBM AntiVirus (IBM-AV), Thunderbyte AntiVirus (TBAV) and VET AntiVirus. While we respect the request of the "new" owners of IBM-AV and TBAV *NOT* to include test results for those (last) versions available to VTC within test period, we wish to esp. thank NAI and CAI, to permit some (possibly "last") test of DSAV (DSS) and VET-AV (VET), respectively. In general, AV products were either submitted or, when test versions were available on Internet, downloaded from respective ftp/http sites. Few scanners were not available either in general (TNT) or for this test, some of which were announced to participate in some future test. One AV manufacturer (Sophos UK) requested that their product shall not be tested against VTCs malware database; as VTC regards non-viral malware as essential threat to customers whose detection shall be measured in VTC tests, and as VTC is not willing to accept any requests for special treatment of products in test, this AV product was not admitted to this tests. Finally, very few AV producers answered VTCs bids for submitting scanners with electronic silence. The following paragraphs survey essential findings in comparison with last VTC tests (performance over time), as well as some relative "grading" of scanners for detection of file and macro viruses, both in full "zoo" and "In-The-Wild" testbeds, of file and macro malware, as well as detection of ITW file and macro viruses in objects packed with ARJ, LHA, ZIP and RAR. Finally, the ability of AV products to avoid False Positive alarms is also analysed. Detailed results including precision and reliability of virus identification and of results for boot/MBR infectors are described in overview tables "6a-sumov.txt" and the related tables for DOS (boot+file+macro), Win-98 and Win-NT (file+macro) detection. An additional test was performed to measure the ability of AV products to detect all polymorphic instantiations of 4 selected polymorphic file viruses, and one more test was devoted to detect any of 10,706 file virus variations generated with the virus development kit "VKIT" detected in fall 1998. A rather detailed evaluation of test results including progress and shortcomings is given in 7EVAL.TXT. 3. Summary #1: Development of DOS scanner detection rates: ========================================================== Concerning performance of DOS scanners, a comparison of virus detection results in tests from "1997-02" until present test "1999-03" shows how scanners behave and how manufacturers succeed in adapting their products to the growing threat of new viruses. The following table lists the development of detection rates of scanners (most actual versions in each test), and it calculates changes ("+" indicating improvement) in detection rates. For reasons of fairness, it must be noted that improvement of those products which have yet reached a very high level of detection and quality (say: more than 90 or 95%) is much more difficult to acchieve than for those products which reach lower detection rates. Moreover, changes in the order of about +-0.5% are not significant as this is about the growth rate per month, so detection depends strongly whether some virus is reported (and analysed and included) just before a new update is delivered. Table ES3 lists developments for detection of file and macro viruses; for details as well as for boot virus detection, see result tables (6b-6g) as well as overview (6asumov.txt) and Evaluation (7eval.txt). Table ES3: Improvement of DOS scanners from 1997-02 to 1999-03: =============================================================== ------- File Virus Detection ------- ------ Macro Virus Detection ------- SCAN 97/02 97/07 98/02 98/10 99/03 DELTA 97/02 97/07 98/02 98/10 99/03 DELTA NER % % % % % % % % % % % % ------------------------------------------------------------------------------ ALE 98.8 94.1 89.4 - - - 96.5 66.0 49.8 - - - AVA 98.9 97.4 97.4 97.9 97.6 -0.3 99.3 98.2 80.4 97.2 95.9 -1.3 AVG 79.2 85.3 84.9 87.6 87.1 -0.5 25.2 71.0 27.1 81.6 82.5 +0.9 AVK - - - 90.0 75.0 -15.0 - - - 99.7 99.6 -0.1 AVP 98.5 98.4 99.3 99.7 99.7 0.0 99.3 99.0 99.9 100.0 99.8 -0.2 ANT 73.4 80.6 84.6 75.7 - - 58.0 68.6 80.4 56.6 - - DRW 93.2 93.8 92.8 93.1 98.2 +5.1 90.2 98.1 94.3 99.3 98.3 -1.0 DSS 99.7 99.6 99.9 99.9 99.8 -0.1 97.9 98.9 100.0 100.0 100.0 0.0 FMA - - - - - - 98.6 98.2 99.9 - - - FPR 90.7 89.0 96.0 95.5 98.7 +3.2 43.4 36.1 99.9 99.8 99.8 0.0 FSE - - 99.4 99.7 97.6 -2.1 - - 99.9 90.1 99.6 +9.5 FWN - - - - - - 97.2 96.4 91.0 85.7 - - HMV - - - - - - - - 98.2 99.0 99.5 +0.5 IBM 93.6 95.2 96.5 - * * 65.0 88.8 99.6 - * * INO - - 92.0 93.5 98.1 +4.6 - - 90.3 95.2 99.8 +4.6 IRS - 81.4 74.2 - 51.6 - - 69.5 48.2 - 89.1 - ITM - 81.0 81.2 65.8 64.2 -1.6 81.8 58.2 68.6 76.3 70.9 -5.4 IVB 8.3 - - - 96.9 - - - - - - - NAV 66.9 67.1 97.1 98.1 77.2 -20.9 80.7 86.4 98.7 99.8 99.7 -0.1 NOD - - - 96.9 - - - - - - 99.8 - NVC 87.4 89.7 94.1 93.8 97.6 +3.8 13.3 96.6 99.2 90.8 - - PAN - - 67.8 - * - - - 73.0 - * - PAV - 96.6 98.8 - 73.7 - - 93.7 100.0 - 99.5 - PCC - - - - - - - 67.6 - - - - PCV 67.9 - - - - - - - - - - - PRO - - - - 35.5 - - - - - 81.5 - RAV - - - 71.0 - - - - - 99.5 99.2 -0.3 SCN 83.9 93.5 90.7 87.8 99.8 +12.0 95.1 97.6 99.0 98.6 100.0 +1.4 SWP 95.9 94.5 96.8 98.4 - - 87.4 89.1 98.4 98.6 - - TBA 95.5 93.7 92.1 93.2 * * 72.0 96.1 99.5 98.7 * * TSC - - 50.4 56.1 39.5 -16.6 - - 81.9 17.0 76.5 +59.5 TNT 58.0 - - - * * - - - - * * VDS - 44.0 37.1 - - - 16.1 9.9 8.7 - - - VET - 64.9 - - 65.3 - - 94.0 97.3 97.5 97.6 +0.1 VRX - - - - - - - - - - - - VBS 43.1 56.6 - 35.5 - - - - - - - - VHU 19.3 - - - - - - - - - - - VSA - - 56.9 - - - - - 80.6 - - - VSP - - - 76.1 71.7 -4.4 - - - - - - VSW - - 56.9 - - - - - 83.0 - - - VTR 45.5 - - - - - 6.3 - - - - - XSC 59.5 - - - - - - - - - - - ------------------------------------------------------------------------------- Mean 74.2% 84.8% 84.4% 85.4% 81.2% -2.2% 69.6% 80.9% 83.8% 89.6% 93,6% +4,3% ------------------------------------------------------------------------------- Remark: for abbreviations and details of products present only in previous tests, see related parts of VTC test report. **************************************************************** Results #1: DOS zoo detection rates need improvement! **************************************************************** Result #1.1) The ability of scanners to detect file viruses under DOS has decreased both for those scanners which participated in last VTC test (from 85.4% to 83.2%) as well as in general (including new products) to now only 81.2%. On the other side, 11 (out of 19) products detected In-The-Wild file viruses completely (100.0%). #1.2) On the better side, the ability of scanners to detect macro viruses improved significantly (by another 4.3%) to now 93.6% (mean detection rate). Again, 2 scanners detect ALL Zoo viruses, and 17 (out of 21) scanners detect ALL ITW macro viruses. This indicates that contemporary macro viruses are not technically difficult to process. #1.3) Evidently, most AV producers invest relatively more work into detection of macro viruses than of file viruses; concerning file viruses, many AV producers concentrate essentially on In-The-Wild viruses and neglect the threat of zoo viruses. There is a growing risk that threats of file viruses are underestimated! ****************************************************************** 4. Summmary #2: Performance of DOS scanners on zoo testbeds: ============================================================ Concerning rating of DOS scanners, the following grid is applied to classify scanners: - detection rate =100.0% : scanner is graded "perfect" - detection rate above 95% : scanner is graded "excellent" - detection rate above 90% : scanner is graded "very good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall grade" (including file and macro virus detection), the lowest of the related results is used to classify resp. scanners. If several scanners of the same producer have been tested, grading is applied to the most actual version (which is, in most cases, the version with highest detection rates). Only scanners where all tests were completed are considered; here, the most actual version with test completed was selected. (For problems in test: see 8problms.txt). The following list indicates those scanners graded into one of the upper three categories, with file and macro virus detection rates in unpacked forms, and with perfect ITW virus detection (rate=100%): (file/macro zoo; file/macro ITW) -------------------------------- "Perfect" DOS scanners: NONE "Excellent" DOS scanners: DSS ( 99.8% 100.0%; 100.0% 100.0%) SCN ( 99.8% 100.0%; 100.0% 100.0%) AVP ( 99.7% 100.0%; 100.0% 100.0%) "Very Good" DOS scanners: FPR ( 98.7% 100.0%; 99.8% 100.0%) DRW ( 98.2% 100.0%; 98.3% 100.0%) INO ( 98.1% 100.0%; 99.8% 100.0%) FSE ( 97.6% 100.0%; 99.6% 100.0%) AVA ( 97.6% 100.0%; 95.9% 100.0%) NOD ( 96.9% 100.0%; 99.8% 100.0%) **************************************************************** Results #2: Quality of best 3 DOS scanners not yet perfect Excellent DOS scanners: DSS, SCN and AVP **************************************************************** Result #2.1) The overall virus detection quality of best DOS scanners has reached a very acceptable level also for viruses which are not "in-the-wild", but with an evident bias to higher macro virus detection rates. #2.2) 3 scanners - DSS, SCN and AVP - are almost perfect. ************************************************************** 5. Summary #3: Performance of DOS scanners on ITW testbeds: =========================================================== Concerning "In-The-Wild" viruses, a much more rigid grid must be applied to classify scanners, as the likelihood is significant that a user may find such a virus on her/his machine. The following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" The following DOS products reach 100% both for file and macro virus detection and are rated "perfect" in this category (alphabetically ordered): ( File Macro Boot) ----------------------- "Perfect" DOS ITW scanners: AVG (100.0% 100.0% 100.0%) AVP (100.0% 100.0% 100.0%) DSS (100.0% 100.0% 100.0%) FPR (100.0% 100.0% 100.0%) FSE (100.0% 100.0% 100.0%) INO (100.0% 100.0% 100.0%) NOD (100.0% 100.0% 100.0%) NVC (100.0% 100.0% 100.0%) One product reached 100% ITW detection with of file and macro viruses but miss the 100% level for ITW boot viruses: "Very good" DOS ITW scanner: AVA (100.0% 100.0% 98,7%) As macro-only product, HMV also reaches "perfect" 100% ITW detection. Several other scanners reach 100% macro ITW detection, with ITW file and boot viruses falling into lower categories: "Good" DOS ITW scanners: NAV ( 92.0% 100.0% 98.7%) ************************************************************** Results #3: High ITW detection rates and implied risks: 8 "Perfect" DOS ITW scanners (all 100%): AVG, AVP, DSS, FPR, FSE, INO, NOD and NVC. ************************************************************** Result #3.1) In-The-Wild detection of best DOS scanners has been significantly improved since last test. Number of perfect scanners (in this category) has jumped from 6 to 8. #3.2) The concentration of some AV producers to reach 100% In-The-Wild detection rates is coupled to inacceptably low detection rates in overall file, macro and boot zoo viruses. ************************************************************** 6. Summary #4: Performance of DOS scanners by virus classes: ============================================================ Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or as that part is significantly better than other parts). It is therefore worth notifying which scanners perform best in detecting file, boot and macro viruses. Compared to the last test (1998-10), the number of "excellent" macro virus detectors has significantly grown (as has the class of "good" ones which is not listed here); in contrast, "standard" file (and even more: boot) viruses seem to be comparably less carefully handled in product upgrading. Two special tests of file viruses were also performed to determine the quality of AV product maintenance. One test was concerned with almost 11,000 viruses generated from the VKIT virus generator. Some AV products count every of the potential 15,000 viruses as new variant while others count all VKIT viruses just as ONE virus. Fortunately, a high proportion of tested products detects these viruses (see 4.5), although reliability of detection is significantly less than normally (see 6BDOSFIL.TXT). Another special test was devoted to the detection of 10,000 polymorphic generations each of the following polymorphic viruses: Maltese.Amoeba, MTE.Encroacher.B, NATAS and TREMOR. Detection rates were "almost perfect". Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed. 4.1 Grading the Detection of zoo file viruses: ---------------------------------------------- "Perfect" DOS scanner: === NONE === "Excellent" DOS scanners: DSS ( 99.8%) SCN ( 99.8%) AVP ( 99.7%) "Very Good" DOS file scanners: FPR ( 98.7%) DRW ( 98.2%) INO ( 98.1%) AVA ( 97.6%) FSE ( 97.6%) NVC ( 98.1%) NOD ( 97.6%) 4.2 Grading the Detection of zoo macro viruses: ----------------------------------------------- "Perfect" DOS macro scanners: DSS (100.0%) SCN (100.0%) "Excellent" DOS macro scanners: AVP ( 99.8%) FPR ( 99.8%) INO ( 99.8%) NOD ( 99.8%) NAV ( 99.7%) AVK ( 99.6%) FSE ( 99.6%) HMV ( 99.5%) RAV ( 99.2%) "Very Good" DOS file scanners: DRW ( 98.3%) VET ( 97.6%) AVA ( 95.9%) 4.3 Grading the Detection of zoo boot viruses: ---------------------------------------------- "Perfect" DOS boot scanner: === NONE === "Excellent" DOS boot scanners: DSS ( 99.1%) NOD ( 99.1%) AVK ( 99.0%) FSE ( 99.0%) PAV ( 99.0%) "Very Good" DOS boot scanners: AVP ( 98.2%) NVC ( 97.8%) INO ( 96.7%) AVA ( 95.3%) 4.4 Grading of Poly-virus detection: ------------------------------------ Based on the detection data (see 6BDOSFIL.TXT, table FDOS.FA), and with additional conditions, that 1) all infected objects for all viruses were detected 2) with full reliability of identification and detection, the following products can be rated as perfect Poly-detectors: "Perfect" Poly-detectors: AVG (100.0) AVK (100.0) AVP (100.0) DRW (100.0) FPR (100.0) FSE (100.0) IRS (100.0) NAV (100.0) NOD (100.0) PAV (100.0) The following products are "almost perfect" as they reach 100% detection rate at least with rounding but with less precise diagnostics: "Almost Perfect" Poly detectors: AVA, DSS, INO, ITM, NAV, NVC, SCN, VET and VSP. 4.5 Grading of VKit virus detection: ------------------------------------ Based on detection data (see 6BDOSFIL.TXT, table FDOS.FB), and with additional conditions, that 1) all infected objects for all viruses were detected 2) with full reliability of identification and detection, NO product was "perfect" but several detected almost all samples (rounded to 100.0%) but with some unreliability of identification: "Perfect" VKIT- detectors: NONE "Almost Perfect" VKIT detectors: AVK, AVP, DSS, FPR, FSE, PAV, SCN and TSC. **************************************************************** Results #4: Performance of DOS scanners by virus classes: Perfect scanners for macro zoo: DSS, SCN Perfect scanners for Polymorphic virus set: AVG, AVK, AVP, DRW, FPR, FSE, IRS, NAV, NOD and PAV. No perfect scanner for boot, file and VKit zoo. **************************************************************** Result #4.1) Specialised scanners (esp. those specialising on macro viruses) are not superior to best overall scanners, even concerning large collections such as VTCs "zoo" testbeds. **************************************************************** 7. Summary #5: Detection of viruses in packed objects under DOS: ================================================================ Detection of file and macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with given popular methods (PKZIP, ARJ, LHA and RAR) are also detected. (Remark: compared to last test where detection of packed zoo viruses was tested, the test condition was reduced as "only" detection of packed ITW viruses was adressed!) Results (see 6BDOSFIL.TXT and 6DDOSMAC.TXT) are rather DISAPPOINTING: One product - AVP - is "perfect" as it detects ALL ITW file and macro viruses packed with ALL 4 methods! 2 more scanners can be rated as "excellent" as they detect at least all ITW viruses packed with 3 methods reliably. And 8 scanners (out of 21) detected at least ONE macro virus packed with at least ONE compressing method. The following table list ALL scanners which detect file and macro viruses in objects compressed with AT LEAST TWO packing methods on an acceptable level (>80%): Packed ITW File Viruses Packed ITW Macros Viruses ( %ZIP %LHA %ARJ %RAR) ( %ZIP %LHA %ARJ %RAR) ------------------------------------------------------- "Perfect" Detection: AVP (100.0 100.0 100.0 100.0) (100.0 100.0 100.0 100.0) 3 Methods detected: DSS (100.0 100.0 100.0 0.0) ( 97.6 97.6 97.6 0.0) NOD (100.0 0.0 100.0 100.0) (100.0 0.0 100.0 100.0) SCN (100.0 100.0 100.0 0.0) (100.0 100.0 100.0 0.0) PAN ( 98.9 0.0 100.0 100.0) (100.0 0.0 100.0 100.0) DRW ( 93.1 93.1 93.1 0.0) (100.0 100.0 100.0 0.0) 2 Methods detected: AVK (100.0 0.0 100.0 0.0) (100.0 0.0 100.0 0.0) FPR (100.0 0.0 100.0 0.0) (100.0 0.0 100.0 0.0) FSE ( 93.1 0.0 93.1 0.0) (100.0 0.0 100.0 0.0) PAV ( 93.1 0.0 93.1 0.0) (100.0 0.0 100.0 0.0) ------------------------------------------------------- Remark: Much more data were collected on precision and reliability of virus detection in packed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. ******************************************************************** Results #5: Detection of viruses in packed objects is insufficient: Only one product is perfect: AVP ******************************************************************** Result #5.1) Only one product = AVP = can be rated "Perfect" concerning detection of infected packed objects, at least on the level of ITW file and macro viruses. #5.2) VERY FEW products have reached an acceptable level of detecting viruses in packed infected objects with 2 or 3 compression methods. Signi- ficant investment of work is needed here. ******************************************************************** 8. Summary #6: False-Positive avoidance of DOS and Win-NT scanners: =================================================================== Regarding the ability of scanners to avoid FP alarms, the following AV products running under DOS reported NO SINGLE False-Positive alarm BOTH in file and macro zoo testbeds and are therefore rated "perfect": FP-avoiding "perfect" DOS scanners: AVK, FSE, PAV and SCN Several DOS scanners gave NO FP alarm EITHER on clean files OR macros: Perfect FP-avoidance on DOS clean file testbed: AVK, AVP, FPR, FSE, NAV, NVC, PAV, PRO, SCN and TSC. Perfect FP-avoidance on DOS clean macro file testbed: AVA, AVK, DSS, FSE, PAV and SCN. Comparing related results with behaviour of 32-bit scanner engines and esp. using results produced under Win-NT, there are (different from last test where DSS was rated "perfect") 2 AV products which avoid ANY FP alarm in both clean file and macro objects BOTH for DOS and Win-NT: FP-avoiding "perfect" Win-NT scanner: AVK and SCN! Several more W-NT scanners also gave NO FP alarm on clean files or macros: Perfect FP-avoidance under Win-NT for clean file testbed: AVK, AVP, FPR, FSE, NAV, NVC, PAV, PRO, RAV, RA7 and SCN. Perfect FP-avoidance on Win-NT clean macro file testbed: AVA, AVK, DSS and SCN. Evidently, avoidance of false-positive alarms is less advanced for macro viruses (see 6DDOSMAC.TXT, table FDOS.M4). Concerning avoidance of False-Positive alarms both under DOS AND Windows-NT, only two products are rated "perfect": AVK and SCN! **************************************************************** Results #6: Avoidance of False-Positive Alarms is insufficient. FP-avoiding perfect DOS scanners: AVK, FSE, PAV,SCN FP-avoiding perfect W-NT scanners: AVK, SCN **************************************************************** Result #6.1) VERY FEW products reliably avoid ANY False Positive alarm on clean file and macro objects, either under DOS and Win-NT. #6.2) Only 2 products avoid ANY false-positive alarm BOTH under DOS and Windows-NT: !AVK and SCN! #6.3) Overall, the quality of FP-avoidance has dege- nerated since last test: the number of false- positive alarms is significantly LARGER compared to the last test, whereas the testbed was deliberately NOT CHANGED. #6.4) AV producers should intensify work to avoid FP alarms. ***************************************************************** 9. Summary #7: Detection of File and Macro Malware (DOS/Win-NT): ================================================================ Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be warned and protected not only about viruses but also about other malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Regrettably, consciousness of AV producers to protect their users against related threats is still underdevelopped. Manifold arguments are presented why AV products are not the best protection against non-viral malware; from a technical point, these arguments may seem conclusive but at the same time, almost nothing is done to support customers with adequate AntiMalware software. On the other side, AV methods (such as scanning for presence or absence of characteristic features) are also applicable - though not ideal - to detect non-viral malware. Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. As in last test, still NO product can be rated a "perfect AM detector" but now, 4 scanners under DOS AND W-NT, and 6 scanners under W-NT are graded "excellent": ===== Malware Detection ===== = under DOS == = under W-NT = (File/Macro-mw;File/Macro-mw) ----------------------------------- "Excellent" DOS/W-NT scanners: DSS (97.5% 98.6%; 97.6% 98.6%) SCN (97.2% 97.9%; 96.7% 98.6%) AVK (94.9% 95.8%; 94.8% 95.8%) PAV (94.8% 94.4%; 94.9% 94.4%) ----------------------------------- AVP (88.3% 95.8%; 94.9% 91.5%) FSE (88.7% 95.8%; 99.4% 98.6%) ----------------------------------- Moreover, the following scanners reach 90%-detection either for file or macro malware for at least one operating system (DOS or Win-NT): "Excellent" scanners in at least one category/under one OS: AVA, AVK, AVP, AVX, DSS, FPR/FMA, FSE, FWN, HMV, INO, IRS, NAV, NOD, NVC, PAV, RA7, RAV and SCN. ************************************************************** Results #7: AntiMalware detection under DOS/W-NT improving No "perfect" but 6 "excellent" AM products: for DOS and W-NT: DSS, SCN, AVK, PAV for W-NT only: AVP FSE ************************************************************** Result #7.1: The ability of AV products to detect also non-viral malware is improving. Now, 6 products detect file/macro malware on a 90% level either under DOS or under W-NT (4 under both). #7.2: Evidently, AV producers invest more work into macro malware detection, where 13 (under DOS) and 18 (under W-NT) detect macro malware on a 90% level. #7.3: With continuing growth of malware testbeds, AV producers are well advised to improve their products also in this area. ************************************************************* 10. Summary #8: File/Macro virus detection under Windows-NT: ============================================================ The following table summarizes results of file and macro virus detection under Windows-NT in test "1998-10" as compared with last one (1998-02): Scan ==== File Virus Detection ==== === Macro Virus Detection === ner 97/07 98/02 98/10 99/03 Delta 97/07 98/02 98/10 99/03 Delta ------------------------------------------------------------------- ANT 88.9 69.2 91.3 - - 92.2 - 85.7 - - ANY - - 69.7 - - - - 70.5 - - AVA - 97.4 96.6 97.1 +0.5 - 91.9 97.2 95.2 -2.0 AVG - - - 87.3 - - - - 82.5 - AVK - - 99.6 90.2 -9.4 - - 99.6 99.6 0.0 AVP - - 83.7 99.9 +16.2 - - 100.0 99.2 -0.8 AVX - - - 74.2 - - - - 98.9 - AW - 56.4 - - - - - 61.0 - - DRW - - - 93.3 - - - - 98.3 - DWW - - - - - - - - 98.2 - DSS 99.6 99.7 99.9 99.3 -0.6 99.0 100.0 100.0 100.0 0.0 FPR/FMA - 96.1 - 98.7 - - 99.9 99.8 99.8 0.0 FSE - 85.3 99.8 100.0 +0.2 - - 99.9 100.0 +0.1 FWN - - - - - - - 99.6 99.7 +0.1 HMV - - - - - - - 99.0 99.5 +0.5 IBM 95.2 95.2 77.2 * * 92.9 92.6 98.6 * * INO - 92.8 - 98.1 - - 89.7 - 99.8 - IRS - 96.3 - 97.6 - - 99.1 - 99.5 - IVB - - - - - - - 92.8 95.0 +2.2 NAV 86.5 97.1 - 98.0 - 95.6 98.7 99.9 99.7 -0.2 NOD - - - 97.6 - - - - 99.8 - NVC 89.6 93.8 93.6 96.4 +2.8 96.6 99.2 - 98.9 - PAN - - - * - - - - * - PAV 97.7 98.7 98.4 97.2 -1.2 93.5 98.8 99.5 99.4 -0.1 PRO - - - 37.3 - - - - 58.0 - RAV - 81.6 84.9 85.5 +0.6 - 98.9 99.5 99.2 -0.3 RA7 - - - 89.3 - - - - 99.2 - PCC 63.1 - - - - - 94.8 - - - PER - - - - - - 91.0 - - - SCN 94.2 91.6 71.4 99.1 +27.7 97.6 99.1 97.7 100.0 +2.3 SWP 94.5 96.8 98.4 - - 89.1 98.4 97.5 - - TBA - 93.8 92.6 * * 96.1 - 98.7 * * TNT - - - * * - - 44.4 * * VET 64.9 - - 65.4 - - 94.0 - 94.9 - VSA - 56.7 - - - - 84.4 - - - VSP - - - 87.0 - - - - 86.7 - ------------------------------------------------------------------- Mean: 87.4% 88.1% 89.0% 89.2% +4.0% 94.7% 95.9% 91.6% 95.3% +0,1% ------------------------------------------------------------------- Generally, the ability of W-NT scanners to detect file viruses "in the mean" was only moderately improved (+0.2%), but those products participating in last test improved by an impressing 4.0 %. On the other side, "mean" macro virus detection improved by 3.7% to now 95.3% (though not as good as in "1998-02"!), but products from last test improved only by 0.1%. As under W-98, one product - FSE - reached 100% detection rate for both file and macro (zoo) viruses and falls therefore under category "Perfect W-NT scanner". 3 scanners reach grade "Excellent" (>99% detection), and 9 scanners are rated "very good" (>95%): "Perfect" Windows NT scanners: FSE (100.0% 100.0%) "Excellent" Windows NT scanners: DSS ( 99.3% 100.0%) SCN ( 99.1% 100.0%) AVP ( 99.9% 99.2%) "Very Good" Windows NT scanners: FPR ( 98.7% 99.8%) DWW ( 98.2% 98.2%) INO ( 98.1% 99.8%) NAV ( 98.0% 99.7%) NOD ( 97.6% 99.8%) IRS ( 97.6% 99.5%) PAV ( 97.2% 99.4%) AVA ( 97.1% 95.2%) NVC ( 96.4% 98.9%) "Good" Windows NT scanners: AVK ( 90.2% 99.6%) For detection of macro viruses under Windows NT, the following 21 scanners detect at least 95% of zoo macro viruses: Perfect: DSS, FSE and SCN (100%); Excellent: FPR, INO and NOD (99.8%), FWN and NAV (99.7%), AVK (99.6%), HMV and IRS (99.5%), PAV (99.4%), AVP, RAV and RA7 (99.2%); Very Good: AVX and NVC (98.9%), DRW (98.3%), DWW (98.2%), AVA (95.2%), IVB (95.9%). ************************************************************** Results #9: Virus detection rates under W-NT on high level One "perfect" Windows-NT zoo scanner: FSE More: 3 "excellent" and 9 "Very Good" scanners. ************************************************************** Result #9.1: Detection rates for file and esp. macro viruses for scanners under Windows NT have reached a fairly high level, similar to W-98: Perfect scanner (100%): 1 Excellent scanners (>99%): 3 Very Good scanners (>95%): 9 #9.2: AV producers should invest more work into file virus detection, esp. into VKIT virus detection (where results are not equally promising) as well as detection of viruses in virus detection in compressed objects (essential for on-access scanning). ************************************************************** 11. Summary #9: File/Macro Virus detection under 32-bit engines: ================================================================ Concerning 32-Bit engines as used in Windows-98 and Windows-NT, it is interesting to test the validity of the hypothesis that related engines produce same detection and identification quality. (For details see 6HCOMP32.TXT). When comparing results from related tests, it is interesting to observe that identical detection on the file/macro zoo level are presently achieved only for few (5 out of 20) products with at least "very good" quality (>95%) both for zoo file and macro viruses: Equal results for W-98 and W-NT for zoo file AND macro viruses: -------------------------------- FSE (100.0% 100.0%) AVP ( 99.9% 99.2%) DWW ( 98.2% 98.2%) NOD ( 97.6% 99.8%) IRS ( 97.6% 99.5%) If only looking at zoo macro virus detection only, 12 (out of 20) products achieved at least "very good" quality (>95%): Equal results for W-98 and W-NT for zoo macro viruses: -------------------------------- Perfect: DSS, FSE and SCN (100.0%); Excellent: FPR, INO and NOD (99.8%), FWN and NAV (99.7%), AVP (99.6%), IRS (99.5%), AVP (99.2%); Very Good: DWW (98.2%), IVB (95.0%) BTW: all these scanners detect In-The-Wild file and macro viruses on the 100% level. ***************************************************************** Results #10: Only few W-32 scanners perform equally on W-98/W-NT Best uniform W-32 performance: FSE,AVP,DWW,NOD,IRS ***************************************************************** Result #10.1: The assumption that 32-bit engines in scanners produce the same detection rate for different instantiations of 32-bit operating systems (esp. for Windows-98 and Windows-NT) holds only for 5 scanners. #10.2: Analysis of ITW detection rates is NOT sufficient to determine the behaviour of 32-bit engines and does not guarantee equal detection rates for different W-32 platforms (esp. W-98/W-NT). ***************************************************************** 12. Summary #10: Malware detection under Windows 98/NT: ======================================================= With Windows 98 and Windows-NT often used for downloading potentially hazardous objects from Internet, it is interesting to measure the ability of AntiVirus products to also act as AntiMalware products. The same grid is applied as to grading of DOS AM products (see Eval #7). Presently, NO AV products can be graded "Perfect" (all rates 100.0%) but 6 scanners perform as AM products with grade "Excellent" (>90%), both for W-98 and W-NT, as the following table show: Detection of File Malware Detection of Macro Malware Scanner Win-98 Win-NT Win-98 Win-NT ------------------------------------------------------------------ FSE 99.3% 99.4% 98.6% 98.6% AVK 94.8% 94.9% 95.8% 95.8% DSS 97.6% 97.6% 98.6% 98.6% SCN 97.3% 96.7% 98.6% 98.6% PAV 94.9% 94.9% 94.4% 94.4% AVP 94.9% 94.9% 91.5% 91.5% ------------------------------------------------------------------- Detection of macro malware is evidently better supported than file malware detection, as several more AV products detect macro malware under W-98 (14 products) and W-NT (16 products). As no product is rated "perfect", the following table list products rated "very good" for macro malware detection under both Win-98 and Win-NT: Detection of macro malware under W-98/W-NT ------------------------------------------ DSS, FSE, SCN (98.6% 98.6%) FPR (97.9% 97.9%) FWN HMV NOD (96.5% 96.5%) AVK INO (95.8% 95.8%) IRS (95.1% 95.1%) PAV (94.4% 94.4%) AVP (91.5% 91.5%) NAV (90.8% 91.5%) NVC (90.1% 90.1%) ------------------------------------------ Evidently, some AV products are able to help protecting users by detecting file and macro-related malware at a significant level. Fortunately, related products also show good to excellent results in Detecting viral malware. *************************************************************** Results #10: AntiMalware quality of AV products is developping No "perfect" AM product for W-98 and W-NT But 6 "excellent" AM products: FSE, AVK, DSS, SCN, PAV, AVP *************************************************************** Result #11.1: Some AntiMalware producers help customers detecting also non-viral malware under 32-bit operating systems, esp. Win-98 and Win-NT. #11.2: The ability of to detect macro malware is more developped than detection of file malware. #11.2: Much more work must be invested to reliably detect file and macro malware and protect customers from downloading trojans etc. *************************************************************** 13. Final remark: Searching the "Perfect AV/AM product" ===================================================== Under the scope of VTCs grading system, a "Perfect AV/AM product" would have the following characteristics: Definition: A "Perfect AntiVirus (AV) product" ---------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild) AND in at least 99% of zoo samples, in ALL categories (file, boot and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples and in at least 99% of zoo samples, also in compressed objects, with at least all popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Definition: A "Perfect AntiMalware (AM) product" ------------------------------------------------ 1) Will be "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, whether in packed or unpacked forms, reliably at high rates (>90%). In VTC test "1999-10", we found *** NO perfect AV product *** and we found *** No perfect AM product ***. But some products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ------------------------------------------------------------- DOS zoo tests: --- DSS, SCN, AVP DOS ITW tests: AVG, AVP, DSS, FPR, FSE, INO, --- NOD, NVC DOS pack-tests: AVP NOD, SCN FP avoidance DOS: AVK, FSE, PAV, SCN --- FP avoidance W-NT: AVK, SCN --- W-NT zoo tests: FSE DSS, SCN, AVP W-32 uniformity: FSE AVP ------------------------------------------------------------- Malware NT/DOS: --- DSS, SCN, AVK, PAV Malware W-98/W-NT: --- FSE, AVK, DSS, SCN, PAV, AVP ------------------------------------------------------------- In order to support the race for more customer protection, we evaluate the order of performance in this test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first 5 places: ************************************************************ "Perfect" AntVirus product: NONE "Excellent" AV products: 1st place: FSE and SCN (both 9 points) 3rd place: AVP (8 points) 4th place: AVK and DSS (both 6 points) ************************************************************ "Perfect" AntiMalware product: NONE "Excellent" AntiMalware product: 1st place: SCN (11 points) 2nd place: FSE (10 points) 3rd place: AVP ( 9 points) 4th place: DSS and AVK (both 8 points) ************************************************************ Generally, we hope that these rather detailed results help AV producers to adapt their products to growing threats and thus to protect their customers. 14. Availability of full test results: ====================================== Much more information about this test, its methods and viral databases, as well as detailed test results are available for anonymous FTP downloading from VTCs HomePage (VTC is part of Working Group AGN): ftp://agn-www.informatik.uni-hamburg.de/vtc Any comment and critical remark which helps VTC learning to improve our test methods will be warmly welcomed. The next comparative test is planned for May to July 1999, with viral databases frozen on March 31, 1999. Any AV producer wishing to participate in forthcoming test is invited to submit related products. On behalf of the VTC Test Crew: Dr. Klaus Brunnstein (April 15,1999) 15. Copyright, License, and Disclaimer: ======================================= This publication is (C) Copyright 1999 by Klaus Brunnstein and the Virus Test Center (VTC) at University of Hamburg, Germany. Permission (Copy-Left) is granted to everybody to distribute copies of this information in electronic form, provided that this is done for free, that contents of the information are not changed in any way, and that origin of this information is explicitly mentioned. It is esp. permitted to store and distribute this set of text files at university or other public mirror sites where security/safety related information is stored for unrestricted public access for free. Any other use, esp. including distribution of these text files on CD-ROMs or any publication as a whole or in parts, are ONLY permitted after contact with the supervisor, Prof. Dr. Klaus Brunnstein or authorized members of Virus Test Center at Hamburg University, and this agreement must be in explicit writing, prior to any publication. No responsibility is assumed by the author(s) for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or ideas contained in the material herein. Prof. Dr. Klaus Brunnstein University of Hamburg, Germany (April 15, 1999)