========================================= File 7EVAL.TXT Evaluation of VTC Scanner Test "1999-03": ========================================= Formatted with non-proportional font (Courier) Content of this file: ===================== Eval #1: Development of DOS Scanner Detection Rates Table E1: Improvement of DOS scanners from 1997-02 to 1999-03 Findings #1: DOS scanner detection rates need improvement! Eval #2: Evaluation of overall DOS AV detection rates Eval #2.1: Overall grades of DOS scanners Findings #2: Quality of best 3 AV DOS scanners not yet perfect Eval #3: In-The-Wild Detection under DOS Findings #3: High ITW detection rates and implied risks Eval #4: Evaluation for detection by virus classes under DOS 4.1 Grading the Detection of file viruses 4.2 Grading the Detection of macro viruses 4.3 Grading the Detection of boot viruses 4.4 Grading of Poly-virus detection 4.5 Grading of VKit virus detection Findings #4: Performance of DOS scanners by virus classes Eval #5: Detection of Packed File and Macro Viruses under DOS Findings #5: Detection of packed viral objects insufficient Eval #6: False Positive Detection in Clean Files and Macros (DOS/W-NT) Findings #6: Avoidance of False-Positive Alarms insufficient Eval #7: Evaluation of File and Macro Malware detection (DOS/W-NT) Findings #7: AntiMalware detection under DOS/W-NT improving Eval #8: Overall virus detection rates under Windows-98 Findings #8: Virus detection rates under W-98 on high level Eval #9: Overall virus detection rates under Windows-NT Findings #9: Virus detection rates under W-NNT on high level Eval #10: File/Macro Virus detection under 32-bit engines Findings #10: Only few W-32 scanners perform equally on W-98/W-NT Eval #11: Evaluation for malware virus detection under Windows 98/NT Findings #11: AntiMalware quality of AV products is developing This part of VTC "1999-03" test report evaluates the detailed results as given in sections (files): 6BDOSFIL.TXT File Virus/Malware results DOS 6CDOSBOO.TXT Boot Virus results DOS 6DDOSMAC.TXT Macro Viruses/Malware results DOS 6FW98.TXT File/Macro Viruses/Malware results Win-98 6GWNT.TXT File/Macro Viruses/Malware results Win-NT 6HCMP32.TXT Comparison File/Macro results Win-98/NT Eval #1: Development of DOS Scanner Detection Rates: ==================================================== Concerning performance of DOS scanners, a comparison of virus detection results in previous 4 tests "1997-02/07" and "1998-02/10" with "1999-03" shows how scanners behave and how manufacturers work in adapting their products to the growing threat of new viruses and malware. Table E1 lists the development of the detection rates of scanners (most actual versions in each test), and it calculates the change (+ indicating improvement) in detection rates between last (1998-10) and the actual test (1999-03). Finally, the "mean change" both in absolute and relative improvement of detection rates also are given (last row of table E1). This comparison concentrates on file and macro virus detection quality. VTC test do NOT TEST for physical boot sector detection (see 4testcon.txt), so results may be unfair for those scanners which analyse physical layout of boot viruses. Therefore, boot virus detection results are not discussed here in detail (related results are avalable: 6CDOSBOO.TXT). For reasons of fairness, it must be noted that improvement of those products which have yet reached a very high level of detection and quality (say: more than 90 or 95%) is much more difficult to achieve than for those products which reached lower detection rates. Some products have incorporated new engines and included formerly separate scanners (e.g. on macro viruses) which lead to improved performance. Generally, changes in the order of about +-1.5% are less significant as this is about the growth rate per month, so detection depends strongly upon whether some virus is reported (and analysed and included) just before a new update is delivered. Table E1: Improvement of DOS scanners from 1997-02 to 1999-03: ============================================================== ------- File Virus Detection ------- ------ Macro Virus Detection ------- SCAN 97/02 97/07 98/02 98/10 99/03 DELTA 97/02 97/07 98/02 98/10 99/03 DELTA NER % % % % % % % % % % % % ------------------------------------------------------------------------------ ALE 98.8 94.1 89.4 - - - 96.5 66.0 49.8 - - - AVA 98.9 97.4 97.4 97.9 97.6 -0.3 99.3 98.2 80.4 97.2 95.9 -1.3 AVG 79.2 85.3 84.9 87.6 87.1 -0.5 25.2 71.0 27.1 81.6 82.5 +0.9 AVK - - - 90.0 75.0 -15.0 - - - 99.7 99.6 -0.1 AVP 98.5 98.4 99.3 99.7 99.7 0.0 99.3 99.0 99.9 100.0 99.8 -0.2 ANT 73.4 80.6 84.6 75.7 - - 58.0 68.6 80.4 56.6 - - DRW 93.2 93.8 92.8 93.1 98.2 +5.1 90.2 98.1 94.3 99.3 98.3 -1.0 DSS 99.7 99.6 99.9 99.9 99.8 -0.1 97.9 98.9 100.0 100.0 100.0 0.0 FMA - - - - - - 98.6 98.2 99.9 - - - FPR 90.7 89.0 96.0 95.5 98.7 +3.2 43.4 36.1 99.9 99.8 99.8 0.0 FSE - - 99.4 99.7 97.6 -2.1 - - 99.9 90.1 99.6 +9.5 FWN - - - - - - 97.2 96.4 91.0 85.7 - - HMV - - - - - - - - 98.2 99.0 99.5 +0.5 IBM 93.6 95.2 96.5 - * * 65.0 88.8 99.6 - * * INO - - 92.0 93.5 98.1 +4.6 - - 90.3 95.2 99.8 +4.6 IRS - 81.4 74.2 - 51.6 - - 69.5 48.2 - 89.1 - ITM - 81.0 81.2 65.8 64.2 -1.6 81.8 58.2 68.6 76.3 70.9 -5.4 IVB 8.3 - - - 96.9 - - - - - - - NAV 66.9 67.1 97.1 98.1 77.2 -20.9 80.7 86.4 98.7 99.8 99.7 -0.1 NOD - - - 96.9 - - - - - - 99.8 - NVC 87.4 89.7 94.1 93.8 97.6 +3.8 13.3 96.6 99.2 90.8 - - PAN - - 67.8 - - - - - 73.0 - - - PAV - 96.6 98.8 - 73.7 - - 93.7 100.0 - 99.5 - PCC - - - - - - - 67.6 - - - - PCV 67.9 - - - - - - - - - - - PRO - - - - 35.5 - - - - - 81.5 - RAV - - - 71.0 - - - - - 99.5 99.2 -0.3 SCN 83.9 93.5 90.7 87.8 99.8 +12.0 95.1 97.6 99.0 98.6 100.0 +1.4 SWP 95.9 94.5 96.8 98.4 - - 87.4 89.1 98.4 98.6 - - TBA 95.5 93.7 92.1 93.2 * * 72.0 96.1 99.5 98.7 * * TSC - - 50.4 56.1 39.5 -16.6 - - 81.9 17.0 76.5 +59.5 TNT 58.0 - - - * * - - - - * * VDS - 44.0 37.1 - - - 16.1 9.9 8.7 - - - VET - 64.9 - - 65.3 - - 94.0 97.3 97.5 97.6 +0.1 VRX - - - - - - - - - - - - VBS 43.1 56.6 - 35.5 - - - - - - - - VHU 19.3 - - - - - - - - - - - VSA - - 56.9 - - - - - 80.6 - - - VSP - - - 76.1 71.7 -4.4 - - - - - - VSW - - 56.9 - - - - - 83.0 - - - VTR 45.5 - - - - - 6.3 - - - - - XSC 59.5 - - - - - - - - - - - ------------------------------------------------------------------------------- Mean 74.2% 84.8% 84.4% 85.4% 81.2% -2.2% 69.6% 80.9% 83.8% 89.6% 93,6% +4,3% ------------------------------------------------------------------------------- ****************************************************************** Findings #1: DOS zoo detection rates need improvement! ****************************************************************** Findings #1.1) The ability of scanners to detect file viruses under DOS has decreased both for those scanners which participated in last VTC test (from 85.4% to 83.2%) as well as in general (including new products) to now only 81.2%. On the other side, 11 (out of 19) products detected In-The-Wild file viruses completely (100.0%). #1.2) On the better side, the ability of scanners to detect macro viruses improved significantly (by another 4.3%) to now 93.6% (mean detection rate). Again, 2 scanners detect ALL Zoo viruses, and 17 (out of 21) scanners detect ALL ITW macro viruses. This indicates that contemporary macro viruses are not technically difficult to process. #1.3) Evidently, most AV producers invest relatively more work into detection of macro viruses than of file viruses; concerning file viruses, many AV producers concentrate essentially on In-The-Wild viruses and neglect the threat of zoo viruses. There is a growing risk that threats of file viruses are underestimated! ******************************************************************** Eval #2: Evaluation of overall DOS AV detection rates: ====================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" Remark: Following the growth in virus numbers, the need for finer granularity in highest grades resulted in assigning the grade "excellent" with this test only to products which detect AT LEAST 99% in each detection category. Grades "very good" and "good" have been adapted accordingly. Eval #2.1: Overall grades of DOS scanners: ========================================== To assess an "overall AV grade" (including file and macro virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. If several scanners of the same producer have been tested, grading is applied to the most actual version (which is in most cases the version with highest detection rates). Only scanners where all tests were completed are considered. (For problems in test: see 8problms.txt). The following list indicates those scanners graded into one of the upper three categories, with file and macro virus detection rates in unpacked forms, and with perfect ITW virus detection (rate=100%): (file/macro zoo; file/macro ITW) -------------------------------- "Perfect" DOS scanners: NONE "Excellent" DOS scanners: DSS ( 99.8% 100.0%; 100.0% 100.0%) SCN ( 99.8% 100.0%; 100.0% 100.0%) AVP ( 99.7% 100.0%; 100.0% 100.0%) "Very Good" DOS scanners: FPR ( 98.7% 100.0%; 99.8% 100.0%) DRW ( 98.2% 100.0%; 98.3% 100.0%) INO ( 98.1% 100.0%; 99.8% 100.0%) FSE ( 97.6% 100.0%; 99.6% 100.0%) AVA ( 97.6% 100.0%; 95.9% 100.0%) NOD ( 96.9% 100.0%; 99.8% 100.0%) ************************************************************** Findings #2: Quality of best 3 DOS scanners not yet perfect Excellent DOS scanners: DSS, SCN and AVP ************************************************************** Finding #2.1) The overall virus detection quality of best DOS scanners has reached a very acceptable level also for viruses which are not "in-the-wild", but with an evident bias to higher macro virus detection rates. #2.2) 3 scanners - DSS, SCN and AVP - are almost perfect. ************************************************************** Eval #3: In-The-Wild Detection under DOS: ========================================= Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" The following DOS products reach 100% both for file and macro virus detection and are rated "perfect" in this category (alphabetically ordered): ( File Macro Boot) ----------------------- "Perfect" DOS ITW scanners: AVG (100.0% 100.0% 100.0%) AVP (100.0% 100.0% 100.0%) DSS (100.0% 100.0% 100.0%) FPR (100.0% 100.0% 100.0%) FSE (100.0% 100.0% 100.0%) INO (100.0% 100.0% 100.0%) NOD (100.0% 100.0% 100.0%) NVC (100.0% 100.0% 100.0%) One product reached 100% ITW detection with of file and macro viruses but miss the 100% level for ITW boot viruses: "Very good" DOS ITW scanner: AVA (100.0% 100.0% 98,7%) As macro-only product, HMV also reaches "perfect" 100% ITW detection. Several other scanners reach 100% macro ITW detection, with ITW file and boot viruses falling into lower categories: "Good" DOS ITW scanners: NAV ( 92.0% 100.0% 98.7%) ************************************************************** Findings #3: High ITW detection rates and implied risks: ************************************************************** Findings #3.1) In-The-Wild detection of best DOS scanners has been significantly improved since last test. Number of perfect scanners (in this category) has jumped from 6 to 8. #3.2) The concentration of some AV producers to reach 100% In-The-Wild detection rates is coupled to inacceptably low detection rates in overall file, macro and boot zoo viruses. ************************************************************** Eval #4: Evaluation for detection by virus classes under DOS: ============================================================= Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or as that part is significantly better than other parts). It is therefore worth notifying which scanners perform best in detecting file, boot and macro viruses. Compared to the last test (1998-10), the number of "excellent" macro virus detectors has significantly grown (as has the class of "good" ones which is not listed here); in contrast, "standard" file (and even more: boot) viruses seem to be comparably less carefully handled in product upgrading. Two special tests of file viruses were also performed to determine the quality of AV product maintenance. One test was concerned with almost 11,000 viruses generated from the VKIT virus generator. Some AV products count every of the potential 15,000 viruses as new variant while others count all VKIT viruses just as ONE virus. Fortunately, a high proportion of tested products detects these viruses (see 4.5), although reliability of detection is significantly less than normally (see 6BDOSFIL.TXT). Another special test was devoted to the detection of 10,000 polymorphic generations each of the following polymorphic viruses: Maltese.Amoeba, MTE.Encroacher.B, NATAS and TREMOR. Detection rates were "almost perfect". Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed. 4.1 Grading the Detection of zoo file viruses: ---------------------------------------------- "Perfect" DOS scanner: === NONE === "Excellent" DOS scanners: DSS ( 99.8%) SCN ( 99.8%) AVP ( 99.7%) "Very Good" DOS file scanners: FPR ( 98.7%) DRW ( 98.2%) INO ( 98.1%) AVA ( 97.6%) FSE ( 97.6%) NVC ( 98.1%) NOD ( 97.6%) 4.2 Grading the Detection of zoo macro viruses: ----------------------------------------------- "Perfect" DOS macro scanners: DSS (100.0%) SCN (100.0%) "Excellent" DOS macro scanners: AVP ( 99.8%) FPR ( 99.8%) INO ( 99.8%) NOD ( 99.8%) NAV ( 99.7%) AVK ( 99.6%) FSE ( 99.6%) HMV ( 99.5%) RAV ( 99.2%) "Very Good" DOS file scanners: DRW ( 98.3%) VET ( 97.6%) AVA ( 95.9%) 4.3 Grading the Detection of zoo boot viruses: ---------------------------------------------- "Perfect" DOS boot scanner: === NONE === "Excellent" DOS boot scanners: DSS ( 99.1%) NOD ( 99.1%) AVK ( 99.0%) FSE ( 99.0%) PAV ( 99.0%) "Very Good" DOS boot scanners: AVP ( 98.2%) NVC ( 97.8%) INO ( 96.7%) AVA ( 95.3%) 4.4 Grading of Poly-virus detection: ------------------------------------ Based on the detection data (see 6BDOSFIL.TXT, table FDOS.FA), and with additional conditions, that 1) all infected objects for all viruses were detected 2) with full reliability of identification and detection, the following products can be rated as perfect Poly-detectors: "Perfect" Poly-detectors: AVG (100.0) AVK (100.0) AVP (100.0) DRW (100.0) FPR (100.0) FSE (100.0) IRS (100.0) NAV (100.0) NOD (100.0) PAV (100.0) The following products are "almost perfect" as they reach 100% detection rate at least with rounding but with less precise diagnostics: "Almost Perfect" Poly detectors: AVA, DSS, INO, ITM, NAV, NVC, SCN, VET and VSP. 4.5 Grading of VKit virus detection: ------------------------------------ Based on detection data (see 6BDOSFIL.TXT, table FDOS.FB), and with additional conditions, that 1) all infected objects for all viruses were detected 2) with full reliability of identification and detection, NO product was "perfect" but several detected almost all samples (rounded to 100.0%) but with some unreliability of identification: "Perfect" VKIT- detectors: NONE "Almost Perfect" VKIT detectors: AVK, AVP, DSS, FPR, FSE, PAV, SCN and TSC. **************************************************************** Finding #4: Performance of DOS scanners by virus classes: Perfect scanners for macro zoo: DSS, SCN Perfect scanners for Polymorphic virus set: AVG, AVK, AVP, DRW, FPR, FSE, IRS, NAV, NOD, PAV. No perfect scanner for boot, file and VKit zoo **************************************************************** Finding #4.1) Specialised scanners (esp. those specialising on macro viruses) are not superior to best overall scanners, even concerning large collections such as VTCs "zoo" testbeds. **************************************************************** Eval #5: Detection of Packed File and Macro Viruses under DOS: ============================================================== Detection of file and macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with given popular methods (PKZIP, ARJ, LHA and RAR) are also detected. (Remark: compared to last test where detection of packed zoo viruses was tested, the test condition was reduced as "only" detection of packed ITW viruses was adressed!) Results (see 6BDOSFIL.TXT and 6DDOSMAC.TXT) are rather DISAPPOINTING: One product - AVP - is "perfect" as it detects ALL ITW file and macro viruses packed with ALL 4 methods! 2 more scanners can be rated as "excellent" as they detect at least all ITW viruses packed with 3 methods reliably. But only 8 scanners (out of 21) detected at least ONE macro virus packed with at least ONE compressing method. The following table list ALL scanners which detect file and macro viruses in objects compressed with AT LEAST TWO packing methods on an acceptable level (>80%): Packed ITW File Viruses Packed ITW Macros Viruses ( %ZIP %LHA %ARJ %RAR) ( %ZIP %LHA %ARJ %RAR) ------------------------------------------------------- "Perfect" Detection: AVP (100.0 100.0 100.0 100.0) (100.0 100.0 100.0 100.0) 3 Methods detected: DSS (100.0 100.0 100.0 0.0) ( 97.6 97.6 97.6 0.0) NOD (100.0 0.0 100.0 100.0) (100.0 0.0 100.0 100.0) SCN (100.0 100.0 100.0 0.0) (100.0 100.0 100.0 0.0) DRW ( 93.1 93.1 93.1 0.0) (100.0 100.0 100.0 0.0) 2 Methods detected: AVK (100.0 0.0 100.0 0.0) (100.0 0.0 100.0 0.0) FPR (100.0 0.0 100.0 0.0) (100.0 0.0 100.0 0.0) FSE ( 93.1 0.0 93.1 0.0) (100.0 0.0 100.0 0.0) PAV ( 93.1 0.0 93.1 0.0) (100.0 0.0 100.0 0.0) ------------------------------------------------------- Remark: Much more data were collected on precision and reliability of virus detection in packed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. ******************************************************************** Findings #5: Detection of viruses in packed objects is insufficient: Only one product is perfect: AVP ******************************************************************** Findings #5.1) Only one product = AVP = can be rated "Perfect" concerning detection of infected packed objects, at least on the level of ITW file and macro viruses. #5.2) VERY FEW products have reached an acceptable level of detecting viruses in packed infected objects with 2 or 3 compression methods. Signi- ficant investment of work is needed here. ******************************************************************** Eval #6: False-Positive Detection in Clean Files and Macros: ============================================================ First introduced in VTC test "1998-10", a set of clean (and non- malicious) objects has been added to the file and macro virus tes- beds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV productws must be signifi- cantly more rigid than that one used for detection (see Eval #2). The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate < 0.5%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" Regarding the ability of scanners to avoid FP alarms, the following AV products running under DOS reported NO SINGLE False Positive alarm BOTH in file and macro zoo testbeds and are therefore rated "perfect": FP-avoiding "perfect" DOS scanners: AVK, FSE, PAV and SCN Several DOS scanners gave NO FP alarm EITHER on clean files or macros: Perfect FP-avoidance on DOS clean file testbed: AVK, AVP, FPR, FSE, NAV, NVC, PAV, PRO, SCN and TSC. Perfect FP-avoidance on DOS clean macro file testbed: AVA, AVK, DSS, FSE, PAV and SCN. Comparing related results with behaviour of 32-bit scanner engines and esp. using results produced under Win-NT, there are (different from last test where DSS was rated "perfect") 2 AV products which avoid ANY FP alarm in both clean file and macro objects BOTH for DOS and Win-NT: FP-avoiding "perfect" Win-NT scanner: AVK and SCN! Several more W-NT scanners also gave NO FP alarm on clean files or macros: Perfect FP-avoidance under Win-NT for clean file testbed: AVK, AVP, FPR, FSE, NAV, NVC, PAV, PRO, RAV, RA7 and SCN. Perfect FP-avoidance on Win-NT clean macro file testbed: AVA, AVK, DSS and SCN. Evidently, avoidance of false-positive alarms is less advanced for macro viruses (see 6DDOSMAC.TXT, table FDOS.M4). Concerning avoidance of False-Positive alarms both under DOS AND Windows-NT, only two products are rated "perfect": AVK and SCN! **************************************************************** Findings #6: Avoidance of False-Positive Alarms is insufficient. FP-avoiding perfect DOS scanners: AVK, FSE, PAV,SCN FP-avoiding perfect W-NT scanners: AVK, SCN **************************************************************** Findings #6.1) VERY FEW products reliably avoid ANY False Positive alarm on clean file and macro objects, either under DOS and Win-NT. #6.2) Only 2 products avoid ANY false-positive alarm BOTH under DOS and Windows-NT: !AVK and SCN! #6.3) Overall, the quality of FP-avoidance has dege- nerated since last test: the number of false- positive alarms is significantly LARGER compared to the last test, whereas the testbed was deliberately NOT CHANGED. #6.4) AV producers should intensify work to avoid FP alarms. ***************************************************************** Eval #7: Evaluation of File and Macro Malware detection (DOS/W-NT): =================================================================== Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be warned and protected not only about viruses but also about other malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Regrettably, consciousness of AV producers to protect their users against related threats is still underdevelopped. Manifold arguments are presented why AV products are not the best protection against non-viral malware; from a technical point, these arguments may seem conclusive but at the same time, almost nothing is done to support customers with adequate AntiMalware software. On the other side, AV methods (such as scanning for presence or absence of characteristic features) are also applicable - though not ideal - to detect non-viral malware. Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. A growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of file and macro malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" As in last test, still NO product can be rated a "perfect AM detector" but now, 4 scanners under DOS AND W-NT, and 6 scanners under W-NT are graded "excellent": ===== Malware Detection ===== = under DOS == = under W-NT = (File/Macro-mw;File/Macro-mw) ----------------------------------- "Excellent" DOS/W-NT scanners: DSS (97.5% 98.6%; 97.6% 98.6%) SCN (97.2% 97.9%; 96.7% 98.6%) AVK (94.9% 95.8%; 94.8% 95.8%) PAV (94.8% 94.4%; 94.9% 94.4%) ----------------------------------- AVP (88.3% 95.8%; 94.9% 91.5%) FSE (88.7% 95.8%; 99.4% 98.6%) ----------------------------------- Moreover, the following scanners reach 90%-detection either for file or macro malware for at least one operating system (DOS or Win-NT): "Excellent" scanners in at least one category/under one OS: AVA, AVK, AVP, AVX, DSS, FPR/FMA, FSE, FWN, HMV, INO, IRS, NAV, NOD, NVC, PAV, RA7, RAV and SCN. ************************************************************** Findings #7: AntiMalware detection under DOS/W-NT improving No "perfect" but 4 "excellent" AM products: for DOS and W-NT: DSS, SCN, AVK, PAV for W-NT only: AVP FSE ************************************************************** Findings #7.1: The ability of AV products to detect also non-viral malware is improving. Now, 6 products detect file/macro malware on a 90% level either under DOS or under W-NT (4 under both). #7.2: Evidently, AV producers invest more work into macro malware detection, where 13 (under DOS) and 18 (under W-NT) detect macro malware on a 90% level. #7.3: With continuing growth of malware testbeds, AV producers are well advised to improve their products also in this area. ************************************************************** Eval #8: Overall virus detection rates under Windows-98: ======================================================== The number of scanners running under Windows 98 is fastly growing (last test: 15 products; this test: 23 products). The following table summarizes results of file and macro virus detection under Windows 98 from last test ("1998-10" where W-98 was first tested) with this test, including relative improvement (DELTA) since last test: Table AB: Comparison: File/Macro Virus Detection Rate in last 2 VTC tests under Windows 98: =========================================================== File Virus Detection Macro Virus Detection SCAN 98/10 99/03 DELTA 98/10 99/03 DELTA NER % % % % % % ----------------------------------------------------------- ACU - - - - 97.6 - ANT 91.3 - - 84.3 - - ANY - - - 70.7 - - AVA 96.6 97.6 +1.0 96.7 95.9 -0.8 AVG - 87.3 - - 82.5 - AVK 99.6 90.8 -8.8 99.6 99.6 0.0 AVP 99.9 99.9 0.0 100.0 99.2 -0.8 AVX - 74.2 - - - - DSS 99.9 99.9 0.0 100.0 100.0 0.0 DRW - - - - 98.3 - DWW - 89.5 - - - - FPR/FMA - 93.9 - 92.4 99.8 +6.4 FSE 99.8 100.0 +0.2 100.0 100.0 0.0 FWN - - - 99.6 99.7 +0.1 HMV - - - - 99.5 - IBM 92.8 * * 94.5 * * INO 93.5 98.1 +4.6 88.1 99.8 +10.7 IRS 96.7 97.6 +0.9 99.0 99.5 +0.5 ITM - 64.2 - - 70.9 - IVB - - - 92.8 95.0 +2.2 NAV - 96.8 - 95.3 99.7 +2.4 NOD - 97.6 - - 99.8 - NVC 93.6 97.0 +3.4 - 99.1 - PAV 98.4 99.9 +1.5 99.5 99.5 0.0 PCC - 81.2 - - 98.0 - PRO - 37.3 - - 58.0 - RAV 84.9 - - 92.2 - - SCN 86.6 99.8 +13.2 97.7 100.0 +2.3 SWP 98.4 - - 98.6 - - TBA 92.6 * * 98.7 * * TSC - 55.3 - - 76.5 - VBS - - - 41.5 - - VBW - 26.5 - - 93.4 - VET - 66.3 - - 97.6 - VSP - 86.4 - - 0.4 - ----------------------------------------------------------- Mean 95.0% 84.2% +1.6% 92.1% 90.3% +1.8% ----------------------------------------------------------- Generally, the ability of W-98 scanners to detect file and macro viruses "in the mean" has only moderately improved (+1,6 and 1.9%, respectively). On the other side, some products reached impressive improvements such as SCN (file:+13.2, macro:+2.3), INO (file:+4.6, macro:+10.7) and FPR/FMA (macro:+6.4). The same grid as for the DOS classification is applied to classify scanners according to their ability to detect file and macro viruses under Windows 98. This time, one product - FSE - reached 100% detection rate for both file and macro (zoo) viruses and falls therefore under category "Perfect W-98 scanner". 6 scanners reach grade "Excellent" (>99% detection), and 7 more scanners are rated "very good" (>95%): "Perfect" Windows 98 scanners: FSE (100.0% 100.0%) "Excellent" Windows 98 scanners: DSS ( 99.9% 100.0%) SCN ( 99.8% 100.0%) PAV ( 99.9% 99.5%) AVP ( 99.9% 99.2%) "Very Good" Windows 98 scanners: DWW ( 98.2% 98.2%) INO ( 98.1% 99.8%) NOD ( 97.6% 99.7%) IRS ( 97.6% 99.5%) AVA ( 97.6% 95.9%) NVC ( 97.0% 99.1%) NAV ( 96.8% 99.7%) "Good" Windows 98 scanners: FPR ( 93.9% 99.8%) AVK ( 90.8% 99.6%) For detection of macro viruses under Windows 98, the following 18 scanners detect at least 95% of zoo macro viruses: DSS, FSE and SCN (100%); INO and NOD (99.8%), FWN and NAV (99.7%), AVK (99.6%), IRS, HMV and PAV (99.5%), AVP (99.2%), NVC (99.1%); DRW (98.3%), DWW (98.2), PCC (98.0%), ACU (97.6%), AVA (95.9%), IVB (95.0%). ************************************************************** Findings #8: Virus detection rates under W-98 on high level ************************************************************** Findings #8.1: Detection rates for file and esp. macro viruses for scanners under Windows 98 have reached a fairly high level: Perfect scanner (100%): 1 Excellent scanners (>99%): 4 Very Good scanners (>95%): 7 Findings #8.2: AV producers should invest more work into file virus detection, esp. into VKIT virus detection (where results are not equally promising) as well as detection of viruses in virus detection in compressed objects (essential for on-access scanning). ************************************************************** Eval #9: Overall virus detection rates under Windows-NT: ======================================================== The number of scanners running under Windows NT is still small, though growing. Significantly less products were available for these tests, compared with the traditional DOS scene. The following table summarizes results of file and macro virus detection under Windows-NT in test "1998-10" as compared with last one (1998-02): Scan ==== File Virus Detection ==== === Macro Virus Detection === ner 97/07 98/02 98/10 99/03 Delta 97/07 98/02 98/10 99/03 Delta ------------------------------------------------------------------- ANT 88.9 69.2 91.3 - - 92.2 - 85.7 - - ANY - - 69.7 - - - - 70.5 - - AVA - 97.4 96.6 97.1 +0.5 - 91.9 97.2 95.2 -2.0 AVG - - - 87.3 - - - - 82.5 - AVK - - 99.6 90.2 -9.4 - - 99.6 99.6 0.0 AVP - - 83.7 99.9 +16.2 - - 100.0 99.2 -0.8 AVX - - - 74.2 - - - - 98.9 - AW - 56.4 - - - - - 61.0 - - DRW - - - 93.3 - - - - 98.3 - DWW - - - - - - - - 98.2 - DSS 99.6 99.7 99.9 99.3 -0.6 99.0 100.0 100.0 100.0 0.0 FPR/FMA - 96.1 - 98.7 - - 99.9 99.8 99.8 0.0 FSE - 85.3 99.8 100.0 +0.2 - - 99.9 100.0 +0.1 FWN - - - - - - - 99.6 99.7 +0.1 HMV - - - - - - - 99.0 99.5 +0.5 IBM 95.2 95.2 77.2 * * 92.9 92.6 98.6 * * INO - 92.8 - 98.1 - - 89.7 - 99.8 - IRS - 96.3 - 97.6 - - 99.1 - 99.5 - IVB - - - - - - - 92.8 95.0 +2.2 NAV 86.5 97.1 - 98.0 - 95.6 98.7 99.9 99.7 -0.2 NOD - - - 97.6 - - - - 99.8 - NVC 89.6 93.8 93.6 96.4 +2.8 96.6 99.2 - 98.9 - PAV 97.7 98.7 98.4 97.2 -1.2 93.5 98.8 99.5 99.4 -0.1 PRO - - - 37.3 - - - - 58.0 - RAV - 81.6 84.9 85.5 +0.6 - 98.9 99.5 99.2 -0.3 RA7 - - - 89.3 - - - - 99.2 - PCC 63.1 - - - - - 94.8 - - - PER - - - - - - 91.0 - - - SCN 94.2 91.6 71.4 99.1 +27.7 97.6 99.1 97.7 100.0 +2.3 SWP 94.5 96.8 98.4 - - 89.1 98.4 97.5 - - TBA - 93.8 92.6 * * 96.1 - 98.7 * * TNT - - - * * - - 44.4 * * VET 64.9 - - 65.4 - - 94.0 - 94.9 - VSA - 56.7 - - - - 84.4 - - - VSP - - - 87.0 - - - - 86.7 - ------------------------------------------------------------------- Mean: 87.4% 88.1% 89.0% 89.2% +4.0% 94.7% 95.9% 91.6% 95.3% +0,1% ------------------------------------------------------------------- Generally, the ability of W-NT scanners to detect file viruses "in the mean" was only moderately improved (+0.2%), but those products participating in last test improved by an impressing 4.0 %. On the other side, "mean" macro virus detection improved by 3.7% to now 95.3% (though not as good as in "1998-02"!), but products from last test improved only by 0.1%. Indeed, one scanner reached impressive improvements: SCN (file:+27.7%, macro:+2.3). The same grid as for the DOS and W-98 classification is applied to grade scanners according to their ability to detect file and macro viruses under Windows NT. As under W-98, one product - FSE - reached 100% detection rate for both file and macro (zoo) viruses and falls therefore under category "Perfect W-NT scanner". 3 scanners reach grade "Excellent" (>99% detection), and 9 scanners are rated "very good" (>95%): "Perfect" Windows NT scanners: FSE (100.0% 100.0%) "Excellent" Windows NT scanners: DSS ( 99.3% 100.0%) SCN ( 99.1% 100.0%) AVP ( 99.9% 99.2%) "Very Good" Windows NT scanners: FPR ( 98.7% 99.8%) DWW ( 98.2% 98.2%) INO ( 98.1% 99.8%) NAV ( 98.0% 99.7%) NOD ( 97.6% 99.8%) IRS ( 97.6% 99.5%) PAV ( 97.2% 99.4%) AVA ( 97.1% 95.2%) NVC ( 96.4% 98.9%) "Good" Windows NT scanners: AVK ( 90.2% 99.6%) For detection of macro viruses under Windows NT, the following 21 scanners detect at least 95% of zoo macro viruses: Perfect: DSS, FSE and SCN (100%); Excellent: FPR, INO and NOD (99.8%), FWN and NAV (99.7%), AVK (99.6%), HMV and IRS (99.5%), PAV (99.4%), AVP, RAV and RA7 (99.2%); Very Good: AVX and NVC (98.9%), DRW (98.3%), DWW (98.2%), AVA (95.2%), IVB (95.9%). ************************************************************** Findings #9: Virus detection rates under W-NT on high level One "perfect" Windows-NT zoo scanner: FSE More: 3 "excellent" and 9 "Very Good" scanners. ************************************************************** Findings #9.1: Detection rates for file and esp. macro viruses for scanners under Windows NT have reached a fairly high level, similar to W-98: Perfect scanner (100%): 1 Excellent scanners (>99%): 3 Very Good scanners (>95%): 9 #9.2: AV producers should invest more work into file virus detection, esp. into VKIT virus detection (where results are not equally promising) as well as detection of viruses in virus detection in compressed objects (essential for on-access scanning). ************************************************************** Eval #10: File/Macro Virus detection under 32-bit engines: ========================================================= Concerning 32-Bit engines as used in Windows-98 and Windows-NT, it is interesting to test the validity of the hypothesis that related engines produce same detection and identification quality. (For details see 6HCOMP32.TXT). When comparing results from related tests, it is interesting to observe that identical detection on the file/macro zoo level are presently achieved only for few (5 out of 20) products at least "very good" quality (>95%) both for zoo file and macro viruses: Equal results for W-98 and W-NT for zoo file AND macro viruses: -------------------------------- FSE (100.0% 100.0%) AVP ( 99.9% 99.2%) DWW ( 98.2% 98.2%) NOD ( 97.6% 99.8%) IRS ( 97.6% 99.5%) If only looking at zoo macro virus detection only, 12 (out of 20) products achieved at least "very good" quality (>95%): Equal results for W-98 and W-NT for zoo macro viruses: -------------------------------- Perfect: DSS, FSE and SCN (100.0%); Excellent: FPR, INO and NOD (99.8%), FWN and NAV (99.7%), AVP (99.6%), IRS (99.5%), AVP (99.2%); Very Good: DWW (98.2%), IVB (95.0%) BTW: all these scanners detect In-The-Wild file and macro viruses on the 100% level. ***************************************************************** Findings #10: Only few W-32 scanners perform equally on W-98/W-NT Best uniform W-32 performance: FSE,AVP,DWW,NOD,IRS ***************************************************************** Findings #10.1: The assumption that 32-bit engines in scanners produce the same detection rate for different instantiations of 32-bit operating systems (esp. for Windows-98 and Windows-NT) holds only for 5 scanners. Findings #10.2: Analysis of ITW detection rates is NOT sufficient to determine the behaviour of 32-bit engines and does not guarantee equal detection rates for different W-32 platforms (esp. W-98/W-NT). ***************************************************************** Eval #11: Evaluation for malware virus detection under Windows 98/NT: ===================================================================== With Windows 98 and Windows-NT often used for downloading potentially hazardous objects from Internet, it is interesting to measure the ability of AntiVirus products to also act as AntiMalware products. The same grid is applied as to grading of DOS AM products (see Eval #7). Presently, NO AV products can be graded "Perfect" (all rates 100.0%) but 6 scanners perform as AM products with grade "Excellent" (>90%), both for W-98 and W-NT, as the following table show: Detection of File Malware Detection of Macro Malware Scanner Win-98 Win-NT Win-98 Win-NT ------------------------------------------------------------------ FSE 99.3% 99.4% 98.6% 98.6% AVK 94.8% 94.9% 95.8% 95.8% DSS 97.6% 97.6% 98.6% 98.6% SCN 97.3% 96.7% 98.6% 98.6% PAV 94.9% 94.9% 94.4% 94.4% AVP 94.9% 94.9% 91.5% 91.5% ------------------------------------------------------------------- Detection of macro malware is evidently better supported than file malware detection, as several more AV products detect macro malware under W-98 (14 products) and W-NT (16 products). As no products was rated "perfect", the following products reached a "very good" level of macro malware detection under both Win-98 and Win-NT: Detection of macro malware under W-98/W-NT ------------------------------------------ DSS, FSE, SCN (98.6% 98.6%) FPR (97.9% 97.9%) FWN HMV NOD (96.5% 96.5%) AVK INO (95.8% 95.8%) IRS (95.1% 95.1%) PAV (94.4% 94.4%) AVP (91.5% 91.5%) NAV (90.8% 91.5%) NVC (90.1% 90.1%) ------------------------------------------ Evidently, some AV products are able to help protecting users by detecting file and macro-related malware at a significant level. Fortunately, related products also show good to excellent results in detecting viral malware. *************************************************************** Findings #11: AntiMalware quality of AV products is developping No "perfect" AM product for W-98 and W-NT But 6 "excellent" AM products: FSE, AVK, DSS, SCN, PAV, AVP *************************************************************** Findings #11.1: Some AntiMalware producers help customers detecting also non-viral malware under 32-bit operating systems, esp. Win-98 and Win-NT. Findings #11.2: The ability to detect macro malware is more developped than detection of file malware. Findings #11.2: Much more work must be invested to reliably detect file and macro malware and protect customers from downloading trojans etc. ***************************************************************