========================================= File 7EVAL.TXT Evaluation of VTC Scanner Test "2000-08": ========================================= Formatted with non-proportional font (Courier) Content of this file: ===================== Eval #1: Development of Macro+Script Virus Detection Rates under DOS Eval #3: In-The-Wild Macro Detection under DOS Eval #5: Detection of Packed Macro Viruses (DOS/W-98/W-NT/W-2k) Eval #6: False Positive Detection in Clean Macros (DOS/W-98/W-NT/W-2k) Eval #7: Evaluation of Macro Malware detection (DOS/W-98/W-NT/W-2k) Eval #8: Macro and Script Virus Detection Rates under Windows-98 Eval #9: Macro and Script Virus Detection Rates under Windows-NT Eval #10: Macro and Script Virus Detection Rates under Windows-2000 Eval #11: Macro Virus detection under 32-bit engines Remark: paragraph numbering matches report structure of "full" VTC tests, which also include DOS/boot viruses and file viruses (previous test: "1999-09", next test: "2001-02"). This part of VTC "2000-08" test report evaluates the detailed results as given in sections (files): 6DDOSMAC.TXT Macro+Script Viruses/Malware results DOS 6FW98.TXT Macro+Script Viruses/Malware results W-98 6GWNT.TXT Macro+Script Viruses/Malware results W-NT 6IW2k.TXT Macro+Script Viruses/Malware results W-2k 6HCMP32.TXT Comparison Macro+Script results W-98/NT/2k Eval #1: Development of DOS Scanner Detection Rates: ==================================================== Concerning performance of DOS scanners, a comparison of virus detection results in previous 8 tests ("1997-02" until "2000-04") with "2000-08" shows how scanners behave and how manufacturers work in adapting their products to the growing threat of new viruses and malware. Table E1 lists the development of the detection rates of scanners (most current versions in each test), and it calculates the change ("+" indicating improvement) in detection rates between last (2000-04) and the actual test (2000-07). Finally, the "mean change" both in absolute and relative improvement of detection rates also is given (last row of table E1). This comparison concentrates on macro virus detection quality, with script virus detection added as new feature in the same table. In this test, neither detection of file nor of boot viruses was tested. For reasons of fairness, it must be noted that improvement of those products which have already reached a very high level of detection and quality (say: more than 95%) is much more difficult to achieve than for those products which reached lower detection rates. Some products have incorporated new engines (esp. for 32-bit platforms) and included formerly separate scanners (e.g. on macro viruses) which lead to improved performance. Generally, changes in the order of about +-1.5% are less significant as this is about the growth rate of new viruses per month, so detection depends strongly upon whether some virus is reported (and analysed and included) just before a new update is delivered. Table E1: Improvement of DOS scanners from 1997-02 to 2000-08: ============================================================== Detection: Script --------- File Virus ------------- + ---------------- Macro Virus --------------- +Virus SCAN9702 9707 9802 9810 9903 9909 0004 I 9702 9707 9802 9810 9903 9909 0004 0007 Diff I 0007 NER % % % % % % % I % % % % % % % % % I % ---------------------------------------+----------------------------------------------+----- ALE 98.8 94.1 89.4 - - - - I 96.5 66.0 49.8 - - - - - - I - ANT 73.4 80.6 84.6 75.7 - - 92.8 I 58.0 68.6 80.4 56.6 - - 85.9 93.3 7.4 I 55.2 AVA 98.9 97.4 97.4 97.9 97.6 97.4 97.5 I 99.3 98.2 80.4 97.2 95.9 94.6 93.7 - - I - AVG 79.2 85.3 84.9 87.6 87.1 86.6 - I 25.2 71.0 27.1 81.6 82.5 96.6 - - - I - AVK - - - 90.0 75.0 - - I - - - 99.7 99.6 - - 100~ - I 91.5 AVP 98.5 98.4 99.3 99.7 99.7 99.8 99.6 I 99.3 99.0 99.9 100% 99.8 100% 99.9 - - I - CMD - - - - - - 99.5 I - - - - - 99.5 100% 100~ 0.0 I 93.5 DRW 93.2 93.8 92.8 93.1 98.2 98.3 - I 90.2 98.1 94.3 99.3 98.3 - 98.4 97.6 -0.8 I 60.8 DSE 99.7 99.6 99.9 99.9 99.8 - - I 97.9 98.9 100% 100% 100% - - - - I - FMA - - - - - - - I 98.6 98.2 99.9 - - - - - - I - FPR 90.7 89.0 96.0 95.5 98.7 99.2 99.6 I 43.4 36.1 99.9 99.8 99.8 99.7 100% 100~ 0.0 I 90.5 FSE - - 99.4 99.7 97.6 99.3 99.9 I - - 99.9 90.1 99.6 97.6 99.9 - - I - FWN - - - - - - I 97.2 96.4 91.0 85.7 - - - - - I - HMV - - - - - - I - - 98.2 99.0 99.5 - - - - I - IBM 93.6 95.2 96.5 - - - - I 65.0 88.8 99.6 - - - - - - I - INO - - 92.0 93.5 98.1 94.7 94.6 I - - 90.3 95.2 99.8 99.5 99.7 99.7 0.0 I 77.8 IRS - 81.4 74.2 - 51.6 - - I - 69.5 48.2 - 89.1 - - - - I - ITM - 81.0 81.2 65.8 64.2 - - I - 81.8 58.2 68.6 76.3 - - - - I - IVB 8.3 - - - 96.9 - - I - - - - - - - - - I - MR2 - - - - - 65.4 - I - - - - - 69.6 - - - I - NAV 66.9 67.1 97.1 98.1 77.2 96.0 93.3 I 80.7 86.4 98.7 99.8 99.7 98.6 97.4 97.0 -0.4 I 24.8 NOD - - - 96.9 - 96.9 98.3 I - - - - 99.8 100% 99.4 - - I - NVC 87.4 89.7 94.1 93.8 97.6 - 99.1 I 13.3 96.6 99.2 90.8 - 99.6 99.9 99.9 0.0 I 83.7 PAN - - 67.8 - - - - I - - 73.0 - - - - - - I - PAV - 96.6 98.8 - 73.7 98.8 98.7 I - - 93.7 100% 99.5 98.8 99.9 - - I - PCC - - - - - - - I - 67.6 - - - - - - - I - PCV 67.9 - - - - - - I - - - - - - - - - I - PRO - - - - 35.5 - - I - - - - 81.5 - - - - I - RAV - - - 71.0 - - - I - - - 99.5 99.2 - - - - I - SCN 83.9 93.5 90.7 87.8 99.8 97.1 99.9 I 95.1 97.6 99.0 98.6 100% 100% 100% 100~ 0.0 I 85.6 SWP 95.9 94.5 96.8 98.4 - 99.0 98.4 I 87.4 89.1 98.4 98.6 - 98.4 98.4 - - I - TBA 95.5 93.7 92.1 93.2 - - - I 72.0 96.1 99.5 98.7 - - - - - I - TSC - - 50.4 56.1 39.5 51.6 - I - - 81.9 76.5 59.5 69.6 - - - I - TNT 58.0 - - - - - - I - - - - - - - - - I - VDS - 44.0 37.1 - - - - I 16.1 9.9 8.7 - - - - - - I - UKV I 0.0 - I 0.0 VET - 64.9 - - 65.3 - - I - 94.0 97.3 97.5 97.6 - - - - I - VIT - - - - - - 7.6 I - - - - - - - - - I - VRX - - - - - - - I - - - - - - - - - I - VBS 43.1 56.6 - 35.5 - - - I - - - - - - - - - I - VHU 19.3 - - - - - - I - - - - - - - - - I - VSA - - 56.9 - - - - I - - 80.6 - - - - - - I - VSP - - - 76.1 71.7 79.6 - I - - - - - - - - - I - VSW - - 56.9 - - - - I - - 83.0 - - - - - - I - VTR 45.5 - - - - - - I 6.3 - - - - - - - - I - XSC 59.5 - - - - - - I - - - - - - - - - I - ---------------------------------------+----------------------------------------------+----- Mean74.2 84.8 84.4 85.4 81.2 90.6 98.3 I 69.6 80.9 83.8 89.6 93.6 88.2 98.0 98.6 +.7 I 66.4 --------------------------------------------------------------------------------------+----- Remark: since test "2000-07", "100.0%" or "100%" denotes "exactly 100%", whereas "100.0" or "100.0~" denotes "100% rounded-up". Generally, the ability of DOS scanners to macro zoo viruses "in the mean" is further improved from 98% to 98.6%) and has reached a rather high level. While NO product is perfect With 100% detection both of ALL ITW viruses in ALL instantiations (files), the following grid is applied to grade macro virus detection under DOS (as well as Windows-based platforms): 1) detection rate of ITW macro viruses: 100% (perfect) mandatory 2) detection rate of ITW macro objects: 100% (perfect) mandatory 3) detection rate of zoo macro viruses: 100% (perfect) or 100~ (almost perfect) or >99% (excellent) 4) detection rate of zoo macro objects: 100% (perfect) or 100~ (almost perfect) or >99% (excellent) -- Macro ITW ---- Macro Zoo -- (virus files) (virus files) ----------------------------------- "Perfect" DOS scanner: NONE ----------------------------------- "Almost Perfect" DOS scanners: CMD (100% 100%) (100~ 100~) SCN (100% 100%) (100~ 100~) ----------------------------------- "Excellent" DOS scanners: AVK (100% 100%) (100~ 99.9) FPR (100% 100%) (100~ 99.9) NVC (100% 100%) (99,9 99,8) INO (100% 100%) (99,7 99,7) ----------------------------------- HOWEVER: Detection rates for script viruses (even on a small collection of VBS, mIRC and JavaScript viruses) is FAR FROM ACCEPTABLE, with mean detection just at 66.4% (BTW: rates are even lesser on Windoes platforms). Grading of script virus detection under DOS: -------------------------------------------- 1 products reaches "very good" detection (>95%): CMD (95.3) AND 2 products reach "good" detection (>90%): AVK (93,5), FPR (91,8) *************************************************************************** Findings #1: AV companies seem to be less interested in DOS scanners which have reached a very high level of macro virus detection, with 4 products detecting "almost all" zoo viruses (though not "exactly all"). The detection rate for script viruses (esp. VBS, mIRC and JavaScript) is inacceptably low and needs more work. *************************************************************************** Findings #1.1) The number of products submitted for DOS tests is now down to 10; with proliferation of Windows platforms, DOS scanners become less relevant, and AV companies concentrate on other platforms. Findings #1.2) Detection rates for zoo macro viruses have reached a very high level (mean: 98.6), with 4 products detecting "almost all" macro viruses: AVK, CMD, FPR, SCN; and 2 detect also "almost all" instantiations: CMD and SCN. Findings #1.3) HOWEVER: detection rates of script viruses (even a small collection of VBS, mIRC and JavaScript viruses) is FAR FROM acceptable, with mean detection rate as low as 66.4%: NO products is "perfect" or even "very good" (>95%) 3 products are "good": CMD (93,5), AVK (91,5), FPR (90.5) As related threats grow significantly, much more work must be invested. *************************************************************************** Evaluation grid for detection rates: ==================================== Generally, the following grid is applied to classify scanners concerning their ability to detect any kind of zoo viruses and malware: - detection rate =100.0% : scanner is "perfect" - detection rate ~100.0 : scanner is "almost perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" Besides grading products in related categories according to their performance, it is interesting to compare how products developed. In comparison with previous results (VTC test "2000-04"), it is notified whether some product remained in the same category (=), improved into a higher category (*) or lost some grade (-). Eval #3: In-The-Wild Detection under DOS: ========================================= Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100.0% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild macro viruses is now absolute requirement. Out of 10 products submitted for DOS tests, *** 7 *** products detect all ITW macro viruses in all instantiations: AVK, CMD, DRW, FPR, INO, NVC and SCN. In comparison with last test (VTC "2000-04") where 11 (out of 14) products detected ALL ITW viruses (including macro AND file ones), the new result seems to indicate that AV companies have reached a high level but seem to loose interest in this platform. At the time of testbed freezing, only very few script viruses were reported "In-The-Wild". These esp. included 2 JavaScript viruses (JS/Kak.A, JS/Cleaner.A) and 2 VBS viruses (VBS/BubbleBoy.A, VBS/Freelink.A). With only few samples available, it was regarded as inadequate to publish ITW-detection results of script viruses for any platform (results were slightly better than zoo results, see result tables). ************************************************************** Findings #3: ITW macro detection rates rather perfect! ************************************************************** Findings #3.1) 7 (of 10) products detect all ITW macro viruses in all instantiations (samples): AVK, CMD, DRW, FPR, INO, NVC and SCN Findings #3.2) AV companies seem to loose interest in DOS AV products. ************************************************************** Eval #5: Detection of Packed Macro Viruses (DOS/W-98/W-NT/W-2k): ================================================================ Detection of macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with given popular methods (PKZIP, ARJ, LHA and RAR) are also detected. Tests are performed only on In-The-Wild viruses packed once (no recursive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially bee unchanged to ease comparison and improvement. Results (see 6DDOS.TXT, 6FW98.TXT, 6GWNT.TXT and 6IW2K.TXT) are AGAIN rather DISAPPOINTING, esp. as we have to report major problems of products in scanning the whole testbed (although not very large), as reported in 8PROBLMS.TXT. The following grid is applied for evaluating AV products concerning detection of packed viruses: Category #1: "Perfect packed macro virus detector": =================================================== A "perfect" product would detect ALL macro viruses in all instan- tiations packed with ALL 4 packers (ZIP, LHA, ARJ and RAR): --------------------------------------------------------------------- "Perfect" packed DOS macro virus detectors: AVK,CMD,SCN "Perfect" packed W-32 macro virus detectors: AVK,AVP,AVX,CMD,PAV,SCN ---------------------------------------------------------------------- Category #2: "Excellent" packed macro virus detector: ===================================================== A "excellent" product would detect ALL macro viruses in ALL instan- tiations, packed with 3 packers: ---------------------------------------------------------------------- "Excellent" packed DOS macro virus detector: --- "Excellent" packed W-32 macro virus detector: AVG,NAV ---------------------------------------------------------------------- Category #3: "Very Good" packed macro virus detector: ===================================================== A "Very good" product would detect ALL macro viruses in ALL instan- tiations, packed with 3 packers: ---------------------------------------------------------------------- "Very good" packed DOS macro virus detector: FPR,INO "Very Good" packed W-32 macro virus detector: FPW,INO,RAV,NVC ---------------------------------------------------------------------- Remark: Much more data were collected on precision and reliability of virus detection in packed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. ********************************************************************** Findings #5: Detection of packed macro viral objects significantly improved on all platforms! *********************************************************************** Findings #5.1) "Perfect" DOS products: AVK,CMD,SCN "Excellent" DOS products: --- "Very Good" DOS products: FPR,INO #5.2) "Perfect" W-32 products: AVK,AVP,AVX,CMD,PAV,SCN "Excellent" W-32 products: AVG,NAV "Very Good" W-32 products: FPW,INO,RAV,NVC #5.3) There is still need for improvement of several products. ********************************************************************** Eval #6: False-Positive Detection in Clean Macros for all platforms: ==================================================================== First introduced in VTC test "1998-10", a set of clean (and non- malicious) objects has been added to the file and macro virus tes- beds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be signifi- cantly more rigid than that one used for detection (see Eval #2). The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate < 0.5%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" Regarding the ability of scanners to avoid FP alarms, it is interesting to compare results for the DOS platform and 32-bit platforms (W-98, W-NT and W-2k). The following AV products running reported NO SINGLE False Positive alarm for the macro zoo testbed and are therefore rated "perfect": ------------------------------------------------------------------------ FP-avoiding "perfect" DOS scanners: ANT(=),AVK(+),INO(+),SCN(=) FP-avoiding "perfect" W-NT scanners: AVA,AVG,AVK,AVP,PAV,PRO,SCN FP-avoiding "perfect" W-98 scanners: AVA,AVG,AVK,AVP,DSE,INO,PAV,PRO,SCN FP-avoiding "perfect" W-2k scanners: ANT,AVA,AVG,AVK,CLE,INO,PRO,SCN ------------------------------------------------------------------------ FP-avoiding "perfect" product/ALL platforms: AVK, INO, SCN ------------------------------------------------------------------------ Remark: moreover, QHL has zero FP rate but on insufficient level of macro virus detection. ********************************************************************* Findings #6: Avoidance of False-Positive Alarms is improving. ********************************************************************* Findings #6.1) Only 2 products avoid ANY False on ALL platforms: AVK, SCN #6.2)Distinguishing platforms: FP-avoiding perfect DOS scanners: ANT, AVK, INO, SCN FP-apoiding perfect W-98 scanners: AVA, AVG, AVK, AVP, DSE, INO, PAV, PRO, SCN FP-avoiding perfect W-NT scanners: AVA, AVG, AVK, AVP, PAV, PRO, SCN FP-apoiding perfect W-98 scanners: ANT, AVA, AVG, AVK, CLE, INO, PRO, SCN #6.3) The number of scanners avoiding FP alarms on specific platforms has increased since last test. #6.4) AV producers should intensify work to avoid FP alarms. ********************************************************************** Eval #7: Evaluation of Macro Malware detection for all platforms: ================================================================= Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be also warned about and protected from non-viral and non-wormy malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Regrettably, consciousness of AV producers to protect their users against related threats is still underdeveloped. Manifold arguments are presented why AV products are not the best protection against non-viral malware; from a technical point, these arguments may seem conclusive but at the same time, almost nothing is done to support customers with adequate AntiMalware software. On the other side, AV methods (such as scanning for presence or absence of characteristic features) are also applicable - though not ideal - to detect non-viral malware. Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. A growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of file and macro malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" We are glad to observe that 2 scanners now detect ALL malware objects "perfectly" (=100.0%) on ALL platforms, and that 2 more detect them at least under W-98 and W-NT (though surprisingly not under W-NT): 2 products detect macro malware ALL platforms "perfectly": CMD, FPW/FPW 2 more products detects macro malware under W-NT and W-2k: FSE, SCN ----- Macro Malware Detection ------ ==DOS== ==W-NT== ==W-98== ==W-2k== FPR/FPW: 100% 100% 100% 100% CMD 100% 100% 100% 100% FSE ---- 99.7 100% 100% SCN 98.8 97.0 100% 100% ------------------------------------- *************************************************************** Findings #7: Macro Malware detection on ALL platforms is slowly improving. 2 perfect macro malware detectors under ALL platforms: FPR/FPW, CMD 2 more perfect macro malware detectors und W-NT and W-2k: FSE, SCN **************************************************************** Findings #7.1: The ability of AV products to detect also non-viral malware is improving for macro malware. #7.2: 2 products are "perfect" for all platforms: FPR/FPW, CMD #7.3: 2 more products are "perfect" for commercial platforms (W-NT, W-2k): FSE, SCN #7.4: With continuing growth of malware testbeds and growing threats to customers, AV producers MUST improve their products also in this area. ***************************************************************** Eval #8: Overall virus detection rates under Windows-98: ======================================================== The following table summarizes results of file and macro virus detection under Windows 98 since 1998/10, including relative improvement (DELTA) from last test to current results: Table AB: Comparison: File/Macro/Script Virus Detection Rate in last 5 VTC tests under Windows 98: ============================================================ Detection of: -Script- Scan --------- File Virus ------- + --------- Macro Virus -------- + Virus- SCAN 9810 9903 9909 0004 DELTA I 9810 9903 9909 0004 0008 DELTA I 0008 NER % % % % % I % % % % % % I % -----------------------------------+---------------------------------------- ACU - - - - - I - 97.6 - - - - I - AN5 - - 87.2 - - I - - 89.3 - - - I - ANT 91.3 - 86.5 92.8 - I 84.3 - 89.5 90.2 96.4 6.2 I 55.2 ANY - - - - - I 70.7 - - - - - I - AVA 96.6 97.6 97.2 97.5 0.3 I 96.7 95.9 93.9 94.3 94.1 -0.2 I 15.0 AVG - 87.3 87.0 85.4 -1.6 I - 82.5 96.6 97.5 97.9 0.4 I - AVK 99.6 90.8 99.8 99.7 -0.1 I 99.6 99.6 100 99.9 100 -0.1 I 91.2 AVP 99.9 99.9 99.8 99.9 0.1 I 100 99.2 100 99.9 100 -0.1 I 88.2 AVX - 74.2 75.7 77.4 1.7 I - - 98.7 94.5 99.0 4.5 I 61.4 CLE - - - - - I - - - - - - I 4.2 CMD - - 98.4 99.6 1.2 I - - 99.6 100 100% 0.0 I 93.5 DSS/DSE 99.9 99.9 * 99.8 - I 100 100 * 100 100% 0.0 I 95.8 DRW/DWW - 89.5 98.3 96.7 -1.6 I - 98.3 98.8 98.4 97.5 -0.9 I 59.8 ESA - - - 58.0 - I - - - 88.9 - - I - FPR/FMA - 93.9 99.4 99.7 0.3 I 92.4 99.8 99.7 100 - - I - FPW - - 99.2 99.6 0.4 I - - 99.9 100 100% 0.0 I 90.8 FSE 99.8 100 99.9 100 0.1 I 100 100 100 100 100% 0.0 I 96.7 FWN - - - - - I 99.6 99.7 99.9 99.8 - - I - HMV - - - - - I - 99.5 - - - - I - IBM 92.8 * * * - I 94.5 * * * - - I - INO 93.5 98.1 97.1 98.7 1.6 I 88.1 99.8 98.1 99.7 99.8 0.1 I 78.1 IRS 96.7 97.6 - - - I 99.0 99.5 - - - - I - ITM - 64.2 - - - I - - - - - - I - IVB - - - - - I 92.8 95.0 - - - - I - MKS - - - - - I - - - 97.1 - - I - MR2 - - 65.9 - - I - - 64.9 - - - I - NAV - 96.8 97.6 96.8 -0.8 I 95.3 99.7 98.7 98.0 97.7 -0.3 I 36.6 NOD - 97.6 98.3 98.3 0.0 I - 99.8 100 99.4 - - I - NV5 - - 99.0 - - I - - 99.6 - - - I - NVC 93.6 97.0 99.0 99.1 0.1 I - 99.1 99.6 99.9 99.9 0.0 I 83.7 PAV 98.4 99.9 99.6 100 0.4 I 99.5 99.5 86.7 99.9 100 0.1 I 90.2 PCC - 81.2 - - - I - 98.0 - - - - I PER - - - - - I - - - 53.7 67.2 13.5 I 18.0 PRO - 37.3 39.8 44.6 4.8 I - 58.0 61.9 67.4 69.1 1.7 I 12.1 QHL - - - - - I - - - 0.0 - - I 6.9 RAV 84.9 - 86.9 86.5 -0.4 I 92.2 - 98.1 97.9 96.9 -1.0 I 47.1 SCN 86.6 99.8 99.7 100 0.3 I 97.7 100 99.8 100 100 0.0 I 95.8 SWP 98.4 - 99.0 99.6 0.6 I 98.6 - 98.5 98.6 - - I - TBA 92.6 * * * - I 98.7 * * * - - I - TSC - 55.3 53.8 - - I - 76.5 64.9 - - - I - VBS - - - - - I 41.5 - - - - - I - VBW - 26.5 - - - I 93.4 - - - - - I - VET - 66.3 * * * I - 97.6 * * - - I - VSP - 86.4 79.7 78.1 -1.6 I - 0.4 0.3 - - - I - -----------------------------------+--------------------------------+-------- Mean 95.0 84.2 89,7 91.6 0.3 I 92.1 90.4 93,5 95.0 95.6 +1,3 I 61,0 -----------------------------------+----------------------------------------- Generally, the ability of W-98 scanners to macro zoo viruses "in the mean" is further improved from 95% to 95.6%). Moreover, 4 scanners are "perfect" (100% detection rate) as the detect ALL macro viruses in the zoo testbed, and 3 more are "almost" perfect (missing few viruses, with detection rate rounded to 100%). With 100% detection both of ALL ITW viruses in ALL instantiations (files), the following grid is applied to grade macro virus detection under W-98: 1) detection rate of ITW macro viruses: 100% (perfect) mandatory 2) detection rate of ITW macro objects: 100% (perfect) mandatory 3) detection rate of zoo macro viruses: 100% (perfect) or 100~ (almost perfect) or >99% (excellent) 4) detection rate of zoo macro objects: 100% (perfect) or 100~ (almost perfect) or >99% (excellent) -- Macro ITW ---- Macro Zoo -- (virus files) (virus files) ----------------------------------- "Perfect" W-98 scanner: CMD (100% 100%) (100% 100%) DSE (100% 100%) (100% 100%) FPW (100% 100%) (100% 100%) FSE (100% 100%) (100% 100%) ----------------------------------- "Almost Perfect" W-98 scanners: AVK (100% 100%) (100~ 100~) AVP (100% 100%) (100~ 100~) PAV (100% 100%) (100~ 100~) SCN (100% 100%) (100~ 100~) ----------------------------------- "Excellent" W-98 scanners: NVC (100% 100%) (99,9 99,8) INO (100% 100%) (99,8 99,8) AVX (100% 99,6) (99,0 99,1) ----------------------------------- Grading of best W-98 products: ------------------------------ 4 products are "perfect" as they detect ALL zoo and ITW macro viruses in all instantiations: CMD, DSE, FPW and FSE. 4 products are "almost perfect" as they detect ALL ITW macro viruses and all BUT ONE zoo macro viruses/samples: AVK, AVP, PAV and SCN. 3 products are "excellent" as they detect ALL ITW macro viruses and all BUT FEW zoo macro viruses/samples: NVC, INO and AVX. HOWEVER: Detection rates for script viruses (even on a small collection of VBS, mIRC and JavaScript viruses) is FAR FROM ACCEPTABLE, with mean detection rate (of 20 products) just at 61%. Grading of script virus detection under W-98: --------------------------------------------- 3 (out of 20) products reach 95% detection (="very good"): FSE(96,7), DSE(95,8), SCN(95,8) AND 4 products reach 90% detection (="good"): CMD(93,5), AVK(91,2), FPW(90,8), PAV(90) ********************************************************************* Findings #8: Macro Virus detection rates under W-98 on high level but Script Virus detection rates insufficient. ********************************************************************* Findings #8.1: Detection rates for macro viruses for scanners under Windows 98 is rather stable on a fairly high, though not perfect level: Perfect scanner (100%): 4 (last test: 3) CMD, DSE, PFW and FSE. Almost perfect scanners : 4 AVK, AVP, PAV and SCN. Excellent scanners (>99%): 3 (last test: 7) NVC, INO and AVX. Findings #8.2: HOWEVER: Simlilar to DOS (see #1.3), detection rates of script viruses for W-98 are FAR FROM acceptable: NO product is "perfect" (100%) or "excellent" (>99%) 3 products are "very good" (Y95%): FSE,DSE,SCN 4 products are "good" (>90%): CMD,AVK,FPW,PAV Findings #8.3: AV producers must invest significantly more work and quality into detection of script viruses as this threat is significantly growing for W-98 users! ********************************************************************** Eval #9: Overall virus detection rates under Windows-NT: ======================================================== The number of scanners running under Windows NT is growing. For the first time, more W-NT products were available for these tests than for the traditional DOS scene. It can safely be assumed that the shift from 16-bit to 32-bit platforms will further continue. The following table summarizes results of file and macro virus detection under Windows-NT in last 7 VTC tests: Table AC: Comparison: File/Macro/Script Virus Detection Rate in last 7 VTC tests under Windows NT: ============================================================ Detection of: Script Scan --------- File Virus ------------- + ---------------- Macro Virus ----------- + Virus ner 9707 9802 9810 9903 9909 0004 Delta 9707 9802 9810 9903 9909 0004 0008 Delta 0008 -----------------------------------------+------------------------------------------+------ ANT 88.9 69.2 91.3 - 87.2 92.8 12.6 I 92.2 - 85.7 - 89.3 90.2 96.4 +6.2 I 55.2 ANY - - 69.7 - - - - I - - 70.5 - - - - - I - ATD - - - - - 100% - I - - - - - 99.9 - - I - AVA - 97.4 96.6 97.1 97.4 97.2 -0.2 I - 91.9 97.2 95.2 93.3 94.3 94.1 -0.2 I 15.0 AVG - - - 87.3 87.0 85.4 -1.6 I - - - 82.5 96.6 97.5 97.9 +0.4 I 45.8 AVK - - 99.6 90.2 99.8 99.7 -0.1 I - - 99.6 99.6 100% 99.9 100 +0.1 I 91.8 AVP - - 83.7 99.9 99.8 99.9 0.1 I - - 100% 99.2 100% 99.9 100 +0.1 I 88.2 AVX - - - 74.2 75.2 80.4 5.2 I - - - 98.9 98.7 94.5 99.0 +4.5 I 61.4 AW - 56.4 - - - - - I - 61.0 - - - - - - I - CLE - - - - - - - I - - - - - - - - I 4.2 CMD - - - - - 99.6 - I - - - - - 100% 100% 0.0 I 93.5 DRW/DWW - - 93.3 98.3 98.3 0.0 I - - - 98.3 98.8 98.4 97.5 -0.9 I 59.8 DSS/E 99.6 99.7 99.9 99.3 * - - I 99.0 100% 100% 100% * - - - I - ESA - - - - - 58.0 - I - - - - - 88.9 - - I - FPR/FMA - 96.1 - 98.7 99.4 - - I - 99.9 99.8 99.8 99.7 - - - I - FPW - - - - - 99.6 - I - - - - 99.7 100% 100% 0.0 I 90.8 FSE - 85.3 99.8 100% 99.9 100% 0.1 I - - 99.9 100% 100% 100% 100% 0.0 I 96.7 FWN - - - - - - - I - - 99.6 99.7 - 99.9 - - I - HMV - - - - - - - I - - 99.0 99.5 - - - - I - IBM 95.2 95.2 77.2 * * * * I 92.9 92.6 98.6 * * * * * I * INO - 92.8 - 98.1 98.0 98.7 0.7 I - 89.7 - 99.8 99.7 99.7 99.8 +0.1 I 78.1 IRS - 96.3 - 97.6 - - - I - 99.1 - 99.5 - - - - I - IVB - - - - - - - I - - 92.8 95.0 - - - - I - MKS - - - - - 78.0 - I - - - - - 97.1 - - I - MR2 - - - - 61.9 - - I - - - - 69.6 - - - I - NAV 86.5 97.1 - 98.0 97.6 96.8 -0.8 I 95.6 98.7 99.9 99.7 98.7 98.0 97.7 -0.3 I 36.6 NOD - - - 97.6 98.2 98.3 0.1 I - - - 99.8 100% 99.4 - - I - NVC 89.6 93.8 93.6 96.4 - 99.1 - I 96.6 99.2 - 98.9 98.9 99.9 99.9 0.0 I 83.7 NVN - - - - 99.0 - - I - - - - 99.5 - - - I - PAV 97.7 98.7 98.4 97.2 99.6 100% 0.4 I 93.5 98.8 99.5 99.4 99.7 99.9 100 +0.1 I 90.2 PCC 63.1 - - - - - - I - 94.8 - - - - - - I - PER - - - - - - - I - - - - - - 85.0 - I 0.0 PRO - - - 37.3 42.4 45.6 3.2 I - - - 58.0 61.9 67.4 69.1 +1.7 I 13.1 QHL - - - - - - - I - - - - - 0.0 - - I 6.9 RAV - 81.6 84.9 85.5 - 88.0 - I - 98.9 99.5 99.2 - 97.9 96.9 -1.0 I 47.1 RA7 - - - 89.3 - - - I - - - 99.2 - - - - I - SCN 94.2 91.6 71.4 99.1 99.8 99.8 0.0 I 97.6 99.1 97.7 100% 100% 100% 100% 0.0 I 95.8 SWP 94.5 96.8 98.4 - 99.0 99.6 0.6 I 89.1 98.4 97.5 - 98.4 98.6 - - I - TBA - 93.8 92.6 * * * - I 96.1 - 98.7 * * * - I - TNT - - - * * * - I - - 44.4 * * * - I - VET 64.9 - - 65.4 * * - I - 94.0 - 94.9 * * - I - VSA - 56.7 - - - - - I - 84.4 - - - - - I - VSP - - - 87.0 69.8 78.1 8.3 I - - - 86.7 0.3 0.0 - - I - -----------------------------------------+------------------------------------------+------- Mean: 87.4 88.1 89.0 89.2 90,0 91.0 1.8% 94.7 95.9 91.6 95.3 95,1 96.5 96.3 +0.6 I 57.7 -----------------------------------------+------------------------------------------+------- Generally, "mean" detection rate of macro viruses is stable on an acceptable though not overly good level (96.3%); again, products having participated in last VTC tests succeed in following the growth by keeping detection rates (+0.6%) on high levels, from where spectacular improvements are less easy. The same grid used for Windows-98 is applied to select best products: -- Macro ITW ---- Macro Zoo -- (virus files) (virus files) ----------------------------------- "Perfect" W-NT scanner: CMD (100% 100%) (100% 100%) FPW (100% 100%) (100% 100%) FSE (100% 100%) (100% 100%) SCN (100% 100%) (100% 100%) ----------------------------------- "Almost Perfect" W-NT scanners: AVK (100% 100%) (100~ 100~) AVP (100% 100%) (100~ 100~) PAV (100% 100%) (100~ 100~) ----------------------------------- "Excellent" W-NT scanners: NVC (100% 100%) (99,9 99,9) INO (100% 100%) (99,8 99,8) AVX (100% 99,6) (99,0 99,1) ----------------------------------- Grading of best W-98 products: ------------------------------ 4 products are "perfect" as they detect ALL zoo and ITW macro viruses in all instantiations: CMD, FPW, FSE and SCN. 3 products are "almost perfect" as they detect ALL ITW macro viruses and all BUT ONE zoo macro viruses/samples: AVK, AVP and PAV. 3 products are "excellent" as they detect ALL ITW macro viruses and all BUT FEW zoo macro viruses/samples: NVC, INO and AVX. HOWEVER: regarding the detection of script viruses, the situation is worse than that one reported under DOS (mean value: 57,7%) and even worse than under W-98 (mean rate: 61%), as only 57.7% are detected "in the mean". Grading of script virus detection under W-NT: --------------------------------------------- NO product is "perfect" (100%) or "excellent" (>99%) 2 products are "very good": FSE(96.7%), SCN(95.8%) 4 products are "good": CMD(93.5%),AVK(91.8%),FPW(90.8%),PAV(90.2%) ********************************************************************* Findings #9: Macro Virus detection rates under W-NT on high level but Script Virus detection rates insufficient. ********************************************************************* Findings #9.1: Detection rates for macro viruses for scanners under Windows NT is rather stable on a fairly high, with 4 products on "perfect" level: Perfect scanner (100%): 4 (last test: 1) CMD, FPW, FSE, SCN. Almost perfect scanners : 3 AVK, AVP, PAV. Excellent scanners (>99%): 3 (last test: 8) NVC, INO, AVX. Findings #9.2: HOWEVER: similar to DOS (see #1.3) and W-98 (see #8.2), detection rates of script viruses for W-98 is FAR FROM acceptable: NO product is "perfect" (100%) or "excellent" (>99%) 2 products are "very good": FSE, SCN 4 products are "good": CMD,AVK,FPW,PAV Findings #9.3: AV producers must invest significantly more work and quality into detection of script viruses as this threat is significantly growing for W-NT users! ********************************************************************** Eval #10: Overall virus detection rates under Windows-2000: =========================================================== For the first time, VTC tested also 20 products submitted for the newly deployed platforms "Windows-2000" (aka W-2k). Compared to the "usual" problems which we experienced with previous tests (esp. under W-98), we observed NO SINGLE CRASH under W-2k (which is worthwhile to mention). The following table summarizes results of macro and script virus detection under Windows-NT in this test. Table AD: Comparison: Macro/Script Virus Detection Rate in VTC test "2000-08" under W-2k: ======================================================= Detection of: Scan - File Virus - + - Macro Virus - + - Script Virus - ner I 0008 I 0008 ---------------------+-----------------+----------------- ANT I 93.3 I 53.9 AVA I 94.1 I 15.0 AVG I 97.9 I 45.8 AVK I 100.0 I 91.5 AVP I 100.0 I 88.2 AVX I 99.0 I 61.4 CLE I - I 4.2 CMD I 100.0% I 93.5 DRW I 97.5 I 59.8 FPW I 100.0% I 90.8 FSE I 100.0% I 96.7 INO I 99.8 I 78.1 NAV I 97.7 I 36.6 NVC I 99.9 I 83.7 PAV I 100.0 I 90.2 PER I 85.0 I 0.0 PRO I 69.1 I 12.1 QHL I 0.0 I 6.9 RAV I 96.9 I 47.1 SCN I 100.0% I 95.8 ---------------------+-----------------+----------------- Mean: I 91.1% I 57.6% ---------------------+-----------------+----------------- The same grid used for W-98 and W-NT is applied to select best products: -- Macro ITW ---- Macro Zoo -- (virus files) (virus files) ----------------------------------- "Perfect" W-NT scanner: CMD (100% 100%) (100% 100%) FPW (100% 100%) (100% 100%) FSE (100% 100%) (100% 100%) SCN (100% 100%) (100% 100%) ----------------------------------- "Almost Perfect" W-NT scanners: AVK (100% 100%) (100~ 100~) AVP (100% 100%) (100~ 100~) PAV (100% 100%) (100~ 100~) ----------------------------------- "Excellent" W-NT scanners: NVC (100% 100%) (99,9 99,9) INO (100% 100%) (99,8 99,8) AVX (100% 99,6) (99,0 99,1) ----------------------------------- Grading of best W-2k products: ------------------------------ 4 products are "perfect" as they detect ALL zoo and ITW macro viruses in all instantiations: CMD, FPW, FSE and SCN. 3 products are "almost perfect" as they detect ALL ITW macro viruses and all BUT ONE zoo macro viruses/samples: AVK, AVP and PAV. 3 products are "excellent" as they detect ALL ITW macro viruses and all BUT FEW zoo macro viruses/samples: NVC, INO and AVX. In comparison with macro virus detection (mean value: 91.1%), detection of script viruses is unsufficient (mean value: 57.6%). But it is interesting to observe that detection rates of several products are more than 1.0% higher than those for W-98 and W-NT products: Grading of script virus detection under W-2k: --------------------------------------------- NO product is "perfect" (100%) or "excellent" (>99%) 3 products are "very good": FSE(97.2), SCN (96.6), CMD (95.3) 5 products are "good": AVK (93,9), AVP (92.4), FPW(92.4), PAV(92.4) ********************************************************************* Findings #10: Macro Virus detection rates under W-2k on high level but Script Virus detection rates insufficient. ********************************************************************* Findings #10.1: Detection rates for macro viruses for scanners under Windows 2000 is starting on a high level, with 4 products on "perfect" level: Perfect scanner (100%): CMD, FPW, FSE, SCN. Almost perfect scanners : AVK, AVP, PAV. Excellent scanners (>99%): NVC, INO, AVX. Findings #10.2: HOWEVER: similar to all other platforms, detection rates of script viruses for W-2k is FAR FROM acceptable: NO product is "perfect" (100%) or "excellent" (>99%) 3 products are "very good": FSE, SCN, CMD 5 products are "good": AVK, AVP, FPW, PAV Findings #10.3: AV producers must invest significantly more work and quality into detection of script viruses as this threat is significantly growing for W-2k users! ********************************************************************** Eval #11: File/Macro Virus detection under 32-bit engines: ========================================================= Concerning 32-Bit engines as used in W-98, W-NT and W-2k, it is interesting to test the validity of the hypothesis that related engines produce same detection and identification quality for all platforms. (For details see 6HCOMP32.TXT). When comparing results from related tests, good news is that 32-bit engines growingly behave equally well under W-98 and W-NT, also including Windoes-2000 for which products were tested for the first time. Equal detection of zoo macro viruses: 14 (of 19) products of ITW macro viruses: 17 (of 19) products of zoo script viruses: 15 (of 20) products ********************************************************************* Findings #11: Several W-32 scanners perform equally on W-98/W-NT/W-2k ********************************************************************* Findings #11.1: The assumption that 32-bit engines in scanners produce the same detection rate for different instantiations of 32-bit operating systems (including Windows-98, Windows-NT and Windows-2000) is now correct for many though not all scanners. #11.2: Analysis of ITW detection rates is NOT sufficient to determine the behaviour of 32-bit engines and does not guarantee equal detection rates for different W-32 platforms (esp. W-98/W-NT/W-2k). *****************************************************************