========================================= File 7EVAL-WNT.TXT Evaluation of results for Macro and Script Virus/Malware detection under Windows-NT in VTC Test "2001-07": ========================================= Formatted with non-proportional font (Courier) Forword: This test is based on VTC test "2001-04". Products submitted for that test (with engines and signatures dated before Dec.12, 2000) were applied to 2 differential testbeds: Testbed ".011" contained all those macro/script viruses/malware which were reported between Nov.1,2000 and Jan.31,2001. Testbed ".014" contained all those macro/script viruses/malware which were reported between Feb.1 and April 30,2001. For both testbeds, only those viruses were included in In-The-Wild tests which have been newly reported as "In-The- Wild" during the related period. The goal of test "2001-07" is to determine to which degree AV products are able to reliably detect macro/script viruses and malware found after delivery of engines and signatures. In order to enable compa- rison with VTC test "2001-04", products were used in the same mode, esp. with the same options and parameters as in that test. Therefore, this is NOT EXACTLY a test for heuristic detection (where one should set switches/options accordingly, e.g. to exclude detection by signa- tures). ("Heureka" is a word from ancient Greek meaning "I found it"). Content of this file: ********************************************************************* Eval WNT.0A: Development of Zoo macro virus detection rates Eval WNT.0B: Development of ITW macro virus detection rates Eval WNT.0C: Development of zoo script virus detection rates Eval WNT.0D: Development of macro malware detection rates Eval WNT.SUM Grading of WNT products according to "Heureka" results ********************************************************************* This part of VTC "2001-07" test report evaluates the detailed results as given in section (file): 6GWNT.TXT Macro/Script Viruses/Malware results (W-NT) The following (15) products participated in this special "heuristic" scanner test for WNT products: -------------------------------------------------------- Products submitted for aVTC test under Windows-NT: -------------------------------------------------------- AV3 v: 3.0.304.0 sig: Dec.04,2000 AVG 6 v: 6.220 sig: Dec.11,2000 AVK 10 v: 10,0,0,0 sig: Dec.07,2000 AVP Platinum v: 3.5.311.0 sig: Dec.07,2000 CMD v: 4.60 sig: Dec.11,2000 FPR v: 3.08b sig: Dec.11,2000 FPW v: 3.08b sig: Dec.11,2000 FSE v: 5.21 sig: Dec.01,2000 INO v: 4.53 Enterprise Ed. sig: Dec.11,2000 NVC v: 4.86 sig: Dec.01,2000 PAV v: 3.0.132.4 sig: Dec.07,2000 PER v: 6.60 sig: Nov.30,2000 RAV v: 8.1.001 sig: Dec.11,2000 SCN v: 4.12.0 sig: Dec.06,2000 VSP v: 12.02.2 sig: Dec.11,2000 -------------------------------------------------------- Remark: NAV and Pro which participated in VTC test "2001-04" were withdrawn from this test, as new versions were announced. Eval WNT.0A: Development of Zoo macro virus detection rates: ============================================================ Total known I New viruses detected: Testbed "2000-0A" I "2001-01" "2001-04" ------------------------+------------------------------ Viruses I New Viruses New Viruses Scanner detected I detected detected ------------------------+------------------------------ Testbed 6233 100.0 I 267 100.0 278 100.0 ------------------------+------------------------------ AV3 5966 95.7 I 139 52.1 156 56.1 AVG 6128 98.3 I 226 84.6 216 77.7 AVK 6230 100.0 I 195 73.0 147 52.9 AVP 6230 100.0 I 195 73.0 147 52.9 CMD 6233 100.0 I 225 84.3 214 77.0 FPR 6233 100.0 I 225 84.3 214 77.0 FPW 6233 100.0 I 225 84.3 214 77.0 FSE 6233 100.0 I 246 92.1 247 88.8 INO 6215 99.7 I 250 93.6 256 92.1 NVC 6219 99.8 I 146 54.7 78 28.1 PAV 6230 100.0 I 196 73.4 151 54.3 PER 4252 68.2 I 199 74.5 221 79.5 RAV 6208 99.6 I 223 83.5 223 80.2 SCN 6233 100.0 I 261 97.8 270 97.1 VSP 1 0.0 I 0 0.0 0 0.0 ------------------------+------------------------------ Mean 90.8% I 73.7% 66.0% ------------------------+------------------------------ Analysis: the ability of AV products to macro viruses which were first reported after the related engine and signature were shipped is evidently very different: - with one exception (PER), AV products loose continously detection quality with time - from mean detection rate in reference test (90.8%), loss of detection quality is * in first 3 months: -17.1% * in next 3 months: -24.8% In order to determine the quality (starting with high detection rate in reference test "2004-04"), we define the "loss vector": loss vector = (detection rate in reference test, loss in months 1-3, loss in months 4-6) In order to classify product behaviour, we grade products according to loss in detection quality. When considering only products with losses up to 20% per 3-month period, the following products behaved best in "Heureka" test: ------------------------------------ detection rate loss in loss in AV product in ref-test month 1-3 month 4-6 ------------------------------------ SCN 100.0% - 2.2% - 2.9% INO 99.7% - 6.1% - 7.6% FSE 100.0% - 7.9% -11.2% AVG 98.3% -13.7% -20.6% CMD 100.0% -15.7% -23.0% FPR 100.0% -15.7% -23.0% FPW 100.0% -15.7% -23.0% ------------------------------------------------ ************************************************************* Result "Heureka-A": concerning new zoo macro viruses, the following 2 products miss less than 10% over each 3-month period ----------------------------------------------------- SCN 1st period: - 2.2% 2nd period: - 2.9% ----------------------------------------------------- INO 1st period: - 6.1% 2nd period: - 7.6% ----------------------------------------------------- FSE 1st period: - 7.9% 2nd period: -11.2% ************************************************************** Eval WNT.0B: Development of ITW macro virus detection rates: ============================================================ Total known I New viruses found: Testbed "2000-0A" I "2001-01" "2001-04" ------------------------+------------------------------ Viruses I New Viruses New Viruses Scanner detected I detected detected ------------------------+------------------------------ Testbed 147 100.0 I 22 100.0 11 100.0 ------------------------+------------------------------ AV3 147 100.0 I 21 95.5 7 63.6 AVG 147 100.0 I 22 100.0 9 81.8 AVK 147 100.0 I 22 100.0 9 81.8 AVP 147 100.0 I 22 100.0 9 81.8 CMD 147 100.0 I 22 100.0 11 100.0 FPR 147 100.0 I 22 100.0 11 100.0 FPW 147 100.0 I 22 100.0 11 100.0 FSE 147 100.0 I 22 100.0 11 100.0 INO 147 100.0 I 22 100.0 10 90.9 NVC 147 100.0 I 22 100.0 8 72.7 PAV 147 100.0 I 22 100.0 9 81.8 PER 114 77.6 I 11 50.0 10 90.9 RAV 147 100.0 I 22 100.0 11 100.0 SCN 147 100.0 I 22 100.0 11 100.0 VSP 0 0.0 I 0 0.0 0 0.0 ------------------------+------------------------------ Mean 91.8% I 89.7% 83.0% ------------------------+------------------------------ Concerning In-The-Wild macro viruses, the situation is - understandably - much better than with zoo viruses, as most ITW viruses are essentially new variants of virus families found some (if not: long) ago. Consequently, generic detection of new viruses in such - often fastly growing - strains is available in most scanners. Indeed, several products detect ALL new variants in both periods. In applying the same grading criteria as for zoo viruses, the following products behaved best in "Heureka" test: ------------------------------------ detection rate loss in loss in AV product in ref-test month 1-3 month 4-6 ------------------------------------ CMD 100.0% - 0.0% - 0.0% FPR 100.0% - 0.0% - 0.0% FPW 100.0% - 0.0% - 0.0% FSE 100.0% - 0.0% - 0.0% RAV 100.0% - 0.0% - 0.0% SCN 100.0% - 0.0% - 0.0% ------------------------------------- INO 100.0% - 0.0% - 9.1% AVG 100.0% - 0.0% -18.2% AVK 100.0% - 0.0% -18.2% AVP 100.0% - 0.0% -18.2% PAV 100.0% - 0.0% -18.2% NVC 100.0% - 0.0% -27.3% ------------------------------------- ***************************************************************** Result "Heureka B": The following *6* products detect In-The-Wild macro viruses with highest persistency over both 3-month periods: CMD, FPR, FPW, FSE, RAV, SCN ***************************************************************** Eval WNT.0C: Development of zoo script virus detection rates: ============================================================= Total known I New viruses found: Testbed "2000-0A" I "2001-01" "2001-04" ------------------------+------------------------------ Viruses I New Viruses New Viruses Scanner detected I detected detected ------------------------+------------------------------ Testbed 477 100.0 I 85 100.0 121 100.0 ------------------------+------------------------------ AV3 139 29.1 I 21 24.7 14 11.6 AVG 276 57.9 I 44 51.8 38 31.4 AVK 476 99.8 I 70 82.4 91 75.2 AVP 476 99.8 I 77 90.6 95 78.5 CMD 462 96.9 I 32 37.6 33 27.3 FPR 463 97.1 I 32 37.6 33 27.3 FPW 462 96.9 I 32 37.6 33 27.3 FSE 477 100.0 I 74 87.1 95 78.5 INO 442 92.7 I 58 68.2 62 51.2 NVC 422 88.5 I 51 60.0 57 47.1 PAV 476 99.8 I 78 91.8 95 78.5 PER 105 22.0 I 18 21.2 12 9.9 RAV 405 84.9 I 58 68.2 75 62.0 SCN 477 100.0 I 78 91.8 88 72.7 VSP 407 85.3 I 55 64.7 73 60.3 ------------------------+------------------------------ Mean 83.4% I 61.0% 49.3% ------------------------+------------------------------ In comparison with zoo macro viruses, the situation concerning script (=VBS, JavaScript etc) viruses is much less favourable, as the following tabel demonstrates (again, only AV products with detection rate loss of less than 20% are listed, with minimum detection rate of 90% in the reference test): ------------------------------------ detection rate loss in loss in AV product in ref-test month 1-3 month 4-6 ------------------------------------ PAV 99.8% - 8.0% -21.3% SCN 100.0% - 8.2% -27.3% AVP 99.8% - 9.2% -21.3% FSE 100.0% -12.9% -21.5% AVK 99.8% -17.4% -24.6% ------------------------------------ ***************************************************************** Result "Heureka C": Zoo script virus detection is significantly less well developped. With best products loosing more than 8% in both 3-month periods, there is strong need for improvement of persistent detection methods esp. as this category adresses many mass-emailing viruses! ***************************************************************** Eval WNT.0D: Development of macro malware detection rates: ========================================================== Total known I New viruses found: Testbed "2000-0A" I "2001-01" "2001-04" ------------------------+------------------------------ Viruses I New Viruses New Viruses Scanner detected I detected detected ------------------------+------------------------------ Testbed 403 100.0 I 23 100.0 41 100.0 ---------------------------------------------------------- AV3 329 81.6 I 6 26.1 13 31.7 AVG 323 80.1 I 13 56.5 23 56.1 AVK 400 99.3 I 16 69.6 25 61.0 AVP 400 99.3 I 16 69.6 25 61.0 CMD 402 99.8 I 16 69.6 27 65.9 FPR 402 99.8 I 16 69.6 27 65.9 FPW 402 99.8 I 16 69.6 27 65.9 FSE 403 100.0 I 19 82.6 28 68.3 INO 378 93.8 I 18 78.3 29 70.7 NVC 399 99.0 I 10 43.5 21 51.2 PAV 400 99.3 I 16 69.6 25 61.0 PER 234 58.1 I 15 65.2 25 61.0 RAV 391 97.0 I 18 78.3 24 58.5 SCN 403 100.0 I 18 78.3 28 68.3 VSP 1 0.2 I 0 0.0 0 0.0 ------------------------+------------------------------ Mean 87.1% I 61.8% 56.4% ------------------------+------------------------------ As non-replicative malware (esp. trojan horses) has recently been strongly reported from Internet newsgroups and on websites, it is essential to find at least new instantiations of known malware. The following table lists those products with detection rates of at least 80% in the reference test, and which do not loose more than 30% detection rates during the 1st 3-month period: ------------------------------------ detection rate loss in loss in AV product in ref-test month 1-3 month 4-6 ------------------------------------ FSE 100.0% -17.4% -31.7% SCN 100.0% -21.7% -31.7% AVK 99.3% -29.7% -38.3% AVP 99.3% -29.7% -38.3% PAV 99.3% -29.7% -38.3% RAV 97.0% -18.7% -38.5% INO 93.8% -15.5% -23.1% ------------------------------------- ****************************************************************** Result "Heureka C": The persistency of non-replicative malware detection needs further improvements. Even those 5 AV products which detect macro malware "perfectly" (100.0%) or "excellently" (>=99.0%) loose detection quality even in the 1st 3-month period. ****************************************************************** Eval WNT.SUM Grading of WNT products according to "Heureka" results =================================================================== In order to grade the "persistent quality" of AV products to detect new viruses in consecutive 3-months periods, we define the following grid (applied to first 3-month period): Category I: "perfect persistency": loss per 3-month period = 0.0% Category II: "excellent persistency": 0.0% <= loss per 3-month period <= 5.0% Category III: "very good persistency": 5.0% < loss per 3-month period <= 10.0% Category IV: "good persistency": 10.0% < loss per 3-month period <= 15.0% Category V: "acceptable persistency": 15.0% < loss per 3-month period <= 20.0% Except for In-The-Wild macro viruses, NO product reaches "perfect consistent" detection rates: ******************************************************* Overall, NO AV product detects macro and script malware (including viruses) with perfect persistency. ******************************************************* The following table lists those AV products which fall into categories I (perfect), II (excellent) and III (very good): Test category: Perfect Excellent Very Good --------------------------------------------------------------- WNT zoo macro virus: --- SCN INO,FSE WNT ITW macro virus: CMD,FPR,FPW, FSE,RAV,SCN --- --- WNT zoo script virus: --- --- PAV,SCN,AVP ---------------------------------------------------------------- WNT macro malware: --- --- --- ---------------------------------------------------------------- In order to support the race for more customer protection, we evaluate the order of performance in test with a simple algorithm, by counting the majority of places (weighing "perfect" with 3 points, "excellent" with 2 points and "very good" with 1 point), for 1st places: ************************************************************ "Perfect" WNT persistent AV product: =NONE= ************************************************************ "Excellent" WNT AV products: 1st place: SCN ( 6 points) 2nd place: FSE ( 4 points) 3rd place: CMD, FPR, FPW, RAV ( 3 points) 7th place: AVP, INO, PAV ( 1 point ) ************************************************************** No "perfect", "excellent" or "very good" persistent AM product **************************************************************