========================================= File 7EVALW2k.TXT ----------------------------------------- Evaluation of results for File, Macro and Script Virus/Malware detection under Windows-2000 in aVTC Test "2002-12": ========================================= Formatted with non-proportional font (Courier) Content of this file: ===================== ********************************************************************** Eval W2k: Development of detection rates under Windows-2000: ********************************************************************** Eval W2k.01: Development of W-2000 Scanner Detection Rates Table W2k-A: Comparison File/Macro/Script virus detection rates Eval W2k.02: In-The-Wild Detection under W-2000 Eval W2k.03: Evaluation of overall W-2000 AV detection rates Eval W2k.04: Evaluation of detection by virus classes under W-2000 W2k.04.1 Grading the Detection of file viruses under W-2000 W2k.04.2 Grading the Detection of macro viruses under W-2000 W2k.04.3 Grading the Detection of script viruses under W-2000 Eval W2k.05: Detection of Packed Viruses by virus classes under W-2000 W2k.05.1 Detection of Packed File Viruses under W-2000 W2k.05.2 Detection of Packed Macro Viruses under W-2000 Eval W2k.06: Avoidance of False Alarms (Macro) under W-2000 W2K.06.1 Avoidance of False Alarms (file) under W-2000 W2K.06.2 Avoidance of False Alarms (macro) under W-2000 Eval W2k.07: Detection of Malware by classes under W-2000 W2k.07.1 Detection of File Malware under W-2000 W2k.07.2 Detection of Macro Malware under W-2000 W2k.07.3 Detection of Script Malware under W-2000 Eval W2k.SUM Grading of W-2000 products ********************************************************************** This part of VTC "2002-12" test report evaluates the detailed results as given in sections (files): 6iW2k.TXT File/Macro/Script Viruses/Malware results W-2000 (W2k) The following *19* products participated in this scanner test for W-2000 products: ------------------------------------------------------------------------ Products submitted for aVTC test under Windows-2000: ------------------------------------------------------------------------ AVA v(def): CLI Lguard32, version 3.0 Eng: 3.0.406.14 Sig/date: December 13, 2001 AVG v(def): 6.0.309 date: December 13, 2001 Sig: 170 date: December 17, 2001 AVK v(def): W2K: 10,1,0,0 Sig: 10.0.435 date: December 13, 2001 AVP v(def): 3.55.160.0 date: December 12, 2001 BDF v(def): v6.1 def: 51456 CMD v(def): 4.64.0 date: August 11, 2001 Eng: 3.55.160.3203 Sign.def date: December 17, 2001 Macro.def date: December 16, 2001 DRW v(def): 4.26 (DrWeb32.txt) date: September 25, 2001 (test.ful) Sig/date: December 17, 2001 (drwtoday.vbd) FPR v(def): 3.11b date: --- Sig/date: December 17, 2001 (binary viruses) Sig/date: December 16, 2001 (macro viruses) FPW v(def): --- sig: --- FSE v(def): 1.00.1251 Sig/date: December 14, 2001 Eng: 3.09.507 (F-PROT) Eng: 3.55.160.3210 (AVP) Eng: 1.02.15 (Orion) IKA v(def): 5.04 Sig/date: December 16, 2001 INO v(def): Eng:49.00 date: December 14, 2001 Sig/date: December 17, 2001 MR2 v(def): 1.20 NAV v(def): 7.60.926 Eng: 4.1.0.15 Sig: rev.3 Sig/date: December 14, 2001 NVC v(def): 5.00.36 Sig/date: December 17, 2001 (binary viruses) Sig/date: December 16, 2001 (macro viruses) PRO v(def): 7.1.C04 date: December 17, 2001 Sig/date: December 17, 2001 RAV v(def): 8.3.1 command line for Win32 i386 Eng: 8.5 for i386 Sig/date: December 17, 2001 at 16:22:24 SCN v(def): 4.1.60 Sig:4177 date:December 17, 2001 VSP v(def): 12.34.1 date: December 17, 2001 Sig/date: December 17, 2001 ------------------------------------------------------------------------ Eval W2k.01: Scanner Detection Rates under Windows-2000: ======================================================== The number of scanners running under Windows 2000 is growing. Evidently, AV producers invest now more work into the development of the W32-related platforms, and here into the detection of macro viruses (with minor improvements) and script viruses (with major improvements). The following table summarizes results of file, macro and script virus detection under Windows-2000 (since 0008 until 0212): Table W2k-A: Comparison: File/Macro/Script Virus Detection Rate: ================================================================ Scan I == File Virus == + ======= Macro Virus ======== + ======= Script Virus ======== ner I Detection I Detection I Detection -----+------------------+------------------------------+------------------------------ Test I 0104 0212 Delta I 0008 0104 0110 0212 Delta I 0008 0104 0110 0212 Delta -----+------------------+------------------------------+------------------------------ ANT I - - - I 93.3 - - - - I 53.9 - - - - AVA I 95.0 96.2 +1.2 I 94.1 95.7 97.7 97.8 +0.1 I 15.0 29.1 29.6 31.5 +1.9 AVG I 81.9 80.6 -1.3 I 97.9 98.3 98.4 98.1 -0.3 I 45.8 57.9 62.9 63.9 +1.0 AVK I 99.8 99.9 +0.1 I 100~ 100~ 100% 100~ -0.0 I 91.5 99.8 100% 99.0 -1.0 AVP I 99.9 100~ +0.1 I 100~ 100~ 100~ 100~ 0.0 I 88.2 99.8 100% 98.9 -1.1 AVX=BDF - 82.9 - I 99.0 - - 99.0 - I 61.4 - - 72.4 - CLE I - - - I - - - - - I 4.2 - - - - CMD I 97.8 98.5 +0.7 I 100% 100% 100~ 99.9 -0.1 I 93.5 96.9 93.2 89.1 -4.1 DRW I - 98.3 - I 97.5 - 99.5 99.4 -0.1 I 59.8 - 95.4 94.7 -0.7 FPR I 97.8 98.8 +1.0 I - 100% 100~ 100~ 0.0 I - 96.9 94.6 88.7 -5.9 FPW I 97.8 98.8 +1.0 I 100% 100% 100~ 100~ 0.0 I 90.8 96.9 94.6 88.7 -5.9 FSE I - 100~ - I 100% 100% 100% 100~ -0.0 I 96.7 100% 100% 99.5 -0.5 IKA I - 89.2 - I - - - 96.2 - I - - - 81.2 - INO I 97.9 98.7 +0.8 I 99.8 99.7 99.9 99.9 0.0 I 78.1 93.1 93.9 94.7 +0.8 MCV I - - - I - - 88.5 - - I - - 27.7 - - MR2 I - 9.5 - I - - 0.7 10.4 +9.7 I - - 83.3 81.0 -2.3 NAV I 93.9 98.3 +4.4 I 97.7 97.0 99.5 99.6 +0.1 I 36.6 54.5 94.2 96.8 +2.6 NVC I 98.1 97.8 -0.3 I 99.9 99.8 99.8 99.8 0.0 I 83.7 88.5 91.3 87.6 -3.7 PAV I 97.5 - - I 100~ 99.4 100% - - I 90.2 98.5 100% - - PER I - - - I 85.0 68.2 - - - I 0.0 22.0 - - - PRO I 70.6 70.4 -0.2 I 69.1 67.1 - 72.7 - I 12.1 40.7 - 59.8 - QHL I - - - I 0.0 - - - - I 6.9 - - - - RAV I 93.5 94.7 +1.2 I 96.9 99.6 99.5 99.9 +0.4 I 47.1 84.9 82.5 96.1 +13.6 SCN I 89.0 99.8 +10.9 I 100% 100% 100% 100% 0.0 I 95.8 100% 99.8 99.6 -0.2 VSP I - 14.0 - I - 0.0 0.~ 0.~ 0.0 I - 85.3 84.0 81.2 -2.8 -----+------------------+------------------------------+------------------------------ Mean : 97.6 85.6% +0.9 I 99.9 89.7 88.0 88.0% +2.4%I 57.6 79.4 84.8 84.4% -0.5% Mean >10%: 89.8% - I 98.9 92.9% I 91.9 84.4% - -----+------------------+------------------------------+------------------------------ Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. In comparison with last test, mean detection rates for file zoo viruses are significantly reduced (by >11%) to 85.6%, for macro viruses are unchanged on too low a level: 88.0%, for script viruses almost unchanged on too low a level: 84.4%. Concerning file zoo virus detection, NO product is able to detect ALL viruses (rating: "perfect"), but several products detect more than 99% and are rated "excellent": AVP and FSE (both 100~), AVK (99.9%), SCN (99.8%) Concerning macro zoo virus detection, ONE product detects ALL viruses and is rated "perfect": SCN In addition, 12 products detect >99% of viruses and are rated "excellent": AVK,AVP,FPR,FPW,FSE (all: 100~), CMD,INO,RAV (all: 99.9%), NAV,NVC (both 99.8%), DRW (99.4%) and BDF (99.0%). Concerning script zoo virus detection, NO product detects ALL viruses, and only 3 products detect more than 99% and are rated "excellent": SCN (99.6%), FSE (99.5%), AVK (99.0). **************************************************************** Findings W2k.1: For W-2000, file, macro and script zoo virus detection rates, no improvement can be reported. Here, significant work is needed. Mean detection rates remain inacceptably low: mean file zoo virus detection rate: 85.6% mean macro virus detection rate: 88.0% mean script virus detection rate: 84.4% ------------------------------------------------ Concerning file zoo viruses: NO product detects ALL viruses ("perfect") 4 products detect more than 90% and are rated "excellent": AVP,FSE;AVK,SCN ------------------------------------------------ Concerning macro zoo viruses: 1 products detects ALL macro zoo viruses in all files and is rated "perfect": SCN 12 products detect almost all macro viruses in almost all files and are rated "excellent": AVK,AVP,FPR,FPW,FSE;CMD,INO,RAV;NAV,NVC;DRW;BDF ------------------------------------------------ Concerning script zoo viruses: NO product detects ALL viruses ("perfect") 3 product detect almost all script viruses in almost all files and are rated "excellent": SCN,FSE,AVK **************************************************************** Eval W2k.02: In-The-Wild (File,Macro,Script) Detection under W-2000 =================================================================== Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses also esp. detecting ALL instantiations of those viruses is now ABSOLUTE REQUIREMENT, for file, macro and script viruses to be rated "perfect" (it must be observed that detection and identification is not completely reliable). Presently, 1 scanner is "perfect" in this category: NAV. 5 more scanners are "excellent" in detection 100% of ALL ITW viruses but miss only few infected objects (>99= detection rate) and are rated "excellent". ITW virus/file detection ( FileV. MacroV. ScriptV. ) ---------------------------------- "Perfect" W2k ITW scanners: AVK (100% 100%; 100% 100%; 100% 100%) AVP (100% 100%; 100% 100%; 100% 100%) DRW (100% 100%; 100% 100%; 100% 100%) FSE (100% 100%; 100% 100%; 100% 100%) NAV (100% 100%; 100% 100%; 100% 100%) SCN (100% 100%; 100% 100%; 100% 100%) ---------------------------------- "Excellent" W2k ITW scanners: === NONE === ---------------------------------- Concerning detection of ITW file viruses: 6 scanners are perfect: NAV,AVK,AVP,DRW,FSE,SCN 2 more scanners are excellent: RAV,INO Detection of ITW macro viruses is much better developped than detection of file viruses and infected objects: 7 scanners are rated "perfect": AVK,AVP,DRW,FSE,INO,NAV,SCN: all (100% 100%) 10 more scanners are rated "excellent": AVG,BDF,CMD,FPR,FPW,NVC: all (100% 99.9%), RAV (100% 99.8%), AVA (100% 99.6%), IKA (100% 99.5%), PRO (100% 99.3%) Detection of ITW script viruses is relatively best, as most products (11 of 19) detected ALL viruses in ALL infected objects: 12 scanners are rated "perfect": AVG,AVK,AVP,CMD,DRW,FPR,FPW,FSE,MR2,NAV,NVC,RAV,SCN: all (100% 100%) 2 more scanners are rated "excellent": IKA,INO (100% 99.5%) ****************************************************************** Findings W2k.2: 6 AV products (out of 19) detect ALL In-The-Wild file, macro and zoo viruses in ALL instantiations (files) and are rated "perfect": AVK,AVP,DRW,FSE,NAV,SCN 0 scanner is "excellent": --- ************************************************* Concerning detection of ITW file viruses: 6 "perfect" scanners: NAV,AVK,AVP,DRW,FSE,SCN 2 "excellent" scanners: RAV,INO Concerning detection of ITW macro viruses: 7 "perfect" scanners: AVK,AVP,DRW,FSE,INO,NAV,SCN 10 "excellent" scanners: AVG,BDF,CMD,FPR,FPW,NVC,RAV,AVA,IKA,PRO Concerning detection of ITW script viruses: 12 "perfect" scanners: AVG,AVK,AVP,CMD,DRW,FPR,FPW,FSE,NAV,NVC,RAV,SCN 2 "excellent" scanners: IKA,INO ***************************************************************** Eval W2k.03: Evaluation of overall W-2000 AV detection rates (zoo,ITW) ====================================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is graded "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall AV grade" (including file, macro and script virus virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. Only scanners where all tests were completed are considered. (For problems in test: see 8problms.txt). Besides grading products in related categories according to their performance, it is interesting to compare how products developed. The following list indicates those scanners graded into one of the upper three categories, with file, macro and script virus detection rates in unpacked samples, and with perfect ITW virus detection (rate=100%). Zoo test: ITW test: (file/macro/script; file/macro/script) -------------------------------------- "Perfect" W2k scanners: ========= NONE ========= "Excellent" W2k scanners: SCN (99.8% 100% 99.6%; 100% 100% 100%) FSE ( 100~ 100~ 99.5%; 100% 100% 100%) AVK (99.9% 100~ 99.0%; 100% 100% 100%) -------------------------------------- "Very Good" W-2k scanners: AVP ( 100~ 100~ 98.9%; 100% 100% 100%) NAV (98.3% 99.6% 96.8%; 100% 100% 100%) -------------------------------------- ****************************************************************** Findings W2k.3: Now, NO W2k product is overall rated "perfect" (in last test: 3 products!): --- 3 "excellent" overall scanners: SCN,FSE,AVK 2 "very good" overall scanners: AVP,NAV ****************************************************************** Eval W2k.04: Evaluation of detection by virus classes under W-2000: =================================================================== Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting macro and script viruses. Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed (where ITW virus detection must be 100%). W2k.04.1 Grading the Detection of file viruses under W2k -------------------------------------------------------- "Perfect" W2k file scanners: === NONE === "Excellent" W2k file scanners: AVP (100~) FSE (100~) AVK (99.9%) SCN (99.8%) "Very Good" W2k file scanners: FPR (98.8%) FPW (98.8%) INO (98.7%) CMD (98.5%) DRW (98.3%) NAV (98.3%) NVC (97.8%) AVA (96.2%) W2k.04.2 Grading the Detection of macro viruses under W2k --------------------------------------------------------- "Perfect" W2k macro scanners: SCN (100.0%) "Excellent" W2k macro scanners: AVK ( 100~ ) AVP ( 100~ ) FPR ( 100~ ) FPW ( 100~ ) FSE ( 100~ ) CMD ( 99.9%) INO ( 99.9%) RAV ( 99.9%) NVC ( 99.8%) NAV ( 99.6%) DRW ( 99.4%) DRW ( 99.0%) "Very Good" W2k macro scanners: AVG ( 98.1%) AVA ( 97.8%) IKA ( 96.2%) W2k.04.3 Grading the Detection of Script viruses under W2k: ----------------------------------------------------------- "Perfect" W2k script scanners: === NONE === "Excellent" W2k script scanners: SCN ( 99.6%) FSE ( 99.5%) AVK ( 99.0%) "Very Good" W2k script scanners: AVP ( 98.9%) NAV ( 96.8%) RAV ( 96.1%) *********************************************************************** Finding W2k.4: Performance of W2k scanners by virus classes: Perfect scanners for file zoo: --- Excellent scanners for file zoo: AVP,FSE,AVK,SCN Very Good scanners for file zoo: INO,CMD,DRW,NAV,NVC,AVA,FPR,FPW Perfect scanners for macro zoo: SCN Excellent scanners for macro zoo: AVK,AVP,FPR,FPW,FSE,CMD,INO,RAV,NVC,NAV,DRW,BDF Very Good scanners for macro zoo: AVG,AVA,IKA Perfect scanners for script zoo: --- Excellent scanners for script zoo: SCN,FSE,AVK Very Good scanners for script zoo: AVP,NAV,RAV *********************************************************************** Eval W2k.05: Detection of Packed File and Macro Viruses under W-2k ================================================================== Detection of file and macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with given popular methods are also detected. It seems therefore reasonable to test whether at least ITW viral objects compressed with 6 popular methods (PKZIP, ARJ, LHA, RAR, WinRAR and CAB) are also detected. Tests are performed only on In-The-Wild viruses packed once (no recursive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially be unchanged to ease comparison and improvement. ATTENTION: for packing objects in ITW testbeds, we used WinRAR 2.0. As WinRAR 2.0 didnot properly pack VTCs very large file testbed, this testbed was packed with WinRAR 2.9 which at that time was available in its final version (after longer availability of beta versions) since >3 months. Only upon evaluation, we detected that ONLY ONE product (RAV) was at all able to handle WinRAR 2.9 packed malware, at least to some degree (though not sufficient for the grade "perfect"). Consequently, this evaluation doesNOT include WinRAR. The following evaluation includes: ARJ,CAB,LHA,RAR,ZIP. Concerning overall detection of BOTH file and macro virus samples: A "perfect" product would detect ALL packed viral samples (100%) for 5 packers: ---------------------------------------------------- "Perfect" packed virus detectors: AVK,AVP,BDF,SCN ---------------------------------------------------- An "excellent" product would reach 100% detection of packed viral samples for at least 4 packers: ---------------------------------------------------- "Excellent" packed macro virus detector: DRW,RAV ---------------------------------------------------- A "very good" product would detect viral samples for at least 3 packers: ---------------------------------------------------- "Very Good" packed macro virus detector: FSE ---------------------------------------------------- Concerning detection of packed file viruses only: "Perfect" packed file virus detectors: AVK,AVP,BDF,FSE,SCN "Excellent" packed file virus detectors: DRW,RAV "Very Good" packed file virus detectors: --- Concerning detection of ALL packed macro viruses: "Perfect" packed macro virus detectors: AVK,AVP,BDF,CMD,FPR,FPW,SCN "Excellent" packed macro virus detectors: DRW,RAV "Very Good" packed macro virus detectors: AVG,FSE,NAV Remark: Much more data were collected on precision and reliability of virus detection in packed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. *********************************************************************** Findings W2k.5: Concerning OVERALL detection of packed file AND macro viruses, 4 products are "perfect": AVK,AVP,BDF,SCN And 2 products are "excellent": DRW,RAV One more product is "very good": FSE ******************************************************* Concerning detection of packed FILE viruses: 5 products are "perfect": AVK,AVP,BDF,FSE,SCN 2 products are "excellent": DRW,RAV ***************************************************** Concerning detection of packed MACRO viruses: 7 products are "perfect": AVK,AVP,BDF,CMD,FPR,FPW,SCN 2 products are "excellent": DRW,RAV 3 products are "very good": AVG,FSE,NAV *********************************************************************** Eval W2k.06: Avoidance of False Alarms (File, Macro) under W-2000: ================================================================== First introduced in VTC test "1998-10", a set of clean (and non-malicious) objects has been added to the file and macro virus testbeds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be significantly more rigid than that one used for detection. The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate < 0.5%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" It is good to observe that ALL 19 scanners avoid FP alerts on clean files, but concerning clean macro objects, only 8 (out of 19) products are "perfect" in avoiding any alarm, and 3 more products are "excellent" as they only alert on a single samples (<0.5%). Overall, the following products didnot issue any false alarm: --------------------------------------------------------------------- "Perfect" file-FP AND macro-FP avoiding W2k products: AVA,AVG,BDF,INO,PRO,NAV,SCN,VSP --------------------------------------------------------------------- "Perfect" file-FP avoiding W2k products: AVA,AVG,AVK,AVP,BDF,CMD, FPR,FPW,FSE,IKA,INO,MR2,NAV,PRO,RAV,SCN,VSP "Excellent" file-FP-avoiding W2k product: DRW --------------------------------------------------------------------- "Perfect" macro-FP-avoiding W2k products: AVA,AVG,BDF,INO,NAV,PRO,SCN,VSP "Excellent" macro-FP-avoiding W2k products: AVK,RAV --------------------------------------------------------------------- ******************************************************************** Findings W2K.06: Avoidance of False-Positive Alarms is rather well developped, at least for file-FP avoidance. 8 Overall FP-avoiding perfect W2k scanners: AVA,AVG,BDF,INO,PRO,NAV,SCN,VSP *************************************************** Concerning file-FP avoidance, 17 products are "perfect": AVA,AVG,AVK,AVP,BDF,CMD, FPR,FPW,FSE,IKA,INO,MR2,NAV,PRO,RAV,SCN,VSP And 1 more product is "excellent": DRW *************************************************** Concerning macro-FP avoidance, these products are "perfect": AVA,AVG,BDF,INO,NAV,PRO,SCN,VSP And 2 more products are "excellent": AVK,RAV ******************************************************************** Eval W2k.07: Detection of Macro and Script Malware under W-2k ============================================================= Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be also warned about and protected from non-viral and non-wormy malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. A growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of file and macro malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" Generally, detection of malware needs significant further development, as mean detection rates over platforms show: mean detection rate for file malware: 73.5% (73.5% for scanners >10%) for macro malware: 84.9% (89.6% for scanners >10%) for script malware: 51.0% (53.8% for scanners >10%) Concerning File, Macro AND Script malware detection: ------------------------------------------------------------ "Perfect" file/macro/script malware detectors under W2k: --- ------------------------------------------------------------ "Excellent" file/macro/script malware detectors under W2k: FSE ( 99.3% 100% 97.4%) AVK ( 98.7% 100% 95.7%) AVP ( 98.7% 100% 95.7%) SCN ( 92.8% 100% 98.3%) ------------------------------------------------------------ "Very Good" file/macro/script malware detector under W2k: RAV ( 86.1% 98.3% 92.1%) ------------------------------------------------------------ Concerning only file malware detection: ---------------------------------------------------------- "Perfect" macro malware detectors under W2k: --- "Excellent" macro malware detectors under W2k: FSE (99.3%), AVK,AVP (98.7%), SCN (92.8%), FPR,FPW (91.3%), CMD (91.0%) "Very Good" macro malware detectors under W2k: RAV (86,1%), INO (82.4%) ---------------------------------------------------------- Concerning only macro malware detection: ------------------------------------------------------------- "Perfect" macro malware detectors under W2k: AVK,AVP,FSE,SCN (100%) "Excellent" macro malware detectors under W2k: CMD,FPR,FPW,RAV (99.3%), NVC (98.2%),INO (93.8%),NAV (92.9%), BDF (91.8%),IKA (91.6%),DRW (91.1%),AVA (90.7%) "Very Good" macro malware detectors under W2k: AVG (80.2%) ------------------------------------------------------------- An concerning script malware detection only: ------------------------------------------------------------- "Perfect" script malware detectors under W2k: --- "Excellent" script malware detectors under W2k: SCN (98.3%), FSE (97.4%), AVK,AVP (95.7%), NAV (91.5%) "Very Good" script malware detectors under W2k: RAV (82.1%) ------------------------------------------------------------- ******************************************************************* Findings W2k.7:File/ Macro/Script Malware detection under W2k is less developped compared to last test: 0 products are "perfect": --- 5 products are "excellent": FSE,AVK,AVP,SCN 1 products is "very good": RAV *************************************************** Concerning only file malware detection, 4 products are "perfect": --- 5 products are "excellent": FSE,AVK,AVP,SCN,FPR,FPW,CMD 2 product are rated "very good": INO,RAV *************************************************** Concerning only macro malware detection, 4 products are "perfect": AVK,AVP,FSE,SCN 11 products are "excellent": CMD,FPR,FPW,RAV,NVC,INO,NAV,BDF,IKA,DRW,AVA 1 product is rated "very good": AVG *************************************************** Concerning only script malware detection, 0 products are "perfect": --- 5 products are "excellent": SCN,FSE,AVK,AVP,NAV 1 product is rated "very good": RAV ******************************************************************* Eval W2k.SUM: Grading of W-2000 products: ========================================= Under the scope of VTCs grading system, a "Perfect W2k AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99% of zoo samples, in ALL categories (file, boot and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (5) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). ***************************************************************** In VTC test "2002-12", we found *** NO perfect W2k AV product *** and we found *** No perfect W2k AM product *** ***************************************************************** But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ----------------------------------------------------------------- W2k file ITW test: NAV,AVK,AVP,DRW, INO,RAV FSE,SCN W2k macro ITW test: AVK,AVP,DRW,FSE, AVG,BDF,CMD,FPR,FPW, INO,NAV,SCN NVC,RAV,AVA,IKA,PRO W2k script ITW test: AVG,AVK,AVP,CMD,DRW, IKA,INO FPR,FPW,FSE,NAV,NVC, RAV,SCN, ----------------------------------------------------------------- W2k file zoo test --- AVP,FSE,AVK,SCN W2k macro zoo test: SCN AVK,AVP,FPR,FPW,FSE,CMD, INO,RAV,NVC,NAV,DRW W2k script zoo test: --- SCN,FSE,AVK ----------------------------------------------------------------- W2k file pack test: AVK,AVP,BDF,FSE,SCN DRW,RAV W2k macro pack test: AVK,AVP,BDF,CMD,FPR, DRW,RAV FPW,SCN W2k file FP avoidance: AVA,AVG,AVK,AVP,BDF, DRW CMD,FPR,FPW,FSE,IKA, INO,MR2,NAV,NVC,PRO, RAV,SCN,VSP W2k macro FP avoidance: AVA,AVG,BDF,INO, AVK,RAV PRO,NAV,SCN,VSP ----------------------------------------------------------------- W2k file malware test: --- FSE,AVK,AVP,SCN, FPR,FPW,CMD W2k macro malware test: AVK,AVP,FSE,SCN CMD,FPR,FPW,RAV,NVC, INO,NAV,BDF,IKA,DRW,AVA W2k script malware test: --- SCN,FSE,AVK,AVP,NAV ----------------------------------------------------------------- In order to support the race for more customer protection, we evaluate the order of performance in this W2k test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" W-2000 AntiVirus product: =NONE= (20 points) "Excellent" W-2000 AV products: 1st place: SCN (18 points) 2nd place: AVK (16 points) 3rd place: AVP (14 points) 4th place: FSE (13 points) 5th place: NAV (11 points) 6th place: DRW (10 points) 7th place: BDF,INO,RAV ( 9 points) 10th place: FPR ( 8 points) 11th place: AVG ( 7 points) 12th place: CMD,FPW ( 6 points) 14th place: NVC,AVA ( 5 points) 16th place: IKA,MR2,PRO,VSP ( 4 points) ************************************************************ "Perfect" W-2000 AntiMalware product: =NONE= (26 points) "Excellent" W-2000 AntiMalware product: 1st place: SCN (22 points) 2nd place: AVK (20 points) 3rd place: AVP (18 points) 4th place: FSE (17 points) 5th place: NAV (13 points) 6th place: DRW (11 points) 7th place: BDF,FPR,INO,RAV (10 points) 11th place: CMD,FPW ( 8 points) 13th place: NVC ( 6 points) 14th place: IKA ( 5 points) ************************************************************