========================================= File 7EVALW98.TXT ----------------------------------------- Evaluation of results for File, Macro and Script Virus/Malware detection under Windows-98 in aVTC Test "2002-12": ========================================= Formatted with non-proportional font (Courier) Content of this file: ************************************************************************ Eval W98: Development of detection rates under Windows-98: ************************************************************************ Eval W98.01: Development of W-98 Scanner Detection Rates Table W98-A1: File Virus Detection Rate in last 8 VTC tests Table W98-A2: Macro/Script Virus Detection Rate in last 8 VTC tests Eval W98.02: In-The-Wild Detection under W-98 Eval W98.03: Evaluation of overall W-98 AV detection rates (zoo,ITW) Eval W98.04: Evaluation of detection by virus classes under W-98 W98.04.1 Grading the Detection of file viruses under W-98 W98.04.2 Grading the Detection of macro viruses under W-98 W98.04.3 Grading the Detection of script viruses under W-98 Eval W98.05: Detection of Packed File and Macro Viruses under W-98 Eval W98.06: Avoidance of False Alarms (File,Macro) under W-98 Eval W98.07: Detection of File, Macro and Script Malware under W-98 Eval W98.SUM Grading of W-98 products ************************************************************************ This part of VTC "2002-12" test report evaluates the detailed results as given in section (files): 6FW98.TXT File/Macro Viruses/Malware results W-98 The following *18* products participated in this scanner test for W-98 products: ------------------------------------------------------------------ Products submitted for aVTC test under Windows-98: ------------------------------------------------------------------ AVA v(def): CLI Lguard32, version 3.0 Eng: 3.0.406.14 sig-date: Dec.13,2001 AVG v(def): 6.0.309 date: Dec.13,2001 sig: 170 sig-date: Dec.17,2001 AVK v(def): 10,1,0,0 sig: 10.0.435 sig-date: Dec.13,2001 AVP v(def): 3.55.160.0 date: Dec.12,2001 BDF v(def): v6.3.6 w98 CMD v(def): 4.62.4 date: Aug.11,2001 Eng: 3.55.160.3203 Sig: Sign.def sig-date: Dec.17,2001 Sig: Sign2.def sig-date: Dec.17,2001 Sig: Macro.def sig-date: Dec.17,2001 DRW v(def): 4.26 (DrWeb32.txt) sig-date: Sept.25,2001 Sig: (drwtoday.vbd) sig-date: Dec.17,2001 FPR v(def): 3.11b Sig:(binary viruses) sig-date: Dec.17,2001 Sig:(macro viruses) sig-date: Dec.16,2001 FPW v(def)/sig: --- FSE v(def)/sig: --- INO v(def): Eng: 49.00 date: Dec.14,2001 sig-date: Dec.17,2001 MR2 v(def): 1.20 sig-date: Dec.12,2001 NAV v(def): 7.60.926 Eng: 4.1.0.15 Sig: rev.3 sig-date: December 14,2001 NVC v(def): 5.00.36 Sig:(binary viruses) sig-date: Dec.17,2001 Sig:(macro viruses) sig-date: Dec.16,2001 PRO v(def): 7.1.C04 date: Dec.17,2001 sig-date: Dec.17,2001 RAV v(def): 8.3.1 command line for Win32 i386 Eng: 8.5 for i386 sig-date: Dec.17,2001 SCN v(def): 4.1.60 Sig: 4177 sig-date: Dec.17,2001 VSP v(def): 12.34.1 date: Dec.17,2001 sig-date: Dec.17,2001 ------------------------------------------------------------------ Eval W98.01: Development of Scanner Detection Rates under Windows-98: ===================================================================== The following tables summarizes results of file virus (A1), and of macro and script virus (A2) detection under Windows-98 in last 8 aVTC tests. Table W98-A1: File Virus Detection Rate in last 7 VTC tests under W-98: ======================================================================= Scan --------- File Virus Detection ---------- ner 9810 9903 9909 0004 0104 0212 DELTA -------------------------------------------------- ACU - - - - - - - ADO - - - - 99.9 - - AN5 - - 87.2 - - - - ANT 91.3 - 86.5 92.8 - - - ANY - - - - - - - ATR - - - - - - - AVA/3 96.6 97.6 97.2 97.5 95.2 96.2 +1.0 AVG - 87.3 87.0 85.4 81.9 80.6 -1.3 AVK 99.6 90.8 99.8 99.7 99.8 99.9 +0.1 AVP 99.9 99.9 99.8 99.9 99.9 100~ +0.~ BDF=AVX - 74.2 75.7 77.4 - 82.9 - CLE - - - - 0.1 - - CMD - - 98.4 99.6 97.8 98.5 +0.7 DSS/DSE 99.9 99.9 * 99.8 99.9 - - DRW/DWW - 89.5 98.3 96.7 98.5 98.3 -0.2 ESA - - - 58.0 - - - FPR/FMA - 93.9 99.4 99.7 97.8 98.8 +1.0 FPW - - 99.2 99.6 97.8 98.8 +1.0 FSE 99.8 100.0 99.9 100.0 99.7 100~ +0.3 FWN - - - - - - HMV - - - - - - IBM 92.8 * * * * * * INO 93.5 98.1 97.1 98.7 97.9 98.7 +0.8 IRS 96.7 97.6 - - - - - ITM - 64.2 - - - - - IVB - - - - - - - MKS - - - - - - - MR2 - - 65.9 - 50.1 1.3 -48.8% (!) NAV - 96.8 97.6 96.8 93.9 11.6 -82.3% (!) NOD - 97.6 98.3 98.3 - - - NV5 - - 99.0 - - - - NVC 93.6 97.0 99.0 99.1 98.1 97.8 -0.3 PAV 98.4 99.9 99.6 100.0 99.7 - - PCC - 81.2 - - - - - PER - - - - - - - PRO - 37.3 39.8 44.6 69.9 69.5 -0.4 QHL - - - - - - - RAV 84.9 - 86.9 86.5 93.6 96.7 +3.1 SCN 86.6 99.8 99.7 100.0 99.9 99.8 -0.1 SWP 98.4 - 99.0 99.6 - - - TBA 92.6 * * * * * * TSC - 55.3 53.8 - - - - VBS - - - - - - - VBW - 26.5 - - - - - VET - 66.3 * * * * * VSP - 86.4 79.7 78.1 64.9 4.9 -60.0% (!) -------------------------------------------------------------- Mean 95.0 84.2 89.7 91.6 87.4 79.7% -0.0% (without !) Mean (rate>10%]: 89.3% - -------------------------------------------------------------- Remarks: for abbreviations of products (code names), see appendix A5CodNam.txt. (!) Results of 3 products - MR2,NAV and VSP - are influenced by the fact that these products crashed 3 times after having scanned only a minor part of the file testbed. (*) Products no longer available. Table W98-A2: Comparison: Macro/Script Virus Detection Rate in last 7 aVTC tests under W-98: =========================================================================================== Scan ------------- Macro Virus Detection ---------------- + --- ScriptVirusDetection --- ner 9810 9903 9909 0004 0008 0104 0110 0212 DELTAI 0008 0104 0110 0212 DELTA ----------------------------------------------------------+----------------------------- ACU - 97.6 - - - - - - - I - - - - - ADO - - - - - 99.9 - - - I - 99.8 - - - AN5 - - 89.3 - - - - - - I - - - - - ANT 84.3 - 89.5 90.2 96.4 - 97.4 - - I 55.2 - 81.8 - - ANY 70.7 - - - - - - - - I - - - - - ATR - - - - - - - - - I - 2.7 - - - AVA/3 96.7 95.9 93.9 94.3 94.1 95.7 97.7 97.8 +0.1 I 15.0 30.0 33.7 31.5 -2.2 AVG - 82.5 96.6 97.5 97.9 98.3 98.4 98.1 -0.3 I - 57.9 62.9 63.9 +1.0 AVK 99.6 99.6 100.0 99.9 100~ 100~ 100% 100~ -0.~ I 91.2 99.8 100% 99.0 -1.0 AVP 100.0 99.2 100.0 99.9 100~ 100~ 100~ 100~ 0.0 I 88.2 99.8 100% 98.9 -1.1 BDF=AVX - - 98.7 94.5 99.0 - 99.1 99.0 -0.1 I 61.4 - 70.1 72.4 +2.3 CLE - - - - - 0.0 - - - I 4.2 6.3 - - - CMD - - 99.6 100.0 100% 100% 100~ 99.9 -0.~ I 93.5 96.9 93.9 89.1 -4.8 DRW - 98.3 98.8 98.4 - 98.0 99.5 99.4 -0.1 I - 95.6 95.4 94.7 -0.7 DSE 100.0 100.0 * 100.0 100% 99.9 97.8 - - I 95.8 100% 73.0 - - ESA - - - 88.9 - - - - - I - - - - - FPR 92.4 99.8 99.7 100.0 - 100% 100~ 100~ 0.0 I - 96.9 94.6 88.7 -5.9 FPW - - 99.9 100.0 100% 100% 100~ 100~ 0.0 I 90.8 96.9 94.6 88.7 -5.9 FSE 100.0 100.0 100.0 100.0 100% 100% 100% 100~ 0.0 I 96.7 100% 100% 99.5 -0.5 FWN 99.6 99.7 99.9 99.8 - - - - - I - - - - - HMV - 99.5 - - - - - - - I - - - - - IBM 94.5 * * * * * * * * I * * * * * INO 88.1 99.8 98.1 99.7 99.8 99.7 99.9 99.9 0.0 I 78.1 92.7 95.1 94.7 -0.4 IRS 99.0 99.5 - - - - - - - I - - - - - ITM - - - - - - - - - I - - - - - IVB 92.8 95.0 - - - - - - - I - - - - - MKS - - - 97.1 - 44.2 - - - I - - - - - MR2 - - 64.9 - - - 40.8 37.9 -2.9 I - 85.1 83.3 81.0 -2.3 NAV 95.3 99.7 98.7 98.0 97.7 97.0 99.5 99.8 +0.3 I 36.6 65.5 94.2 97.0 +2.8 NOD - 99.8 100.0 99.4 - - - - - I - - - - - NV5 - - 99.6 - - - - - - I - - - - - NVC - 99.1 99.6 99.9 99.9 99.8 99.8 99.8 0.0 I 83.7 88.5 91.3 87.6 -3.7 PAV 99.5 99.5 86.7 99.9 100~ 99.5 100% - - I 90.2 99.8 100% - - PCC - 98.0 - - - - - - - I - - - - - PER - - - 53.7 67.2 68.5 - - - I 18.0 22.0 - - - PRO - 58.0 61.9 67.4 69.1 67.1 - 72.7 - I 12.1 40.7 - 59.8 - QHL - - - 0.0 - 0.0 0.0 - - I 6.9 - - - - RAV 92.2 - 98.1 97.9 96.9 99.6 99.5 99.9 +0.4 I 47.1 84.9 82.5 96.1+13.6 SCN 97.7 100.0 99.8 100.0 100% 100% 100% 100% 0.0 I 95.8 100% 99.8 99.6 -0.2 SWP 98.6 - 98.5 98.6 - - - - - I - - - - - TBA 98.7 * * * * * * * * I - - - - - TSC - 76.5 64.9 - - - - - - I - - - - - VBS 41.5 - - - - - - - - I - - - - - VBW 93.4 - - - - - - - - I - - - - - VET - 97.6 * * * * * * * I - - - - - VSP - 0.4 0.3 - - 0.0 0.0 0.~ 0.~ I - 85.3 84.0 81.2 -2.8 ----------------------------------------------------------+----------------------------- Mean 92.1 90.3 93.5 95.0 95.6 84.7 87.1 89.1% -0.~%I 61.0 76.0 86.5 84.6% -0.9% Without extreme low detectors: 96.3 94.4% - I 84.6% - ----------------------------------------------------------+----------------------------- Concerning zoo file viruses, "mean" detection rate is reduced to 79.7% (down from 87.4%); this result is influenced by the fact that one formerly successful product - NAV - crashed 3 times without producing significant detection results, whereas other products were rather stable (mean deviation: +-0.0%). Concerning zoo virus file virus detection, NO product detects ALL viruses but 4 products detect ALMOST all viruses and samples (>99%) and are rated "excellent": AVP,FSE (both:100~), AVK (99.9%), SCN (99.8%). Concerning zoo macro viruses, "mean" detection rate slightly increased (from 87.1% to 89.1%) but are still insufficient although 6 products detect all or almost all file viruses. Concerning zoo macro virus detection, ONE product detects ALL viruses: SCN (100%). In addition, 12 products detect ALMOST all viruses and samples (>99%) and are rated "excellent": AVK,AVP,FPR,FPW,FSE (all: 100~), CMD,INO,RAV (all:99.9%), NVC,NAV (99.8%), DRW (99.4%), BDF (99.0%) Concerning zoo script viruses, "mean" detection rate is slightly reduced (from 86.5% to 84.6%) and is still on an acceptably low level. Concerning zoo script virus detection, NO product detects ALL viruses but 3 products detect ALMOST all viruses and samples (>99%) and are rated "excellent": SCN (99.6%), FSE (99.5%), AVK (99.0%) **************************************************************** Findings W98.1: Mean Detection rates for file and script viruses have decreased while detection rates for macro show a slight improvement. All detection rates are on inacceptably low level. Mean detection rates remain inacceptably low: mean file zoo virus detection rate: 79.7% mean macro virus detection rate: 89.1% mean script virus detection rate: 84.6% ------------------------------------------------ Concerning file virus detection: NO product detects ALL file viruses 4 products detect ALMOST all viruses (>99%) and are rated "excellent": AVP,FSE,AVK,SCN ------------------------------------------------ Concerning macro virus detection only: 1 product detects ALL macro zoo viruses and is rated "perfect": SCN 12 products detects >99% of macro zoo viruses and are rated "excellent": AVK,AVP,FPR,FPW,FSE,CMD,INO, RAV,NVC,NAV,DRW,BDF ------------------------------------------------- Concerning script virus detection, NO product detects ALL script viruses 3 products detect ALMOST all viruses (>99%) and are rated "excellent": SCN,AVK,FSE **************************************************************** Eval W98.02: In-The-Wild (File/Macro,Script) Detection under W-98 ================================================================= Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses also esp. detecting ALL instantiations of those viruses is now ABSOLUTE REQUIREMENT, for macro and script viruses (it must be observed that detection and identification is not completely reliable). 6 products are "perfect" in detecting ALL ITW file, macro and script viruses, and 2 more products detect ALMOST ITW viruses (>99%). The following table lists all products which detect ITW viruses with at least 99% detection rates: ITW virus/file detection ( FileV. MacroV. ScriptV. ) ---------------------------------- "Perfect" W98 ITW scanner: AVK (100% 100%; 100% 100%; 100% 100%) AVP (100% 100%; 100% 100%; 100% 100%) DRW (100% 100%; 100% 100%; 100% 100%) FSE (100% 100%; 100% 100%; 100% 100%) NAV (100% 100%; 100% 100%; 100% 100%) SCN (100% 100%; 100% 100%; 100% 100%) ----------------------------------- "Excellent" W98 ITW scanners: INO (100% 99.8%; 100% 100%; 100% 99.2%) RAV (100% 99.1%; 100% 99.8%;100% 100%) ----------------------------------- Concerning only detection of ITW file viruses, 6 scanners are "perfect" in detecting ALL viruses and ALL samples: AVK,AVP,DRW,FSE,NAV,SCN. And 2 more detect ALL viruses in almost all samples: INO(99.8%),RAV(99.1%). Concerning only detection of ITW macro viruses, 7 scanner are "perfect" in detecting ALL viruses and ALL samples: AVK,AVP,DRW,FSE,INO,NAV,SCN. And 9 more products detect ALL viruses in almost all samples: AVG,BDF,CMD,FPR,FPW,NVC (all:99.9%), RAV (99.8%), AVA (99.6%), PRO (99.3%) Concerning only detection of ITW script viruses, 12 scanners are "perfect" in detection ALL viruses and ALL samples: AVG,AVK,AVP,CMD,DRW,FPR,FPW,FSE,NAV,NVC,RAV,SCN And 1 more product detect ALL viruses in almost all samples: INO (99.2%). ************************************************************** Findings W98.2: 6 AV products (out of 18) detect ALL ITW file, macro and script viruses in ALL samples and are rated "perfect": AVK,AVP,DRW,FSE,NAV,SCN 2 more products detect ALL ITW viruses in ALMOST all samples (>99%) and are rated "excellent": INO,RAV ********************************************* Concerning ITW file virus detection only, 6 products are "perfect" as they detect ALL viruses in ALL samples: AVK,AVP,DRW,FSE,NAV,SCN 2 products detect ALL ITW viruses in ALMOST all samples (>99%) and are rated "excellent": INO,RAV ********************************************* Concerning ITW macro virus detection only, 7 products detect ALL viruses in ALL files: AVK,AVP,DRW,FSE,INO,NAV,SCN 9 products detect ALL ITW viruses in ALMOST all samples (>99%) and are rated "excellent": AVG,BDF,CMD,FPR,FPW,NVC,RAV,AVA,PRO ********************************************* Concerning ITW script virus detection only, 12 products detect ALL viruses in ALL files: AVG,AVK,AVP,CMD,DRW, FPR,FPW,FSE,NAV,NVC,RAV,SCN 1 product detects ALL ITW viruses in ALMOST all samples (>99%) and is rated "excellent": INO ************************************************************** Eval W98.03: Evaluation of overall W-98 AV detection rates (zoo,ITW) ==================================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is graded "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall AV grade" (including file, macro and script virus virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. Only scanners where all tests were completed are considered. (For problems in test: see 8problms.txt). The following list indicates those scanners graded into one of the upper three categories, with file, macro and script virus detection rates in unpacked samples, and with perfect ITW virus detection (rate=100%). Under W-98, NO product reached 100% detection rate for file, macro and script viruses, BOTH ZOO and ITW (there for ALL files). But scanners are graded "Excellent" (>99%), and 1 more scanner is rated "very good" (>95%): (zoo: file/macro/script; file/macro/script:ITW) -------------------------------------------------- "Perfect" W-98 zoo scanners: ================ NONE ============= -------------------------------------------------- "Excellent" W-98 zoo scanners: SCN (99.8% 100% 99.6%; 100% 100% 100%) FSE ( 100~ 100~ 99.5%; 100% 100% 100%) AVK (99.9% 100~ 99.0%; 100% 100% 100%) -------------------------------------------------- "Very Good" W-98 zoo scanners: AVP ( 100~ 100~ 98.9%; 100% 100% 100%) RAV (96.7% 99.9% 96.1%;100% 100% 100%) -------------------------------------------------- ******************************************************************** Findings W98.3: NO W98 product is overall rated "perfect": --- 3 products are rated "excellent": SCN,FSE,AVK 2 products are rated "very good": AVP,RAV ********************************************************************* Eval W98.04: Evaluation of detection by virus classes under W-98: ================================================================= Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting macro and script viruses; in all cases, 100% detection of viruses in all files is required for a product to be graded. Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed (where ITW virus detection must be 100%). W98.04.1 Grading the Detection of file viruses under W98 -------------------------------------------------------- "Perfect" W98 file scanners: === NONE === "Excellent" W98 file scanners: AVP (100~) FSE (100~) AVK (99.9%) SCN (99.8%) "Very Good" W98 file scanners: FPR (98.8%) FPW (98.8%) INO (98.7%) CMD (98.5%) DRW (98.3%) NVC (97.8%) RAV (96.7%) AVA (96.2%) W98.04.2 Grading the Detection of macro viruses under W98 --------------------------------------------------------- "Perfect" W98 macro scanners: SCN (100.0%) "Excellent" W98 macro scanners: AVK ( 100~ ) AVP ( 100~ ) FPR ( 100~ ) FPW ( 100~ ) FSE ( 100~ ) CMD ( 99.9%) INO ( 99.9%) NAV ( 99.8%) RAV ( 99.9%) NVC ( 99.8%) DRW ( 99.4%) BDF ( 99.0%) "Very Good" W98 macro scanners: AVG ( 98.1%) AVA ( 97.8%) W98.04.3 Grading the Detection of Script viruses under W98: ----------------------------------------------------------- "Perfect" W98 script scanners: === NONE === "Excellent" W98 script scanners: SCN ( 99.6%) FSE ( 99.5%) AVK ( 99.0%) "Very Good" W98 script scanners: AVP ( 98.9%) NAV ( 97.0%) RAV ( 96.1%) *********************************************************************** Finding W98.4: Performance of W98 scanners by virus classes: Perfect scanners for file zoo: --- Excellent scanners for file zoo: AVP,FSE,AVK,SCN Very Good scanners for file zoo: FPR,FPW,INO,CMD,DRW,NVC,RAV,AVA Perfect scanners for macro zoo: SCN Excellent scanners for macro zoo: AVK,AVP,FPR,FPW,FSE,CMD,INO,RAV,NAV,NVC,DRW,BDF Very Good scanners for macro zoo: AVG,AVA Perfect scanners for script zoo: --- Excellent scanners for script zoo: SCN,FSE,AVK Very Good scanners for script zoo: AVP,NAV,RAV *********************************************************************** Eval W98.05: Detection of Packed File and Macro Viruses under W-98 ================================================================== Detection of macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with 6 popular methods (PKZIP, ARJ, LHA, RAR, WinRAR and CAB) are also detected. Tests are performed only on In-The-Wild viruses packed once (no recursive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially be unchanged to ease comparison and improvement. ATTENTION: for packing objects in ITW testbeds, we used WinRAR 2.0. As WinRAR 2.0 didnot properly pack VTCs very large file testbed, this testbed was packed with WinRAR 2.9 which at that time was available in its final version (after longer availability of beta versions) since >3 months. Only upon evaluation, we detected that ONLY ONE product (RAV) was at all able to handle WinRAR 2.9 packed malware, at least to some degree (though not sufficient for the grade "perfect"). Consequently, this evaluation doesNOT include WinRAR. The following evaluation includes: ARJ,CAB,LHA,RAR,ZIP. Concerning overall detection of BOTH file and macro virus samples: A "perfect" product would detect ALL packed viral samples (100%) for all (5) packers: --------------------------------------------- "Perfect" packed virus detectors: AVK,AVP,BDF,SCN --------------------------------------------- An "excellent" product would reach 100% detection of packed viruses for at least 4 packers: ----------------------------------------------- "Excellent" packed virus detector: DRW ----------------------------------------------- A "very good" product would detect viral samples of packed viruses for at least 3 packers: ------------------------------------------------- "Very Good" packed virus detector: --- ------------------------------------------------- Concerning detection of packed file viruses only: "Perfect" packed file virus detectors: AVK,AVP,BDF,SCN "Excellent" packed file virus detectors: DRW,RAV "Very Good" packed file virus detectors: --- Concerning detection of ALL packed macro viruses: "Perfect" packed macro virus detectors: AVK,AVP,BDF,CMD,FPR,FPW,SCN "Excellent" packed macro virus detectors: DRW "Very Good" packed macro virus detectors: AVG,FSE Remark: Much more data were collected on precision and reliability of virus detection in packed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. *********************************************************************** Findings W2k.5: Concerning OVERALL detection of packed file AND macro viruses, 4 products are "perfect": AVK,AVP,BDF,SCN And 1 product is "excellent": DRW No product is "very good": --- ******************************************************* Concerning detection of packed FILE viruses: 4 products are "perfect": AVK,AVP,BDF,SCN 3 products are "excellent": DRW,RAV ***************************************************** Concerning detection of packed MACRO viruses: 7 products are "perfect": AVK,AVP,BDF,CMD,FPR,FPW,SCN 1 product is "excellent": DRW 1 product is "very good": FSE *********************************************************************** Eval W98.06: Avoidance of False Alarms (File/Macro) under W-98: =============================================================== First introduced in aVTC test "1998-10", a set of clean (and non-malicious) objects has been added to the macro virus testbeds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be significantly more rigid than that one used for detection. The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate < 0.5%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" Regarding the ability of scanners to avoid FP alarms, 8 (out of 18) products in test reported NO SINGLE False Positive alarm both in file and macro zoo testbeds and are therefore rated "perfect": --------------------------------------------------------------------- "Perfect" file-FP AND macro-FP avoiding W98 products: AVA,AVG,BDF,INO,NAV,PRO,SCN,VSP --------------------------------------------------------------------- "Perfect" file-FP avoiding W98 products: AVA,AVG,AVK,AVP,BDF,CMD, FPR,FPW,FSE,INO,MR2,NAV,NVC,PRO,RAV,SCN,VSP "Excellent" file-FP avoiding W98 products: DRW --------------------------------------------------------------------- "Perfect" macro-FP-avoiding W98 products: AVA,AVG,BDF,INO,NAV,PRO,SCN,VSP "Excellent" macro-FP-avoiding W98 products: AVK,RAV --------------------------------------------------------------------- ******************************************************************* Findings W98.6: Avoidance of False-Positive Alarms is rather well well developped, at least for file-FP avoidance. 8 overall FP-avoiding perfect W98 scanners: AVA,AVG,BDF,INO,NAV,PRO,SCN,VSP *************************************************** Concerning file-FP avoidance, 17 (of 18) products are "perfect": AVA,AVG,AVK,AVP,BDF,CMD, FPR,FPW,FSE,INO,MR2,NAV,NVC,PRO,RAV,SCN,VSP 1 product is "excellent": DRW *************************************************** Concerning macro-FP avoidance, these products are "perfect": AVA,AVG,BDF,INO,NAV,PRO,SCN,VSP And 2 more products are "excellent": AVK,RAV ******************************************************************** Eval W98.07: Detection of File, Macro and Sciript Malware under W-98 ==================================================================== Since test "1997-07", aVTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be also warned about and protected from non-viral and non-wormy malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Since aVTC test "1999-03", malware detection is a mandatory part of aVTC tests, both for submitted products and for those downloaded as free evaluation copies. A growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of file, macro and script malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" Generally, detection of malware needs significant further development, as mean detection rates over platforms show: mean detection rate for file malware: 75.0% (75.0% for scanners >10%) for macro malware: 84.5% (89.5% for scanners >10%) for script malware: 51.4% (54.4% for scanners >10%) In this test, NO product "perfectly" detects all (non-viral) malware in ALL 3 categories (file, macro and script malware) in aVTC testbeds. Concerning File, Macro AND Script malware detection: ------------------------------------------------------------ "Perfect" file/macro/script malware detectors under W98: --- ------------------------------------------------------------ "Excellent" file/macro/script malware detectors under W98: FSE ( 99.1% 100% 97.4%) AVK ( 98.7% 100% 95.7%) AVP ( 98.7% 100% 95.7%) SCN ( 92.8% 100% 98.3%) ------------------------------------------------------------ "Very Good" file/macro/script malware detector under W98: RAV ( 86.1% 99.3% 82.1%) ------------------------------------------------------------ Concerning only file malware detection: ---------------------------------------------------------- "Perfect" macro malware detectors under W98: --- "Excellent" macro malware detectors under W98: FSE (99.1%), AVK,AVP (98.7%), SCN (92.8%), FPR,FPW (91.3%), CMD (91.0%) "Very Good" macro malware detectors under W98: RAV (86.1%), INO (83.0%) ---------------------------------------------------------- Concerning only macro malware detection: ---------------------------------------------------------- "Perfect" macro malware detectors under W98: AVK,AVP,FSE,SCN (100%) "Excellent" macro malware detectors under W98: CMD,FPR,FPW,RAV (99.3%), NVC (98.2%),INO (93.8%),NAV (93.1%), BDF (91.8%),DRW (91.1%),AVA (90.7%) "Very Good" macro malware detectors under W98: AVG (80.2%) ----------------------------------------------------------- An concerning script malware detection only: ------------------------------------------------------------- "Perfect" script malware detectors under W98: --- "Excellent" script malware detectors under W98: SCN (98.3%), FSE (97.4%), AVK,AVP (95.7%), NAV (92.3%) "Very Good" script malware detectors under W98: RAV (82.1%) ------------------------------------------------------------- ******************************************************************* Findings W98.7: Generally, detection of malware is insufficient as is indicated by mean detection rates for file malware: 75.0% for macro malware: 84.5% for script malware: 51.4%. *************************************************** Concerning overall malware detection (including file, macro AND script malware) under W98: 0 products are "perfect": --- 4 products are "excellent": FSE,AVK,AVP,SCN 1 product is "very good": RAV *************************************************** Concerning only file malware detection, 4 products are "perfect": --- 7 products are "excellent": FSE,AVK,AVP,SCN,FPR,FPW,CMD 2 products are "very good": RAV,INO *************************************************** Concerning only macro malware detection, 4 products are "perfect": AVK,AVP,FSE,SCN 10 products are "excellent": CMD,FPR,FPW,RAV,NVC,INO,NAV,BDF,DRW,AVA 1 product is rated "very good": AVG *************************************************** Concerning only script malware detection, 0 products are "perfect": --- 5 products are "excellent": FSE,SCN,AVK,AVP,NAV 1 product is rated "very good": RAV ****************************************************************** Eval W98.SUM: Grading of W-98 products: ======================================= Under the scope of aVTCs grading system, a "Perfect W-98 AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99% of zoo samples, in ALL categories (file, boot and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (5) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). ******************************************************************* In aVTC test "2002-12", we found *** NO perfect W-98 AV product *** and we found *** No perfect W-98 AM product *** ******************************************************************* But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ----------------------------------------------------------------- W98 file ITW test: AVK,AVP,DRW,FSE, INO,RAV NAV,SCN W98 macro ITW test: AVK,AVP,DRW,FSE, AVG,BDF,CMD,FPR,FPW, INO,NAV,SCN NVC,RAV,AVA,PRO W98 script ITW test: AVG,AVK,AVP,CMD,DRW,FPR, INO FPW,FSE,NAV,NVC,RAV,SCN ----------------------------------------------------------------- W98 file zoo test: --- AVP,FSE,AVK,SCN W98 macro zoo test: SCN AVK,AVP,FPR,FPW,FSE,CMD, INO,RAV,NAV,NVC,DRW,BDF W98 script zoo test: --- SCN,FSE,AVK ----------------------------------------------------------------- W98 file pack test: AVK,AVP,BDF,SCN DRW,RAV W98 macro pack test: AVK,AVP,BDF,CMD, DRW FPR,FPW,SCN W98 file FP avoidance: AVA,AVG,AVK,AVP,BDF, DRW CMD,FPR,FPW,FSE,INO,MR2, NAV,NVC,PRO,RAV,SCN,VSP W98 macro FP avoidance: AVA,AVG,BDF,INO,NAV, AVK,RAV PRO,SCN,VSP ----------------------------------------------------------------- W98 file malware test: --- FSE,AVK,AVP,SCN, FPR,FPW,CMD W98 macro malware test: AVK,AVP,FSE,SCN CMD,FPR,FPW,RAV,NVC, INO,NAV,BDF,DRW,AVA W98 script malware test: --- FSE,SCN,AVK,AVP,NAV ----------------------------------------------------------------- In order to support the race for more customer protection, we evaluate the order of performance in this W-98 test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" W-98 AntiVirus product: =NONE= (20 points) "Excellent" W-98 AntiVirus products: 1st place: SCN (18 points) 2nd place: AVK (16 points) 3rd place: AVP (14 points) 4th place: FSE,NAV (11 points) 6th place: BDF,DRW (10 points) 8th place: INO,RAV ( 9 points) 10th place: CMD,FPR,FPW ( 8 points) 13th place: AVG ( 7 points) 14th place: NVC ( 6 points) 15th place: AVA,PRO ( 5 points) 17th place: VSP ( 4 points) 18th place: MR2 ( 2 points) ************************************************************ "Perfect" W-98 AntiMalware product: =NONE= (26 points) "Excellent" W-98 AntiMalware products: 1st place: SCN (22 points) 2nd place: AVK (20 points) 3rd place: AVP (18 points) 4th place: FSE (15 points) 5th place: NAV (13 points) 6th place: BDF,DRW (11 points) 8th place: CMD,FPR,FPW,INO,RAV (10 points) 13th place: NVC ( 7 points) 14th place: AVA ( 6 points) ************************************************************