========================================= File 7EVALDOS.TXT ----------------------------------------- Evaluation of VTC Scanner Test "2002-12": ========================================= Formatted with non-proportional font (Courier) Content of this file: ===================== Eval A: Development of detection rates under DOS: ------------------------------------------------- Eval DOS.01: Development of DOS Scanner Detection Rates Table DOS-A1: File Virus Detection Rate in last 11 VTC tests Table DOS-A2: Macro/Script Virus Detection Rate in last 10 VTC tests Eval DOS.02: In-The-Wild Detection under DOS Eval DOS.03: Evaluation of overall DOS AV detection rates Eval DOS.04: Evaluation of detection by virus classes under DOS DOS.04.1 Grading the Detection of file viruses under DOS DOS.04.2 Grading the Detection of macro viruses under DOS DOS.04.3 Grading the Detection of script viruses under DOS Eval DOS.05: Detection of Packed (File, Macro) Viruses under DOS Eval DOS.06: Avoidance of False Alarms (File, Macro) under DOS Eval DOS.07: Detection of File, Macro and Script Malware under DOS Eval DOS.SUM Grading of DOS products ********************************************************************** This part of VTC "2002-12" test report evaluates the detailed results as given in sections (files): 6dDOS.TXT File, Macro and Script Viruses/Malware results DOS The following *13* products participated in this test for DOS products: ------------------------------------------------- Products submitted for aVTC test under DOS: ------------------------------------------------- AVA v(def): V7.70 AVG v(def): 6.0 AVP v(def): 3.0 build 135 CMD v(def): 4.62.4 DRW v(def): 4.26 FPR v(def): 3.11b INO v(def): 6.0 n(s) MR2 v(def): 1.20 NAV v(def): corporate edition 14.12. NVC v(def): 5.30.02 RAV v(def): 8.1.001 engine 8.5 SCN v(def): 4.16.0 VSP v(def): 12.34.1 ------------------------------------------------- Eval DOS.01: Development of Scanner Detection Rates under DOS: ============================================================== DOS-based scanners become less important as AV producers invest more work into the development of the W32-related platforms; consequently, the number of scanners available for tests was further reduced, to now 13 scanners. ************************************************** Consequently, this will be VTCs last test for DOS! ************************************************** The following tables summarizes results of virus detection under DOS in all VTC tests: Table DOS-A1: summary of file virus (zoo) detection Table DOS-A2: summary of macro and script virus (zoo) detection Table DOS-A1: File Virus Detection Rate in 10 VTC tests under DOS: ================================================================== ----------------- File (ZOO) Virus Detection ------------------------ SCAN 9702 9707 9802 9810 9903 9909 0004 0104 0212 Delta NER % % % % % % % % % % ------------------------------------------------------------------------- ALE 98.8 94.1 89.4 - - - - - - - ANT 73.4 80.6 84.6 75.7 - - 92.8 - - - AVA 98.9 97.4 97.4 97.9 97.6 97.4 97.5 95.2 96.9 +1.7 AVG 79.2 85.3 84.9 87.6 87.1 86.6 - 81.9 (*) - AVK - - - 90.0 75.0 - - 99.7 - - AVP 98.5 98.4 99.3 99.7 99.7 99.8 99.6 99.8 100~ +0.2 CMD - - - - - - 99.5 - 98.5 - DRW 93.2 93.8 92.8 93.1 98.2 98.3 - - (*) - DSE 99.7 99.6 99.9 99.9 99.8 - - - - - FMA - - - - - - - - - - FPR 90.7 89.0 96.0 95.5 98.7 99.2 99.6 97.8 98.8 +1.0 FSE - - 99.4 99.7 97.6 99.3 99.9 - - - FWN - - - - - - - - - - HMV - - - - - - - - - - IBM 93.6 95.2 96.5 - - - - - - - INO - - 92.0 93.5 98.1 94.7 94.6 91.0 93.8 +2.8 IRS - 81.4 74.2 - 51.6 - - - - - ITM - 81.0 81.2 65.8 64.2 - - - - - IVB 8.3 - - - 96.9 - - - - - MR2 - - - - - 65.4 - - (*) - NAV 66.9 67.1 97.1 98.1 77.2 96.0 93.3 90.8 98.4 +7.6 NOD - - - 96.9 - 96.9 98.3 - - - NVC 87.4 89.7 94.1 93.8 97.6 - 99.1 - (*) - PAN - - 67.8 - - - - - - - PAV - 96.6 98.8 - 73.7 98.8 98.7 99.9 - - PCC - - - - - - - - - - PCV 67.9 - - - - - - - - - PRO - - - - 35.5 - - - - - RAV - - - 71.0 - - - - 96.7 - SCN 83.9 93.5 90.7 87.8 99.8 97.1 99.9 99.8 99.8 0.0 SWP 95.9 94.5 96.8 98.4 - 99.0 98.4 - - - TBA 95.5 93.7 92.1 93.2 - - - - - - TSC - - 50.4 56.1 39.5 51.6 - - - - TNT 58.0 - - - - - - - - - VDS - 44.0 37.1 - - - - - - - UKV - - - - - - - - - - VET - 64.9 - - 65.3 - - - - - VIT - - - - - - 7.6 - - - VRX - - - - - - - - - - VBS 43.1 56.6 - 35.5 - - - - - - VHU 19.3 - - - - - - - - - VSA - - 56.9 - - - - - - - VSP - - - 76.1 71.7 79.6 - - 61.6 - VSW - - 56.9 - - - - - - - VTR 45.5 - - - - - - - - - XSC 59.5 - - - - - - - - - ------------------------------------------------------------------------- Mean 74.2 84.8 84.4 85.4 81.2 90.6 98.3 95.1 93.5% +2.2% ------------------------------------------------------------------------- (*) For test problems, see 8problm.txt. Table DOS-A2: Macro/Script Virus Detection Rate in last 9 VTC tests under DOS: ============================================================================== ------------------ Macro Virus Detection ------------------ + - ScriptVirus Detection - SCAN 9702 9707 9802 9810 9903 9909 0004 0008 0104 0110 0212 DeltaI 0008 0104 0110 0212 Delta NER % % % % % % % % % % % % I % % % % % -----------------------------------------------------------------+-------------------------- ALE 96.5 66.0 49.8 - - - - - - - - - I - - - - - ANT 58.0 68.6 80.4 56.6 - - 85.9 93.3 - 97.0 - - I 55.2 - 81.8 - - AVA 99.3 98.2 80.4 97.2 95.9 94.6 93.7 - 92.0 93.0 92.8 -0.2 I - 30.0 33.7 31.5 -2.2 AVG 25.2 71.0 27.1 81.6 82.5 96.6 - - 98.3 98.4 98.1 -0.3 I - 57.9 62.9 63.9 +1.0 AVK - - - 99.7 99.6 - - 100~ 100~ 100% - - I 91.5 99.4 100% - - AVP 99.3 99.0 99.9 100% 99.8 100% 99.9 - 100~ 100% 100~ 0.0 I - 99.8 100% 98.9 CMD - - - - - 99.5 100% 100~ - 100~ 99.9 -0.1 I 93.5 - 93.9 98.1 DRW 90.2 98.1 94.3 99.3 98.3 - 98.4 97.6 98.0 99.5 99.4 -0.1 I 60.8 95.6 95.4 94.7 DSE 97.9 98.9 100% 100% 100% - - - - - - - I - - - - - FMA 98.6 98.2 99.9 - - - - - - - - - I - - - - - FPR 43.4 36.1 99.9 99.8 99.8 99.7 100% 100~ 100% 100~ 100~ 0.0 I 90.5 96.9 94.6 88.7 -5.9 FSE - - 99.9 90.1 99.6 97.6 99.9 - - - - - I - - - - - FWN 97.2 96.4 91.0 85.7 - - - - - - - - I - - - - - HMV - - 98.2 99.0 99.5 - - - - - - - I - - - - - IBM 65.0 88.8 99.6 - - - - - - - - - I - - - - - INO - - 90.3 95.2 99.8 99.5 99.7 99.7 99.3 - 99.9 - I 77.8 66.0 - 94.7 - IRS - 69.5 48.2 - 89.1 - - - - - - - I - - - - - ITM - 81.8 58.2 68.6 76.3 - - - - - - - I - - - - - IVB - - - - - - - - - - - - I - - - - - MR2 - - - - - 69.6 - - 44.2 40.8 37.9 -2.9 I - 85.1 83.3 81.0 -2.3 NAV 80.7 86.4 98.7 99.8 99.7 98.6 97.4 97.0 93.8 99.5 99.8 +0.3 I 24.8 31.2 94.2 97.0 +2.8 NOD - - - - 99.8 100% 99.4 - - - - - I - - - - - NVC 13.3 96.6 99.2 90.8 - 99.6 99.9 99.9 99.8 - 99.8 - I 83.7 88.5 - 87.6 - PAN - - 73.0 - - - - - - - - - I - - - - - PAV - - 93.7 100% 99.5 98.8 99.9 - 100~ 100% - - I - 99.8 100% - - PCC - 67.6 - - - - - - - - - - I - - - - - PCV - - - - - - - - - - - - I - - - - - PRO - - - - 81.5 - - - - - - - I - - - - - RAV - - - 99.5 99.2 - - - - 99.5 99.9 +0.4 I - - 82.5 96.1+13.6 SCN 95.1 97.6 99.0 98.6 100% 100% 100% 100~ 100% 100% 100% 0.0 I 85.6 100% 99.8 99.6 -0.2 SWP 87.4 89.1 98.4 98.6 - 98.4 98.4 - - - - - I - - - - - TBA 72.0 96.1 99.5 98.7 - - - - - - - - I - - - - - TSC - - 81.9 76.5 59.5 69.6 - - - - - - I - - - - - TNT - - - - - - - - - - - - I - - - - - VDS 16.1 9.9 8.7 - - - - - - - - - I - - - - - UKV - - - - - - - 0.0 - - - - I 0.0 - - - - VET - 94.0 97.3 97.5 97.6 - - - - - - - I - - - - - VIT - - - - - - - - - - - - I - - - - - VRX - - - - - - - - - - - - I - - - - - VBS - - - - - - - - - - - - I - - - - - VHU - - - - - - - - - - - - I - - - - - VSA - - 80.6 - - - - - - - - - I - - - - - VSP - - - - - - - - 0.0 0.0 0.0 0.0 I - 85.3 84.0 81.2 -2.8 VSW - - 83.0 - - - - - - - - - I - - - - - XSC - - - - - - - - - - - - I - - - - - -----------------------------------------------------------------+--------------------------- Mean 69.6 80.9 83.8 89.6 93.6 88.2 98.0 98.6 93.8 87.7 86.7 -0.3%I 66.4 79.7 86.2 84.9% 0.4% Without extreme values: 94.4 94.0% - I 84.9% - -----------------------------------------------------------------+--------------------------- Concerning file zoo viruses, mean detection rate is FURTHER DECLINING. After having reached the best mean value in April 2000 with 98.3% mean detection rate, mean detection rate is now down by almost 5% to 93.5%. Concerning macro zoo viruses, "mean" detection rate is also FURTHER DECLINING, now down to 86.7% after having reached its maximum of 98.6% in August 2000. Concerning script zoo viruses which is presently the fastest growing sector, detection rate ALSO DECLINE sfurther, to an inacceptable detection rate of 84.9%. ********************************************************************** Findings DOS.1: Evidently, AV producers dont invest into DOS products. For ALL platforms, detection rates are declining, with mean detection rates for file zoo viruses down to 93.5% for macro zoo viruses down to 86.7% and for script zoo viruses down to 84.9% Such detection rates are inacceptably low. ---------------------------------------------------- Concerning detection of file zoo viruses, NO product detects ALL viruses and is "perfect". But 2 products detect almost all (>99%) viruses and are "excellent": AVP,SCN 5 more products are "very good"; AVA,CMD,FPR,NAV,RAV ---------------------------------------------------- Concerning detection of macro zoo viruses, 1 product detects ALL viruses and is "perfect": SCN 9 more products detect >99% of macro zoo viruses and are rated "excellent": AVP,FPR;CMD,INO,RAV;NAV,NVC;DRW ---------------------------------------------------- Concerning detection of script zoo viruses, NO product detects ALL viruses and is "perfect". 1 product detect >99% of visuses: "excellent" SCN And 3 more products detect >95% of script viruses and are "very good": AVP,NAV,RAV **************************************************** Overall, NO product detects ALL file, macro and script viruses. But 1 product detects >99% of all file, macro and script viruses and is "Overall Perfect": SCN ******************************************************************** Eval DOS.02: In-The-Wild (Boot,File,Macro,Script) Detection under DOS ===================================================================== Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses also esp. detecting ALL instantiations of those viruses is now ABSOLUTE REQUIREMENT, for macro and script viruses (it must be observed that detection and identification is not completely reliable). The following *3* DOS products (out of 13) detect 100% for ALL platforms (boot, file, macro AND script) both of ALL viruses AND ALL infected objects, and are rated "perfect" in this category: "Perfect" ITW boot/file/macro/script scanners: ITW Viruses,Files (Boot File Macro Script) ----------------------------------------- AVP (100%,100% 100%,100% 100%,100% 100%,100%) NAV (100%,100% 100%,100% 100%,100% 100%,100%) SCN (100%,100% 100%,100% 100%,100% 100%,100%) ----------------------------------------- Concerning detection of ITW boot viruses, 8 (out of 13) products are "perfect" in detecting ALL ITW viruses and infected objects: AVP, CMD, DRW, FPR, NAV, NVC, SCN, VSP: all (100% 100%) Concerning detection of ITW file viruses, 3 (out of 13) products are "perfect" in detecting ALL ITW viruses and infected objects (100%, 100%): AVP, NAV, SCN: all (100%, 100%) And 2 more products detect ALL ITW file viruses in ALMOST ALL samples (>99%) and are rated "excellent": INO (100% 99.8%), RAV (100% 99.1%) Concerning detection of ITW macro viruses, 5 (out of 13) products are "perfect" in detecting ALL ITW viruses and infected objects: AVP, DRW, INO, NAV, SCN: all (100% 100%) And 5 more products detect ALL ITW macro viruses in ALMOST ALL samples (>99%) and are rated "excellent": AVG,CMD,FPR,NVC: all (100% 99.8%); RAV (100% 99.8%) Concerning ITW script detection, now 9 (of 13) scanners detect ALL ITW script viruses in ALL infected objects: AVG,AVP,CMD,DRW,FPR,NAV,NVC,RAV,SCN: ALL (100% 100%) And 1 more product detects ALL script ITW viruses but misses few samples: INO (100% 99.2%) ************************************************************* Findings DOS.2: 3 AV products (out of 13) detect ALL In-The-Wild boot, file, macro and script viruses in ALL instantiations (files) and are rated "perfect": AVP,NAV,SCN -------------------------------------------- 8 AV products detect ALL ITW boot viruses in ALL samples and are rated "perfect": AVP,CMD,DRW,FPR,NAV,NVC,SCN,VSP -------------------------------------------- 3 AV products detect ALL ITW file viruses and infected objects and are rated "perfect": AVP,NAV,SCN 2 AV products detect ALL ITW file viruses on all platforms but misses few (<1%) files is rated "excellent": INO,RAV -------------------------------------------- 5 products can be rated "perfect" concerning detection of ITW macro viruses: AVP,DRW,INO,NAV,SCN 5 AV products detect ALL ITW macro viruses but misses few (<1%) files and are rated "excellent": AVG,CMD,FPR,NVC,RAV -------------------------------------------- Concerning detection of ITW script viruses, 9 products are rated "perfect" as they detect ALL viruses in ALL samples: AVG,AVP,CMD,DRW,FPR,NAV,NVC,RAV,SCN And 1 products detects all ITW viruses but misses one sample and is "excellent": INO ************************************************************* Eval DOS.03: Evaluation of overall DOS AV detection rates (zoo,ITW) =================================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is graded "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall AV grade" (including file, macro and script virus virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. Only scanners where all tests were completed are considered. (For problems in test: see 8problms.txt). The following list indicates those scanners graded into one of the upper three categories, with file, macro and script virus detection rates in unpacked samples, and with perfect ITW virus detection (rate=100%). Under DOS, NO product reached 100% detection rate for file, macro and script zoo viruses and was rated "perfect". But 2 scanners are graded "Excellent" (>99%), and 2 more scanners are rated "very good" (>95%): (zoo:file/macro/script;file/macro/script:ITW) --------------------------------------------- "Excellent" DOS scanners: SCN (99.8% 100% 99.6%; 100% 100% 100%) ---------------------------------------------- "Very Good" DOS scanners: AVP (100~ 100~ 98.9%; 100% 100% 100%) NAV (98.4% 99.8% 97.0%; 100% 100% 100%) RAV (96.7% 99.9% 96.1%; 100% 100% 100%) ---------------------------------------------- ***************************************************************** Findings DOS.3: NO "perfect" overall scanners: --- 1 "excellent" overall scanner: SCN 3 "very good" overall scanner : AVP,NAV,RAV ***************************************************************** Eval DOS.04: Evaluation of detection by virus classes under DOS: ================================================================ Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting macro and script viruses. Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) with 100% ITW virus detection are listed: DOS.04.1 Grading the Detection of file viruses under DOS ---------------------------------------------------------- "Perfect" DOS file scanners: --- "Excellent" DOS file scanners: AVP (100~) SCN (99.8%) "Very Good" DOS file scanners: FPR (98.8%) CMD (98.5%) NAV (98.4%) AVA (96.9%) RAV (96.7%) DOS.04.2 Grading the Detection of macro viruses under DOS ---------------------------------------------------------- "Perfect" DOS macro scanners: SCN (100.0%) "Excellent" DOS macro scanners: AVP (100~) FPR (100~) CMD (99.9%) INO (99.9%) RAV (99.9%) NAV (99.8%) NVC (99.8%) DRW (99.4%) "Very Good" DOS macro scanners: AVG (98.1%) DOS.04.3 Grading the Detection of Script viruses under DOS: ----------------------------------------------------------- "Perfect" DOS script scanners: --- "Excellent" DOS script scanners: SCN (99.6%) "Very Good" DOS script scanners: AVP (98.9%) NAV (97.0%) RAV (96.1%) ************************************************************************ Finding DOS.4: Performance of DOS scanners by virus classes: --------------------------------------------- Perfect scanners for file zoo+ITW: --- Excellent scanners for file zoo+ITW: AVP,SCN Very Good scanners for file zoo+ITW: FPR,CMD,NAV,AVA,RAV Perfect scanners for macro zoo+ITW: SCN Excellent scanners for macro zoo+ITW: AVP,FPR,CMD,INO,RAV,NAV,NVC,DRW Very Good scanners for macro zoo+ITW: AVG Perfect scanners for script zoo+ITW: --- Excellent scanners for script zoo+ITW: SCN Very Good scanners for script zoo+ITW: AVP,NAV,RAV ************************************************************************ Eval DOS.05: Detection of Packed ITW File and Macro Viruses under DOS ===================================================================== Detection of file and macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with 6 popular methods (PKZIP, ARJ, LHA, RAR, WinRAR and CAB) are also detected. ATTENTION: for packing objects in ITW testbeds, we used WinRAR 2.0. As WinRAR 2.0 didnot properly pack VTCs very large file testbed, this testbed was packed with WinRAR 2.9 which at that time was available in its final version (after longer availability of beta versions) since >3 months. Only upon evaluation, we detected that ONLY ONE product (RAV) was at all able to handle WinRAR 2.9 packed malware, at least to some degree (though not sufficient for the grade "perfect"). Consequently, this evaluation doesNOT include WinRAR. The following evaluation includes: ARJ,CAB,LHA,RAR,ZIP. Tests are performed only on In-The-Wild viruses packed once (no recursive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially been unchanged to ease comparison and improvement. A "perfect" product would detect ALL packed ITW file/macro viruses (100%) for 5 packers: ------------------------------------------------------- "Perfect" packed file/macro virus detectors: AVP,SCN ------------------------------------------------------- An "excellent" product would detect ALL packed ITW file/macro viruses (100%) for 4 packers: ------------------------------------------------------- "Excellent" packed file/macro virus detector: DRW ------------------------------------------------------- A "very good" product would detect ALL packed ITW file/macro viruses (100%) for 3 packers: -------------------------------------------------------- "Very Good" packed file/macro virus detector: RAV -------------------------------------------------------- Concerning detection of packed file ITW viruses: "Perfect" packed file virus detector (5 packers): AVP,SCN "Excellent" packed file virus detector (4 packers): DRW "Very Good" packed file virus detector (3 packers): RAV Some products which failed to detect all packed ITW file viruses were able to detect packed macro viruses at a significantly level: "Perfect" packed macro virus detector (5 packers):AVP,CMD,FPR,SCN "Excellent" packed macro virus detector (4 packers): DRW "Very Good" packed macro virus detector (3 packers): AVG,RAV Remark: Much more data were collected on precision and reliability of virus detection in packed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. *********************************************************************** Findings DOS.05: Detection of packed viral objects needs improvement: Perfect packed ITW file/macro virus detector: AVP,SCN Excellent packed ITW file/macro virus detector: DRW Very Good packed ITW file/macro virus detector: RAV ************************************************************** "Perfect" packed file virus detector: AVP,SCN "Excellent" packed file virus detector: DRW "Very Good" packed file virus detector: RAV ************************************************************** "Perfect" packed macro virus detector: AVP,CMD,FPR,SCN "Excellent" packed macro virus detector: DRW "Very Good" packed macro virus detector: AVG,RAV *********************************************************************** Eval DOS.06: Avoidance of False Alarms (File and Macro) under DOS: ================================================================== First introduced in VTC test "1998-10", a set of clean (and non-malicious) objects has been added to the macro virus testbeds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be significantly more rigid than that one used for detection. The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate < 0.5%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" Regarding the ability of scanners to avoid FP alarms, 4 (out of 13) products reported NO SINGLE False Positive alarm in file AND macro zoo testbeds and are therefore rated "perfect": ---------------------------------------------------------------- "Perfect" FP avoiding DOS scanners: AVA,INO,NAV,SCN,VSP "Excellent" FP avoiding DOS scanners: --- "Very Good" FP avoiding DOS scanners: AVP,CMD,FPR,RAV ---------------------------------------------------------------- In this category, products performed much better on avoiding false positive alarms on file viruses (9 products) as compared to macro FP (5 products): -------------------------------------------------------------- "Perfect" file -FP avoiding DOS scanners: AVA,AVP,CMD,FPR,INO,NAV,RAV,SCN,VSP "Perfect" macro -FP avoiding DOS scanners: AVA,AVG,INO,NAV,SCN,VSP -------------------------------------------------------------- ****************************************************************** Findings DOS.6: Avoidance of False-Positive Alarms is improving though still regarded insufficient. Generally for both file and macro, FP-avoiding "perfect" DOS scanners: AVA,INO,NAV,SCN,VSP ************************************************** "Perfect" file -FP avoiding DOS scanners: AVA,AVP,CMD,FPR,INO,NAV,RAV,SCN,VSP "Perfect" macro -FP avoiding DOS scanners: AVA,AVG,INO,NAV,SCN,VSP ****************************************************************** Eval DOS.07: Detection of File, Macro and Script Malware under DOS ================================================================== Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be also warned about and protected from non-viral and non-wormy malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. A growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of macro and script malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" Generally, detection of malware needs significant further development, as mean detection rates over platforms show: mean detection rate for file malware: 73.7% (80.9% for scanners >10%) for macro malware: 81.9% (88.7% for scanners >10%) for script malware: 49.7& (53.6% for scanners >10%) Concerning file, macro and script malware detection: Malware type: File/Macro/Script -------------------------------------------------------------------------- "Perfect" file/macro/script malware detectors: --- "Excellent" file/macro/script malware detectors: AVP ( 98.7% 100.0% 95.7%) SCN ( 92.9% 100.0% 98.3%) "Very Good" file/macro/script malware detector: RAV ( 86.1% 99.3% 82.1%) -------------------------------------------------------------------------- Concerning file malware detection only: File Malware -------------------------------------------------- "Perfect" file malware detectors: --- "Excellent" file malware detectors: AVP ( 98.7%) SCN ( 92.9%) FPR ( 90.9%) CMD ( 90.6%) "Very Good" file malware detector: RAV ( 86.1%) -------------------------------------------------- Macro malware detection is MUCH better developped than file malware detection: Macro Malware -------------------------------------------------- "Perfect" macro malware detectors: AVP (100.0%) SCN (100.0%) "Excellent" macro malware detectors: CMD ( 99.3%) FPR ( 99.3%) RAV ( 99.3%) NVC ( 98.2%) INO ( 93.8%) NAV ( 93.1%) DRW ( 91.1%) "Very Good" macro malware detector: AVA ( 80.4%) AVG ( 80.2%) -------------------------------------------------- Concerning script malware detection, much work in this fastly growing area must still be invested: Script Malware -------------------------------------------------- "Perfect" script malware detectors: --- "Excellent" macro malware detectors: SCN ( 98.3%) AVP ( 95.7%) NAV ( 92.3%) "Very Good" macro malware detector: RAV ( 82.1%) -------------------------------------------------- ******************************************************************* Findings DOS.7: File, Macro and Script Malware detection under DOS is only slowly improving. Mean detection rates are significantly lower than for virus detection: Mean file malware detection: 73.7% Mean macro malware detection: 81.9% Mean script malware detection: 49.7% *************************************************** NO product detect sALL file, macro and script malware samples and is "perfect": --- 2 products are rated "excellent": AVP,SCN 1 product is rated "very good": RAV *************************************************** Concerning file malware detection: NO product is rated "perfect" --- 4 products are rated "excellent":AVP,SCN,FPR,CMD 1 product is rated "very good": RAV *************************************************** Concerning macro malware detection: 2 products detect ALL macro malware samples and are rated "perfect": AVP,SCN 7 products are rated "excellent": CMD,FPR,RAV,NVC,INO,NAV,DRW 2 products are rated "very good": AVA,AVG *************************************************** Concerning script malware detection: NO product detect ALL script malware samples and can be rated "perfect": --- 3 products are rated "excellent": AVP,NAV,SCN 1 product is rated "very good": RAV ****************************************************************** Eval DOS.SUM: Grading of DOS products: ====================================== Under the scope of VTCs grading system, a "Perfect DOS AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99% of zoo samples, in ALL categories (macro and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (5) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). ******************************************************************** In VTC test "2002-12", we found **** NO perfect DOS AV product **** ********************************************* AND we found **** NO perfect DOS AM product **** ******************************************************************** But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ----------------------------------------------------------------- DOS boot ITW test: AVP,CMD,DRW,FPR, NAV,NVC,SCN,VSP DOS file ITW test: AVP,NAV,SCN INO,RAV DOS macro ITW test: AVP,DRW,INO,NAV,SCN AVG,CMD,FPR,NVC,RAV DOS script ITW test: AVG,AVP,CMD,DRW,FPR, INO NAV,NVC,RAV,SCN ----------------------------------------------------------------- DOS file zoo test: --- AVP,SCN DOS macro zoo test: SCN AVP,FPR,CMD,INO, NAV,NVC,DRW,RAV DOS script zoo test: --- SCN ----------------------------------------------------------------- DOS file pack test: AVP,SCN DRW DOS macro pack test: AVP,CMD,FPR,SCN DRW DOS file FP avoidance: AVA,AVP,CMD,FPR,INO, --- NAV,RAV,SCN,VSP DOS macro FP avoidance:AVA,AVG,INO,NAV,SCN,VSP --- ------------------------------------------------------------------ DOS file malware test: --- AVP,SCN,FPR,CMD DOS macro malware test: AVP,SCN CMD,FPR,RAV, NVC,INO,NAV,DRW DOS script malware test: --- AVP,NAV,SCN ------------------------------------------------------------------ In order to support the race for more customer protection, we evaluate the order of performance in this DOS test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" DOS AntiVirus product: === (22 points) "Excellent" DOS AV products: 1st place: SCN (20 points) 2nd place: AVP (16 points) 3rd place: NAV (13 points) 4th place: CMD,FPR (10 points) 6th place: DRW,INO ( 9 points) 8th place: RAV ( 7 points) 8th place: NVC,VSP ( 6 points) 11th place: AVG ( 5 points) 12th place: AVA ( 4 points) ************************************************************ "Perfect" DOS AntiMalware product: =NONE= (28 points) "Excellent" DOS AntiMalware product: 1st place: SCN (24 points) 2nd place: AVP (20 points) 3rd place: NAV (15 points) 4th place: CMD,FPR (12 points) 6th place: DRW,INO (10 points) 8th place: RAV ( 8 points) 9th place: NVC ( 7 points) ************************************************************