========================================= File 7EVALDOS.TXT ----------------------------------------- Evaluation of VTC Scanner Test "2001-10": ========================================= Formatted with non-proportional font (Courier) Content of this file: ===================== Eval A: Development of detection rates under DOS: ------------------------------------------------- Eval DOS.01: Development of DOS Scanner Detection Rates Table DOS-A2: Macro/Script Virus Detection Rate in last 10 VTC tests Eval DOS.02: In-The-Wild Detection under DOS Eval DOS.03: Evaluation of overall DOS AV detection rates Eval DOS.04: Evaluation of detection by virus classes under DOS DOS.04.2 Grading the Detection of macro viruses under DOS DOS.04.3 Grading the Detection of script viruses under DOS Eval DOS.05: Detection of Packed Macro Viruses under DOS Eval DOS.06: Avoidance of False Alarms (Macro) under DOS Eval DOS.07: Detection of Macro and Script Malware under DOS Eval DOS.SUM Grading of DOS products ********************************************************************** This part of VTC "2001-10" test report evaluates the detailed results as given in sections (files): 6dDOS.TXT Macro+Script Viruses/Malware results DOS The following *14* products participated in this test for DOS products: ----------------------------------------------------------------- Products submitted for aVTC test under DOS: ----------------------------------------------------------------- ANT v(def): 6.8.0.56 sig: June 22,2001 AVA v(def): 7.70-53 sig: June 25,2001 AVG v(def): sig: June ,2001 AVK v(def): 3.0 Build 133 sig: June 4,2001 AVP v(def): 3.0 build 135 sig: June 22,2001 CMD v(def): 4.61.5 sig: June 25,2001 DRW v(def): 4.25 sig: June 20,2001 FPR v(def): 3.09d sig: June 25,2001 MR2 v(def): 1.17 sig: June ,2001 NAV v(def): sig: June 22,2001 PAV v(def): 3.0 Build 131 sig: June 8,2001 RAV v(def): 8.1.001 sig: June 25,2001 SCN v(def): 4.14.0 scan eng:4.1.40 sig: June 20,2001 VSP v(def): 12.22.1 sig: June ,2001 ----------------------------------------------------------------- Eval DOS.01: Development of Scanner Detection Rates under DOS: ============================================================== DOS-based scanners become less important as AV producers invest more work into the development of the W32-related platforms. In this test, 21 scanners for W32 platforms were submitted whereas "only" 14 scanners were submitted for DOS. The following table summarizes results of file and macro virus detection under DOS in last 9 VTC tests: Table DOS-A2: Macro/Script Virus Detection Rate in last 9 VTC tests under DOS: ============================================================================== ------------------ Macro Virus Detection --------------- + ScriptVirus Detection SCAN 9702 9707 9802 9810 9903 9909 0004 0008 0104 0110 Delta I 0008 0104 0110 Delta NER % % % % % % % % % % % I % % % % ---------------------------------------------------------------+---------------------- ALE 96.5 66.0 49.8 - - - - - - - - I - - - - ANT 58.0 68.6 80.4 56.6 - - 85.9 93.3 - 97.0 - I 55.2 - 81.8 - AVA 99.3 98.2 80.4 97.2 95.9 94.6 93.7 - 92.0 93.0 +1.0 I - 30.0 33.7 +3.7 AVG 25.2 71.0 27.1 81.6 82.5 96.6 - - 98.3 98.4 +0.1 I - 57.9 62.9 +5.0 AVK - - - 99.7 99.6 - - 100~ 100~ 100% 0.0 I 91.5 99.4 100% +0.6 AVP 99.3 99.0 99.9 100% 99.8 100% 99.9 - 100~ 100% 0.0 I - 99.8 100% +0.2 CMD - - - - - 99.5 100% 100~ - 100~ - I 93.5 - 93.9 - DRW 90.2 98.1 94.3 99.3 98.3 - 98.4 97.6 98.0 99.5 +1.5 I 60.8 95.6 95.4 -0.2 DSE 97.9 98.9 100% 100% 100% - - - - - - I - - - - FMA 98.6 98.2 99.9 - - - - - - - - I - - - - FPR 43.4 36.1 99.9 99.8 99.8 99.7 100% 100~ 100% 100~ 0.0 I 90.5 96.9 94.6 -2.3 FSE - - 99.9 90.1 99.6 97.6 99.9 - - - - I - - - - FWN 97.2 96.4 91.0 85.7 - - - - - - - I - - - - HMV - - 98.2 99.0 99.5 - - - - - - I - - - - IBM 65.0 88.8 99.6 - - - - - - - - I - - - - INO - - 90.3 95.2 99.8 99.5 99.7 99.7 99.3 - - I 77.8 66.0 - - IRS - 69.5 48.2 - 89.1 - - - - - - I - - - - ITM - 81.8 58.2 68.6 76.3 - - - - - - I - - - - IVB - - - - - - - - - - - I - - - - MR2 - - - - - 69.6 - - 44.2 40.8 -3.4 I - 85.1 83.3 -1.8 NAV 80.7 86.4 98.7 99.8 99.7 98.6 97.4 97.0 93.8 99.5 +5.7 I 24.8 31.2 94.2 +63.0 NOD - - - - 99.8 100% 99.4 - - - - I - - - - NVC 13.3 96.6 99.2 90.8 - 99.6 99.9 99.9 99.8 - - I 83.7 88.5 - - PAN - - 73.0 - - - - - - - - I - - - - PAV - - 93.7 100% 99.5 98.8 99.9 - 100~ 100% 0.0 I - 99.8 100% +0.2 PCC - 67.6 - - - - - - - - - I - - - - PCV - - - - - - - - - - - I - - - - PRO - - - - 81.5 - - - - - - I - - - - RAV - - - 99.5 99.2 - - - - 99.5 - I - - 82.5 - SCN 95.1 97.6 99.0 98.6 100% 100% 100% 100~ 100% 100% 0.0 I 85.6 100% 99.8 -0.2 SWP 87.4 89.1 98.4 98.6 - 98.4 98.4 - - - - I - - - - TBA 72.0 96.1 99.5 98.7 - - - - - - - I - - - - TSC - - 81.9 76.5 59.5 69.6 - - - - - I - - - - TNT - - - - - - - - - - - I - - - - VDS 16.1 9.9 8.7 - - - - - - - - I - - - - UKV - - - - - - - 0.0 - - - I 0.0 - - - VET - 94.0 97.3 97.5 97.6 - - - - - - I - - - - VIT - - - - - - - - - - - I - - - - VRX - - - - - - - - - - - I - - - - VBS - - - - - - - - - - - I - - - - VHU - - - - - - - - - - - I - - - - VSA - - 80.6 - - - - - - - - I - - - - VSP - - - - - - - - 0.0 0.0 0.0 I - 85.3 84.0 -1.3 VSW - - 83.0 - - - - - - - - I - - - - VTR 6.3 - - - - - - - - - - I - - - - XSC - - - - - - - - - - - I - - - - ---------------------------------------------------------------+---------------------- Mean 69.6 80.9 83.8 89.6 93.6 88.2 98.0 98.6 93.8 87.7% +0.4% I 66.4 79.7 86.2% +6,1% Without extreme values: (94.4%) I (+0.4%) ---------------------------------------------------------------+---------------------- Concerning macro viruses, "mean" detection rate is further declining (down to 87.7% after 93.8%). Even if one doesnot count scanners with extremely low detection rates (<30%), the mean detection rate of 94.4% is inacceptably low. On the other side, products having participated in last VTC tests succeeded to raise their detection rates (+0.4%). Concerning script viruses which is presently the fastest growing sector, the detection rate is improving (now 86.2%) but is still regarded as insufficient. But one scanner improved his detection rate by impressive 63%; the other (10) scanners which participated in last test succeeded to improve their detection rate by 0.4%"in the mean". ******************************************************************** Findings DOS.1: For DOS, macro zoo virus detection rates in the mean needs improvement but several products detect ALL or almost all macro zoo viruses. The situation is worse for script virus detection. ---------------------------------------------------- Now, 4 (of 14) products detect ALL macro zoo viruses and are rated "perfect": AVK,AVP,PAV,SCN 5 more products are "excellent": CMD,FPR,DRW,NAV,RAV ---------------------------------------------------- Detection rates for script viruses are improving (now 86.2%) but need still significant work. Here, 3 (of 14) products detect ALL script zoo viruses and are rated "perfect": AVK,AVP,PAV 1 more product is "excellent": SCN **************************************************** Overall, 3 products detect ALL macro AND script zoo viruses and are "Overall Perfect": AVK,AVP,PAV And 1 product is rated "excellent" as it detects >99% of all zoo viruses: SCN ******************************************************************** Eval DOS.02: In-The-Wild (Macro,Script) Detection under DOS =========================================================== Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses also esp. detecting ALL instantiations of those viruses is now ABSOLUTE REQUIREMENT, for macro and script viruses (it must be observed that detection and identification is not completely reliable). The following 6 DOS products (of 14) reach 100% for macro AND script virus and file detection and are rated "perfect" in this category (alphabetically ordered): ITW Viruses&Files ( Macro Script) ------------------------- "Perfect" ITW macro/script scanners: AVK ( 100.0% 100.0%) AVP ( 100.0% 100.0%) DRW ( 100.0% 100.0%) NAV ( 100.0% 100.0%) PAV ( 100.0% 100.0%) SCN ( 100.0% 100.0%) ------------------------ In comparison, ITW macro virus detection is better developped compared to ITW script virus detection. Concerning ITW macro detection, now 9 (of 14) scanners detect ALL ITW macro viruses: AVG,AVK,AVP,CMD,DRW,FPR,NAV,PAV,SCN Concerning ITW script detection, now 6 (of 14) scanners detect ALL ITW script viruses: AVK,AVP,DRW,NAV,PAV,SCN ************************************************************* Findings DOS.2: 6 AV products (out of 14) detect ALL In-The-Wild macro and script viruses in > 99.9% of files: AVK, AVP, DRW, NAV, PAV, SCN -------------------------------------------- 8 products can be rated "perfect" concerning detection of ITW macro viruses: AVG,AVK,AVP,CMD,DRW,FPR,NAV,PAV,SCN -------------------------------------------- And 6 products are rated "perfect" as they detect ALL ITW script viruses: AVK,AVP,DRW,NAV,PAV,SCN ************************************************************* Eval DOS.03: Evaluation of overall DOS AV detection rates (zoo,ITW) =================================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is graded "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall AV grade" (including macro and script virus virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. Only scanners where all tests were completed are considered. (For problems in test: see 8problms.txt). The following list indicates those scanners graded into one of the upper three categories, with macro and script virus detection rates in unpacked samples, and with perfect ITW virus detection (rate=100%). Under DOS, *3* product reached 100% detection rate for macro and script viruses, both zoo and In-The-Wild, and could be rated "perfect". And 1 scanner is graded "Excellent" (>99%), and 1 more scanner is rated "very good" (>95%): (zoo: macro/script; macro/script:ITW) ---------------------------------------------- "Perfect" DOS scanners: AVK ( 100% 100% ; 100% 100% ) AVP ( 100% 100% ; 100% 100% ) PAV ( 100% 100% ; 100% 100% ) ---------------------------------------------- "Excellent" DOS scanners: SCN ( 100% 99.8 ; 100% 100% ) ---------------------------------------------- "Very Good" DOS scanners: DRW ( 99.5 95.4 ; 100% 100% ) ---------------------------------------------- ***************************************************************** Findings DOS.3: 3 "perfect" overall scanners: AVK, AVP, PAV 1 "excellent" overall scanner: SCN 1 "very good" overall scanner : DRW ***************************************************************** Eval DOS.04: Evaluation of detection by virus classes under DOS: ================================================================ Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting macro and script viruses. Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed where ITW detection MUST be 100%. DOS.04.2 Grading the Detection of macro viruses under DOS ---------------------------------------------------------- "Perfect" DOS macro scanners: AVK (100.0%) AVP (100.0%) PAV (100.0%) SCN (100.0%) "Excellent" DOS macro scanners: CMD (100~) FPR (100~) DRW ( 99.5%) NAV ( 99.5%) RAV ( 99.5%) "Very Good" DOS macro scanners: AVG ( 98.4%) DOS.04.3 Grading the Detection of Script viruses under DOS: ----------------------------------------------------------- "Perfect" DOS script scanners: AVK (100.0%) AVP (100.0%) PAV (100.0%) "Excellent" DOS script scanners: SCN ( 99.8%) "Very Good" DOS script scanners: DRW ( 95.4%) ************************************************************************ Finding DOS.4: Performance of DOS scanners by virus classes: --------------------------------------------- Perfect scanners for macro zoo+ITW: AVK,AVP,PAV,SCN Excellent scanners for macro zoo+ITW: CMD,FPR,DRW,NAV,RAV Perfect scanners for script zoo+ITW: AVK,AVP,PAV Excellent scanners for script zoo+ITW: SCN ************************************************************************ Eval DOS.05: Detection of Packed Macro Viruses under DOS ======================================================== Detection of macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with 6 popular methods (PKZIP, ARJ, LHA, RAR, WinRAR and CAB) are also detected. Tests are performed only on In-The-Wild viruses packed once (no recursive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially be unchanged to ease comparison and improvement. A "perfect" product would detect ALL packed viral macro samples (100%) for all (6) packers: --------------------------------------------------------- "Perfect" packed virus detectors: AVK,AVP,CMD,FPR,PAV,SCN --------------------------------------------------------- Here, significant progress was made as only 1 product was rated "perfect" in last test! An "excellent" product would reach 100% detection of packed viral samples (ITW macro) for at least 5 packers: -------------------------------------------------------- "Excellent" packed macro virus detector: --- -------------------------------------------------------- A "very good" product would detect viral samples (ITW macro) for at least 4 packers: ------------------------------------------------------ "Very Good" packed macro virus detector: AVG, DRW, RAV ------------------------------------------------------ Remark: Much more data were collected on precision and reliability of virus detection in packed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. ******************************************************************** Findings DOS.5: Detection of packed viral objects needs improvement Perfect packed macro virus DOS detectors: AVK,AVP,CMD,FPR,PAV,SCN Excellent packed macro virus DOS detector: --- Very Good packed macro virus DOS detector: AVG,DRW,RAV ******************************************************************** Eval DOS.06: Avoidance of False Alarms (Macro,Script) under DOS: ================================================================ First introduced in VTC test "1998-10", a set of clean (and non-malicious) objects has been added to the macro virus testbeds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be significantly more rigid than that one used for detection. The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate < 0.5%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" Regarding the ability of scanners to avoid FP alarms, 6 (out of 14) products reported NO SINGLE False Positive alarm in macro zoo testbeds and are therefore rated "perfect": ---------------------------------------------------------------- "Perfect" FP avoiding DOS scanners: ANT, AVA, AVG, AVK, PAV, SCN ---------------------------------------------------------------- Moreover, VSP avoids FPs, but at low level of virus detection. As all other products had more than 2.5% false alarms, no product was rated "excellent" (<0.5%) and "very good" (<2.5%). **************************************************************** Findings DOS.6: Avoidance of False-Positive Alarms is improving though still regarded insufficient. FP-avoiding perfect DOS scanners: ANT, AVA, AVG, AVK, PAV, SCN **************************************************************** Eval DOS.07: Detection of Macro and Script Malware under DOS ============================================================ Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be also warned about and protected from non-viral and non-wormy malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. A growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of macro and script malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" Concerning macro AND script malware detection: Macro/Script Malware --------------------------------------------------------------- "Perfect" macro/script malware detectors: AVP (100.0% 100.0%) SCN (100.0% 100.0%) "Excellent" macro/script malware detectors: AVK ( 99.8% 100.0%) PAV ( 99.8% 100.0%) "Very Good" macro/script malware detector: RAV ( 97.7% 81.8%) --------------------------------------------------------------- Macro malware detection is MUCH better developped than script malware detection: Macro Malware -------------------------------------------------- "Perfect" macro malware detectors: AVP (100.0%) SCN (100.0%) "Excellent" macro malware detectors: AVK ( 99.8%) PAV ( 99.8%) FPR ( 99.5%) CMD ( 99.5%) RAV ( 97.7%) DRW ( 90.8%) "Very Good" macro malware detector: NAV ( 86.4%) AVG ( 82.6%) ANT ( 80.0%) -------------------------------------------------- Conncerning script malware detection, much work in this fastly growing area must still be invested: Script Malware -------------------------------------------------- "Perfect" script malware detectors: AVP (100.0%) AVK (100.0%) PAV (100.0%) SCN (100.0%) "Excellent" macro malware detectors: ------- "Very Good" macro malware detector: RAV ( 81.8%) -------------------------------------------------- ******************************************************************* Findings DOS.7: Macro and Script Malware detection under DOS is impressively improving: 2 products detect ALL macro and script malware samples and are "perfect": AVP,SCN 2 products are rated "excellent": AVK,PAV 1 product is rated "very good": RAV *************************************************** Concerning macro malware detection: 2 products detect ALL macro malware samples and are rated "perfect": AVP,SCN 6 products are rated "excellent": AVK,PAV,FPR,CMD,RAV,DRW 3 products are rated "very good": NAV,AVG,ANT *************************************************** Concerning script malware detection: 4 products detect ALL script malware samples and are rated "perfect": AVP,AVK,PAV,SCN 0 product is rated "excellent": --- 1 product is rated "very good": RAV ****************************************************************** Eval DOS.SUM: Grading of DOS products: ====================================== Under the scope of VTCs grading system, a "Perfect DOS AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99.9% of zoo samples, in ALL categories (macro and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (6) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). ********************************************************************* In VTC test "2001-10", we found ** 2 perfect DOS AV products: AVK,PAV ********************************************** but we found **** No perfect DOS AM product **** ********************************************************************* But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ----------------------------------------------------------------- DOS zoo macro test: AVK,AVP,PAV,SCN CMD,FPR,DRW,NAV,RAV DOS zoo script test: AVK,AVP,PAV SCN DOS ITW tests: AVK,AVP,DRW,NAV,PAV,SCN --- DOS pack-tests: AVK,AVP,CMD,FPR,PAV,SCN --- DOS FP avoidance: ANT,AVA,AVG,AVK,PAV,SCN --- ----------------------------------------------------------------- DOS Macro Malware Test: AVP,SCN AVK,PAV,FPR,CMD,RAV,DRW DOS Script Malware Test: AVP,AVK,PAV,SCN --- ----------------------------------------------------------------- In order to support the race for more customer protection, we evaluate the order of performance in this DOS test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" DOS AntiVirus product: AVK,PAV (10 points) ************************************************************ "Excellent" DOS AV products: 3rd place: SCN ( 9 points) 4th place: AVP ( 8 points) 5th place: CMD,DRW,FPR,NAV ( 3 points) 9th place: ANT,AVA,AVG ( 2 points) 12th place: RAV ( 1 point ) ************************************************************ "Perfect" DOS AntiMalware product: =NONE= (14 points) "Excellent" DOS AntiMalware product: 1st place: AVK,PAV,SCN (13 points) 4th place: AVP (12 points) 5th place: DRW ( 5 points) 6th place: FPR,CMD ( 4 points) 8th place: RAV ( 2 points) ************************************************************