========================================= File 7iEVAW2k.TXT ----------------------------------------- Evaluation of results for File, Macro and Script Virus/Malware detection under Windows-2000 in aVTC Test "2004-07": ========================================= Formatted with non-proportional font (Courier) ********************************************************************** Content of this file: ********************************************************************** Eval W2k: Development of detection rates under Windows-2000: ********************************************************************** Eval W2k.01: Development of W-2000 Scanner Detection Rates Table W2k-A: Comparison File/Macro/Script virus detection rates Eval W2k.02: In-The-Wild Detection under W-2000 Eval W2k.03: Evaluation of overall W-2000 AV detection rates Eval W2k.04: Evaluation of detection by virus classes under W-2000 W2k.04.1 Grading the Detection of file viruses under W-2000 W2k.04.2 Grading the Detection of macro viruses under W-2000 W2k.04.3 Grading the Detection of script viruses under W-2000 Eval W2k.05: Detection of Packed Viruses by virus classes under W-2000 + AND Loss of Virus Detection through Packing Eval W2k.06: Avoidance of False Alarms (Macro) under W-2000 Eval W2k.07: Detection of Malware by classes under W-2000 Eval W2k.SUM Grading of W-2000 products ********************************************************************** This part of VTC "2004-07" test report evaluates the detailed results as given in sections (files): 6iW2k.TXT File/Macro/Script Viruses/Malware results W-2000 (W2k) The following *25* products participated in this scanner test for W-2000 products: =============================================================== Products submitted for aVTC test under Windows-2000: =============================================================== ANT = Antivir v:6.18.0.1 H+B EDV Datentechnik Germany AVA = Avast! v:0301-9 ALWIL Software Czech Republic AVG = AVG Antivirus System v:6.0.456 GriSoft Czech Republic AVK = AntiVirenKit 10 v.12.0.3 GData Software Germany AVP = Kaspersky Anti-Virus (KAV), v:4.0.5.37 Kaspersky Lab Russia BDF = BitDefender Professional v:7.0 build 2473 SOFTWIN Romania CMD = Command Antivirus v:4.74.3 Command Software Systems USA DRW = Dr. Web v:v4.29b DialogueScience Russia FIR = Fire Anti-virus Kit v:2.7 Prognet Technologies India FPR = F-PROT v:3.12d Frisk Software Iceland FSE = F-SECURE v:1.02.2410 F-Secure Corporation Finland GLA = Gladiator AV v:3.0.0 "Gladiator" IKA = Ikarus Virus Utilities v:2.27 IKARUS Software Austria INO = eTrust AV v:6.0.102 Computer Associates USA NAV = Norton Antivirus v:8.00.9374 Symantec USA NVC = Norman Virus Control v:5.50 Norman Data Defense Norway PAV = Power AV v: 11.0.5 GData Software Germany PER = Peruvian AntiVirus v:7.90 PER Systems Peru PRO = Protector v:7.2.D01 Proland Software India QHL = Qhickheal 6.08 Cat Computer Services India RAV = RAV Antivirus v8 v:8.3.1 GeCAD Software Romania SCN = McAfee ViruScan v4.1.60 Network Associates USA SWP = Sophos AV v:3.66 Sophos UK VBR = VirusBuster v:2 Leprechaun Australia VSP = VirScanPlus v:12.762 Ralph Roth Germany =============================================================== Eval W2k.01: Scanner Detection Rates under Windows-2000: ======================================================== The number of scanners running under Windows 2000 is growing. Evidently, AV producers invest now more work into the development of the W32-related platforms, and here into the detection of macro viruses (with minor improvements) and script viruses (with major improvements). The following table summarizes results of file, macro and script virus detection under Windows-2000 (since 0008 until 0212): Table W2k-A: Comparison: File/Macro/Script ZOO Virus Detection Rate: ==================================================================== Table W2k-A1: Detection performance for file ZOO viruses: =================================== Scan I ==== File Virus === ner I Detection -----+--------------------- Test I 0104 0212 0407 Delta -----+--------------------- ANT I - - 90.9 - AVA I 95.0 96.2 95.7 -0.5 AVG I 81.9 80.6 79.8 -0.8 AVK I 99.8 99.9 100~ +0.1 AVP I 99.9 100~ 100~ 0.0 BDF I - 82.9 84.4 +1.5 CLE I - - - - CMD I 97.8 98.5 98.6 +0.1 DRW I - 98.3 79.0-19.3 FIR I - - 75.4 - FPR I 97.8 98.8 99.5 +0.7 FPW I 97.8 98.8 - - FSE I - 100~ 100~ 0.0 GLA I - - 40.7 - IKA I - 89.2 90.8 +1.6 INO I 97.9 98.7 95.8 -2.9 MCV I - - - - MR2 I - 9.5 - - NAV I 93.9 98.3 99.3 +1.0 NVC I 98.1 97.8 95.0 -2.8 PAV I 97.5 - 100~ - PER I - - 35.9 - PRO I 70.6 70.4 67.2 -3.2 QHL I - - 59.0 - RAV I 93.5 94.7 99.4 +4.7 SCN I 89.0 99.8 100~ +0.2 SWP I - - 98.2 - VBR I - - 68.5 - VSP I - 14.0 14.6 +0.6 -----+--------------------- Mean : 93.6 85.6 82.7 -1.5% M.>10%: 89.8 82.7 -----+--------------------- Table W2k-A2: Detection performance for macro/script ZOO viruses: =========================================== Scan I ======== Macro Virus ======== + ======= Script Virus ======== ner I Detection I Detection -----+--------------------------------+-------------------------------- Test I 0008 0104 0110 0212 0407 DeltaI 0008 0104 0110 0212 0407 Delta -----+ -------------------------------+-------------------------------- ANT I 93.3 - - - 97.9 - I 53.9 - - - 87.5 - AVA I 94.1 95.7 97.7 97.8 97.9 +0.1 I 15.0 29.1 29.6 31.5 88.7 +57.2 AVG I 97.9 98.3 98.4 98.1 98.0 -0.1 I 45.8 57.9 62.9 63.9 68.2 +4.3 AVK I 100~ 100~ 100% 100~ 100~ 0.0 I 91.5 99.8 100% 99.0 99.8 +0.8 AVP I 100~ 100~ 100~ 100~ 100~ 0.0 I 88.2 99.8 100% 98.9 99.7 +0.8 BDF I 99.0 - - 99.0 98.1 -0.9 I 61.4 - - 72.4 94.3 +21.9 CLE I - - - - - - I 4.2 - - - - - CMD I 100% 100% 100~ 99.9 99.9 0.0 I 93.5 96.9 93.2 89.1 98.5 +9.4 DRW I 97.5 - 99.5 99.4 99.4 0.0 I 59.8 - 95.4 94.7 95.4 +0.7 FIR I - - - - 85.9 - I - - - - 75.9 - FPR I - 100% 100~ 100~ 99.9 -0.1 I - 96.9 94.6 88.7 99.3 +10.6 FPW I 100% 100% 100~ 100~ - . I 90.8 96.9 94.6 88.7 - . FSE I 100% 100% 100% 100~ 100~ 0.0 I 96.7 100% 100% 99.5 100% +0.5 GLA I - - - - 1.5 - I - - - - 49.5 - IKA I - - - 96.2 96.5 +0.3 I - - - 81.2 91.7 +10.5 INO I 99.8 99.7 99.9 99.9 99.9 0.0 I 78.1 93.1 93.9 94.7 97.5 +2.8 MCV I - - 88.5 - - - I - - 27.7 - - - MR2 I - - 0.7 10.4 - - I - - 83.3 81.0 - - NAV I 97.7 97.0 99.5 99.6 99.9 +0.3 I 36.6 54.5 94.2 96.8 98.3 +1.5 NVC I 99.9 99.8 99.8 99.8 99.2 -0.6 I 83.7 88.5 91.3 87.6 99.7 +12.1 PAV I 100~ 99.4 100% - 100~ - I 90.2 98.5 100% - - PER I 85.0 68.2 - - 69.8 - I 0.0 22.0 - - 22.9 - PRO I 69.1 67.1 - 72.7 73.1 +0.4 I 12.1 40.7 - 59.8 70.4 +10.6 QHL I 0.0 - - - - - I 6.9 - - - 29.1 - RAV I 96.9 99.6 99.5 99.9 99.8 -0.1 I 47.1 84.9 82.5 96.1 99.7 +3.6 SCN I 100% 100% 100% 100% 100~ -0.0 I 95.8 100% 99.8 99.6 100% +0.4 SWP I - - - - 99.7 - I - - - - 96.8 - VBR I - - - - 98.4 - I - - - - 46.4 - VSP I - 0.0 0.~ 0.~ 0.1 +0.1 I - 85.3 84.0 81.2 83.5 +2.3 -----+--------------------------------+-------------------------------- Mean : 99.9 89.7 88.0 88.0 88.1 -0.1%I 57.6 79.4 84.8 84.4 83.2 +8.8% M.>10%: 98.9 92.9 96.1 I 91.9 84.4 83.2 -----+ -------------------------------+-------------------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. In comparison with the last test, mean detection rates for FILE ZOO viruses are VISIBLY reduced (by -2.9%) to 82.7%, with best products in both tests with minor reduction; for MACRO ZOO viruses are ALMOST UNCHANGED: 88.1%, with rather stable detection rates in 2 last tests; for SCRIPT viruses are almost unchanged on too low a level: 83.2%, but with significantly IMPROVED detection of ALMOST ALL products which participated also in last test: +8.8%; indeed, 4 newly participating products detect <50% which negatively influence the mean detection rate. Overall detection level for ALL categories is still INSUFFICIENT! Concerning FILE ZOO virus detection, NO product is able to detect ALL viruses (rating: "perfect"), but 6 products detect more than 99% and are rated "excellent": AVK, AVP, FSE, PAV, SCN (all:100~), FPR (99.5%), RAV (99.4%) Concerning MACRO ZOO virus detection, NO product detects ALL viruses and (rating: "perfect"), but 13 products detect more than 99% and are rated "excellent": AVK, AVP, FSE, PAV, SCN (all: 100~), CMD, FPR, INO, NAV (all: 99.9%), RAV (99.8%), SWP (99.7%), DRW (99.4%), NVC (99.2%) Concerning SCRIPT ZOO virus detection, 2 products detect ALL viruses and are rated "perfect": FSE, SCN (100%) In addition, 5 products detect more than 99% and are rated "excellent": AVK (99.8%), AVP, RAV (99.7%), FPR (99.3%) **************************************************************** Findings W2k.1: For W-2000, mean detection rates of zoo viruses are still insufficient, with file virus detection rates going down, macro virus detection rates being stable, script virus detection rates improved. Still significant work is needed. Mean detection rates remain inacceptably low: mean file zoo virus detection rate: 82.7% mean macro virus detection rate: 88.1% mean script virus detection rate: 83.2% ------------------------------------------------ Concerning file zoo viruses: NO product detects ALL viruses ("perfect") 6 products detect more than 99% and are rated "excellent": AVK,AVP,FSE,PAV,SCN;RAV. ------------------------------------------------- Concerning macro zoo viruses: NO product detects ALL macro zoo viruses 13 products detect almost all macro viruses in almost all files and are rated "excellent": AVK,AVP,FSE,PAV,SCN; CMD,FPR,INO,NAV;RAV,SWP,DRW,NVC. ------------------------------------------------- Concerning script zoo viruses: 2 products detect ALL viruses and are rated "perfect": FSE,SCN. 5 products detect almost all script viruses in almost all files and are rated "excellent": AVK,AVP,NVC,RAV;FPR. **************************************************************** Eval W2k.02: In-The-Wild (File,Macro,Script) Detection under W-2000 =================================================================== Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses also esp. detecting ALL instantiations of those viruses is an ABSOLUTE REQUIREMENT, for file, macro and script viruses to be rated "perfect" (it must be observed that identification, that is equal naming of all related infected objects, may not completely reliable). Presently, 1 scanner is "perfect" in this category: SCN. 9 more scanners are "excellent" in detecting more than 99% of ALL ITW viruses (indeed they miss just 1 ITW virus) but miss only few infected objects (>99= detection rate) and are rated "excellent". ITW virus/file detection ( FileV. MacroV. ScriptV. ) ------------------------------------------- "Perfect" W2k ITW scanners: SCN (100% 100%; 100% 100%; 100% 100%) ------------------------------------------- "Excellent" W2k ITW scanners: AVK (99.1% 99.8%; 100% 100%; 100% 100%) AVP (99.1% 99.8%; 100% 100%; 100% 100%) FSE (99.1% 99.8%; 100% 100%; 100% 100%) NAV (99.1% 99.8%; 100% 100%; 100% 100%) PAV (99.1% 99.8%; 100% 100%; 100% 100%) FPR (99.1% 99.8%; 100% 99.9%; 100% 99.4%) INO (99.1% 99.5%; 100% 99.9%; 100% 99.4%) RAV (99.1% 99.8%; 100% 99.9%; 100% 99.4%) -------------------------------------------- Concerning only detection of ITW file viruses: 1 scanner is perfect: SCN (100% 100%) 11 scanners are excellent (missing one virus): AVK,AVP,DRW,FPR,FSE,NAV,PAV,RAV (ALL: 99.1% 99.8%), INO,SWP (99.1% 99.5%), AVA (99.1% 99.1%) Detection of ITW macro viruses is much better developed than detection of file viruses and infected objects: 10 scanners are rated "perfect": ANT,AVK,AVP,BDF,DRW,FSE,NAV,PAV,SCN,SWP (ALL: 100% 100%) 8 scanners are rated "excellent": AVG,CMD,FPR,INO,RAV (ALL: 100% 99.9%), AVA (100% 99.5%), IKA (100% 99.4%), PRO (100% 99.3%) Detection of ITW script viruses is less successful than in last test (where 12 out of 19 products were rated "perfect"): 6 scanners are rated "perfect": AVK,AVP,FSE,NAV,PAV,SCN (ALL: 100% 100%) 3 scanners are rated "excellent": FPR,INO,RAV (100% 99.4%) ****************************************************************** Findings W2k.2: 1 AV product (out of 25) detects ALL In-The-Wild file, macro and zoo viruses in ALL instantiations (files) and is "perfect": SCN 8 scanners are "excellent": AVK,AVP,FSE,NAV,PAV,FPR,INO,RAV ************************************************* Concerning detection of ITW file viruses: 1 "perfect" scanner: SCN 11 "excellent" scanners: AVK,AVP,DRW,FPR,FSE,NAV,PAV,RAV,INO,SWP,AVA Concerning detection of ITW macro viruses: 10 "perfect" scanners: ANT,AVK,AVP,BDF,DRW,FSE,NAV,PAV,SCN,SWP 8 "excellent" scanners: AVG,CMD,FPR,INO,RAV,AVA,IKA,PRO Concerning detection of ITW script viruses: 6 "perfect" scanners: AVK,AVP,FSE,NAV,PAV,SCN 3 "excellent" scanners: FPR,INO,RAV ***************************************************************** Eval W2k.03: Evaluation of overall W-2000 AV detection rates (zoo,ITW) ====================================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is graded "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall AV grade" (including file, macro and script virus virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. Only scanners where all tests were completed are considered. (For problems in test: see 8problms.txt). Besides grading products in related categories according to their performance, it is interesting to compare how products developed. The following list indicates those scanners graded into one of the upper three categories, with file, macro and script virus detection rates in unpacked samples, and with perfect ITW virus detection (rate=100%). Different from last time, only ONE product detects ITW viruses in all 3 categories at 100% rate (last time: 5 products!). Zoo test: ITW test: (file/macro/script; file/macro/script) -------------------------------------- "Perfect" W2k scanners: ========= NONE ========= "Excellent" W2k scanners: SCN (99.8% 100% 99.6%; 100% 100% 100%) ****************************************************************** Findings W2k.3: Now, NO W2k product is overall rated "perfect" --- 1 "excellent" overall scanner: SCN 0 "very good" overall scanner: --- ****************************************************************** Eval W2k.04: Evaluation of detection by virus classes under W-2000: =================================================================== Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting macro and script viruses. Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed. W2k.04.1 Grading the Detection of file viruses under W2k -------------------------------------------------------- "Perfect" W2k file scanners: === NONE === "Excellent" W2k file scanners: AVK (100~) AVP (100~) FSE (100~) PAV (100~) SCN (100~) FPR (99.5%) RAV (99.4%) NAV (99.3%) "Very Good" W2k file scanners: CMD (98.6%) SWP (98.2%) INO (95.8%) AVA (95.7%) NVC (95.0%) W2k.04.2 Grading the Detection of macro viruses under W2k --------------------------------------------------------- "Perfect" W2k macro scanners: === NONE === "Excellent" W2k macro scanners: AVK ( 100~ ) AVP ( 100~ ) FSE ( 100~ ) PAV ( 100~ ) SCN ( 100~ ) CMD ( 99.9%) FPR ( 99.9%) INO ( 99.9%) NAV ( 99.9%) RAV ( 99.8%) SWP ( 99.7%) DRW ( 99.4%) NVC ( 99.2%) "Very Good" W2k macro scanners: VBR ( 98.4%) BDF ( 98.1%) AVG ( 98.0%) AVA ( 97.9%) ANT ( 97.9%) AVA ( 97.9%) IKA ( 96.5%) W2k.04.3 Grading the Detection of Script viruses under W2k: ----------------------------------------------------------- "Perfect" W2k script scanners: FSE ( 100% ) SCN ( 100% ) "Excellent" W2k script scanners: AVK ( 99.8%) AVP ( 99.7%) PAV ( 99.7%) RAV ( 99.7%) FPR ( 99.3%) "Very Good" W2k script scanners: NAV ( 98.7%) CMD ( 98.5%) INO ( 97.5%) DRW ( 95.4%) SWP ( 96.8%) *********************************************************************** Finding W2k.4: Performance of W2k scanners by virus classes: Perfect scanners for file zoo: --- Excellent scanners for file zoo: AVK,AVP,FSE,PAV,SCN,FPR,RAV,NAV Very Good scanners for file zoo: CMD,SWP,INO,AVA,NVC Perfect scanners for macro zoo: --- Excellent scanners for macro zoo: AVK,AVP,FSE,PAV,SCN,CMD,INO,FPR,NAV,RAV,SWP,DRW,NVC Very Good scanners for macro zoo: VBR,BDF,AVG,AVA,ANT,AVA,IKA Perfect scanners for script zoo: FSE,SCN Excellent scanners for script zoo: AVK,AVP,PAV,RAV,FPR Very Good scanners for script zoo: NAV,CMD,INO,DRW,SWP *********************************************************************** Eval W2k.05: Detection of Packed File and Macro Viruses under W-2k ================================================================== Detection of file and macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with 6 popular methods (PKZIP, ARJ, LHA, RAR 1.5, WinRAR 3.0 and CAB) are also detected. Tests are performed only on In-The-Wild viruses packed once (no recursive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially be unchanged to ease comparison and improvement. Following analysis of detection of ITW file AND macro viruses where 100% ITW detection is mandatory, the following grading is applied: "perfect": 100% detection for ITW file AND macro viruses packed with 6 packers "excellent": 100% detection for ITW file AND macro viruses packed with 5 packers "very good": 100% detection for ITW file AND macro viruses packed with 4 packers "good": 100% detection for ITW file AND macro viruses packed with 3 packers ********************************************************** No product detected ITW samples packed with ALL 6 packers. ********************************************************** Due to the fact that only ONE products fulfils the conditions "100% ITW detection" (whereas others MISS at least one virus in file ITW testbed) but only for 5 packers, only one product can be rated at all: Concerning detection of BOTH file and macro virus samples: A "excellent" product would detect at least ALL viral samples (100%) with at least 5 packers: ---------------------------------------------------- "Excellent" packed virus detectors: SCN ---------------------------------------------------- Concerning detection of packed file viruses only, all products except one miss at least one sample of an ITW file virus: "Perfect" packed file virus detectors: --- "Excellent" packed file virus detectors: SCN "Very Good" packed file virus detectors: --- "Good" packed file virus detectors: --- Comment: 16 products missed just ONE ITW virus, but VTCs strong requirement for grading is that ALL ITW viruses MUST be detected! Concerning detection of ALL packed macro viruses: "Perfect" packed macro virus detectors: AVK,AVP,BDF,FSE,PAV "Excellent" packed macro virus detectors: AVA,DRW,QHL,RAV,SCN "Very Good" packed macro virus detectors: CMD,FPR,INO,NAV,SWP "Good" packed macro virus detectors: --- One NEW table analyses whether all ITW viruses which a product detects in UNPACKED form is also detected when PACKED with one of 6 packers. This is applied to ALL products including those which do NOT fulfil the "100% ITW detection" criterion. The related tables show that some scanners detect ALL ITW viruses RELIABLY both in unpacked and packed forms. But some scanners show significant LOSSES of detection for file and macro viruses (tables W2K.F3L and W2K.M3L). "Perfect scanners" (6 packers): No LOSS in detection of file AND macro ITW viruses: AVP,FSE No LOSS in detection of ITW file viruses: AVK,AVP,FSE No LOSS in detection of ITW macro viruses: AVP,FSE "Excellent scanners" (5 packers): No LOSS in detection of file AND macro ITW viruses: PAV,RAV,SCN No LOSS in detection of ITW file viruses: PAV,RAV,SCN No LOSS in detection of ITW macro viruses: AVA,BDF,DRW,PAV,QHL,RAV,SCN "Very Good scanners" (4 packers): No LOSS in detection of file AND macro ITW viruses: SWP No LOSS in detection of ITW file viruses: BDF,DRW,SWP No LOSS in detection of ITW macro viruses: INO,NAV,SWP *********************************************************************** Findings W2k.5: Concerning detection of packed file AND macro viruses: NO product is "perfect": --- 1 product is "excellent": SCN NO product is "very good": --- ******************************************************* Concerning detection of packed FILE ITW viruses: 0 product is "perfect": --- 1 product is "excellent": SCN 0 product is "very good": --- ******************************************************* Concerning detection of packed MACRO viruses: 5 products are "perfect": AVK,AVP,BDF,FSE,PAV 5 products are "excellent": AVA,DRW,QHL,RAV,SCN 5 products are "very good": CMD,FPR,INO,NAV,SWP ******************************************************* Concerning EQUAL detection of UNPACKED AND PACKED of ITW file AND macro viruses: 2 "perfect" products have NO LOSS for 6 packers: AVP,FSE 3 "excellent" products have NO LOSS for 5 packers: PAV,RAV,SCN 1 "very good" product has NO LOSS for 4 packers: SWP *********************************************************************** Eval W2k.06: Avoidance of False Alarms (File, Macro) under W-2000: ================================================================== First introduced in VTC test "1998-10", a set of clean (and non-malicious) objects has been added to the file and macro virus testbeds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be significantly more rigid than that one used for detection. The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate <=0.6%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" It is good to observe that ALL 25 scanners avoid FP alerts on clean files, but concerning clean macro objects, only 11 (out of 25) products are "perfect" in avoiding any alarm, and 7 more products are "excellent" as they only alert on at most 2 samples (<0.6%). Remark: the testbed included 2 CLEAN (non-viral, non-malicious) macro objects which were taken from a goat generator. While there is no reason for alerting on such samples, SOME AV experts argue that these samples must be detected as they may be used by VX groups. Indeed, some scanner diagnosed those samples as "infection" which is misleading and this is counted as "false positive diagnosis" (a warning would be acceptable). Overall, the following products didnot issue any false alarm: --------------------------------------------------------------------- "Perfect" file-FP AND macro-FP avoiding W2k products: ANT,AVA,AVG,BDF,GLA,INO,NAV,PRO,SCN,SWP,VSP "Excellent" file-FP AND macro-FP-avoiding W2k products: AVK,AVP,CMD,FPR,FSE,PAV,RAV --------------------------------------------------------------------- "Perfect" file-FP avoiding W2k products: ANT,AVA,AVG,AVK,AVP,BDF, CMD,DRW,FIR,FPR,FSE,GLA,IKA,INO,NAV, NVC,PAV,PER,PRO,QHL,RAV,SCN,SWP,VBR,VSP "Excellent" file-FP-avoiding W2k product: --- --------------------------------------------------------------------- "Perfect" macro-FP-avoiding W2k products: ANT,AVA,AVG,BDF,GLA,INO,NAV,PRO,SCN,SWP,VSP "Excellent" macro-FP-avoiding W2k products: AVK,AVP,CMD,FPR,FSE,PAV,RAV --------------------------------------------------------------------- ******************************************************************** Findings W2K.06: Avoidance of False-Positive Alarms is rather well developed, at least for file-FP avoidance. 11 Overall FP-avoiding "perfect" W2k scanners: ANT,AVA,AVG,BDF,GLA,INO,NAV,PRO,SCN,SWP,VSP 7 more products are "excellent": AVK,AVP,CMD,FPR,FSE,PAV,RAV *************************************************** Concerning file-FP avoidance, ALL 25 products are "perfect": ANT,AVA,AVG,AVK,AVP,BDF, CMD,DRW,FIR,FPR,FSE,GLA,IKA,INO,NAV, NVC,PAV,PER,PRO,QHL,RAV,SCN,SWP,VBR,VSP *************************************************** Concerning macro-FP avoidance, 11 products are "perfect": ANT,AVA,AVG,BDF,GLA,INO,NAV,PRO,SCN,SWP,VSP 7 products are "excellent": AVK,AVP,CMD,FPR,FSE,PAV,RAV ******************************************************************** Eval W2k.07: Detection of Macro and Script Malware under W-2k ============================================================= Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be also warned about and protected from non-viral and non-wormy malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. A growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of file and macro malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" Generally, detection of malware needs significant further development, as mean detection rates over platforms show: Mean detection rate for file malware: 65.6% (70.8% for scanners >10%) for macro malware: 84.2% (91.4% for scanners >10%) for script malware: 62.0% (64.4% for scanners >10%) In comparison to the last test (2002-04), were: Mean detection rate for file malware: from 73.5% to 65.6%: MUCH LOWER. for macro malware: from 84.9% to 84.2%: LOWER. for script malware: from 51.0% to 62.0%: IMPROVED. Concerning File, Macro AND Script malware detection: ------------------------------------------------------------ "Perfect" file/macro/script malware detectors under W2k: --- ------------------------------------------------------------ "Excellent" file/macro/script malware detectors under W2k: FSE ( 99.8% 100% 98.5%) AVK ( 99.7% 100% 98.5%) PAV ( 99.3% 100% 97.9%) AVP ( 98.2% 100% 98.2%) SCN ( 97.7% 99.8% 98.2%) ------------------------------------------------------------ "Very Good" file/macro/script malware detector under W2k: NAV ( 92.8% 98.4% 89.7%) FPR ( 98.9% 100% 89.1%) ------------------------------------------------------------ Concerning only file malware detection: ------------------------------------------------------------ "Perfect" file malware detectors under W2k: --- "Excellent" file malware detectors under W2k: FSE(99.8%), PAV(99.3%), AVK(99.7%), FPR(98.9%), AVP(98.2%), SCN(97.7%), NAV(92.8%) "Very Good" file malware detectors under W2k: CMD,SWP (both 89.0%), INO(83.5%) ------------------------------------------------------------ Concerning only macro malware detection: ------------------------------------------------------------ "Perfect" macro malware detectors under W2k: AVK,AVP,CMD,FSE,PAV (all:100%) "Excellent" macro malware detectors under W2k: SCN(99.8%),INO(99.4%),RAV(99.2%),DRW(99.0%), NAV,SWP(both:98.4%),NVC(97.1%),BDF(96.3%), AVA(95.9%),IKA(91.5%),VBR(94.8%) "Very Good" macro malware detectors under W2k: ANT(86.4%) ------------------------------------------------------------ An concerning script malware detection only: ------------------------------------------------------------ "Perfect" script malware detectors under W2k: --- "Excellent" script malware detectors under W2k: AVK,FSE(both:98.5%),AVP,SCN(both:98.2%), PAV(97.9%),RAV(93.3%) "Very Good" script malware detectors under W2k: NAV(89.7%),FPR(89.1%),AVA(86.7%) ------------------------------------------------------------ ******************************************************************* Findings W2k.07:File and Macro Malware detection under W2k is less developed compared to last test whereas script malware detection is improving. Concerning overall malware detection: 0 products are "perfect": --- 5 products are "excellent": FSE,AVK,PAV,AVP,SCN 2 products are "very good": NAV,FPR *************************************************** Concerning only file malware detection, 0 product is "perfect": --- 7 products are "excellent": FSE,PAV,AVK,FPR,AVP,SCN,NAV 3 products are rated "very good": CMD,SWP,INO *************************************************** Concerning only macro malware detection, 5 products are "perfect": AVK,AVP,CMD,FSE,PAV 11 products are "excellent": SCN,INO,RAV,DRW,NAV,SWP,NVC,BDF,AVA,IKA,VBR 1 product is rated "very good": ANT *************************************************** Concerning only script malware detection, 0 products are "perfect": --- 5 products are "excellent": AVK,FSE,AVP,SCN,PAV,RAV 3 products are rated "very good": NAV,FPR,AVA ****************************************************************** Eval W2k.SUM: Grading of W-2000 products: ========================================= Under the scope of VTCs grading system, a "Perfect W2k AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99% of zoo samples, in ALL categories (file, macro and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for 6 popular packers, +2A) Will detect ALL ITW samples both in unpacked instantiations AND packed with ALL (6) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >=99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>=90%). ***************************************************************** In VTC test "2004-07", we found *** NO perfect W2k AV product *** and we found *** No perfect W2k AM product *** ***************************************************************** But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ------------------------------------------------------------------ W2k file ITW test: SCN AVK,AVP,DRW,FPR,FSE,NAV, PAV,RAV,INO,SWP,AVA W2k macro ITW test: ANT,AVK,AVP,BDF,DRW, AVG,CMD,FPR,INO,RAV, FSE,NAV,PAV,SCN,SWP AVA,IKA,PRO W2k script ITW test: AVK,AVP,FSE,NAV, FPR,INO,RAV PAV,SCN ------------------------------------------------------------------ W2k file zoo test --- AVK,AVP,FSE,PAV,SCN, FPR,RAV,NAV W2k macro zoo test: --- AVK,AVP,FSE,PAV,SCN,CMD, INO,FPR,NAV,RAV,SWP,DRW,NVC W2k script zoo test: FSE,SCN AVK,AVP,PAV,RAV,FPR ------------------------------------------------------------------ W2k file pack test: --- SCN W2k macro pack test: AVK,AVP,BDF,FSE,PAV AVA,DRW,QHL,RAV,SCN + W2k pack/unpack test: AVP,FSE PAV,RAV,SCN ------------------------------------------------------------------ W2k file FP avoidance: ANT,AVA,AVG,AVK, --- AVP,BDF,CMD,DRW,FIR,FPR,FSE, GLA,IKA,INO,NAV,NVC,PAV,PER, PRO,QHL,RAV,SCN,SWP,VBR,VSP W2k macro FP avoidance: ANT,AVA,AVG,BDF, AVK,AVP,CMD,FPR, GLA,INO,NAV,PRO,SCN,SWP,VSP FSE,PAV,RAV ------------------------------------------------------------------ W2k file malware test: --- FSE,PAV,AVK,FPR, AVP,SCN,NAV W2k macro malware test: AVK,AVP,CMD,FSE,PAV SCN,INO,RAV,DRW,NAV, SWP,NVC,BDF,AVA,IKA,VBR W2k script malware test: --- AVK,FSE,AVP,SCN,PAV,RAV ------------------------------------------------------------------ In order to support the race for more customer protection, we evaluate the order of performance in this W2k test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" W-2000 AntiVirus product: =NONE= (22 points) "Excellent" W-2000 AV products: 1st place: SCN (17 points) 2nd place: FSE (16 points) 3rd place: AVP (15 points) 4th place: AVK,PAV (13 points) 6th place: NAV,RAV (11 points) 8th place: FPR ( 9 points) 9th place: BDF,INO,SWP ( 8 points) 12th place: AVA,DRW ( 7 points) 14th place: ANT ( 6 points) 15th place: AVG,CMD,PRO ( 5 points) 18th place: GLA,VSP ( 4 points) 20th place: IKA,NVC,QHL ( 3 points) 22th place: FIR,PER,VBR ( 2 points) ************************************************************ "Perfect" W-2000 AntiMalware product: =NONE= (28 points) "Excellent" W-2000 AntiMalware product: 1st place: SCN,FSE (20 points) 3rd place: AVP (19 points) 4th place: AVK,PAV (17 points) 6th place: NAV,RAV (13 points) 8th place: FPR (10 points) 9th place: BDF,INO,SWP ( 9 points) 12th place: AVA,DRW ( 8 points) 14th place: CMD ( 7 points) 15th place: IKA,NVC ( 4 points) 17th place: VBR ( 3 points) ************************************************************